Some issues here:
1) The PostgreSQL Bulk loader step only works with a PostgreSQL database
That step cannot connect to a SQL Server database.
2) The Table Output step is probably your best option
Set the "Commit size" value to a fairly large number (so that PDI doesn't do COMMIT TRANSACTION commands often)
3) As far as I know, there is no MS SQL Server bulk loader step available in the standard product (7.1 or 8.1 are the versions I've used recently), nor on the Tools > Marketplace. If someone knows of one, I'd like to learn about it too.
4) If this is a one-time load, then PDI may not be the best way to go.
There is a command line bulk loader for MS SQL Server (called bcp) and if you can get a file from Oracle in the needed format (or write it out via PDI if not), then bcp will load your data much faster. If you do not need indexes while the rows are loading, it is *much* faster.
I'm not familiar with what magnitude 50 lakhs + rows would be. If you use PDI, I suggest you set it up to allow use of more memory (edit the script spoon.bat if running under Windows, or spoon.sh if not on Windows to give a larger value for the Java -Xms parameter).
Hope that helps!
John