I need to insert huge csv file into an oracle database using JDBC . I read somewhere that parallel insert is possible only when the query contains
insert into <tablename> select * from <tablename1> i.e. we can parallely insert from one table to other.
insert into <tablename> values ... ... cannot do a parallel insert.
Now my csv file is very big say around a 1-2 gb. Inserting serially would take day(s) to load it to database which isnt acceptable. Now can I use threading in this case and how ? I have to improve the performance drastically.
The SQL you have doesn't demonstrate a parallel insert and wont really help you speed up your bulk data load.
SQL Loader lets you perform parallel loads. This is where you multi thread the loading task accross available CPUs and disks. Have a read of the SQL Loader documentation They contain a huge wealth of possible performance improvements.