I have to insert 100K records into a table having 4 fields. Normal JDBC Insert seems to load 2K records per minute. Any ideas how to improve the execution time?
The flow is something like this:
Open a connection.
Read the record from a .txt file inside a loop.
Issue executeUpdate to insert the record.
Commit after every 10 records. (Tried autocommitting or commiting after x records; where x were different combinations)
Outside the loop, when all the input have been read; i close the connection and file reader, Buffered reader objects.
In the finally block, i close the Statement, connection objects.
I never worked with oracle, but maybe I had similar problem with MySQL and INNODB engine. The problem was that innodb logs each insert. Because mysql and innodb support multiple inserts with one command I ended with creating big SQL statement (with StringBuffer) which inserts 30 40k rows at once.