My Java web application has to parse a large data file of 500,000 records. These records will be stored in an Oracle 9i database table. I wonder what would be the efficient way to write data to the table.
I tried to write about 1,000 records to the table in one go, but the table seems to be unusable now (Error: java.sql.SQLException: Protocol violation, SQL State: null, Error Code: 17401).
Kelvin, my thought is loading .5 million through Java code is not great idea. Did you look at other ways to load (like using sqlldr). If you have constraint that Java prepares these data [clean,transform] then you can write to a File stream and invoke a script which performs direct load to oracle.