my dog learned polymorphism*
The moose likes JDBC and the fly likes Data Migration Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of The Java EE 7 Tutorial Volume 1 or Volume 2 this week in the Java EE forum
or jQuery UI in Action in the JavaScript forum!
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Data Migration" Watch "Data Migration" New topic
Author

Data Migration

vivek makode
Greenhorn

Joined: Oct 01, 2001
Posts: 27
Hi,
I have a example.csv file (comma separated values) containing millions of records generated from the SAP database.
I want to read this file and insert the same records into Firebird database.
If I use my current jsp based application to read and insert records then after few thousands of records the application terminates with ServletException.
If I use stand alone java program to read and insert the same, it terminates with outofmemory error.
What is the professional and best way to do this? Should I read the file part by part and insert accordinly?
Thanking you in advance

Jeanne Boyarsky
internet detective
Marshal

Joined: May 26, 2003
Posts: 30356
    
150

Vivek,
Yes you should read the file in batches and updated a bunch of records at a time. If you commit after each batch, you shouldn't have any problems because it won't be held in memory.


[Blog] [JavaRanch FAQ] [How To Ask Questions The Smart Way] [Book Promos]
Blogging on Certs: SCEA Part 1, Part 2 & 3, Core Spring 3, OCAJP, OCPJP beta, TOGAF part 1 and part 2
 
Consider Paul's rocket mass heater.
 
subject: Data Migration