aspose file tools*
The moose likes JDBC and the fly likes Handling massive number of records Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of The Java EE 7 Tutorial Volume 1 or Volume 2 this week in the Java EE forum
or jQuery UI in Action in the JavaScript forum!
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Handling massive number of records" Watch "Handling massive number of records" New topic
Author

Handling massive number of records

Hussein Baghdadi
clojure forum advocate
Bartender

Joined: Nov 08, 2003
Posts: 3479

Hey,
In our application we have read one day record from MySQL database, make a little change on each retrieved record and then save the records to a new database.
Please note that each day holds a massive number of records (could be a million records).
What is the best strategy to implement this?
I'm thinking to use MySQL limit keyword to read each 100 row, do you have any better approach?
And what about saving the retrieved/modified each 100 row?
Using a batch?
Thanks.
Scott Selikoff
Saloon Keeper

Joined: Oct 23, 2005
Posts: 3704
    
    5

If it's in millions of records I hope you have the two databases physically sitting near each other or network latency is going to be high. For such a situation though, you may consider performing a backup up the database into a single file, transferring the file to the target database machine, then unloading/converting it there.

It seems like a lot of work but databases are known to thrash for high number of records, and getting JDBC to do batch records optimally can be hard.


My Blog: Down Home Country Coding with Scott Selikoff
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Handling massive number of records