File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
The moose likes JDBC and Relational Databases and the fly likes Handling massive number of records Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC and Relational Databases
Bookmark "Handling massive number of records" Watch "Handling massive number of records" New topic

Handling massive number of records

Hussein Baghdadi
clojure forum advocate

Joined: Nov 08, 2003
Posts: 3479

In our application we have read one day record from MySQL database, make a little change on each retrieved record and then save the records to a new database.
Please note that each day holds a massive number of records (could be a million records).
What is the best strategy to implement this?
I'm thinking to use MySQL limit keyword to read each 100 row, do you have any better approach?
And what about saving the retrieved/modified each 100 row?
Using a batch?
Scott Selikoff
Saloon Keeper

Joined: Oct 23, 2005
Posts: 3753

If it's in millions of records I hope you have the two databases physically sitting near each other or network latency is going to be high. For such a situation though, you may consider performing a backup up the database into a single file, transferring the file to the target database machine, then unloading/converting it there.

It seems like a lot of work but databases are known to thrash for high number of records, and getting JDBC to do batch records optimally can be hard.

[OCA 8 Book] [Blog]
I agree. Here's the link:
subject: Handling massive number of records
It's not a secret anymore!