Win a copy of Think Java: How to Think Like a Computer Scientist this week in the Java in General forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Handling massive number of records

 
Hussein Baghdadi
clojure forum advocate
Bartender
Posts: 3479
Clojure Mac Objective C
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hey,
In our application we have read one day record from MySQL database, make a little change on each retrieved record and then save the records to a new database.
Please note that each day holds a massive number of records (could be a million records).
What is the best strategy to implement this?
I'm thinking to use MySQL limit keyword to read each 100 row, do you have any better approach?
And what about saving the retrieved/modified each 100 row?
Using a batch?
Thanks.
 
Scott Selikoff
author
Saloon Keeper
Posts: 4014
18
Eclipse IDE Flex Google Web Toolkit
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If it's in millions of records I hope you have the two databases physically sitting near each other or network latency is going to be high. For such a situation though, you may consider performing a backup up the database into a single file, transferring the file to the target database machine, then unloading/converting it there.

It seems like a lot of work but databases are known to thrash for high number of records, and getting JDBC to do batch records optimally can be hard.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic