This week's book giveaway is in the OO, Patterns, UML and Refactoring forum. We're giving away four copies of Refactoring for Software Design Smells: Managing Technical Debt and have Girish Suryanarayana, Ganesh Samarthyam & Tushar Sharma on-line! See this thread for details.
Team, I have a secnario in which i have the following. 1. A number of users(6 million or more) 2. A lot of data per user.(10 tables and each table containing around 25 rows) Now i need to per user i need to transform data for all the rows. The problem: I want to read 50 users at a time and loop trough the transformations. Then read 50 more users and od the same till i do them all. I am confused as to what to use. 1. Srcollable Result Set: Will this along with cursors do the job, meaning will it only read 50 users at a time and do the transformations, and then read the next 50 users ..... 2. Or Should i use setFetchSize , Will that only read 50 rows at a time... I od know that this is best done in sql but the client is hell bent to doing this through a java program. regards Suchak Jani
I don't see what a scrollable ResultSet will buy you, as you can just stream straight through the users you're processing -- no need to hop back and forth through your ResultSet. Not sure if setting the FetchSize would buy you a lot, but it's worth experimenting with. Are you planning to select the data for all users, and committing your transaction every 50 users, holding the ResultSets open over the commit (assuming a JDBC 3.0 driver)? Or are you selecting 50 users at a time? Whatever you do, use batch updates. - Peter [ May 21, 2003: Message edited by: Peter den Haan ]
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com