Win a copy of Re-engineering Legacy Software this week in the Refactoring forum
or Docker in Action in the Cloud/Virtualization forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

limit result through JDBC

 
avihai marchiano
Ranch Hand
Posts: 342
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator


I need to go other around 100,000 rows (row has 7 columns from join between tables) in bulk of 500-1000, do my process (around 20 sec for bulk) and take additional bulk ....

My question:

from performance perspective should i leave the result set open or for each bulk create result set and set beginning (by absolute ) at the end of the last bulk row?

I understand that when you use "absolute" the database iterate over all rows until this point , dosnt the database has any algorithm to do this without iterate over all rows?

Thank you
 
Arunkumar Subramaniam
Greenhorn
Posts: 8
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi

you can use SQL cached row result set for achive your goal

like :


Arunkumar S
 
avihai marchiano
Ranch Hand
Posts: 342
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
If you have 1000 rows in the resultset, so it will copy all rows?
 
Jeanne Boyarsky
author & internet detective
Marshal
Posts: 34095
337
Eclipse IDE Java VI Editor
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Or you could use the SQL query itself to only return the rows you want to each connection is independent. Then you don't have to worry about using resources between requests. You could use rownum or the primary key of your table in the query.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic