aspose file tools*
The moose likes JDBC and the fly likes Unable to Iterate over large number of records Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Unable to Iterate over large number of records" Watch "Unable to Iterate over large number of records" New topic
Author

Unable to Iterate over large number of records

Raj Murthi
Greenhorn

Joined: Aug 29, 2005
Posts: 14
Hi All,

Please take some time out and give your views/comments on the following use case.

There are 300000 records in the database table.
But when I query this table for all records I get 99999 records only.

Seems there is limitation on number of records that can be fetched from result set. Is it related to driver restriction?

Database:- Sybase
Driver:- com.sybase.jdbc3.jdbc.SybDriver
JDK 1.5

Even I tried using Statement.setMaxRows(300000) but that didn�t worked out.
And it continued to fetch 99999 records only.

Is anyone aware of any such issues/limitations and is there any solution?
Please share your ideas.

Thanks.
Scott Selikoff
author
Saloon Keeper

Joined: Oct 23, 2005
Posts: 3716
    
    5

Have you tried calling setFetchSize(300000)?

As a design suggestion, perhaps you could reorganize the qurey so that you don't have to fetch 300000 rows? That could be a substantial amount of data (depending on the size of each record) that if fetched could seriously drain network traffic and jvm memory. I would suggest design solutions (if possible) that have you fetching far less records at a time.


My Blog: Down Home Country Coding with Scott Selikoff
Scott Selikoff
author
Saloon Keeper

Joined: Oct 23, 2005
Posts: 3716
    
    5

As a side note this is a good example of when stored procedures can be used to perform server side operations that would be too costly to use an application server for.
Raj Murthi
Greenhorn

Joined: Aug 29, 2005
Posts: 14
I did tried calling setFetchSize(300000), but seems it is not working as well.

Ya reoraganizing query is the workaround. But is it really Jconnect driver limitation or am I missing some tweaks.
Wei Dai
Ranch Hand

Joined: Jun 22, 2005
Posts: 86
>>There are 300000 records in the database table.
>>But when I query this table for all records I get 99999 records only.
>is it really Jconnect driver limitation or am I missing some tweaks.
It should be a limitation of Jconnect driver.
Scott Selikoff
author
Saloon Keeper

Joined: Oct 23, 2005
Posts: 3716
    
    5

Originally posted by Raj Murthi:
Ya reoraganizing query is the workaround


I'm not sure this really constitutes a 'workaround'. As I said earlier, even if the query was allowed to be executed, the performance impact in most systems would be awful. Customers would be calling you every five minutes saying the system is hung.

The database world is less pure than the java world in that you always have to take in to account the performance impact of every query you write. (you have to do this in java too but its not as common since the system can organize a lot of it for you). I'll give you a good example of when I encountered this in real life...


I was using a provided web architecture that performed inserts into a database table and, as input, took an array of records to insert. So, I passed in about 200,000 records. I assumed the architecture would be smart enough to organize this for me, but it was not. It was making a single database connection for each record. This resulted in terrible performance that lasted *20 minutes* to import all the records.

In an alternate case I tried forcing it to do the entire insert in a single database call. This was better because it was only one database call but with a ton of data. This was a problem with bandwidth since it meant the server would have to keep a connection open for a very long time and the web server I was using often crashed because of timeout errors. In the case that it suceeded it still took about *5 minutes*.

In a final attempt, I wrote a chunk mechanism that inserted 1,000 records at a time basically making it 200 calls of 1,000 records each. This took *25 seconds* to insert all 200,000 records and never crashed.
 
Don't get me started about those stupid light bulbs.
 
subject: Unable to Iterate over large number of records