aspose file tools*
The moose likes JDBC and the fly likes Large data query hangs indefinitely Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Large data query hangs indefinitely" Watch "Large data query hangs indefinitely" New topic
Author

Large data query hangs indefinitely

laila NiHai
Ranch Hand

Joined: Oct 18, 2010
Posts: 35
Hi, I'm new to DB development; so please bear with me.

I need to query a large data set (potentially be in the 100K range). Using Netbeans, the result returned w/o issues; however, performing the same query in the command line... a "2" is returned and hangs indefinitely. Now, in the application, the same query returns a resultset and is able to process data object...but only up to a certain row then it just hangs w/o any error. I'm try to take this large resultset and put it in the JTable. I've explored Java Binding, but that doesn't seem to work either... I've spent countless hours on this... please point me in the right direction on how to efficiently query and display large data set in a JTable.

I would really really appreciate any pointers....b/c right now I feel like and clueless.
Thanks!
Jan Cumps
Bartender

Joined: Dec 20, 2006
Posts: 2500
    
    8

Ah, that 2 means that your command line is expecting a second line.

What happens if you enter a ; or a / on line 2 ?

OCUP UML fundamental and ITIL foundation
youtube channel
laila NiHai
Ranch Hand

Joined: Oct 18, 2010
Posts: 35
i added a ; at the end of the query command and it works!! can't believe i left that out on the command line for some reason. thank you Jan Cumps!

the query in my application works, but just processing of the resultset stops after a certain record. How can I improve this? I read that it has to do with the VM memory, but not sure how to improve this.
Jan Cumps
Bartender

Joined: Dec 20, 2006
Posts: 2500
    
    8

You don't want to show 100K records in a JTable. No user will ever scroll through that.
Restrict the number of records that you want to show to a number that makes sence (maybe 50?).

Read our article on Pagination: How do I limit the number of rows displayed in the results page?


laila NiHai
Ranch Hand

Joined: Oct 18, 2010
Posts: 35
Thanks for the reference, but Pagination is not an option in the requirement. It's required to capture all the records at a given time by a thread. the data might changed by another thread if waiting for a request of the next set of rows. the user may not see 100K records all at once, but that data still needs to be captured from the DB. I can't think of a better way to do this then to get all the data and store locally...
Jan Cumps
Bartender

Joined: Dec 20, 2006
Posts: 2500
    
    8

Sorry, I don't see that working.
Have you measured the time needed to retrieve this amount of records, and the memory needed to store them?
Ravi Kiran Va
Ranch Hand

Joined: Apr 18, 2009
Posts: 2234

.but only up to a certain row then it just hangs w/o any error.


Hi all , i just want to ask if any Profiling tool or Memory tool may help in this case ?? Or only Pagination or extending Heap Sapce is the solution for this ?? Thanks .


Save India From Corruption - Anna Hazare.
Jan Cumps
Bartender

Joined: Dec 20, 2006
Posts: 2500
    
    8

Ravi Kiran Va wrote:
.but only up to a certain row then it just hangs w/o any error.


Hi all , i just want to ask if any Profiling tool or Memory tool may help in this case ?? Or only Pagination or extending Heap Sapce is the solution for this ?? Thanks .


Did you read what the original issue was, and how we resoved it? No Profiling or Memory tool will ever find that you do not end an SQL statement with a semicolon.
Ravi Kiran Va
Ranch Hand

Joined: Apr 18, 2009
Posts: 2234

Did you read what the original issue was, and how we resoved it? No Profiling or Memory tool will ever find that you do not end an SQL statement with a semicolon.


Didn't you read "Or only Pagination or extending Heap Sapce is the solution for this ?? Thanks . "
Jan Cumps
Bartender

Joined: Dec 20, 2006
Posts: 2500
    
    8

No I didn't.

Pagination might help.
Increasing heap space will only help if the number of records doesn't increase. What if the table grows to 150K, or 800K? There is no filter, so the program would consume more memory over time.


There are other solutions:
- take a step back and reconsider what you are trying to achieve; is the requirement realy "fetch all records", or is the requirement something else.
- try to restrict the number of records that need to be fetched by using where statements.
- reject the requirement because it cannot be achieved
- move the functionality within the database (a stored procedure?)
- ?

I think that you can profile and optimize something that could possibly work, but has resource issues. But it can't solve a not-achievable requirement.
Ravi Kiran Va
Ranch Hand

Joined: Apr 18, 2009
Posts: 2234

 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Large data query hangs indefinitely