This week's book giveaway is in the OCAJP 8 forum. We're giving away four copies of OCA Java SE 8 Programmer I Study Guide and have Edward Finegan & Robert Liguori on-line! See this thread for details.
Hi, I'm new to DB development; so please bear with me.
I need to query a large data set (potentially be in the 100K range). Using Netbeans, the result returned w/o issues; however, performing the same query in the command line... a "2" is returned and hangs indefinitely. Now, in the application, the same query returns a resultset and is able to process data object...but only up to a certain row then it just hangs w/o any error. I'm try to take this large resultset and put it in the JTable. I've explored Java Binding, but that doesn't seem to work either... I've spent countless hours on this... please point me in the right direction on how to efficiently query and display large data set in a JTable.
I would really really appreciate any pointers....b/c right now I feel like and clueless.
Ah, that 2 means that your command line is expecting a second line.
What happens if you enter a ; or a / on line 2 ?
OCUP UML fundamental and ITIL foundation
Joined: Oct 18, 2010
i added a ; at the end of the query command and it works!! can't believe i left that out on the command line for some reason. thank you Jan Cumps!
the query in my application works, but just processing of the resultset stops after a certain record. How can I improve this? I read that it has to do with the VM memory, but not sure how to improve this.
Thanks for the reference, but Pagination is not an option in the requirement. It's required to capture all the records at a given time by a thread. the data might changed by another thread if waiting for a request of the next set of rows. the user may not see 100K records all at once, but that data still needs to be captured from the DB. I can't think of a better way to do this then to get all the data and store locally...
Pagination might help.
Increasing heap space will only help if the number of records doesn't increase. What if the table grows to 150K, or 800K? There is no filter, so the program would consume more memory over time.
There are other solutions:
- take a step back and reconsider what you are trying to achieve; is the requirement realy "fetch all records", or is the requirement something else.
- try to restrict the number of records that need to be fetched by using where statements.
- reject the requirement because it cannot be achieved
- move the functionality within the database (a stored procedure?)
I think that you can profile and optimize something that could possibly work, but has resource issues. But it can't solve a not-achievable requirement.