aspose file tools*
The moose likes JDBC and the fly likes Table having large volume of data Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Spring in Action this week in the Spring forum!
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Table having large volume of data" Watch "Table having large volume of data" New topic
Author

Table having large volume of data

Ankit Tripathi
Ranch Hand

Joined: Oct 17, 2009
Posts: 198
Hi all,

I have to fetch data from a table which has about 10,00,00,000 rows.when I am running my program it is giving me Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

What can I do to avoid that?

Below my code is given-



VERTEX_TRF.ERROR_CODE_MAPPING(ERROR RULE) TABLE CONTAINS FEW RECORDS(ABOUT 50) WHILE VERTEX_TRF.RISK_STATUS_DATA (ERROR MESSAGE)TABLE HAS ABOUT 10,00,00,000 RECORDS,that means for one error rule resultset has to parse through 10,00,00,000 recods,seems like it has huge memory requirements......

Below given is exception trace-

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Vector.<init>(Vector.java:111)
at java.util.Vector.<init>(Vector.java:124)
at oracle.jdbc.driver.OracleResultSetCacheImpl.put(OracleResultSetCacheImpl.java:62)
at oracle.jdbc.driver.ScrollableResultSet.putCachedValueAt(ScrollableResultSet.java:2392)
at oracle.jdbc.driver.ScrollableResultSet.cacheCurrentRow(ScrollableResultSet.java:2073)
at oracle.jdbc.driver.ScrollableResultSet.cacheRowAt(ScrollableResultSet.java:2090)
at oracle.jdbc.driver.ScrollableResultSet.isValidRow(ScrollableResultSet.java:2038)
at oracle.jdbc.driver.ScrollableResultSet.get_refetch_size(ScrollableResultSet.java:2192)
at oracle.jdbc.driver.ScrollableResultSet.refreshRowsInCache(ScrollableResultSet.java:256)
at oracle.jdbc.driver.SensitiveScrollableResultSet.refreshRow(SensitiveScrollableResultSet.java:187)
at oracle.jdbc.driver.SensitiveScrollableResultSet.handle_refetch(SensitiveScrollableResultSet.java:275)
at oracle.jdbc.driver.SensitiveScrollableResultSet.next(SensitiveScrollableResultSet.java:68)
at oracle.jdbc.driver.UpdatableResultSet.next(UpdatableResultSet.java:330)
at com.ankit.DB.DatabaseFetcher.main(DatabaseFetcher.java:48)




I am not getting any way to resolve this problem.How can I optimize the things..... >
Sudheer Bhat
Ranch Hand

Joined: Feb 22, 2011
Posts: 75
Its just the number of rows you are processing is large. Either you need to break the process into small parts or increase your JVM heap size to a unreasonably high size.

You do a to get the error messages with status = 2 only. Why don't you add this filter in the SQL where clause itself?. That will definitely decrease the number of rows returned from DB to java layer. I don't think any production system will have 10,00,00,000 unprocessed error messages . And you may also consider purging the processed error messages if you no longer need it.

Even if you manage to increase your jvm heap size to a very huge value, the current code will take a lots of time to complete. You need to fetch only the data you need from the DB.
Ankit Tripathi
Ranch Hand

Joined: Oct 17, 2009
Posts: 198
Hi Sudheer,

Thanks for your post.As you have suggested I have applied filter(STATUS_ID) with two additional filters(BUSINESS_DATE & CONFIG_ID) in my SQL query,and now it's returning something about 10,000(max) rows at a time which can easily be stored in some excel sheet.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Table having large volume of data