File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
The moose likes JDBC and Relational Databases and the fly likes Table having large volume of data Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login

Win a copy of Java Interview Guide this week in the Jobs Discussion forum!
JavaRanch » Java Forums » Databases » JDBC and Relational Databases
Bookmark "Table having large volume of data" Watch "Table having large volume of data" New topic

Table having large volume of data

Ankit Tripathi
Ranch Hand

Joined: Oct 17, 2009
Posts: 199
Hi all,

I have to fetch data from a table which has about 10,00,00,000 rows.when I am running my program it is giving me Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

What can I do to avoid that?

Below my code is given-

VERTEX_TRF.ERROR_CODE_MAPPING(ERROR RULE) TABLE CONTAINS FEW RECORDS(ABOUT 50) WHILE VERTEX_TRF.RISK_STATUS_DATA (ERROR MESSAGE)TABLE HAS ABOUT 10,00,00,000 RECORDS,that means for one error rule resultset has to parse through 10,00,00,000 recods,seems like it has huge memory requirements......

Below given is exception trace-

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Vector.<init>(
at java.util.Vector.<init>(
at oracle.jdbc.driver.OracleResultSetCacheImpl.put(
at oracle.jdbc.driver.ScrollableResultSet.putCachedValueAt(
at oracle.jdbc.driver.ScrollableResultSet.cacheCurrentRow(
at oracle.jdbc.driver.ScrollableResultSet.cacheRowAt(
at oracle.jdbc.driver.ScrollableResultSet.isValidRow(
at oracle.jdbc.driver.ScrollableResultSet.get_refetch_size(
at oracle.jdbc.driver.ScrollableResultSet.refreshRowsInCache(
at oracle.jdbc.driver.SensitiveScrollableResultSet.refreshRow(
at oracle.jdbc.driver.SensitiveScrollableResultSet.handle_refetch(
at com.ankit.DB.DatabaseFetcher.main(

I am not getting any way to resolve this problem.How can I optimize the things..... >
Sudheer Bhat
Ranch Hand

Joined: Feb 22, 2011
Posts: 75
Its just the number of rows you are processing is large. Either you need to break the process into small parts or increase your JVM heap size to a unreasonably high size.

You do a to get the error messages with status = 2 only. Why don't you add this filter in the SQL where clause itself?. That will definitely decrease the number of rows returned from DB to java layer. I don't think any production system will have 10,00,00,000 unprocessed error messages . And you may also consider purging the processed error messages if you no longer need it.

Even if you manage to increase your jvm heap size to a very huge value, the current code will take a lots of time to complete. You need to fetch only the data you need from the DB.
Ankit Tripathi
Ranch Hand

Joined: Oct 17, 2009
Posts: 199
Hi Sudheer,

Thanks for your post.As you have suggested I have applied filter(STATUS_ID) with two additional filters(BUSINESS_DATE & CONFIG_ID) in my SQL query,and now it's returning something about 10,000(max) rows at a time which can easily be stored in some excel sheet.
I agree. Here's the link:
subject: Table having large volume of data
It's not a secret anymore!