File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
The moose likes Performance and the fly likes Out of Memory Exception Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Performance
Bookmark "Out of Memory Exception" Watch "Out of Memory Exception" New topic

Out of Memory Exception

Sreenivasulu Naidu

Joined: Jan 11, 2007
Posts: 4

My application requires to create around 39,000,000 objects and i use ArrayList to hold these objects. I get
OutOfMemoryError approx after creation of 1million objects and added. Here is my piece of code. Any thoughts.

ArrayList<SABAccessRuleDataBean> explodedList = new ArrayList<SABAccessRuleDataBean>();
for(MSSSiteData site : siteList){
MSSSiteData site = siteList.get(i);
for(MSSServiceData svc : svcList){
for(MSSCalendarData cal : calList){
sabARObj = new SABAccessRuleDataBean();
sabARObj.accessType = accessType_p;
sabARObj.authSite_id = site.siteID;
sabARObj.authSite_code = site.siteCode;
sabARObj.authSite_desc = site.siteDesc;

sabARObj.service_id = svc.svcID;
sabARObj.service_code = svc.svcCode;
sabARObj.service_desc = svc.svcDesc;

sabARObj.calendar_id = cal.calID;
sabARObj.calendar_code = cal.calCode;
sabARObj.calendar_desc = cal.resourceObj.desc;

Jeanne Boyarsky
author & internet detective

Joined: May 26, 2003
Posts: 33125

A million records is a lot of memory. What are you trying to do with the objects - write to a file, some sort of processing, etc? Is it possible to do it in batches?

[OCA 8 book] [Blog] [JavaRanch FAQ] [How To Ask Questions The Smart Way] [Book Promos]
Other Certs: SCEA Part 1, Part 2 & 3, Core Spring 3, TOGAF part 1 and part 2
Kees Jan Koster
JavaMonitor Support

Joined: Mar 31, 2009
Posts: 251
So basically, you are trying to re-implement a database in Java. I'd suggest you put those 39M records in a database and then use SQL to query them. Usually, SQL is *way* faster than Java at searching through large sets of data.

Java-monitor, JVM monitoring made easy <- right here on Java Ranch
Tim Holloway
Saloon Keeper

Joined: Jun 25, 2001
Posts: 17417

Kees Jan Koster wrote:Usually, SQL is *way* faster than Java at searching through large sets of data.

I'd be reluctant to make that assertion. For one thing, it depends on what you mean by "Java". For that matter, it depends on what you means by "SQL". And there are some memory-resident SQL DBMS's, although they're not intended to store very large amounts of data.

The critical performance determinant is the size of the working set. That is, how much data has to be together in memory at the same time. Most commonly, when working with a large data set, you'll be iterating the data set proper, but you may need information from side tables which are smaller, but accessed randomly. So you might make those bits of data memory-resident and pass through the main data set stream-wise. That keeps the overall memory requirements down, saving resources and ensuring you don't end up compounding the situation courtesy of too much virtual memory paging (a/k/a "thrashing").

ORM's are a really good solution to things like that, since they keep you from having to invent the various caching mechanisms from scratch and can be fine-tuned by altering declarative information instead of altering (and debugging!) custom code.

An IDE is no substitute for an Intelligent Developer.
Kees Jan Koster
JavaMonitor Support

Joined: Mar 31, 2009
Posts: 251
Dear Tim,

Ok, I was painting overly broad. To make my statement more precise: what I see happening with new Java devs that start working with SQL is that they basically do "SELECT * FROM table" and then iterate over the resulting set to do something useful. By moving the set operations into the SQL domain (e.g. SELECT COUNT(*) FROM table WHERE condition) instead of iterating in Java, performance is gained in all but edge cases.

Another nice one is that I get a list using one JDBC query and then get details data for each item in the list using an SQL statement for each item. Runs nicely for small lists but does not scale all that well.

Kees Jan
steve souza
Ranch Hand

Joined: Jun 26, 2002
Posts: 862
Although there may be exceptions it sounds like a questionable design to put 39 million of these objects in memory. What if this number grows? At some point you will probably need to handle the situation where you have more objects than can be held in memory. Of course it is hard to make any comments at all when we don't know what you are trying to accomplish. - a fast, free open source performance tuning api.
JavaRanch Performance FAQ
Hemanth H Bhat

Joined: Apr 14, 2008
Posts: 15
I am not sure about what you are trying to achieve but you can give a try by increasing the memory allocated to JVM on which you are running this application using

-Xms and -Xmx as arguments

Shariquddin Mohammed

Joined: Oct 10, 2009
Posts: 9
My suggestion is use the database, querry using a prepared statement , fetch chunks of "relevant" data in batches and then use a collections object on the App Layer.
I agree. Here's the link:
subject: Out of Memory Exception
It's not a secret anymore!