aspose file tools*
The moose likes Java in General and the fly likes GC Overhead limit exceeded error when reading a text file Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Soft Skills this week in the Jobs Discussion forum!
JavaRanch » Java Forums » Java » Java in General
Bookmark "GC Overhead limit exceeded error when reading a text file " Watch "GC Overhead limit exceeded error when reading a text file " New topic
Author

GC Overhead limit exceeded error when reading a text file

ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
Hello All,

I am getting java.lang.OutOfMemoryError: GC overhead limit exceeded error when reading from a text file.I am not sure what is going wrong.I am running my program on a cluster having sufficient memory.The outer loop iterates for 16000 times and for each iteration of the outer loop the inner loop iterates for about 300,000 times.The error is thrown at different points in the program but it occurs only when I insert the code which stores the <String,Float> pair in a hashmap.Any suggestions will be grately appreciated.




Thanks.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 19060
    
  40

ch pravin wrote:I am running my program on a cluster having sufficient memory.The outer loop iterates for 16000 times and for each iteration of the outer loop the inner loop iterates for about 300,000 times.The error is thrown at different points in the program but it occurs only when I insert the code which stores the <String,Float> pair in a hashmap.Any suggestions will be grately appreciated.


Well, let's do some simple math... There are 16,000 iterations of the outer loop times 300,000 iterations of the inner loop gives you 4,800,000,000 iterations. Assuming that the string of the string pair is 80 characters long, giving the pair at about 84 bytes, and ignoring the memory needed for the nodes and stuff, gives you a total memory footprint of about a bit over 400GB.

Henry


Books: Java Threads, 3rd Edition, Jini in a Nutshell, and Java Gems (contributor)
ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
Henry Wong wrote:
ch pravin wrote:I am running my program on a cluster having sufficient memory.The outer loop iterates for 16000 times and for each iteration of the outer loop the inner loop iterates for about 300,000 times.The error is thrown at different points in the program but it occurs only when I insert the code which stores the <String,Float> pair in a hashmap.Any suggestions will be grately appreciated.


Well, let's do some simple math... There are 16,000 iterations of the outer loop times 300,000 iterations of the inner loop gives you 4,800,000,000 iterations. Assuming that the string of the string pair is 80 characters long, giving the pair at about 84 bytes, and ignoring the memory needed for the nodes and stuff, gives you a total memory footprint of about a bit over 400GB.

Henry


I am clearing the hashmap everytime the outer loop starts,so the hashmap contains only about 300,000 entries.Also,the string size is 8 characters at max.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 19060
    
  40

ch pravin wrote:
I am clearing the hashmap everytime the outer loop starts,so the hashmap contains only about 300,000 entries.Also,the string size is 8 characters at max.


It shouldn't be hard to confirm -- just put a check near the put to confirm that the count doesn't climb over some value, say 500k.

Otherwise, if you are already convinced that this is not the source of the leak, then you are back to square one. Get a profiler, and see what object types are growing pass your expectations.

Henry
ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
Henry Wong wrote:
It shouldn't be hard to confirm -- just put a check near the put to confirm that the count doesn't climb over some value, say 500k.

Otherwise, if you are already convinced that this is not the source of the leak, then you are back to square one. Get a profiler, and see what object types are growing pass your expectations.

Henry


The size of the hashmap grows upto 40,000 and then the error is thrown.I am not sure how to use a profiler.Can you shed some light on it as to what it will do? I am executing the code as a jar file on a remote machine.
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 40034
    
  28
Welcome to the Ranch

I think this question is too difficult for "beginning", so I shall move it.
Ilari Moilanen
Ranch Hand

Joined: Apr 15, 2008
Posts: 198
I trust that you have uused Google to solve your problem? One of the first results is this
http://stackoverflow.com/questions/1393486/what-means-the-error-message-java-lang-outofmemoryerror-gc-overhead-limit-excee
which indicates that increasing the heap size could solve your problem. Of course if there is a major problem and the heap memory is not freed correctly (as Henry Wong suggested) then increasing heap space will not help in the long run.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 19060
    
  40

ch pravin wrote:
The size of the hashmap grows upto 40,000 and then the error is thrown.I am not sure how to use a profiler.Can you shed some light on it as to what it will do? I am executing the code as a jar file on a remote machine.


If what you said is true, 5 character strings and a float, then each data point is really small. Even if you make each data point 100 bytes to over exaggerate the space needed for object headers and such, 40,000 points is only 4MB.

Either you are really incorrect about the size of each data point, or you have a memory leak during the processing of each data point.

Henry
ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
Ilari Moilanen wrote:I trust that you have uused Google to solve your problem? One of the first results is this
http://stackoverflow.com/questions/1393486/what-means-the-error-message-java-lang-outofmemoryerror-gc-overhead-limit-excee
which indicates that increasing the heap size could solve your problem. Of course if there is a major problem and the heap memory is not freed correctly (as Henry Wong suggested) then increasing heap space will not help in the long run.


I increased the heap size to 4GB,should I increase it further? Also, I using this command to increase it: java -Xmx4096m -jar Input.jar. I am not sure if it is the right command,I know -Xmx is used to increase the heap size.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 19060
    
  40

Ilari Moilanen wrote:I trust that you have uused Google to solve your problem? One of the first results is this
http://stackoverflow.com/questions/1393486/what-means-the-error-message-java-lang-outofmemoryerror-gc-overhead-limit-excee
which indicates that increasing the heap size could solve your problem.



This is actually a good point. I just assumed when the OP said that there were plenty memory, that the JVM was configured to use all of that memory. Just how much heap space is the JVM configured for?

Henry
ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
Henry Wong wrote:
This is actually a good point. I just assumed when the OP said that there were plenty memory, that the JVM was configured to use all of that memory. Just how much heap space is the JVM configured for?

Henry


I am not sure how much heap space is the JVM configured for,since it's a remote machine.I tried increasing the heap size to 8GB,that made the hashmap grow upto 80K before the code threw java heap space exception.You told that there could be a possible memory leak when each data point is getting processed.Can you suggest me some steps to resolve the issue?
Paul Clapham
Bartender

Joined: Oct 14, 2005
Posts: 18987
    
    8

ch pravin wrote:I tried increasing the heap size to 8GB,that made the hashmap grow upto 80K before the code threw java heap space exception.You told that there could be a possible memory leak when each data point is getting processed.Can you suggest me some steps to resolve the issue?


Bear in mind that 32-bit JVMs can only allocate about 3 GB (the theoretical maximum is 4 GB but for some reason the actual maximum is less). Are you using a 64-bit JVM?

As for steps to resolve the issue, it has already been recommended that you use a profiler. I would recommend that too.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 19060
    
  40

ch pravin wrote:
I am not sure how much heap space is the JVM configured for,since it's a remote machine.I tried increasing the heap size to 8GB,that made the hashmap grow upto 80K before the code threw java heap space exception.



Keep in mind, that with the exception of the Azul JVM and its pauseless GC, that most JVMs don't work well with a heap that big. A Sun JVM may take a long time to collect on a 8GB heap, and under certain condition, it may never finish.

Henry
ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
Henry Wong wrote:

Keep in mind, that with the exception of the Azul JVM and its pauseless GC, that most JVMs don't work well with a heap that big. A Sun JVM may take a long time to collect on a 8GB heap, and under certain condition, it may never finish.

Henry


I am not sure how to use a profiler.Can you shed some light on it as to what it will do and which one is the best one? I was looking to use this: http://java.sun.com/developer/technicalArticles/Programming/HPROF.html but I don't know if it is the best one to use.
Paul Clapham
Bartender

Joined: Oct 14, 2005
Posts: 18987
    
    8

ch pravin wrote:I am not sure how to use a profiler.Can you shed some light on it as to what it will do and which one is the best one?


First step: get a profiler. Second step: read its documentation. It's impossible for somebody to tell you how to use a piece of software without knowing what that piece of software is.

As for whether it's the best one, that's debatable. You want the best one for your circumstances, not the best one for someone else's circumstances. Since you have an immediate problem, then for you "best" means quickest to get running. So don't spend a lot of time analyzing profilers and their pros and cons. Pick one and use it.
ch pravin
Greenhorn

Joined: Nov 25, 2010
Posts: 20
I used a profiler and I saw that the key of hashmap which was a String value was consuming a lot of memory.I converted the string to integer and the program is running fine now.Thanks for all the help
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: GC Overhead limit exceeded error when reading a text file