aspose file tools*
The moose likes Java in General and the fly likes OutOfMemory Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Java 8 in Action this week in the Java 8 forum!
JavaRanch » Java Forums » Java » Java in General
Bookmark "OutOfMemory" Watch "OutOfMemory" New topic
Author

OutOfMemory

Nelson Nadal
Ranch Hand

Joined: Jun 06, 2002
Posts: 169
I have a txt datafile with 50,000 records or 5mb file size, my applicatoin reads them and upload it in ArrayList, when I run them its just ok.
But when the data is about 500,000 records or 50mb file size it has an error of OutOfMemory.
What will I do about this? Someone said I have to adjust the heapsize in jdk itself, is it true?
Or is my application wrong? Please help me.
Thanks co javaranch members and jurus out there.
Prakash Dwivedi
Ranch Hand

Joined: Sep 28, 2002
Posts: 452
u can increase the memory size of the jvm.
java -Xms200m -Xmx500m myApplication
Here 200m=200 mb of initial memory
500m= 500 mb of maximum memory
i hope this helps
Ernest Friedman-Hill
author and iconoclast
Marshal

Joined: Jul 08, 2003
Posts: 24168
    
  30

The default JVM heap size for Sun's recent JVMs is 64MB; if you're loading in 50MB worth of ASCII text as Strings, then this will take more than 100MB worth of RAM to hold (Java uses 16-bit Unicode characters.)
You may be able to change your application, but getting 50MB into 64MB, is obviously going to be a tight fit no matter what you do (your app has other requirements, of course, besides just the text itself.) So yes, enlarging the default heap is the best answer. With Sun's JVMs, the switch -XmxNNm, where NN is a number of megabytes, will do it. Try -Xmx128m and see what happens.


[Jess in Action][AskingGoodQuestions]
Nelson Nadal
Ranch Hand

Joined: Jun 06, 2002
Posts: 169
Thanks, Ok thats help me a lot. But there is still a problem because that java application is considered as java bean because it calls by JSP page.
How will I permanently configure to increase the JVM's memory. I really appreciate your answers. Thanks again.
Ken Blair
Ranch Hand

Joined: Jul 15, 2003
Posts: 1078
Do you need to have the entire file in memory at once?
Nelson Nadal
Ranch Hand

Joined: Jun 06, 2002
Posts: 169
I think.. yes..
The application is simply like raffle draw, the data is consists of hundred thousands who joined in the raffle, from that textdata file I put those records inside ArrayList. I generate a random number. I compare that against their no.position in ArrayList. As simple as that, I believe its just right algorithm.
Anyway is it ok that I put this question (only regarding the jsp part) also in JSP Topic?
Ken Blair
Ranch Hand

Joined: Jul 15, 2003
Posts: 1078
If that's the case why don't you just find the random number first and then iterate through the file until you get to that number, then read in that data and use it.
Nelson Nadal
Ranch Hand

Joined: Jun 06, 2002
Posts: 169
Hmm, Ok, This is the pseudo code of my program...
There are 2 files . Group A in one file and group B in another file with 400,000 of records each (or 25 mb each).
I assign them a ctr no. as I add them to a HashSet table. The reason is to mix the A records with B records in random.
So the possible arrangement is 1 is A, 2 is B, 3 is B, 4 is A...
We dont want the scenario of 1-400,000 is A and 400,001 to 800,000 is B.
After putting that records to HashTable. Then I generate a random no that compares against the counter no. that was assigned to a record, thats the winner.
So I think even if I get the random no. first, it will still might get the last in the list of HashList. And that produces an Out of Memory Error.
I hope you understand my explanation. Thanks...
Ken Blair
Ranch Hand

Joined: Jul 15, 2003
Posts: 1078
So you're generating a number between 1 and 800,000 and then finding the nth entry in the HashSet? I still don't see why you can't generate the number first. If it's <= 400,000 then go into file A and find the nth entry, if its >= 400,001 then go through file B and find the nth entry. Either way, if that's what you're doing I don't see why there's any need to read everything into memory.
Marcus Howarth
Ranch Hand

Joined: Jan 04, 2002
Posts: 37
Nelson,
So I think even if I get the random no. first, it will still might get the last in the list of HashList. And that produces an Out of Memory Error.

have a look at RandomAccessFile- you don't need to read the entire file into the HashList to get at the last entry

I assign them a ctr no. as I add them to a HashSet table. The reason is to mix the A records with B records in random.

I don't quite see the benefit of this bit - if you're taking a random number to pick you winner, why bother mixing up the records, random selection of a randomly mixed up data set isn't making it more random..


Marcus<br /> <br />SCJP, March '02
Nelson Nadal
Ranch Hand

Joined: Jun 06, 2002
Posts: 169
First of all, again, thanks to you sirs:
Marcus Howarth
Ken Blair
Ernest Friedman-Hill
Prakash Dwivedi
This is how the program works....
File A
=======
Name1
Name2
Name3
Name4

File B
======
Name5
Name6
Name7
Name8
... I read these file then add to HashList. The possible mixed outcome:
HashList
========
B,Name5
A,Name1
A,Name4
B,Name7
B,Name6
A,Name2
B,Name8
A,Name3
During addition into HashList I contatenated their Group Name.
If the generated no. is 5 then the winnner
is in group A, Name2 because it is in the index 5 of HashList.

With regards to Ken Blair suggestion, the records are no more separated since I mixed them already.
For Marcus Howarth input, it is not in the file any more. Thanks again.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: OutOfMemory
 
Similar Threads
OutOfMemory Error
OutOfMemory
outOfMemory
OutOfMemory Error
Large ResultSets causing OutOfMemory error