Probably I am processing large files most of the time, I got another problem on the jvm memory management: I am trying to read a large file (320M) into memory to be processed later, so I wrote the following program:
public class Test2 {
public static void main(
String[] args) {
String filename = "c:\\work\\data\\testfile1.xml";
int size = 20000000;
char buf[] = new char[size];
try {
Reader reader = new BufferedReader(new FileReader(filename));
StringBuffer buffer = new StringBuffer();
long total = 0;
int num;
while ((num = reader.read(buf, 0, size)) > -1) {
buffer.append(buf, 0, num);
total += num;
System.out.println("Total chars read:"+total);
}
reader.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
The result is as following (jdk is 1.4.2):
C:\work\workspace\test1>
java -Xmx800M test1.Test2
Total chars read:20000000
Total chars read:40000000
Total chars read:60000000
Total chars read:80000000
Total chars read:100000000
Total chars read:120000000
Total chars read:140000000
Total chars read:160000000
Exception in
thread "main" java.lang.OutOfMemoryError
I have > 1G physical memory in the machine, so this should not be a problem at all. Changing maximum heap size seems not helping at all! I also tried to change the buffer size from 20M to 20K, but it doesn't help at all.
Anybody knows why I could not allocate 320M memory when the heap is much bigger than this? Any suggestion would be appreciated!!!
--James