I have an application which requires to read data from files. The file sizes can be >100MB also. I used BufferedReader to read the data from the file , I get the "out of memory " exception when large files are being read. My development enivironment is windows, though actually I would be deploying it in Unix.
Is this exception platform specific? I tried splitting the file into smaller chunks of bytes , but during the process , the data format in the file was lost.So , Is there any way to read large files without splitting them into smaller chunks of data?
I believe the problem may not be the file, but the fact that you may be trying to reading the whole file into memory.
Depending of the size of the objects that you are creating, and on the strategy that you are following to ensure proper GC, and on the configured amount of memory for your JVM, different things may happen.
See the JDK Documentation about the java command and how you can configure the Java HotSpot Virtual Machine to use different amounts of memory.
To start you would probably like to read the documentation abouut flags -Xmsn and -Xmxn.
Try to ensure that unused objects are propery garbage collected, like those stored in collections or arrays is another option good option to get memory back.
However, the best shot could be to page the file procesing, namely, not to read the whole file into memory. Just those portions you really, really need to do your procesing.
If the file is to display information to user, consider paging the information displayed, so that you do not need to readd all the objects into memory. [ October 18, 2006: Message edited by: Edwin Dalorzo ]