Hi, I've around two million records in database and need to put it in a flat file. What is the best way. I've tried BufferedWriter but ended up getting java.lang.OutOfMemory exception though I use 'out.flush()' for every record. And also I invoke java with maximum heap size like, 'java -Xms1024m -Xmx1024m'
Any suggestions to handle this kind of massive operation. Thanks.
I don't see anything that would necessarily give you a memory leak. How big are your records? Try leaving the StringBuffer out. Just write each element individually. You're using a BufferedWriter so you won't be losing anything on the performance side. You'll probably gain, since BufferedWriter's buffer (8k) may be smaller than your record size.
Originally posted by Jim Yingst: The problem right now is that you're adding all the results onto one huge StringBuffer.
I don't think that's the case. This line:
which _is_ inside the loop should delete the contents of the StringBuffer. It is good practice to not reuse StringBuffer instances, as deleting the contents of a StringBuffer does not resize the internal storage buffer, so a particularly large record can cause a StringBuffer instance to hog a lot of memory for the duration of it's existance.
Joined: Jan 30, 2000
Ah yes, I missed the delete(), thanks. I agree with your other points.
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com