posted 14 years ago
Well, 4 Mb isn't really massive, so bringing it all into memory should be fine. However, you were splitting it up into smaller temporary files before. Wouldn't it be possible to read one chunk of the file, do whatever you need to do with it, and then read the next chunk? Do you need the whole file in memory at once?
I once rewrote a process that was reading a large XML file from a database record, transforming it with a DOM parser, then zipping the result, adding a header, and sending it to a remote client. The way it was implemented, it actually kept six copies of the file in memory in various forms, so as the files started getting larger, the thing became a huge memory hog. Through judicious use of IO streams (and a change to a SAX parser), I was able to read the file, transform, and send it with never more than a single buffer-full of it in memory at any given time.