aspose file tools*
The moose likes I/O and Streams and the fly likes Efficient IO for large files(as large as 20 to 40 MB) Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » I/O and Streams
Bookmark "Efficient IO for large files(as large as 20 to 40 MB)" Watch "Efficient IO for large files(as large as 20 to 40 MB)" New topic
Author

Efficient IO for large files(as large as 20 to 40 MB)

Ak Rahul
Greenhorn

Joined: Jul 08, 2009
Posts: 21
Hi,
I was wondering if there is way of doing IO efficiently on large files. Generally, we use buffer streams...which reads/loads everything in the memory and then we perform the transformation and write the transformed file back. The problem here is if there are 100s of users using the same file or different large files concurrently, it can take up a lot of memory to load all files and perform the transformation. I thought of using file streams, but that can slow down the entire processing because of read/write from disk. I was wondering if there was an efficient way of doing this in the memory itself. Any help shall be appreciated.
James Sabre
Ranch Hand

Joined: Sep 07, 2004
Posts: 781

Ak Rahul wrote:Hi,
Generally, we use buffer streams...which reads/loads everything in the memory .


Not true. Buffered streams hold some of a file/stream in a buffer in memory but not usually the whole stream content unless the stream is shorter than the buffer size. If the whole stream content is in memory then you must have allocated the space and filled it.


Retired horse trader.
 Note: double-underline links may be advertisements automatically added by this site and are probably not endorsed by me.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Efficient IO for large files(as large as 20 to 40 MB)