• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Efficient IO for large files(as large as 20 to 40 MB)

 
Ak Rahul
Greenhorn
Posts: 21
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
I was wondering if there is way of doing IO efficiently on large files. Generally, we use buffer streams...which reads/loads everything in the memory and then we perform the transformation and write the transformed file back. The problem here is if there are 100s of users using the same file or different large files concurrently, it can take up a lot of memory to load all files and perform the transformation. I thought of using file streams, but that can slow down the entire processing because of read/write from disk. I was wondering if there was an efficient way of doing this in the memory itself. Any help shall be appreciated.
 
James Sabre
Ranch Hand
Posts: 781
Java Netbeans IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Ak Rahul wrote:Hi,
Generally, we use buffer streams...which reads/loads everything in the memory .


Not true. Buffered streams hold some of a file/stream in a buffer in memory but not usually the whole stream content unless the stream is shorter than the buffer size. If the whole stream content is in memory then you must have allocated the space and filled it.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic