my dog learned polymorphism*
The moose likes I/O and Streams and the fly likes Data Truncated when converting from blob to stream to char buffer for huge files Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Murach's Java Servlets and JSP this week in the Servlets forum!
JavaRanch » Java Forums » Java » I/O and Streams
Bookmark "Data Truncated when converting from blob to stream to char buffer for huge files" Watch "Data Truncated when converting from blob to stream to char buffer for huge files" New topic
Author

Data Truncated when converting from blob to stream to char buffer for huge files

aadhar sharma
Ranch Hand

Joined: Oct 09, 2006
Posts: 38
I am loosing data when I try to convert a stream to a CharBuffer. The following can be verified from the output below.

I see the code perfectly fine and the following works 99% of the times when the file size is small how ever the data is truncated for huge files. (3,50,000 Lines approx or more)

The behaviour is consistent and the file is cut off every single time from a specific character.

Do the CharBuffer and ByteBuffer have a limitation on the size ? If yes is there a better way to re-write this code and return CharBuffer.



Any help appreciated.


InputStreamReader reader = null;
CharBuffer cbuf = null;
// get the blob to access its length
Blob blob = getBlob(attachment);
int capacity = (int)blob.length();
System.out.println("Capacity is this "+capacity);
System.out.println("Capacity is this long"+blob.length());
Log.info(this, "BinaryAttachmentManager getBufferredReader() allocating "+capacity+" buffer with characterset "+charsetName);
cbuf = CharBuffer.allocate(capacity);

BufferedInputStream bis = new BufferedInputStream(blob.getBinaryStream());
if(null!=charsetName) {
reader = new InputStreamReader(bis,charsetName);
} else {
reader = new InputStreamReader(bis);
}
long start = System.currentTimeMillis();
Log.info(this, "BinaryAttachmentManager getBufferredReader() reading Blob bytes into CharBuffer");
int count = reader.read(cbuf);
cbuf.limit(count); //truncate any unused space
Log.info(this, "BinaryAttachmentManager getBufferredReader() read "+count+" characters, elapsed="+(System.currentTimeMillis()-start)+" msec");


return cbuf;
------------------------------------------------------------------------------------------Out Put ----------------------------------------------------------------------------

06:34:51,190 INFO BinaryAttachmentManager,main:46 - BinaryAttachmentManager getBufferredReader() allocating 16666465 buffer with characterset null
06:34:51,331 INFO BinaryAttachmentManager,main:46 - BinaryAttachmentManager getBufferredReader() reading Blob bytes into CharBuffer
06:34:53,394 INFO BinaryAttachmentManager,main:46 - BinaryAttachmentManager getBufferredReader() read 16654336 characters, elapsed=2063 msec

----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------




Thanks and Regards
Paul Clapham
Bartender

Joined: Oct 14, 2005
Posts: 18541
    
    8

I don't see anything in the API documentation for that read(CharBuffer) method which says that it guarantees to fill the whole buffer. Quite the contrary; it returns the number of characters that it did put into the buffer. It's your responsibility to write a loop which reads until the method returns -1, if you want to get all of the data in the reader.
aadhar sharma
Ranch Hand

Joined: Oct 09, 2006
Posts: 38

Appreciate it Paul.
 
wood burning stoves
 
subject: Data Truncated when converting from blob to stream to char buffer for huge files
 
Similar Threads
Reading From a Binary Large Object(BLOB)
nio and file transfers
Http Reader.read(CharBuffer) for Binary?
Converting from BLOB to string
How to retrieve blob data in java