File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
http://aspose.com/file-tools
The moose likes JDBC and the fly likes getBlob throws OutOfMemoryError ? Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Android Security Essentials Live Lessons this week in the Android forum!
JavaRanch » Java Forums » Databases » JDBC
Bookmark "getBlob throws OutOfMemoryError ?" Watch "getBlob throws OutOfMemoryError ?" New topic
Author

getBlob throws OutOfMemoryError ?

Jo Ho
Greenhorn

Joined: Mar 22, 2003
Posts: 2
I am using a MySQL database with the org.gjt.mm.mysql.Driver JDBC driver. I have a result set from the db, but when I call getBlob it throws an java.lang.OutOfMemoryError. Has any one seen this before, or have any suggestions on what may cause it?
Thanks
Greg Charles
Sheriff

Joined: Oct 01, 2001
Posts: 2840
    
  11

It's most likely caused, as you might expect, by running out of memory. How big a blob are you retrieving? Assuming you have enough memory on your machine, you can increase the amount allocated to Java using the -Xms (for initial memory) and -Xmx (for maximum memory) switches to the JVM.
Eg.,
java -Xms 64m -Xmx 128m MyProgram
jay vas
Ranch Hand

Joined: Aug 30, 2005
Posts: 407
Hi Guys :

I have a project with VERY large blobs (up to 50 Mb) and mysql 5.0.

It is very important that they are put in the database, because theyre relationship to other domain elements is important to maintain (so storing a pointer to the file on disk is absolutely not an option).

Will I run into any issues with JDBC ? Im using it with 1-2 M files right now with no problems...but is their a client side limit to how much data jdbc allows to be sent out from the jvm ?

I know that on the server side you can increase the "max_packet_size" variable to very high numbers, but are their limitations as far as what you can do from the client side ?

Are Blobs are implemented with inputStreams ? If so JDBC could handle an arbitrarily large byte[], as its blob content - since InputStreams can be arbitrarily large.

Thanks !

Jay
stu derby
Ranch Hand

Joined: Dec 15, 2005
Posts: 333
Yes, on most databases, you can work with a Blob of unbounded size by streaming it in and out, never having the full thing in memory...

The Java API is pretty explicit on how it's supposed to work; this is clearly reflected in the Blob Javadocs:
http://java.sun.com/j2se/1.5.0/docs/api/java/sql/Blob.html

It wouldn't surprise me if some drivers somewhere don't fully conform to the spec and "cheat" and really load the full Blob into memory, but most of the major DBs do it right. You might test this early in your project though...
Feyna Reendrina
Greenhorn

Joined: Apr 18, 2006
Posts: 11
i have some similar problem....

i try to get large record....in 3 tables, and for 100 records each table..
so i want get 1000.000 records... but there is exception:

Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space

and i use netbeans for GUI....

i don't know where i must put the command...can u tell me??

thx
 
It is sorta covered in the JavaRanch Style Guide.
 
subject: getBlob throws OutOfMemoryError ?
 
Similar Threads
Problem when readin blob
Data lost when writing object to a file
reading BLOB type
java.lang.ClassCastException : OracleResultSet
Blob?