File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
The moose likes JDBC and Relational Databases and the fly likes getBlob throws OutOfMemoryError ? Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login

Win a copy of REST with Spring (video course) this week in the Spring forum!
JavaRanch » Java Forums » Databases » JDBC and Relational Databases
Bookmark "getBlob throws OutOfMemoryError ?" Watch "getBlob throws OutOfMemoryError ?" New topic

getBlob throws OutOfMemoryError ?

Jo Ho

Joined: Mar 22, 2003
Posts: 2
I am using a MySQL database with the JDBC driver. I have a result set from the db, but when I call getBlob it throws an java.lang.OutOfMemoryError. Has any one seen this before, or have any suggestions on what may cause it?
Greg Charles

Joined: Oct 01, 2001
Posts: 2963

It's most likely caused, as you might expect, by running out of memory. How big a blob are you retrieving? Assuming you have enough memory on your machine, you can increase the amount allocated to Java using the -Xms (for initial memory) and -Xmx (for maximum memory) switches to the JVM.
java -Xms 64m -Xmx 128m MyProgram
jay vas
Ranch Hand

Joined: Aug 30, 2005
Posts: 407
Hi Guys :

I have a project with VERY large blobs (up to 50 Mb) and mysql 5.0.

It is very important that they are put in the database, because theyre relationship to other domain elements is important to maintain (so storing a pointer to the file on disk is absolutely not an option).

Will I run into any issues with JDBC ? Im using it with 1-2 M files right now with no problems...but is their a client side limit to how much data jdbc allows to be sent out from the jvm ?

I know that on the server side you can increase the "max_packet_size" variable to very high numbers, but are their limitations as far as what you can do from the client side ?

Are Blobs are implemented with inputStreams ? If so JDBC could handle an arbitrarily large byte[], as its blob content - since InputStreams can be arbitrarily large.

Thanks !

stu derby
Ranch Hand

Joined: Dec 15, 2005
Posts: 333
Yes, on most databases, you can work with a Blob of unbounded size by streaming it in and out, never having the full thing in memory...

The Java API is pretty explicit on how it's supposed to work; this is clearly reflected in the Blob Javadocs:

It wouldn't surprise me if some drivers somewhere don't fully conform to the spec and "cheat" and really load the full Blob into memory, but most of the major DBs do it right. You might test this early in your project though...
Feyna Reendrina

Joined: Apr 18, 2006
Posts: 11
i have some similar problem....

i try to get large 3 tables, and for 100 records each table..
so i want get 1000.000 records... but there is exception:

Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space

and i use netbeans for GUI....

i don't know where i must put the command...can u tell me??

I agree. Here's the link:
subject: getBlob throws OutOfMemoryError ?
It's not a secret anymore!