Win a copy of Mesos in Action this week in the Cloud/Virtualizaton forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

getBlob throws OutOfMemoryError ?

 
Jo Ho
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am using a MySQL database with the org.gjt.mm.mysql.Driver JDBC driver. I have a result set from the db, but when I call getBlob it throws an java.lang.OutOfMemoryError. Has any one seen this before, or have any suggestions on what may cause it?
Thanks
 
Greg Charles
Sheriff
Posts: 2985
12
Firefox Browser IntelliJ IDE Java Mac Ruby
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It's most likely caused, as you might expect, by running out of memory. How big a blob are you retrieving? Assuming you have enough memory on your machine, you can increase the amount allocated to Java using the -Xms (for initial memory) and -Xmx (for maximum memory) switches to the JVM.
Eg.,
java -Xms 64m -Xmx 128m MyProgram
 
jay vas
Ranch Hand
Posts: 407
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Guys :

I have a project with VERY large blobs (up to 50 Mb) and mysql 5.0.

It is very important that they are put in the database, because theyre relationship to other domain elements is important to maintain (so storing a pointer to the file on disk is absolutely not an option).

Will I run into any issues with JDBC ? Im using it with 1-2 M files right now with no problems...but is their a client side limit to how much data jdbc allows to be sent out from the jvm ?

I know that on the server side you can increase the "max_packet_size" variable to very high numbers, but are their limitations as far as what you can do from the client side ?

Are Blobs are implemented with inputStreams ? If so JDBC could handle an arbitrarily large byte[], as its blob content - since InputStreams can be arbitrarily large.

Thanks !

Jay
 
stu derby
Ranch Hand
Posts: 333
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Yes, on most databases, you can work with a Blob of unbounded size by streaming it in and out, never having the full thing in memory...

The Java API is pretty explicit on how it's supposed to work; this is clearly reflected in the Blob Javadocs:
http://java.sun.com/j2se/1.5.0/docs/api/java/sql/Blob.html

It wouldn't surprise me if some drivers somewhere don't fully conform to the spec and "cheat" and really load the full Blob into memory, but most of the major DBs do it right. You might test this early in your project though...
 
Feyna Reendrina
Greenhorn
Posts: 11
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
i have some similar problem....

i try to get large record....in 3 tables, and for 100 records each table..
so i want get 1000.000 records... but there is exception:

Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space

and i use netbeans for GUI....

i don't know where i must put the command...can u tell me??

thx
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic