I am using hibernate and trying to insert a file of size 200mb in the db2 database(column type as Blob). The corresponding java attribute is of type byte. Things work fine with small files and with big files it takes too much of time for insert. I have to read the file from a remote server and then insert. Can anyone please help me what will be the good practice to do this. document has to be saved in database as its client requirement.
Bipra De wrote:I am using hibernate and trying to insert a file of size 200mb in the db2 database(column type as Blob). The corresponding java attribute is of type byte. Things work fine with small files and with big files it takes too much of time for insert...
That's incredibly vague. How much time is 'too much time'?
For example: How long does it take to simply copy a 200Mb file from your remote server?
If that takes "too long", then nothing you do in Java or Hibernate is going to help you. You'll also be in a much better position to determine the actual overhead involved in your insert.
First measure; then optimize.
Bats fly at night, 'cause they aren't we. And if we tried, we'd hit a tree -- Ogden Nash (or should've been).
Articles by Winston can be found here
Joined: Nov 06, 2011
Regarding the file transfer i think that can be very well accomplish by using FTP and that will not take much time also. And it's almost taking 20-25 min to insert the file in the database using hibernate when the file is on my local system and database on different server.
Generally, with big Blobs. you are better off using java.sql.Blob instead of byte. Hibernate provides utility methods that can stream any InputStream into a Blob
However, having said that, I agree with Winston:- first measure then optimize. One thing you need to look at is how GC is behaving. Remember that generally speaking, because of how GC behaves, Java apps become slow before they go out of OOM. Many times, performance issues are really because your app is at the point of OOM. It's important to check GC when you observe that loading lots of data in memory has exponentially reduced the performance of the system. Not saying that is the only issue. Just another data point to look at.
Joined: Nov 06, 2011
Is there any API or something which insert the data in chunks in db2 ?? I know inserting big size blob can cause performance issue but here i just need to get the value from the web service from remote application and insert in my database.
The Blob's OutputStream is provided by the JDBC driver, and Hibernate doesn't have any control over how much bufferring the JDBC driver does on it's own end. We have tested this with extremely large Blobs (4GB.. yeah.. don;t ask why) on Oracle. I am pretty sure that Oracle's JDBC driver pretty much just sends the chunk to the Oracle server. DOn't know what DB2 driver will do.