aspose file tools*
The moose likes JDBC and the fly likes issue with jdbc batch Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "issue with jdbc batch " Watch "issue with jdbc batch " New topic
Author

issue with jdbc batch

anagha patankar
Ranch Hand

Joined: Dec 26, 2005
Posts: 53
Hello ,
Am trying to do a batch insert into the database table .
However what is seen is that there are no errors - but total records is much less than expected .
( Expecting 100,000 records to be inserted in a batch )
Printing "int insertResult[] = pStmt.executeBatch();" shows 100,000
but database doesnt show that many records .

Here is the sample code :


The table is a simple three field table .

My queries are :
1 > Is there a limit to no of records that can be added in a batch ?
2 >is there a limit from db side as to no of records that it can committ ?

Any help will be appreciated .

Thanks ,
-anagha
Ulf Dittmer
Marshal

Joined: Mar 22, 2005
Posts: 39548
    
  27
I don't think there's a limit to the batch size, but I wouldn't run transactions with this many statements. When I need to perform batch insert I generally commit every 1000 items or so. This ensures that the transaction doesn't become too big (which eats up memory on the client, and takes a toll on the DB as well, since it needs to keep track of the rollback information).


Ping & DNS - updated with new look and Ping home screen widget
anagha patankar
Ranch Hand

Joined: Dec 26, 2005
Posts: 53
Thanks Ulf for your response .
When you specify a commit after every 1000 records a question comes to mind :

If my entire transaction ( 100,000 ) records should be atomic - committing every 1000 may not satisfy the condition .

Queries :
1 >How do I handle such a situation ?
2 >Could you kindly elaborate why keeping a batch size of 1000 is better ?

Thanks once again for your response.

Regards,
-anagha
Ulf Dittmer
Marshal

Joined: Mar 22, 2005
Posts: 39548
    
  27
Yes, rollback of all inserted rows isn't so easy any more. You need to be able to identify all the previously inserted (and committed) rows. How you might do that (and it may not actually be possible) depends on the nature of th edata you work with, and the table attributes they end up in.

It's advantageous for the reasons I mentioned: memory usage on the client, and resource usage in the DB. Compare inserting 1000 rows a 1000 times with inserting a million rows once and you'll see the difference.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: issue with jdbc batch
 
Similar Threads
JDBC VRE 2.0 BatchUpdate
unable to connect to sql server 2000 thru weblogic
howto modify database structure ?
Searching problem
ejbLoad is not getting called