• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
  • Campbell Ritchie
  • Tim Cooke
  • Devaka Cooray
  • Ron McLeod
  • Jeanne Boyarsky
  • Liutauras Vilda
  • paul wheaton
  • Junilu Lacar
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Piet Souris
  • Carey Brown
  • Tim Holloway
  • Martijn Verburg
  • Frits Walraven
  • Himai Minh

Error inserting data into database after few successfull insert.

Ranch Hand
Posts: 763
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

I'm trying to insert 3000 reocrds into MySQL Database from CSV file using JDBC but after seccessfull insertion of aprox 1300 records, Its giving me the connection refuse to connect.

Stack Trace :

Posts: 4335
jQuery Eclipse IDE Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Usually this happens because of a timeout value somewhere in the connection between the java process and the server. First, I'd check the MySQL configuration and increase timeouts there. Then I'd examine the web server for similar timeouts.

Also, how are grouping your inserts? One record per insert statement, multiple records per insert statement, or all records within a single insert statement?

You can either find the timeout value and increase, or insert 1,000 records at a time. It's been my experience that the optimal solution to bulk insertion for network bandwidth and database performance, is to insert about 500 records per insert statement (although keep the 500 value configurable). Often in larger scale environments this prevents timeouts from ever being hit. In one case of inserting data on a shared server I saw performance like this for bulk inserts:

1 record per insert statement: 1 hour to complete
500 records per insert statement: 35 seconds to complete
All records (50k) per insert statement: never completed do to web server timeout

In short, bulk inserts require more thought and planning if they are a common occurance than traditional inserts.
How do they get the deer to cross at the signs? Or to read this tiny ad?
the value of filler advertising in 2021
    Bookmark Topic Watch Topic
  • New Topic