This week's book giveaways are in the iOS and Features new in Java 8 forums. We're giving away four copies each of Barcodes with iOS: Bringing together the digital and physical worlds and Core Java for the Impatient and have the authors on-line! See this thread and this one for details.
Hi Guys Not sure whether this should be here or in the tomcat forum. I'm sure someone will move it if it is in the wrong place.
My problem is that I am trying to import a large number of rows (10,000) into a MySql database. If I import say 1000 it seems to work ok. But with 10,000 I get connection errors (sorry, i'm at home and can't lay my hands on a stacktrace but it says something like "connect is already in use"). I'm using straight jdbc - no fussing around with connection pooling or what have you at the moment.
I have a method getConnection which supplies all the connections, and it is synchronized, which I would have thought would solve this sort of problem.
My thought is that the jvm might be trying to be clever and open multiple threads to access the database simultaneously.
Any diagnostic thoughts? Any suggested solutions? Any better approaches that I could be using? Or should I be asking this in the Tomcat forum?
Alister, The large batch insert is probably exceeding some buffer size or timeout. It could be a mysql setting. Is it necessary to do so many updates in one shot. If you can do them in batches of 1000, you could avoid this problem.
I agree that it makes more sense here than in tomcat because it is a jdbc question.
Ah. Sorry, should have mentioned - I batch them up in lots of 100. Perhaps I need to change the number of available connections in mysql. Hmm.
But what's odd is that out of the 10k, say 8k work ok. There is just a problem with some of them. And it isn't the first 8k which work - the errors happen in the last half of the upload, but some won't work, then some will work.
Joined: Sep 13, 2002
Just for the sake of interest: Here is the error I'm getting:
Joined: Sep 13, 2002
Hurrah! I've fixed it.
For anyone else with similar problems, the solution is to use the addBatch() method on preparedstatement. Also, I set the size of my batch to 1000;
author & internet detective