Win a copy of Re-engineering Legacy Software this week in the Refactoring forum
or Docker in Action in the Cloud/Virtualization forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Multiple resultsets concurrently on a connection

 
Tina Smith
Ranch Hand
Posts: 208
9
Eclipse IDE Firefox Browser Java
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I have a Java app that uses a single shared Connection to connect to an Oracle database. At any time, a number of threads may be using this connection to execute stored procedures on the database. These procedures typically take milliseconds to execute and read results back. There is one procedure (which is close to the most used) that may take up to a couple hundred ms to execute and read results. I've been informed that in certain other languages (.NET), executing two procedures interleaved (so execute1, execute2, read1, read2) causes exceptions to be thrown. I seem to remember something about a single active resultset in .NET...

I have no concurrency controls in place as far as the connection is concerned. Then again, I've never any problems occur because of my approach.
So my question for the gurus out there is: is there a potential problem with my usage of the connection?

Thanks
 
Martin Vajsar
Sheriff
Pie
Posts: 3751
62
Chrome Netbeans IDE Oracle
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Generally there can be only one processing active on a connection to the Oracle database. I'm pretty sure that if you try to execute two stored procedures in parallel (or any two SQL statements actually), they will be processed serially by the JDBC/database. I believe that this is true even for processing a resultset, ie. only one thread can actually use the connection, either for executing the procedure, or for fetching the records from the resultset. If it was me, I'd add a synchronization around all JDBC calls in your situation, just for the peace of the soul and because the processing is not going to be parallel anyway. This actually might reduce concurrency of your application, because there is probably some prefetch on the resultset and it is possible that reading from the resultset only blocks with other JDBC call only if the resultset needs to fetch another group of records from the database, so the solution I propose would synchronize on every read, instead of, say, every tenth read from resultset.

However, I'd propose a different solution: set up a connection pool. When processing a request, take a connection from the pool and use this connection to run the procedure and process all the results from the resultset. This would allow you to process as many requests as there are connections in the pool truly in parallel (even on the database side). This solution is more resource expensive than your current solution (the database connection is the expensive resource), but seems more robust to me and prevents any possible issues with transactionality (if your processing threads use transactions at all, the result is a mess - one thread's commit would commit all other thread's too, which could be a serious - and potentially hard to detect - bug).
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic