Help coderanch get a
new server
by contributing to the fundraiser
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Ron McLeod
  • Paul Clapham
  • Devaka Cooray
  • Liutauras Vilda
Sheriffs:
  • Jeanne Boyarsky
  • paul wheaton
  • Henry Wong
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Tim Moores
  • Carey Brown
  • Mikalai Zaikin
Bartenders:
  • Lou Hamers
  • Piet Souris
  • Frits Walraven

java.util.NoSuchElementException: Timeout waiting for idle object

 
Greenhorn
Posts: 8
Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello,

Greetings for the day!!

I'm getting java.util.NoSuchElementException: Timeout waiting for idle object error in a production when there is high load on the system. I'm calling Jasper reports form java to export reports using jasper libraries. For this i'm passing poolable connection from java. As defined in jasper libraries, jasper never responsible to close the connection when it finishes extracting reports. Due to this, I have explicitly closed the connection in finally block right after the call given for jasper.

Connection pooling parameters I have configured are :


Can any one suggest how to debug the connection leak problems or can any one suggest what jdbc poolable parameter configuration should be used to deal with the systems when its serving highest load. Let me know if anything more information required about the exception.

Thanks,
Abhinandan Patil.
 
Sheriff
Posts: 11604
178
Hibernate jQuery Eclipse IDE Spring MySQL Database AngularJS Tomcat Server Chrome Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Are you sure it's a connection leak problem? Or is it because when your system is under high load, all 50 connections are used and the application needs another one?

Is the application itself responsible for getting connections from the pool, creating statements, executing queries, closing result sets, closing statements and so on? Or do you use a framework, like Spring, to do this dirty work?

I would start monitoring the number of the connections in the pool. If the number of the connections steadily grows, whitout the growth of the usage of your web application, you might have a connection leak. If you confirm that, you would have to investigate that leak.

And/or you can change your data source settings and remove abandoned connections (and maybe log them as well). See Tomcat's JDBC Connection Pool documentation for attributes removeAbandoned, removeAbandonedTimeout, and logAbandoned.
 
Abhinandan Patil
Greenhorn
Posts: 8
Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi,

I'm in bit confusion that in my case Is this "Connection pool exhausted" problem causing because of connection are not getting closed and returned to pool? or It is because of max active connections defined in data source are small enough to handle current HIGH load of the system. As per my last comment, its confirmed that while giving call to third party libraries (jasper reports) am passing connection and immediately after a call i'm closing this in finally block. Closing of result set, prepared statement is being done in jasper library. So i'm quite sure the "Connection pool exhausted" issue cannot be because of incorrect connection close problem. I suspect that the max active count parameter (here maxActive=50) may causing the less number of connection to handle the application load at peak time. (Report extraction is being done around 1000 times in 2 hours + other application activities apart from report extraction).

Can you please insist me the impact of increasing maxActive connection parameter? Or 50 itself is enough in production systems with high loads.? Or you want me to analyse the abandoned connections and fix them?

Also How it is feasible to enable "logAbandoned" parameter in production environment as it may lead overhead while creating connections or prepared statements.?

Thanks,
Abhi

 
Roel De Nijs
Sheriff
Posts: 11604
178
Hibernate jQuery Eclipse IDE Spring MySQL Database AngularJS Tomcat Server Chrome Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Abhinandan Patil wrote:Can you please insist me the impact of increasing maxActive connection parameter? Or 50 itself is enough in production systems with high loads.?


The default value for maxActive is 100. And that's what we use for our data sources without any problem.

Abhinandan Patil wrote:Or you want me to analyse the abandoned connections and fix them?


You can't fix an issue, if you don't know what the cause of the issue is. Currently you don't know if you have abandoned connections (and thus a resource leak) or your system just needs more than 50 active connections when it's heavily used. So if you want a permanent fix, you really have to find the root cause of this issue. Because if you really have a resource leak, then increasing maxActive won't solve the issue, it will only postpone the occurence of the issue (and instead of having a NoSuchElementException each night, you'll have one every other night).

Abhinandan Patil wrote:Also How it is feasible to enable "logAbandoned" parameter in production environment as it may lead overhead while creating connections or prepared statements.?


It's of course better to (try to) simulate the issue in another environment (development, test, acceptance) and then change the data source settings in this environment to log the abandoned connections.
 
Abhinandan Patil
Greenhorn
Posts: 8
Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello Roel,

Thanks for your assistance. Will go ahead by simulating the issue using logAbandoned properties and verify the cause of the issue.

Thanks,
Abhi
 
Roel De Nijs
Sheriff
Posts: 11604
178
Hibernate jQuery Eclipse IDE Spring MySQL Database AngularJS Tomcat Server Chrome Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Abhinandan Patil wrote:Will go ahead by simulating the issue using logAbandoned properties and verify the cause of the issue.


Best of luck! And keep us posted with your findings.
 
Abhinandan Patil
Greenhorn
Posts: 8
Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Roel,

I'm able to reproduce this issue in development and QA environment. I have set maxActive count to minimum value as "1" and tried to extract the reports from my application. As expected, i have received "Connection pool exhausted exception" for the second request to report extraction. This is what I had suspected very most. I believe, i'm closing the connections as soon as I've done with the reports execution, hence I thought that connection leak will not be a primary reason. But as a safer side, i have enabled the logAmandoned property of DBCP and i'm monitoring this for some days.

Moving ahead to provide a solution to this issue, i'm planning to extend the maxActive count to 100 from 50 (DOUBLING TO THAT WE HAVE EARLIER) as I think as far as the load on the application in peak hours is considered, (600 branches [600*3=1800 end users] are using the application to perform Banking Card related operations/reports, Its assumed to be very high load on the application). And as per my today's experience, where maxActive count is defined as 1, I logged into my application with two different users, tried extracting same report from two different browser instances, using these two user logins, I found that very first report request has succeed while for the second request, i received the connection pool exhausted because the maxAcive count is already reached.

Please suggest if you thinks my experience and observations are good enough to extend the maxActive count?

Let's see if i get any connection leak traces in tomcat logs.

Important sites that I have gone through to analyse this error:

http://grokbase.com/t/tomcat/users/10b3140dn2/connection-leak

http://grokbase.com/t/tomcat/users/051r7fmqs2/connection-pool-leaking-how-to-detect-it

Thanks Roel for your timely assistance.
 
Roel De Nijs
Sheriff
Posts: 11604
178
Hibernate jQuery Eclipse IDE Spring MySQL Database AngularJS Tomcat Server Chrome Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Abhinandan Patil wrote:Please suggest if you thinks my experience and observations are good enough to extend the maxActive count?


I think using maxActive=1 is not the best situation to verify if you have resource leaks or not. Certainly not if different reports are requested concurrently. And maybe more than one connection is required to generate one report (I don't know, because I have never used Jasper reports).

But I would certainly decrease maxActive to a lower reasonable value (e.g. 5) and log abandoned connections for some days in development (and/or QA).

In the meanwhile you can inspect your code and look for code which creates connections, statements and/or result sets. For every created resource, you should invoke the close() method as well.
 
With a little knowledge, a cast iron skillet is non-stick and lasts a lifetime.
reply
    Bookmark Topic Watch Topic
  • New Topic