This week's book giveaway is in the OO, Patterns, UML and Refactoring forum. We're giving away four copies of Refactoring for Software Design Smells: Managing Technical Debt and have Girish Suryanarayana, Ganesh Samarthyam & Tushar Sharma on-line! See this thread for details.
I am working on a performance issue. This is reported in production, so not much debugging is possible. Here application performance degrades over a period of time. This programs runs for 2 hours for processing 10,000 records. The rate of processing is 200 records/minute initially, then it becomes 150 rec/mt, and finally 25 rec/mt after some time. The kind of processing is same for all records. The overall time (2 hrs) is okay at this point, but the concern is on gradual performance degradation.
What I am trying to find out is:
(1) Can this be due to a memory leak? Will the application slow down when the memory increases? (a dump question, but when I tried with a small program that creates lot of data (to the onset of OutofMemoryError), I couldn't verify this fact. Processing time was the same with and without huge data in memory). btw I am using some collections that hold large number of objects (roughly 10,000).
(2) I have seen some code using WeakReference caching. Will that be helpful here? I could verify that this makes the program more memory efficient. (More objects were getting marked for garbage collection when I used WeakReference, verified this using a static counter variable , increment and display this counter in finalize()).
Waiting for your expert opinion, thanks in advance!
Turn on the different verbose options (like printgcdetails or printtenuringdistribution) for gc and see how frequently it's doing garbage collection and what the pattern. You might find as time passes the frequency increasing due to memory leaks in your application that can cause tons of objects occupying unnecessary heap space. That'll give you an initial idea.
Yes memory leaks can cause performance to drop given you have a limited amount of memory. If you want to test this, reduce the amount of memory you give the JVM at runtime, and see how much faster it slows down. If it does not slow down faster (over time) then your problem is not a memory leak.
Joined: May 23, 2006
This problem is resolved for now. The application was using Toplink for object/database mapping. For a particular business object, Toplink caching was disabled. I changed Toplink setup to enable caching and the application is performing better now. (before the change, # of records processed per minute:250 -30 . After the fix it is 250-150). For now, this is acceptable to the client. If they come back with this issue later I may have to check memory issues.
I guess I will be reopening this post if the issue returns. Thanks all for your suggestions!