I am not sure how to assess the memory heap requirements of a Java program (a Spring & Hibernate Tomcat Web Application a it happens).
I've tried measuring the memory usage at the point of most usage (some batch processing) but the results are fairly meaningless.
I used runtime.totalMemory - runtime.freeMemory just after a garbage collect.
Given that a gc is only a suggestion and may not have completed by the time I do the measurement I guess it is hardly surprising.
Watching the memory usage in a profiler / monitoring tool is fairly pointless too since Java will continue to use memory until a garbage collect - the apparant memory usage is a function of the max heap size. The more I give it the more it uses.
I'm left wondering if the only recourse is to choose a very small heap size and see if the software actually runs? Slowly increasing/decreasing the maximum until I get a result.
The other alternative appears to be a memory dump, but I'm not convinced I will be able to time this exactly or understand the results (I'd probably use MAT).
I'm developing in Eclipse if that makes a difference.
I would certainly go for a load tester and try realistic loads with various memory settings.
You can only get so far with the measurements you have tried already.
Personally, I have used the HttpClient toolkit to create fairly realistic web client emulation.
Joined: Mar 11, 2011
Thanks for the recommendation. I was wondering what to use for load testing and that sounds like a good solution.
I was wondering whether there was some magical measurement, e.g. gc thread time, used heap as a ratio of max heap, etc that was considered a 'warning level' that indicates that the application is hitting the limits?
My worry is that if I just increase the max heap size Java will increase to use it each time - waiting till the last minute to gc?
It all seems so unscientific. I need to work out how many users and customers I can fit on a server. To work out a base cost. Much of the approach seems like waiting (or at best simulating) until it falls over.
Author and all-around good cowpoke
Joined: Mar 22, 2000
This gets us into a very interesting problem in Systems Analysis - since requests arrive with some sort of random distribution and have some sort of distribution of required system resources there is no direct analytical expression to determine a (for instance) optimum memory setting. Perhaps the best we can hope for is a system that gradually degrades instead of falling over.
(historical note - sorry, I just can't help it) The earliest computer language to introduce object oriented programming was Simula - created to simulate complex processes and systems analysis.(/historical note)
Simulating real loads gets us into many cool topics such as Markov chains.