There are a several recommendations that I know of. HP suggests that start heap should be set to the same as max heap; another HP rec is that the initial heap is about the size needed for the maximum number of live objects and max heap 3-4 times this. Sun suggest that start heap should be half of max heap; elsewhere they've suggested that start should be between 1/10th to 1/4 of max heap. They recommended setting start and max heaps to be different in 1.1.x.
IBM suggested that as a starting point for a server based on a single JVM, consider setting the maximum heap size to 1/4 the total physical memory on the server and setting the start heap to 1/2 of the maximum heap. IBM points out that in general increasing the size of the Java heap improves throughput to the point where the heap no longer resides in physical memory. Once the heap begins swapping to disk, Java performance drastically suffers. Therefore, the max heap setting should be set small enough to contain the heap within physical memory. Also, large heaps can take several seconds to fill up, so garbage collection occurs less frequently but pause times due to GC will increase.
I don't really believe that anyone has general peer-reviewed tests to back up these suggestions.
There are some rationales. Assuming you've worked out or tuned the JVM so that you know what the max heap should be, then growing the JVM memory can be considered as pure overhead, requiring multiple system calls and resulting in segmented system memory allocation. If you figure that you are going to get to max heap anyway, then there is a good argument for simply starting out at the max heap (HP suggestion), and avoiding the growth overhead as well as getting memory which is less segmented. However, this can mean that when garbage collection kicks in it results in longer pauses, so the system load might not be smoothed out as much as you'd want. But a generational garbage collector will not necessariy suffer from this problem, as it specifically tries to smooth out the GC load, and can do so.
An alternative view is that there is this lovely garbage collection system in the JVM, which will grow the JVM to just as big as needed and no more, so why not let it do its job. This way, although there is some overhead in growing the JVM, you will end up using the minimum resources and the GC should be optimizing what it does best, i.e. handling and maintaining memory. With this argument, you set start heap to 1MB, and max as high as is reasonable.
A combination of these two rationales might lead you to the Sun recommendations: e.g. assuming that the max heap is an overestimate of the ultimate JVM size, then half max is probably a good starting point to minimize memory allocation and memory segmentation overheads, while still giving the GC space to do its stuff of optimizing memory usage.
All vendors recommend using verbosegc to help determine the optimum size that minimizes overall GC. So all the recommendations should really be seen as suggested starting points.
Take your pick. Or come up with other rationales. Or test your system. For testing remember that the goal is to minimize the GC overheads, so you need to measure the total amount of time that GC takes (verbosegc is probably the only way, and it produces non-standardized output) and play with min/max heap to minimize total GC time.
--Jack Shirazi
http://www.JavaPerformanceTuning.com/ [This message has been edited by Jack Shirazi (edited February 18, 2001).]