I have a java standalone program with default settings. Now I have looked up the documentation and it says that the default heap size is 64Mb for 1.6 JVM. I am running in 64bit env so the documentation said a 30% will be added, which is around 83Mb.
Sure enough, in my standalone program, when I print Runtime.maxMemory(), I get the number 85196800.
Now the other thing I observe is that the "used" memory by calling this Runtime.getRuntime().totalMemory () - Runtime.getRuntime().freeMemory ();
I noticed that whenever that "used" number approaches near 85196800, it drops back down to the 6000000 range. Again, this makes perfect sense as I understand it because gc will try to clean up when it is near max heap size. So far so good, but correct me if I am wrong.
Now, the problem is this: if I use TOP in my unix server, I noticed that over time the Java process grows in the VIRT field. This field is suppose to represent the amount of virtual memory. It just keeps growing, and right now it is as 1450Mb. Why is this? I thought the whole point of the max heap size, is that the JVM will not grow past that amount of memory usage? Needless to say, this eats into my swap space and ultimately brought my server down to its knew (the machine only have 256mb ram).
Another side point is that the RES field is always at 160Mb. Now I don't quite understand how 160 is still bigger than the 83Mb, I'm guessing that the other 77 Mb is for some things that are outside of the heap
Can anyone explain what is really going on here? And how do I limit this jvm process such that it will not eat up all resources on this server and bring it to a halt?
Joined: Apr 27, 2003
Can you try running your java application by explicitly stating the max heap size:
java -Xms128m -Xmx128mYourAppClass
I have seen default Max Healp sizes on Linux to be different from than that on windows.
Joined: Apr 24, 2012
I am having a similiar problem as the OP.
For example, a JVM is set at -Xms200m -Xmx400m. When I check the virtual size allocated it's showing 650MB. What is the point of a max heap of 400 if this goes above it? Am I not understanding the concept of virtual that adds to this?
Accounting for used memory is actually not always easy. The size of virtual memory reported by the OS contains not only the heap allocated by the program, but also the memory used to store the program itself (the CPU instructions), the program's stack(s), areas of memory mapped files (if the application uses them) and perhaps some other items depending on your OS. Applications can also share some portions of memory, complicating things further (for example, if you run the same executable file twice, its code in the memory can be shared by the two instances, which saves the memory, but might be reported by the OS as being used by both of them, which means that the sum of memory reported as used by these two executables is inaccurate).
In case of Java, the -Xmx parameter is observed. However, Java actually allocates two memory areas, not one (the other one is used to store class data, that is the bytecode of your classes). On top of that, some memory is used for stacks, to load the JRM executable code itself and the code compiled by the JIT into native instructions has to be stored somewhere too. So by using -Xmx 400m you'll limit the amount of memory available to the application for creating objects to 400 MB, but the application as a whole can take more. You'd probably have to experiment a bit if you want to make sure that the application won't take more than 400 MB in total and you actually might not be able to enforce such a limit in a reliable way.