I have to analyse large heap dump file (3.6GB) from production environment. However if open it in eclipse mat, it is giving OutOfMemoryError. I tried to increase eclipse workbench java heap size as well. But it doesnt help. I also tried with visualVM as well. Can we split the heap dump file into small size? Or is there any way to set max heap dump file size for jvm options so that we collect reasonable size of heap dumps.
No, you need to give as much memory to MAT as the size of the file. That's just a guideline. MAT doesn't need as much memory, but it's good to have
This is the reason I don't use the MAT plugin to eclipse, and I prefer to use the standalone version instead. I want my eclipse to behave with other apps on my server. So, I have set my memory on eclipse to take a good bit of memory, just not all the memory. I change my MAT memory settings to match the head dump I want to analyze
Thanks for the reply. But I cannot ask my manager or my IT team to increase memory each time it get large heap dumps. Any other suggestions / solution?
Jayesh A Lalwani wrote:If you are using the standalone version of MAT, You just change MemoryAnalyzer.ini and put in -Xmx option
Jayesh is right. You only need to tweak the ini file.
You can also try out IBM HeapAnalyzer...it provides IBM, Solaris, and HP-UX Java 6.0 ascii/binary Java heap dump support...I'm not sure about other types of dump files.