• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Tuning 64bit JVM (Windows)

 
Rob Poulos
Ranch Hand
Posts: 49
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
We have a 3rd party java job scheduling application which used to run in a 32bit windows enviroment but with an increase in users and jobs, the performance has taken a severe hit. This past weekend we migrated the scheduling application onto a 64bit Windows 2k3 box and we are looking to optimise every ounce of performace out of this box.

The specs i have are below:

Java 64bit 1.6.0_17
Windows 2003 64bit Server (SP2)
intel XEon CPU
8GB Ram

This server ONLY runs this application so all resources are really dedicated to it.

I did find this thread which was rather helpful but I still wanted the input from you guys: http://onjava.com/onjava/2006/11/01/scaling-enterprise-java-on-64-bit-multi-core.html

How would you define the memory settings?
 
Peter Johnson
author
Bartender
Posts: 5852
7
Android Eclipse IDE Ubuntu
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
What monitoring have you done? Have you monitored the heap an garbage collection? What does that app do - do it access databases, and if so have you monitored the database? You cannot tune what you have not measured.

I gave a presentation on monitoring GC a while back: http://www.cecmg.de/doc/tagung_2007/agenda07/24-mai/2b3-peter-johnson/index.html

By the way, our performance testing has shown that the only benefit of using a 64-bit JVM over a 32-bit JVM is that you can create a larger heap.
 
Rob Poulos
Ranch Hand
Posts: 49
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Peter,

First off, thank you for the prompt response. Unfortunatly we dont have any Java guru's on hand so our level of monitoring is extremely primitave. I am a bit out of the loop as to what exactly is happening but i can pretty much assure you we are not monitoring garbage collection performace. I dont even think we would know where to begin or what to look for. I guess i was more or less looking for a rule of thumb settings and work from there.

for example: given 8GB of physical memory you should never exceed allocating more than 50% (4GB) of physical memory for the JVM.

The application performs heavy database / network traffic but we've had the DBA's and Network guys have confirmed that we are not maxing out DB/Network performance so I am confident that the bottleneck isnt there.

Our server team analyzed the server performance on the old box and confirmed that CPU/Mem utilization was essentially pegged a majority of the time and suggested upgrading the box to a faster CPU/more memory, and 64bit OS. We have definitly noticed a difference in performance however we are operating on the default JVM settings and we believe that by allocating a bit more memory or setting properties such as -XX:ThreadStackSize=256k, -server or -XX:+UseConcMarkSweepGC we can increase the performace even more
 
Rob Poulos
Ranch Hand
Posts: 49
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
something? anything?

im trying to get my boss to try

-server -XX:ThreadStackSize=256k -XX:+UseConcMarkSweepGC -Xms4g

as a starting point
 
Pat Farrell
Rancher
Posts: 4660
5
Linux Mac OS X VI Editor
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Rob Poulos wrote:I guess i was more or less looking for a rule of thumb settings and work from there.
for example: given 8GB of physical memory you should never exceed allocating more than 50% (4GB) of physical memory for the JVM.


Er, that basic question was answered in 1969 when Denning wrote up Working Sets. The answer is that computers are better at managing memory than programmers, so all the theory of overlays and other manual memory management went away.

His second observation is that computers that are memory constrained work better with more real memory than with more virtual memory.

What do you think is going to use the other half of the physical memory?

If the machine is running lots of services, say Apache, DNS, postfix, etc, then it may make some sense. But it seems to me that heuristics are the way to go. Start with 100% to the JVM, back it down by 20% and repeat the tests, mapping performance, throughput or response time as you wish.

Find the knee of the curve, and go back a bit bigger.

If needed, buy more memory. Keep buying more memory (its dirt cheap) until you peg the CPU. Then buy more CPU.


 
Rob Poulos
Ranch Hand
Posts: 49
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
thank you for the response. I'm going to make some suggestions. hopefully they'll listen
 
Peter Johnson
author
Bartender
Posts: 5852
7
Android Eclipse IDE Ubuntu
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I guess i was more or less looking for a rule of thumb settings and work from there.

My general rule-of-thumb is to start with a 1.2GB heap with a 300MB young generation, monitor the GC performance and adjust from there. If you see large pause times for major collections and response times are critical, switch to the CMS collector. I give a 1/2 day seminar on Jvaa GC monitoring and tuning at the CMG conference, and even have a chapter on that topic in JBoss in Action; trying to condense all of that knowledge into a forum post is impossible.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic