I decided to make a simple arcade game. In the main class I have a "while(true)" loop and inside of this loop it redraws the screens listens to keystrokes etc. I would assume this is fairly typical. I just want to know why when I run the game on one machine it uses almost 100% of the CPU and on another machine it only uses 50%. The only differences I can think of is one is an AMD and the other is a Pentium, both are running windows XP professional. Both machines are running Java 1.5. I haven't put anything into to regulate the fps, and am assuming this means it uses as much of the cpu as its being allowed too. On the first machine I do want to slow it down a bit, but on the second machine... I think I'd rather make it go a little faster. I'm guessing to slow it down I just need a timer and to have it only redraw the screen after X amount of time, as for speeding it up I have no clue where to start.
You'd better provide a way for the main thread of the game have some sleep. But I have no idea of how could you control the CPU's capacity usage... By the way, if you figure out a way, please let us know, ok?
I figured making the thread sleep would be the best way to regulate the speed of the game.
I just don't understand (and I guess this isn't obvious and probably platform/jvm specific) why on one machine the jvm would take full control of the cpu and on another machine be throttled to only using 50% of the cpu. Especially when they're both the same OS and afaik same jre (1.5.06)
No, one's an intel chip(p4 2.6ghz) and the other is an amd (XP 2900). On the intel chip I've never seen any java application use more than 50% of the processor (its like there is a limit), but not so on the amd machine. Both are running Windows XP sp2. Just seems odd to me.