• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Paul Clapham
  • Ron McLeod
  • Tim Cooke
  • Junilu Lacar
Sheriffs:
  • Rob Spoor
  • Devaka Cooray
  • Jeanne Boyarsky
Saloon Keepers:
  • Jesse Silverman
  • Stephan van Hulst
  • Tim Moores
  • Carey Brown
  • Tim Holloway
Bartenders:
  • Jj Roberts
  • Al Hobbs
  • Piet Souris

How to force Java code to use dedicated GPU?

 
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
My java code runs with 3200 frames per second on my PC. On my LAPTOP however, (which has a very similar hardware except that it has an integrated GPU) can only run the same code with maximum of 100 FPS, and has no dedicated GPU-usage whatsoever.
How could I force my java game to use the dedicated GPU?

Things that I tried:

--Set the dedicated GPU as default in NVIDIA Control Panel.

--Update drivers

--Uninstalling the integrated GPU
 
Saloon Keeper
Posts: 13366
295
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Welcome to CodeRanch!

You're not giving us much to work with. What kind of application is it? What rendering library are you using?
 
Gert Keldern
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Stephan van Hulst wrote:Welcome to CodeRanch!

You're not giving us much to work with. What kind of application is it? What rendering library are you using?



Thanks for the reply. It's a 2D java game. It's pretty simple in terms of rendering, it's using a Graphics g object to render.
However, I made some progress since the last time, I've enabled hardware-acceleration in the code and now I get 300 FPS instead of 100, and now it actually uses the dedicated GPU aswell.
Still only 0.1x times the FPS of the PC FPS, but it's a step ahead.
 
Marshal
Posts: 74341
334
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Welcome to the Ranch again

Why would you need 300fps when most screens only support 60fps? How did you enable that hardware acceleration?
 
Gert Keldern
Greenhorn
Posts: 5
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Campbell Ritchie wrote:Welcome to the Ranch again

Why would you need 300fps when most screens only support 60fps? How did you enable that hardware acceleration?



Thank you for your reply.

It's 300 FPS on a powerful laptop and 3200 FPS on a similarly powerful PC, so I want the game to be the similar fps aswell.
The game's engine doesnt allow FPS to be modified meaning it has to be at a constant level like 60 or 144 so I cannot afford any FPS drops.

I have added this line to my Launcher class: System.setProperty("sun.java2d.opengl", "true");
 
Saloon Keeper
Posts: 7162
165
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Interesting. I know little about gaming, but I thought beyond 60fps or so the human eye wasn't able to discern changes that quickly.
 
Campbell Ritchie
Marshal
Posts: 74341
334
  • Likes 2
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Tim Moores wrote:. . . I thought beyond 60fps or so the human eye wasn't able to discern changes that quickly.

It's more like 25‑30fps.
 
Stephan van Hulst
Saloon Keeper
Posts: 13366
295
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I don't see why a 2D game would give even an older system any trouble when you use sensible programming techniques.

Let me put it this way: if your game runs fine on an integrated GPU, you obviously don't have to worry about getting it to run on a graphics card. If it only runs well using a graphics card, what about players that don't have one?
 
Saloon Keeper
Posts: 24499
167
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Campbell Ritchie wrote:

Tim Moores wrote:. . . I thought beyond 60fps or so the human eye wasn't able to discern changes that quickly.

It's more like 25‑30fps.



The standard motion picture film frame rate is 24FPS. US (NTSC) TV ran 60FPS by rendering 2 30FPS frames on alternate scan lines. I think PAL was interlaced as well, just different resolution.

If you moved your head suddenly, you would possibly see artefacts at 60FPS interlaced. These days, however, computer monitors generally run 60FPS non-interlaced and I'm not sure about artefacts, but I don't see any on still images at least. The old phosphor CRTs had to refresh frequently, since in order to display motion tolerably, the phosphors had to have a fairly fast fade rate. But not too fast, or you'd get a strobe-light effect.

LEDs themselves have no fade to speak of, although I think that the ones used for video screens may have a phosphor secondary, both to ensure the desired LED color and to avoid strobing as the LED matrix is scanned. Never gave it much thought.

So 100FPS even with a very-short fade phosphor is exceptional quality and anything over that is overkill I think.

In short, Campbell is mostly correct, except for the fact that most of us don't watch video Clockwork-Orange style and therefore head and eye movements can distort the image. Still, head and eye movement distort everything, so some restraint is in order.
 
Tim Holloway
Saloon Keeper
Posts: 24499
167
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Oh, and in answer to the original question. Java always uses the video card to render graphics. Even when you have a headless server sitting in a rack in a dark data center, if the webapps create graphs and/or formatted text, the Java graphics routines still use whatever graphics card the server has installed. Most PC/server machines won't even boot unless there's at least a minimal graphics controller either integrated or in a slot.

A GPU may also be used for non-graphics purposes - like Bitcoin farming - but in that case you need an external library to allow the computational code to be fed to the GPU. And, of course, you have to program it in something that can be compiled to GPU machine language.

Splitting the difference, if you have graphics that a supercharged graphics card has special functions for that are above and beyond the subtleties of the Java graphics libraries - and the OS graphics drivers, then you'll also need some appropriate external graphics library. Probably a dedicated game engine.
 
You showed up just in time for the waffles! And this tiny ad:
Building a Better World in your Backyard by Paul Wheaton and Shawn Klassen-Koop
https://coderanch.com/wiki/718759/books/Building-World-Backyard-Paul-Wheaton
reply
    Bookmark Topic Watch Topic
  • New Topic