This week's book giveaway is in the Java in General forum.
We're giving away four copies of Think Java: How to Think Like a Computer Scientist and have Allen B. Downey & Chris Mayfield on-line!
See this thread for details.
Win a copy of Think Java: How to Think Like a Computer Scientist this week in the Java in General forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Don't be like me

 
Ernest Friedman-Hill
author and iconoclast
Marshal
Pie
Posts: 24211
35
Chrome Eclipse IDE Mac OS X
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Got bit by a JOGL "portability bug" that drove me crazy for about three hours today, and thought I'd report it so it didn't happen to anybody else!

OpenGL defines an unsigned int type "GLuint" which JOGL maps to a Java int. This isn't quite right, is it, because on most architectures, GLuint is a 32-bit unsigned int, while a Java int is signed. That means that some code originally written in C would not work right if naively translated to Java.

The function glGenLists() (which creates display lists, basically an enormous optimization) returns GLuint. It returns 0 on failure, and non-zero for success. On success, the return value is meaningful. Now, there exists C code that looks like



But in Java, glGenLists() can return a negative number on success, so this code would be broken.

So to make a long story short: NVIDIA's drivers return small positive integers here when used with GeForce3 or 4 or Quadro 8xx or 9xx cards; but the very same drivers return large negative numbers when you've got a Quadro FX card. The symptom was that the application ran very slowly on these high-end cards (because the optimization wasn't in force) and fast on cheaper cards. Took me three hours to track this down. Don't let this happen to you! Now the code says

 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic