I've been delving into a bit of game-related development lately, and I have
thread which constantly renders an offscreen buffer to a panel. I'm emulating 80x25 text-mode, so 16-colors are available (4-bit IndexColorModel). I had quite the time figuring out how to get fades to work smoothly, both in speed and quality, but now I've got most everything working how I'd like.
My only gripe is, whenever performing fades, I must recreate the offscreen image (a BufferedImage) with the altered ColorModel each step along the way. This eats up a couple megs worth during each fade. That wouldn't even particularly bother me, except that it causes the garbage colletion to trigger every so often, and my fades will ungracefully pause.
If I call System.gc() before each fade, I get the wait up front and then the fades are always fine, but I'd really rather just figure out how to switch the color model without having to reallocate memory, ie. a more elegant solution. I do realize that I might not always be able to avoid these kind of pauses in my
Java apps, but I can at least take some preventive measures. :-)
A ColorModel is just a rule for color conversion after all, right? It seems rather silly that there wouldn't be some simple mechanism available for swapping out different ones. Perhaps right under my nose? Wouldn't that be swell...
So far I've been unable to find anything that does what I want, and I've been picking through the API for I don't know how long (okay, a couple weeks in my spare time). I've also surfed around quite a bit, in search of some example code doing anything similar, but the only example I did find was something to do with fractals and palette rotation -- but it also regenerates the image every frame.
Any help/hints anyone can offer would be much appreciated!