wood burning stoves 2.0*
The moose likes Swing / AWT / SWT and the fly likes BufferedImage color model Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Java 8 in Action this week in the Java 8 forum!
JavaRanch » Java Forums » Java » Swing / AWT / SWT
Bookmark "BufferedImage color model" Watch "BufferedImage color model" New topic
Author

BufferedImage color model

Miklos Szeles
Ranch Hand

Joined: Oct 21, 2008
Posts: 142
Hi,

Can anybody tell me which color model is used most widely in computers? I thought it is INT_BGR, but I tested the ColorModel returned from the GraphicsConfiguration on 3 computer and it was 3BYTE_BGR. Based on Filthy Rich Clients I thought that most of the computers uses 32bit color models nowadays:
http://my.safaribooksonline.com/9780132413930/ch05lev1sec2
As far as I know, when my image is not in compatible format with the graphics device, a color space conversion happens before image drawing. The image data comes from a native image decoder which can produce 3BYTE_BGR output at the moment. So I thought maybe it's worth to implement the INT_BGR model, but if computers don't use it I won't spend time with the INT_BGR support.
Thanks in advance,
Miki
Brian Cole
Author
Ranch Hand

Joined: Sep 20, 2005
Posts: 862
Miklos Szeles wrote:
Can anybody tell me which color model is used most widely in computers? I thought it is INT_BGR, but I tested the ColorModel returned from the GraphicsConfiguration on 3 computer and it was 3BYTE_BGR. Based on Filthy Rich Clients I thought that most of the computers uses 32bit color models nowadays:
http://my.safaribooksonline.com/9780132413930/ch05lev1sec2
As far as I know, when my image is not in compatible format with the graphics device, a color space conversion happens before image drawing. The image data comes from a native image decoder which can produce 3BYTE_BGR output at the moment. So I thought maybe it's worth to implement the INT_BGR model, but if computers don't use it I won't spend time with the INT_BGR support.


I can't tell you what color model is used most widely in computers but the ones used most widely in images in RAM are TYPE_INT_RGB and TYPE_INT_ARGB. I recommend you use one of those.


bitguru blog
Miklos Szeles
Ranch Hand

Joined: Oct 21, 2008
Posts: 142
Hi Brian,

When you run this little program on your computer, do you get 32 for pixel size?

Ulf Dittmer
Marshal

Joined: Mar 22, 2005
Posts: 39578
    
  27
On OS X I'm getting

Color model type: 5
Pixel size: 32
Color model: DirectColorModel


Ping & DNS - updated with new look and Ping home screen widget
Brian Cole
Author
Ranch Hand

Joined: Sep 20, 2005
Posts: 862
Ulf Dittmer wrote:
Color model type: 5


That 5 is ColorSpace.TYPE_RGB but it doesn't really tell you much. (It's actually the color space type, not the color model type. See ColorSpace vs. ColorModel.)

The whole world uses the sRGB color space, including any BufferedImage created with the 3-arg constructor regardless of which color model is specified.
Miklos Szeles
Ranch Hand

Joined: Oct 21, 2008
Posts: 142
I thought that. When I realized the color model is not the information what I was looking for then I found the pixelsize, which tells much more. So the question is not whether RGB used or not, but which type of RGB used. Is it operation system dependent? Video card dependent?
Brian Cole
Author
Ranch Hand

Joined: Sep 20, 2005
Posts: 862
Miklos Szeles wrote:I thought that. When I realized the color model is not the information what I was looking for then I found the pixelsize, which tells much more. So the question is not whether RGB used or not, but which type of RGB used. Is it operation system dependent? Video card dependent?


I'm not sure exactly what you are asking, but I stand by my original recommendation to use either TYPE_INT_RGB or TYPE_INT_ARGB.


I said above that the whole world uses RGB-family color spaces, but that's not entirely true. Analog television broadcasts (NTSC) use a YUV color space, and print advertising often uses CMYK.

You, however, want to use sRGB. The question is how many bits do you want to dedicate to the R-channel, the G channel, and the B channel, and it what order. (also: Alpha channel? indexed palette?) The possible choices don't technically make different "types" of the sRGB color space, but are instead mere bit-packing issues that are beyond/beneath the scope of the color space. Does that make sense? (With YUV or CMYK there would still be these bit-packing issues.)

How about we use one 32-bit int per pixel, and assign 8 bits per channel? The Blue channel gets the low-order bits, Green the next lowest, and Red the next lowest. If you want Alpha then it gets the highest-order bits, otherwise we just "waste" those 8 bits. This is TYPE_INT_RGB/TYPE_INT_ARGB and is ubiquitous.

If you want to use one of the BGR orderings instead, knock yourself out. I believe the difference from most of them (ok, not the 3-byte one) to RGB is just an endian flip, which modern hardware can do fairly quickly.

btw, I'm kind of pretending to be an expert on this stuff but I'm not. I know just enough to be dangerous.

[aside: I originally wrote R-channel without the hyphen, but CodeRanch prevented me from posting because "'r' is a silly English abbreviation."]
Miklos Szeles
Ranch Hand

Joined: Oct 21, 2008
Posts: 142
I think you misunderstood me. What I want is the following:
When you create an image with createCompatibleImage you get an image with the color format used by your display system. So whenver you use drawImage no additional color space conversion occures since your image is compatible with the device. When you use any other formats, the system must convert the image. I use a native decoder which decodes images in YV12 format and then converts it to 3BYTE_BGR. If your graphics device format is 3BYTE_BGR then everything is okay but an additional conversion happens in the background when the device's color space is different. I made some performance tests and my tests shows that the conversion doubles the drawing time, so it is really important to skip the conversion if possible.
I'm using windows XP and in the display settings panel 32bit is choosen, so I thought that my display device must use some 32bit color format. But I was wrong.
I tested this on a few computers and I found that these computers compatible image uses 3BYTE_BGR. But in Filthy Rich Clients it is said that nowadys computers use 32 representations for pixels. So it was strange, since I experienced differently and that was the reason to start this topic. As Ulf reported his system uses INT representation. So I just wanted to know which part of a computer determines the color format of the display device. Is it the video cards? Is it the monitor? Is it the operation system?...

Brian Cole
Author
Ranch Hand

Joined: Sep 20, 2005
Posts: 862
Miklos Szeles wrote:I think you misunderstood me.


You are correct. I misunderstood you.

What I want is the following:
When you create an image with createCompatibleImage you get an image with the color format used by your display system. So whenver you use drawImage no additional color space conversion occures since your image is compatible with the device. When you use any other formats, the system must convert the image. I use a native decoder which decodes images in YV12 format and then converts it to 3BYTE_BGR.


Might it make sense to write your own ColorSpace (not ColorModel) for YV12 and use that? (Perhaps not.)

If your graphics device format is 3BYTE_BGR then everything is okay but an additional conversion happens in the background when the device's color space is different. I made some performance tests and my tests shows that the conversion doubles the drawing time, so it is really important to skip the conversion if possible.
I'm using windows XP and in the display settings panel 32bit is choosen, so I thought that my display device must use some 32bit color format. But I was wrong.
I tested this on a few computers and I found that these computers compatible image uses 3BYTE_BGR. But in Filthy Rich Clients it is said that nowadys computers use 32 representations for pixels. So it was strange, since I experienced differently and that was the reason to start this topic. As Ulf reported his system uses INT representation. So I just wanted to know which part of a computer determines the color format of the display device. Is it the video cards? Is it the monitor? Is it the operation system?...


I don't know much about Windows, but the javadoc for TYPE_3BYTE_BGR mentions "a Windows-style BGR color model" so maybe that's the one to use.

If you've tried all the possibilities and know which one is the fastest, it seems like that should be good enough.

 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: BufferedImage color model
 
Similar Threads
Graphics - BufferedImage - ImageIO problem
Change color of image
Getting the background color of an ImageIcon
Not able to display image from the object retrieved back from the database into the jsp page.
print checkmark in HTML???