Hello everyone,
I hope you may help me with this problem.
I'm working on a new Galaxy S3 and found a problem that didn't occur on other phones.
My problem is strange. When drawing my OpenGL models on top of my camera image like that:
So the GLRenderer should be transparent. It actually is. It works fine for all the objects that are opaque as well as the background (where I see the image of the camera)
My problem is, when I make the pixels semi-transparent in the fragment Shader I get a weird result. The brightest pixels seem overexpose when "blending together". And it looks like they got clamped after so the value that brighter than white - loosing probably the first bit - got dark again.
So only the brightest pixels are affected.
I use a simple call like:
So the actual GL-Fragment is clamped and I'm pretty sure it comes from the camera preview itself.
Here is an image that describes my problem:
I hope someone here has seen that problem before. Is there any way of "clamping" both values together since in the Shader I can only clamp the actual rendered fragment somehow. :-(
By the way: it works perfectly fine on my old Galaxy S1 and on the S2 as well.
Thank you for your help,
Tobias