Pixel format bug?

Blitz3D Forums/Blitz3D Programming/Pixel format bug?

SLotman(Posted 2007) [#1]
I just made a BMP with some dots (kind of a colored heightmap) to place some objects on a terrain - the object being defined by the pixel color.

The problem is: everything works fine on windowed mode, 32bpp - but not if the desktop is 16bpp.

A point that should be 0,255,0, in 16bpp returns 0,252,0!

Just try for yourselves:

Create a image with a single dot in RGB 0,255,0 and save it as mapinfo.bmp, then run the program below:

Graphics3d 640,480, 0, 2
Tex = LoadTexture("mapinfo.bmp")
LockBuffer TextureBuffer(Tex)
rgb = ReadPixelFast(0,0,TextureBuffer(Tex))  
UnlockBuffer TextureBuffer(Tex)
R = rgb Shr 16 And %11111111
G = rgb Shr 8 And %11111111
B = rgb And %11111111

while not keyhit(1)
   cls
   updateworld
   renderworld
   text 10,10, str$(r) + "-" + str$(g) + "-" + str$(b)
   flip
wend
end


Even using the "getcolor" command I get the same problem! Does anyone know anything about this?

Changing the mode to Graphics3D 640,480,16,1 also returns 0,252,0...

EDIT: just remembered that 16bpp has 5 bits per color... but shouldnt it convert/return values beetween 0-255, specially when using ColorRed/ColorBlue/ColorGreen?


Floyd(Posted 2007) [#2]
16-bit color means that not all values are possible. I think green gets 6 of those bits so there are 2^6 = 64 available levels of green. 252 is one of the but 255 is not.


SLotman(Posted 2007) [#3]
This should "fix" the problem:

Graphics3D 640,480, 16, 1
Tex = LoadTexture("mapinfo.bmp",1+256)


If GraphicsDepth()=16 Then
   SetBuffer TextureBuffer(tex)
   GetColor 0,0
   R = ColorRed() / 248.0 * 255
   G = ColorGreen() / 252.0 * 255
   B = ColorBlue() / 248.0 * 255
	Color 255,255,255   
   SetBuffer BackBuffer()
Else 
	LockBuffer TextureBuffer(Tex)
	rgb = ReadPixelFast(0,0,TextureBuffer(Tex))  
	UnlockBuffer TextureBuffer(Tex)

	R = rgb Shr 16 And %11111111
	G = rgb Shr 8 And %11111111
	B = rgb And %11111111
End If


While Not KeyHit(1)
   Cls
   UpdateWorld
   RenderWorld
   Text 10,10, Str$(r) + "-" + Str$(g) + "-" + Str$(b)
   Flip
Wend
End



SLotman(Posted 2007) [#4]
And even better: now I'm reading the bitmap through my own code and dont have to care for desktop bit depth ;)


Rob Farley(Posted 2007) [#5]
That won't fix the problem.

It's all down to the graphics card how it deals with 16 bit.

Some have 555, and as you've found 565. Some will emulate 24bit.

The easiest way I've found is to write the pixel to an image buffer and read it back out again. That way you'll get the correct results regardless of the video card.