ReadPixel Problem

Blitz3D Forums/Blitz3D Programming/ReadPixel Problem

HappyCat(Posted 2005) [#1]
I've mentioned this already in another thread but it seems to have gone unnoticed so I thought I'd try once more.

On one of my PCs (with Geforce 4 gfx) ReadPixelFast on a black pixel works fine and returns $00000000 as expected, but on another (with Intel gfx) the ReadPixelFast returns $0000FFFF for the same pixel. In fact, for all pixels the Green and Blue bytes are always $FF.

This is all in 32-bit colour.

Any thoughts as to why this might be?


JoshK(Posted 2005) [#2]
You have a piece of hardware with the word "Xtreme" in the name.


big10p(Posted 2005) [#3]
Do you still get the same results if you use the slower ReadPixel command on an unlocked buffer?

You have been locking the buffer before using ReadPixelFast, right? Just checking. :)


HappyCat(Posted 2005) [#4]
You have a piece of hardware with the word "Xtreme" in the name.

"Express", but not "Xtreme". So, are there known problems with some cards then?

You have been locking the buffer

Yep.

Do you still get the same results if you use the slower ReadPixel command on an unlocked buffer?

Yep, still the same.


BlitzSupport(Posted 2005) [#5]
Can you post a small example that demonstrates this? Does it happen in different display sizes? Also, what specific graphics chipset is it?

I thought it might be related to this thing (which I think is the result of different display formats/pitches, but usually only seems to show up in 16-bit modes), but that does seem rather a huge difference, ie. $00000000 to $0000FFFF.

What results do you get back when writing something like $FF000000 or $00000001?


HappyCat(Posted 2005) [#6]
The following recreates the problem for me on the affected PC and works fine on the other PC.

On the affected PC I get "0, 255, 255" for every pixel. On the other PC I get "0, 0, 0" for them all.

Just realised that it seems to be using texture flag 2 on the CreateTexture that's causing it - miss that out and it works fine but I need it for applying alpha to the texture, which is what I'm trying to do :-)

Graphics3D 640, 480, 0, 2

Local tex=CreateTexture(100,100,2)

; And print out the RGB values

Local TexBuff = TextureBuffer(tex)
LockBuffer(TexBuff)

For X = 0 To TextureWidth(tex) -1
	For Y = 0 To TextureHeight(tex) -1

		Local RGB = ReadPixelFast(X, Y, TexBuff) And $00ffffff

		Local Red = (RGB Shr 16) And $ff
		Local Green = (RGB Shr 8) And $ff
		Local Blue = RGB And $ff
			
		DebugLog(Red + ", " + Green + ", " + Blue)
			
	Next
Next
	
UnlockBuffer(TexBuff)



BlitzSupport(Posted 2005) [#7]
Is it meant to be a pure alpha texture? What happens if you set flags to 1 + 2 (colour plus alpha)?


HappyCat(Posted 2005) [#8]
In my best Victor Meldrew ... I do not believe it!

1+2 works fine. I originally wrote the code on the other PC (where it worked with just 2) so I just didn't think about it. I was looking at the read and write pixel as being the culprit.

Erm ... I'll just shuffle over into this dark corner here ...




PS. Thanks :-)


Shifty Geezer(Posted 2005) [#9]
That's invariably the way. I find posting on this forum often helps me find my own solutions before anyone helps out ;)


BlitzSupport(Posted 2005) [#10]

1+2 works fine


Glad you picked up the right flags. I've corrected my post in case some other poor sod stumbles upon it...