Readpixel Discrepancy

Blitz3D Forums/Blitz3D Beginners Area/Readpixel Discrepancy

pc_tek(Posted 2008) [#1]
Another strange one that has me banging my head against the desk.

In full screen, using 'Readpixel' returns a different colour to the one in a windowed screen.

The command I use is 'Readpixel(x,y) And $FFFFFF'

Ie, a white (255,255,255) pixel in a window returns 16777215. In full screen, the same pixel read returns 16317688.

Any ideas as to what is happening here?


Gabriel(Posted 2008) [#2]
In Windowed mode, you cannot set a colour depth, the color depth will be whatever your desktop is set to. So if your desktop is set to a different colour depth than that which you are setting in fullscreen, this would account for the differences. For example, perhaps your desktop is 16 bit colour but you're setting 32 bit color in fullscreen? That would give you different colour values.

Although I wouldn't have expected white to be affected like this.


pc_tek(Posted 2008) [#3]
You are correct. My windows colour depth is 32-bit, while my full screen proggy is 16-bit.

Although, other colours are affected too as well as pure white.


Matty(Posted 2008) [#4]
That is normal for the other colours as well. In 32 bit each color component (red/green/blue) uses 8 bits whereas in 16bit it is something like 5bits for 2 of the colors and 6 bits for one of the other components (can't remember if blue/green/red). Therefore it is highly unlikely that the value returned by readpixel will be the same in both 32 and 16 bit modes.


Snarkbait(Posted 2008) [#5]
Just force the depth by specifying it.


Yan(Posted 2008) [#6]
Real men search...

http://blitzbasic.com/Community/posts.php?topic=31316

;op


Gabriel(Posted 2008) [#7]
Just force the depth by specifying it.

As I said, you can't force color depth in Windowed. It will be the same as your desktop, regardless of what you specify.


jfk EO-11110(Posted 2008) [#8]
Today I wouldn't use 16 Bit anymore, this mentioned problem is only one problem that may appear. It is however nice to offer flexible screensettings, so even old rigs can run your game in a worst case scenario like 640*480*16. Games usually don't need to search for exact RGB matches in pixels. If a tool is doing so, then you may force 32 Bit.


Warner(Posted 2008) [#9]
There is also the command GetColor, which returns ColorRed/Green/Blue. Maybe you can use it instead?


Gabriel(Posted 2008) [#10]
There is also the command GetColor, which returns ColorRed/Green/Blue. Maybe you can use it instead?

Why? The results retrieved with ReadPixel are correct.

Today I wouldn't use 16 Bit anymore, this mentioned problem is only one problem that may appear.

Agreed. There's no real reason to support 16 bit any more, but you do have to be aware that the end users desktop could be set at 16 bit, so whether you like it or not, in Windowed, you might be stuck in 16 bit. Perhaps a little bit of jiggery pokery with the Windows API could detect if the desktop is at 16 bit and post an error message warning the user to change to 32 bit color if it is?


markcw(Posted 2008) [#11]
If you really wanted to maximize how many people played your game, you would support 16-bit. I never really found it much more work to implement. You should try to avoid using read/writepixel if possible, especially in game time. I think 16-bit is 565 in bitmaps and 555 for video.


Gabriel(Posted 2008) [#12]
If you really wanted to maximize how many people played your game, you would support 16-bit.

Well, if your game is really going to work in all other respects on a videocard which is 10-12 years old and probably hasn't had any new drivers released for five years, I guess.

I never really found it much more work to implement.

It's not about the amount of work, it's about making the experience worse for more users than you can improve it. For every one person who really can't use 32 bit colour who gets to play your game in 16 bit colour, ten other people suffer glitchy graphics because you didn't tell them to switch to 32 bit colour in the control panel. You see, you don't just get less colours in 16 bit video mode, you also get a vastly decreased zbuffer, which means far more vertex wobbling and depth sorting issues. Not to mention the obvious texture banding problems.


Ross C(Posted 2008) [#13]
PLUS, alot of cards, are actually slower in 16 bit mode. I've read a good few articles on this. Since they work in 32 bit mode, everything has to be stepped down to 16 bit mode.


markcw(Posted 2008) [#14]
Well, my point wasn't really about supporting old video cards (although saying as you have this option in B3D you might as well take it). Quite a few people (like myself) have cards with 16 and 32-bit options, so if a user is in 16-bit mode, as many will be, and the game is run in windowed mode, then you have to use 16-bit. If you don't it won't look pretty. It's fine in full screen though as you can then switch to 32-bit (assuming the card has 32-bit, which it should).