Whats the problem with 24bit

Blitz3D Forums/Blitz3D Programming/Whats the problem with 24bit

D4NM4N(Posted 2006) [#1]
Why cant blitz handle 24bit color?


GfK(Posted 2006) [#2]
It does. Its probably your card that doesn't.


Danny(Posted 2006) [#3]
Not meaning to hyjack, but come to think of it; What would be the difference of running a blitz3d app in 24-bit OR 32-bit color??? (assuming your card supports both)

32 bit doesn't give 'additional colors' and normally means the addition of an alpha channel - but in realtime 3d terms, what's the difference????? If any...

Danny


Damien Sturdy(Posted 2006) [#4]
um, 32 bit colours are exactly the same as 24 bit, the 32 bit comes from the extra 8 bits of having an alpha byte. thats it AFAIK.


Matty(Posted 2006) [#5]
Does it have any effect on the bit depth of the zbuffer? If so then it may simply increase the accuracy of the zbuffer in 32bit mode compared to 24 or 16bit mode.


markcw(Posted 2006) [#6]
i think it's because 24-bit is an image color format, not a screen/video format.


jfk EO-11110(Posted 2006) [#7]
Maybe the graphics card does use 24 Bits per pixel internally, compared to 32 Bit where 8 bits are unused, but access may be faster since it allows one-instruction access, where 24 Bit requires odd adresses (probably slower) for every 2nd pixel, and it also requires to AND the value by $FFFFFF.

Since 2D graphics don't use an Alpha Channel in DX, I guess both modes are looking exactly the same.


smilertoo(Posted 2006) [#8]
it's been a long time since ive seen any of my drivers offer 24bit, its always 16/32 bit now.


Damien Sturdy(Posted 2006) [#9]
Muk, My old Intel Onboard supported 16 or 24 bit screen modes only :-)


D4NM4N(Posted 2006) [#10]
Im writing for an embedded pc that only supports 16/24 bit
but for some reason with blitz it cannot start 3d in 24 bit color only 16.
Windows however has no probs with 24 bit
I would like to get the 24bit option working to cut out the z-buffer problems.

Even on my new pc blitz seems to have problems with 24bit and thats a modern nvidia job.


big10p(Posted 2006) [#11]
Your PC may support 24-bit but it may not support 3D in a 24-bit mode, for some reason. Try using GfxMode3D to see if 3D is supported in the specified mode.


Rroff(Posted 2006) [#12]
24bit display modes used to be sloooow, even on cards/drivers that supported 16,24 and 32bit, 32bit was often 4x faster than 24bit...

While your display adapter might have enough memory to do 24bit in 2D, with the requirements needed for a z-buffer, etc. it might not have enough to do 3D.


Sir Gak(Posted 2006) [#13]
Interesting discussion. If I remember correctly, 24-bit was claimed at some point to be a measure of the maximum number of colors that can differentiated by the human eye. You know, like how the human ear cannot hear audio frequencies below 20 Hz or above 20 KHz (ie 20-20KHz), and so it was being maintained that the human eyes can't see more than 16777216 (ie 24-bits) worth of colors. Me, I think that this was just a justification for using RGB values, 8-bits for Red, 8-bits for Green, and 8-bits for Blue. Anybody know any credible science that establishes what the maximum number of colors the human eye can see?

Also, as a side note on 24-bit, I normally use 32-bit color on my PC. I recently used a software that complained and insisted that it could NOT run on 32-bit, but that I had to downgrade to 24-bit. Say, WHAT? What a bogus piece of work, as if I would change my system operation to suit them! Sheesh!

Blitzers, do NOT do this to your users!


big10p(Posted 2006) [#14]
Anybody know any credible science that establishes what the maximum number of colors the human eye can see?
I don't know, but I know 24-bits aren't enough when it comes to single shades of a primary colour, or grey. Each RGB channel only has 8 bits, so, say I want a gradient of pure red to black - that only gives 256 shades to change from red to black. Doing so on a large area and you can clearly see banding. I think it's probably even more noticeable on a greyscale gradient.


jfk EO-11110(Posted 2006) [#15]
Exactly. 24 Bit and 32 Bit was only a step after 8 Bit and 16 Bit Color. Soon we'll have 48/64 Bit. This will also remove moire artefacts on gradients. Already today good graphics apps work with more than 24 bit internally. So they shift the source RGB to eg. 48 Bit, do all calculations and then convert it back to 24 bit.


fredborg(Posted 2006) [#16]
The thing is that the human eye (a perfect one) can see over a 100 million different colors.

The claim about the 16.7 million colors is correct in the sense that a screen cannot produce the full range of colors the eye can see. It simply doesn't cover a large enough range of the spectrum.

The problem with banding does in fact not come from only 256 shades being available, but because most screens do not remap the intensities in a 1:1 ratio, therefore you get visible color jumps between certain values. This is particularily noticable on LCD screens, as they cover a smaller color range, and have sharper definition compared to CCT screens.

Internally it makes sense to compute colors at a higher quality, to avoid clamping and other things, but when it comes to displaying colors 24bit is enough as long as the display technology is at it's current level.


D4NM4N(Posted 2006) [#17]
I though the only difference between 24 & 32 was an extra 8 bits for the alpha channel, nothing to do with the number of colors (as jfk pointed out)

could be wrong.

Anyway the thing cant do 3d in 24 only 16 (sucks) o well
:(


dynaman(Posted 2006) [#18]
> I though the only difference between 24 & 32 was an extra 8 bits for the alpha channel, nothing to do with the number of colors

Unless there is some really goofy card out there, you are correct.

> Anyway the thing cant do 3d in 24 only 16 (sucks) o well
Most cards can't - don't feel bad. (does anyone know a card that can do 3D in 24bit?)


AbbaRue(Posted 2006) [#19]
One reason for using 32 bit instead of 24 bit is because
32 bit numbers are easyer for the processor to manipulate.
There are no 24 bit registers in the processor, only
8 bit, 16 bit, 32 bit, 64 bit and 128 bit registers.
So all 24 bit numbers are converted to 32 bit before they can be used. This could cause some slow downs.