32 bit depth slower than 16?

BlitzMax Forums/BlitzMax Beginners Area/32 bit depth slower than 16?

Rixarn(Posted 2009) [#1]
Hi!

I was wondering, what´s the main difference between 32 and 16 bit depth? Im making a game, but i want it running on older PC's as far as possible :) So im willing to sacrifice a little bit of picture quality in favor of performance and speed. Thanks! :D

(btw, im using png for my graphics... and when they have gradients or the like they look.. well.. not nice he he, gradients look horrible)


SLotman(Posted 2009) [#2]

(btw, im using png for my graphics... and when they have gradients or the like they look.. well.. not nice he he, gradients look horrible)



That's the difference. In 16bpp you get fewer colors, so less smooth transitions from one color to another.

And unless you have a VERY old gfx card (pre-Geforce 1 old) it won't give you any speed improvements going to 16bpp. Lowering the resolution will have a much bigger increase on speed than doing that.

What you need to do is change states as few as possible, and re-use and pack images/textures as much as you can.


ImaginaryHuman(Posted 2009) [#3]
I agree. When you're using 16bit it's probably something like 5-bit Red, 6-Bit green and 5-Bit blue, or it might be 5-Bit each Red/Green/Blue and 1-bit alpha.

Either way the difference between 5-bit and 8-bit per color component is the difference between 32 levels and 256 levels - that's a lot - it's 8 times less accurate. So if you had a gradient going:

$00,$00,$00
$01,$01,$01

it's going to get changed in 16-bit to the equivalent of either

$00,$00,$00

or $08,$08,$08

Gradients especially are going to look bad in 16-bit - which is slightly helped by switching on dithering in some cases. The hardware can do the dithering usually.

I have found sometimes that a 32-bit graphics card which is designed to be optimum at 32-bit color, may actually get slower when you go to 16-bit because it might not have so many features streamlined for that bit depth.

These days I think you can safely throw away anything less than 32-bit color, it's just not worth it. I think not so long ago when I saw some stats for people's pc's, those with a 16-bit display was a really small percentage.


Rixarn(Posted 2009) [#4]
Hey, thanks both for your answers :) if that's so then i have no problem on doing my stuff on a 32bit detph res... thanks!


MGE(Posted 2009) [#5]
Stick to 32bit. GPU's for the past few years have been optimized to work faster in 32bit. 16bit may actually be slower, depending what you're doing via the gpu.