Most common pixel format?

BlitzMax Forums/BlitzMax Programming/Most common pixel format?

ImaginaryHuman(Posted 2005) [#1]
What is the most common pixel data format, and on what platform?

RGBA? ARGB?


xlsior(Posted 2005) [#2]
Before assuming anything, just use pixmapformat and you can tailor your routines to run on any computer.


Hotcakes(Posted 2005) [#3]
Yeh AD, having had a quick look through some of the image loading routines, it seems some systems store data as GRB - so you can't really assume anything ;]


ImaginaryHuman(Posted 2005) [#4]
I'm not using pixmaps, I have my own bitmap system. The reason I ask about the format is because I figured if there are more computers using one particular format then I will optimize for those ahead of the others so that there is less likelihood of having to convert things all the time.

Having said that, I suppose that in order to upload a texture to OpenGL I have to define it's source format and OpenGL handles the on-the-fly converson, so I guess it doesn't matter too much.

But if OpenGL is slowed down in any way but trying to draw an RGBA texture rather than an ARGB texture I'd like to know.


ImaginaryHuman(Posted 2005) [#5]
Bump.

What is the most common format? Do most PC cards use ARGB?
What does the Mac use?


ImaginaryHuman(Posted 2005) [#6]
Seems like it's ARGB most commonly


xlsior(Posted 2005) [#7]
Many of the Macs use the same video adapters as PC's do (Radeon 9x00), so I would expect them to be the same?


ImaginaryHuman(Posted 2005) [#8]
And that same format IS????????????


Hotcakes(Posted 2005) [#9]
Mine is GRBA (GeForce 6600). I don't think it's safe to assume anything.


marksibly(Posted 2005) [#10]
Win32 (little endian) : BGRA

MacOS (big endian) : RGBA

Fun stuff!


xlsior(Posted 2005) [#11]
MacOS (big endian) : RGBA


So... Any idea if Apple will also change over to little endian when they switch to Intel CPU's in the not-too-distant future?
Isn't the big/little endian-ness tied to the CPU architecture, and not operating system?


taxlerendiosk(Posted 2005) [#12]
Wait, is alpha the "leftmost" or "rightmost" byte? If you have a byte representing alpha do you have to shift it to the left or keep it where it is?


ImaginaryHuman(Posted 2005) [#13]
It probably is for efficiency's purposes, where you usually want to access the RGB more often than the A, so you put the RGB in the first-accessed bytes, be they in the little end or big end.

I guess it doesn't matter too much. I prefer the RGBA mac way but maybe doing stuff in ARGB is more efficient, coding-wise.


Hotcakes(Posted 2005) [#14]
Oh OK, so it's just an endian thing. And yes, xlsior, if Apple switch their OS to Intel, so will their endianness change.