OpenGL reporting errors

BlitzMax Forums/OpenGL Module/OpenGL reporting errors

ImaginaryHuman(Posted 2006) [#1]
It seems that when I run the below program, with a 16-bit screen, or indeed any kind of screen, even in a window on a 16-bit desktop, it says that there are 8 bits per pixel, plus a 24-bit depth buffer, plus a stencil buffer. This doesn't seem to vary based on changing the mode, or the depth of the display.

Does anyone else get the same?

P.S. This program probably bombs out anyway because the gfx card technically actually doesn't support 16-bit color with a stencil buffer or an alpha buffer - but it should still print the results.

SetGraphicsDriver GLGraphicsDriver()
GLGraphics 640,480,16,60,GRAPHICS_BACKBUFFER|GRAPHICS_ALPHABUFFER|GRAPHICS_DEPTHBUFFER|GRAPHICS_STENCILBUFFER|GRAPHICS_ACCUMBUFFER
Local a:Int[1]
a[0]=0
glGetIntegerv(GL_RED_BITS,a)
Print "Color red bits: "+a[0]
a[0]=0
glGetIntegerv(GL_GREEN_BITS,a)
Print "Color green bits: "+a[0]
a[0]=0
glGetIntegerv(GL_BLUE_BITS,a)
Print "Color  blue bits: "+a[0]
a[0]=0
glGetIntegerv(GL_ALPHA_BITS,a)
Print "Color alpha bits: "+a[0]
a[0]=0
glGetIntegerv(GL_DEPTH_BITS,a)
Print "Depth bits: "+a[0]
a[0]=0
glGetIntegerv(GL_STENCIL_BITS,a)
Print "Stencil bits: "+a[0]
a[0]=0
glGetIntegerv(GL_ACCUM_RED_BITS,a)
Print "Accum red bits: "+a[0]
a[0]=0
glGetIntegerv(GL_ACCUM_GREEN_BITS,a)
Print "Accum green bits: "+a[0]
a[0]=0
glGetIntegerv(GL_ACCUM_BLUE_BITS,a)
Print "Accum blue bits: "+a[0]
a[0]=0
glGetIntegerv(GL_ACCUM_ALPHA_BITS,a)
Print "Accum alpha bits: "+a[0]
a[0]=0
glGetBooleanv(GL_DOUBLEBUFFER,a)
Print "Double buffering supported?: "+a[0]
a[0]=0
glGetBooleanv(GL_STEREO,a)
Print "Stereo left/right buffers supported?: "+a[0]
a[0]=0
glGetIntegerv(GL_AUX_BUFFERS,a)
Print "Auxiliary buffers available?: "+a[0]
a[0]=0
glGetIntegerv(GL_MAX_PIXEL_MAP_TABLE,a)
Print "Maximum pixel-remap table size: "+a[0]
a[0]=0
glGetIntegerv(GL_MAX_TEXTURE_SIZE,a)
Print "Maximum texture size: "+a[0]+"x"+a[0]
Print "GraphicsWidth(): "+GraphicsWidth()
Print "GraphicsHeight(): "+GraphicsHeight()
Print "GraphicsDepth(): "+GraphicsDepth()
Print "GraphicsHertz(): "+GraphicsHertz()



ImaginaryHuman(Posted 2006) [#2]
I have found that if I open the graphics with GLGraphics 640,480,0 it will actually correctly report that the color depth is 5-bit red, 5-bit green, 5-bit blue, and 0-bit alpha (interesting that it is actually 15-bit color even though my display mode says 16-bit). It also correctly reports that there is no stencil buffer, no alpha buffer, and the depth buffer is 16-bit. Also GraphicsHertz() returns 60 even though the hz rate of the screen mode in the list of modes says 0, which is interesting.


ImaginaryHuman(Posted 2006) [#3]
Oddy, GraphicsDepth() returns 0, when it should say 16. It looks as though you can rely on OpenGL to tell you the actual desktop depth, but you can't rely on GraphicsDepth(). ... .or more to the point, 0 may be reported because that's what I specified in GLGraphics()?

The depth buffer is also 16-bit in 32-bit color desktop. I know it probably varies by manufacturer.


Chris C(Posted 2006) [#4]
I get this on a gt6600 (hope this helps you)
Color red bits: 5
Color green bits: 6
Color  blue bits: 5
Color alpha bits: 0
Depth bits: 24
Stencil bits: 0
Accum red bits: 16
Accum green bits: 16
Accum blue bits: 16
Accum alpha bits: 16
Double buffering supported?: 1
Stereo left/right buffers supported?: 0
Auxiliary buffers available?: 4
Maximum pixel-remap table size: 65536
Maximum texture size: 4096x4096
GraphicsWidth(): 640
GraphicsHeight(): 480
GraphicsDepth(): 16
GraphicsHertz(): 60



Chris C(Posted 2006) [#5]
interesingly "32-bit" mode gives
Depth bits: 24
but then including the alpha I suppose thats 32...


ImaginaryHuman(Posted 2006) [#6]
Hmm, Chris that's pretty interesting that you get a 16-bit mode using all bits, plus a 24-bit depth buffer and an accumulation buffer, but no stencil and no alpha. It's also interesting to see that you have 4096x4096 texture size, mine is 2048x2048. Also nice that you have 4 aux buffers, mine has none on my better spec'd machine but 4 on my low-spec machine. Seems to be quite variable.

It's kind of strange that you have a 24-bit depth buffer with 16-bit image data, because that's going to be sort of wasteful on memory - although I guess they then use some of the remaining bytes for the accumulation buffer.Good to know that these things are all flexible based on the hardware and so on.

It's also wierd that you have a alpha channel for the accumulation buffer but not for the backbuffer itself. Weird.

I find on my system that opening a 15-bit mode actually does open but its much slower and is more obviously less attractive, while opening a 16-bit mode is much better speed and so on - even though it actually is not 16-bit, it's 15-bit.

Why can't people agree on a standard? lol

I think what I'm going to have to do, instead of seeking out a mode that has all the properties I need for the game, is just give in and adapt the game to do the best it can based on what is available. That means removing some features on lower-spec displays or finding an way to do the same things in other ways. A nice 32-bit screen with stencil and alpha and depth and accum buffers is definitely cream of the crop.

Chris do you have any 24-bit modes on your computer? Could you test what you get in 24-bit?


Chris C(Posted 2006) [#7]
I get this with 24 bit mode
but you might want to consider that 640x480 is a rather low screen resolution by todays standard, unless your running somthing stupid like oblivion, which has such stupid requirement its not true!

Theres a problem with selecting a mode automagically, what "score" do you attribute to accumulation buffer v's stencil buffer

I suppose you could do a test render of say 10 frames in each res of 10 different demo scenes, but the fastest might not be the most capable

better to do an Ogre type display choice type dialog, with a simple default mode.

oh and btw the gt6600 isnt exactly cheap so dont think these caps are the norm, they probably aint!

Color red bits: 5
Color green bits: 6
Color  blue bits: 5
Color alpha bits: 0
Depth bits: 24
Stencil bits: 0
Accum red bits: 16
Accum green bits: 16
Accum blue bits: 16
Accum alpha bits: 16
Double buffering supported?: 1
Stereo left/right buffers supported?: 0
Auxiliary buffers available?: 4
Maximum pixel-remap table size: 65536
Maximum texture size: 4096x4096
GraphicsWidth(): 640
GraphicsHeight(): 480
GraphicsDepth(): 16
GraphicsHertz(): 60



Tom(Posted 2006) [#8]
Radeon x800 pro here, both 16 & 32bit modes give the same results, apart from GraphicsDepth() of course :)

Color red bits: 8
Color green bits: 8
Color blue bits: 8
Color alpha bits: 8
Depth bits: 24
Stencil bits: 8
Accum red bits: 16
Accum green bits: 16
Accum blue bits: 16
Accum alpha bits: 16
Double buffering supported?: 1
Stereo left/right buffers supported?: 0
Auxiliary buffers available?: 2
Maximum pixel-remap table size: 65536
Maximum texture size: 2048x2048
GraphicsWidth(): 640
GraphicsHeight(): 480
GraphicsDepth(): 16
GraphicsHertz(): 60


ImaginaryHuman(Posted 2006) [#9]
Chris, if that's a 24-bit mode why does it say 5 6 5 for the bit depths of each component, and not 8 8 8? Does that mean that your card is perhaps downgrading the mode to a 16-bit display because it doesn't have a regular 24-bit mode?

Tom, so it shows 8 bits for each component even in 16-bit mode, and also an 8-bit alpha channel, and a stencil?

Odd, most odd, and totally inconsistent.

Chris, yeah, I will have a display mode selector within the game itself, once the game has startup up and actually has a screen to display it on. I guess if I had maxgui I could make a requester that is non-graphics based to choose a mode before startup. But ideally the first time the game runs I don't want the user to have to choose a mode first, just start up and be done with it. Hence trying to find a suitable mode.


ImaginaryHuman(Posted 2006) [#10]
Chris, didn't the first program, with the 16-bit display asking for all the buffers, just bomb out and not open the display, because your results show there is no stencil buffer supported. Normally on my systems if a buffer isn't supported it flat out wont open the mode and will bomb quit the application.