Confused about window mode

BlitzMax Forums/OpenGL Module/Confused about window mode

ImaginaryHuman(Posted 2006) [#1]
I am confused about windowed mode in OpenGL.

When I open a window on a 16-bit desktop using GLGraphicsDriver() and GLGraphics(320,240,0), I can use

Local Component:Int[1]
glGetIntegerv(GL_RED_BITS,Component)

which will correctly state the number of red bits available. The same applies to green blue and alpha - it gives the correct amount of bits. It also correctly reports there is no stencil buffer and no alpha channel.

Now, if I open a dummy window and read the GraphicsHertz() and GraphicsDepth() which are 60Hz and 16-bit (desktop is set to 16-bit), then open my proper graphics window with GLGraphics(320,240,0,Hertz,GRAPHICS_BACKBUFFER), the following calls to read the bit depths returns 8-bit for red green blue and alpha, plus 16-bit depth buffer, plus 8-bit stencil buffer. However, the OpenGL program that then ensues does not operate correctly because there actually is no alpha channel or stencil buffer present, and it gets painfully slow and dithers the display.

Further, If I am on a 16-bit dekstop and I open a window with GLGraphics(320,240,0,Hertz,GRAPHICS_BACKBUFFER|GRAPHICS_ALPHABUFFER|GRAPHICS_STENCILBUFFER), rather than fail, it actually presents me with a stencil buffer, an alpha buffer, plus it runs faster than the previous 16-bit mode, plus you can't tell if it's dithered because it looks like the 32-bit image, yet it's running on a 16-bit desktop.

It seems that if you specify any flags for GLGraphics(), or a hertz rate, OpenGL reports the highest bit depths and buffer availability possible. Does this mean the display gets `promoted` internally to 32-bit data, and then dithered down to the display? If you open without any flags, it reports correctly.

It also seems that if you ask for a 16-bit window on a 16-bit desktop, it will let you ask for and get a stencil buffer and an alpha channel. Yet if you do that in full-screen, it will fail. So what does this mean? Do only some cards support this pseudo 32-bit mode in 16-bit windows? Somehow it supports a 16-bit display but with stencil and alpha. Is this just my graphics card? (NVidea GeForce4).

It's just all so confusing trying to figure out what will or will not be supported and what mode to go with.

Are 16-bit window supporting alpha and stencils commonplace? Is non-support of these in fullscreen commonplace?

It seems that 32-bit is the holy grail, as you know you're getting everything you ask for, unless it's not supported. ;-)


Chris C(Posted 2006) [#2]
I think if you opened a context directly with wgl calls you may well get different results!

Its worth checking whats going on in the max opengl driver code when it opens a context and even sticking a few printf's in the C code to see whats going on

But a word of warning its not commented *at all* got to pity any one who has to maintain that code...


ImaginaryHuman(Posted 2006) [#3]
Hmm, sounds a little bit too low-level complicated but it's a nice idea. I might take a look at it out of curiosity.

Thanks