GLGraphics() doesn't use the default flags

BlitzMax Forums/BlitzMax Programming/GLGraphics() doesn't use the default flags

ImaginaryHuman(Posted 2006) [#1]
The follow program demonstrates that the default flags that are set in SetGraphicsDriver(GLGraphicsDriver()) are ignored by the call to GLGraphics().

If you do GLGraphics(640,480,0,0,0), ie with 0 as the flags, the flags specified as default in the driver are completely ignored.

You might notice that if you ask for a stencil buffer you're likely to be given a depth buffer even if you didn't ask for one, that's not a bug it's normal GL behavior and it might not apply to all platforms.

So the issue is with GLGraphics() ignoring the default flags, requiring you to respecify them in the GLGraphics() call.

Strict
SetGraphicsDriver(GLGraphicsDriver(),GRAPHICS_BACKBUFFER|GRAPHICS_ALPHABUFFER|GRAPHICS_STENCILBUFFER|GRAPHICS_DEPTHBUFFER|GRAPHICS_ACCUMBUFFER)
Local g:TGraphics=GLGraphics(640,480,0,0,GRAPHICS_BACKBUFFER|GRAPHICS_STENCILBUFFER)
SetGraphics g
Local a:Int
glGetIntegerv(GL_RED_BITS,Varptr(a))
Print "Red bits: "+a
glGetIntegerv(GL_GREEN_BITS,Varptr(a))
Print "Green bits: "+a
glGetIntegerv(GL_BLUE_BITS,Varptr(a))
Print "Blue bits: "+a
glGetIntegerv(GL_ALPHA_BITS,Varptr(a))
Print "Alpha bits: "+a
glGetIntegerv(GL_STENCIL_BITS,Varptr(a))
Print "Stencil Bits: "+a
glGetIntegerv(GL_DEPTH_BITS,Varptr(a))
Print "Depth bits: "+a
glGetIntegerv(GL_ACCUM_RED_BITS,Varptr(a))
Print "Accum red bits: "+a
glGetIntegerv(GL_ACCUM_GREEN_BITS,Varptr(a))
Print "Accum green bits: "+a
glGetIntegerv(GL_ACCUM_BLUE_BITS,Varptr(a))
Print "Accum blue bits: "+a
glGetIntegerv(GL_ACCUM_ALPHA_BITS,Varptr(a))
Print "Accum alpha bits: "+a

It should print that you have 8-bit stencil (or whatever you have), probably x-bit depth, but no accumulation buffer.

[EDIT] GLMax2DDriver() in combination with CreateGraphics() doesn't have this problem.


skidracer(Posted 2006) [#2]
The SetGraphicsDriver command as documented only affects the behavior of the Graphics() function and has no affect on the behavior of the GLGraphics function which has it's own default flags behavior.


ImaginaryHuman(Posted 2006) [#3]
Umm, okay, sorry to suggest it was a bug.

GLGraphics() has its own default flags, separate from the flags that you pass in GlGraphics()?

How does that work?


skidracer(Posted 2006) [#4]
I just meant the default value for the flags parameter of GLGraphics is GRAPHICS_BACKBUFFER where for Graphics it's 0.

Mark might agree with you that this is misleading behavior, will see...


ImaginaryHuman(Posted 2006) [#5]
I would like for GLGraphics() to not default to having a backbuffer if I don't want one. It would be nice for it to be consistent that you can set the driver's default buffers and not have to respecify later.


marksibly(Posted 2006) [#6]
Hi,

Instead of GLGraphics, go...
SetGraphicsDriver GLGraphicsDriver(),my_default_flags
Graphics width,height,blah,etc...

...to get 'default flags' behaviour.


ImaginaryHuman(Posted 2006) [#7]
Umm, ok. Will that set up a bunch of standard OpenGL display state or does it give me a completely uninitialized clean slate?

I suppose I could use CreateGraphics() as well, instead of Graphics, to get the default flags?

Or... I'll just use GLGraphics() and pass the flags to it. No biggie really.