Fullscreen single buffer graphics

BlitzMax Forums/BlitzMax Programming/Fullscreen single buffer graphics

Zolyx(Posted 2009) [#1]
I've searched and read a couple of topics about people who seem to have got single-buffer graphics working to their satisfaction, but for whatever reason, I'm still having trouble in getting it to work for me. All I'm after is a fullscreen, single buffered graphics canvas.

Test code:
SetGraphicsDriver(D3D7Max2DDriver(), 0)

Graphics 800, 600, 32, 60, 0

Repeat
	DrawText "Hello single-buffered world!", 10, 10
Until KeyDown(KEY_ESCAPE) Or AppTerminate()

End


The lack of both Cls and Flip in the main loop is deliberate. If there truly was just a single graphics buffer, surely neither of them should be required? I would expect to see the text printed. But... I see nothing except a black screen :(

Adding a Flip makes it work fine.

Am I doing something horribly wrong to get a single-buffered graphics canvas set up?


Azathoth(Posted 2009) [#2]
I thought Graphics (or the driver) uses double buffering?


Bremer(Posted 2009) [#3]
There isn't any single buffer option, from what I remember.


GfK(Posted 2009) [#4]
The CreateGraphics function allows you (according to the documentation) to create a graphics object without a backbuffer. Can't say I've ever had any joy in getting any of that stuff to do what I assumed it would do. That said, I haven't spent a vast amount of time on it.


smilertoo(Posted 2009) [#5]
try adding a flip, i dont think youre going to see much without it...even if you dont want it.


MGE(Posted 2009) [#6]
The default is a double buffered display. Add a flip(1) and all is well.


ImaginaryHuman(Posted 2009) [#7]
I have no idea if DirectX supports it, but you CAN do single-buffered with OpenGL.

From the docs:

Function SetGraphicsDriver( driver:TGraphicsDriver,defaultFlags=GRAPHICS_BACKBUFFER )

You must set the `default flags` to the buffers that you want, when you define what driver you're using. Then when you use Graphics, or GLGraphics, or CreateGraphics, it should give you only those buffers.

Note that the SetGraphicsDriver function has a default value for the defaultFlags parameter - that means that if you do not provide a parameter it will default to GRAPHICS_BACKBUFFER. I think that possibly passing a 0 as your parameter might make it think you're not passing any flags, and thus will use a backbuffer?

Either way, what I do is ask for some other specific buffer but not a backbuffer - e.g.

SetGraphicsDriver GLGraphicsDriver(),GRAPHICS_ALPHABUFFER

or you could do

SetGraphicsDriver GLGraphicsDriver(),GRAPHICS_DEPTHBUFFER

since even 24-bit (non-alpha-channel) screens can have a depth buffer even if you don't use it.

Since you're specifying a requested buffer, but that doesn't include GRAPHICS_BACKBUFFER, it should NOT create a backbuffer. Then everything that you draw should be drawn immediately to the visible front buffer and you will NOT need to use Flip at all.

Good luck enjoying the fact that the user will be able to see every drawing operation as you issue it (or as the graphics card performs it), but also the total removal of the overhead of performing a Flip (which typically copies the entire buffer to the screen).


TaskMaster(Posted 2009) [#8]
You know, if you can begin drawing at the moment the last vsync finishes, and get all of your drawing done before the next vsync, it would work fine. :)


Zolyx(Posted 2009) [#9]
Thanks for all of the replies :) After some experimentation, a friend and I managed to get it working with OpenGL-based graphics thanks to ImaginaryHuman's suggestion. For anyone else looking to do the same thing, the minimal code to get the job done is:
SetGraphicsDriver GLMax2DDriver(), GRAPHICS_DEPTHBUFFER
Graphics 800, 600, 32, 60, GRAPHICS_DEPTHBUFFER

Repeat
	DrawText "Hello single-buffered world!", 10, 10
	glFlush
Until KeyDown(KEY_ESCAPE) Or AppTerminate()

End

Cheers! :)


ImaginaryHuman(Posted 2009) [#10]
Your Graphics call does not need the GRAPHICS_DEPTHBUFFER, you've already made that a default buffer to be requested by putting it in the driver defaults. When you omit the `flags` from Graphics(), it should pull them from what you set in the SetGraphicsDriver line.

As to what TaskMaster said, it's not quite that simple. The vertical refresh is occuring kind of `all the time` from the top of the screen down. So right after the vertical blank you could start drawing stuff and it won't be `torn` by the refresh point. But as time passes the `beam` (or whatever) will proceed down the screen and eventually it's going to run into the area that you're drawing to, creating a tear. That's the problem with single-buffered displays. If you can sort your drawing so that you start drawing stuff from the top of the screen down, you can always be ahead of the refresh and get no tearing, but that is highly impractical for most graphics.


MGE(Posted 2009) [#11]
"If you can sort your drawing so that you start drawing stuff from the top of the screen down, you can always be ahead of the refresh and get no tearing, but that is highly impractical for most graphics. "

Not to mention totally sporadic behavior depending on cpu/gpu configs. ;)

But still.... why the need for this???


ImaginaryHuman(Posted 2009) [#12]
Would be okay for some applications where you don't care about stuff being seen being drawn? ie not too graphically dynamic?