Trapping OpenGL Crash

BlitzMax Forums/BlitzMax Programming/Trapping OpenGL Crash

Grey Alien(Posted 2006) [#1]
Hi, I've got a laptop and when I try to run my game on it in OpenGL mode (set in an ini file), the game just bombs out after trying to create a full-screen OpenGL display. You actually see the screen go black for a split second then in bombs. It works in windowed mode. Also a simpler demo I made works in full-screen mode, it's just the more complex full-game that doesn't work. It totally works in DirectX mode by the way.

Why could this be? The laptop really isn't very powerful but it does seem to run OpenGL some of the time but not when the full-screen scene is too complex. Has anyone ever heard of this before?

Anyway, in an effort to trap an error so that I can see what is happening, I've put this code around the main code where the screen is created and drawn:

Try
'main code
Catch o:Object
	HandleGeneralError(o)
End Try

Function HandleGeneralError(o:Object)
	If TBlitzException(o) Then
		Notify TBlitzException(o).ToString()
	Else
		'Perhaps a string has been raised? If so, show it.
		Notify o.Tostring()
	EndIf
End Function


This error trapping code works well for debug.exes because it displays the full error message when the exe is run outside of the IDE (which is what's happening on the laptop). However, with the OpenGL issue no error is returned, making me think that nowhere in Blitz is it being trapped and raised for me to trap at the top level. I suppose I could put BMax on the laptop and try it in the IDE to see if anything shows up. I may also make a log file to see how far it gets.

I'm setting OpenGl mode with this: SetGraphicsDriver GLMax2DDriver() but I note that the function doesn't return a value you can check.

I've even got code protecting the Graphics() call like this:

		If FullScreen Then
			'Try 32 bit. Don't test for Hertz as some drivers return 0Hz!
			If GraphicsModeExists(ScreenWidth, ScreenHeight,32) Then
				OK = Graphics(ScreenWidth, ScreenHeight,32,Hertz)
			EndIf
			'Was 32 bit set OK?
			If OK=Null Then
                             'some error code
                        EndIf
                EndIf


The main reason I need to fix this is, if you offer the user the option to change to OpenGL and it fails, you need to put back DirectX (and in the ini file), but if it bombs immediately and is not trapped, every time they try to load it'll use OpenGL mode due to the ini file! i.e. game appears permanently broken = not good.

Any help/advice is welcome, thanks.


Dreamora(Posted 2006) [#2]
The graphics mode exist actually does not give you any information on the context.

You need to create a TGraphics object with the desired settings and check if it is <> null and if so, set that graphics object.

The reason is your notebook, which most likely does not feature enough VRAM or the system which simply does not have any kind of OpenGL emulation driver installed (Win98 before SE didn't have any by default if I'm not wrong)
This means for OpenGL -> death
For DX this is less a problem as it works with managed surfaces and holds most of its data in system RAM which OpenGL is not able to do.


Grey Alien(Posted 2006) [#3]
Thanks.

I see, so OpenGL tries to hold all the graphics in VRAM and runs out of room thus bombs. But it doesn't bomb in a nice way at all that can be trapped. Maybe I'll need to write a little file on exit called "goodexit" and then delete it when it loads (before setting the screen up). Then if, when loading, the goodexit file is missing I can revert to a standard .ini file with DirectX as the default mode. (The laptop is XP SP2 btw, I shoulda said)

OK, so GraphicsModeExists may be redundanct but Graphics() does return a TGraphics object and I'm checking that for null and taking action if it is. Would you suggest a different method? Probably it's bombing later anyway by the time the graphics are fully loaded into VRAM and it overflows. Now there isn't any decent easy code to check for remaining VRAM in BMax is there?...


ImaginaryHuman(Posted 2006) [#4]
I used to work on an iBook G3 300Mhz, which had only a software OpenGL driver (no hardware acceleration). It did some of these same things.

The implementation of OpenGL may or may not support full screen displays. My ibook reported that OpenGL was supported in windowed mode but not in fullscreen. As soon as I tried to open fullscreen it would bomb out and it didn't seem like I could stop it without putting the SetGraphics() into a Try/Catch block and taking action from there.

I also heard that at least on the Mac you must have a certain amount of video ram, e.g. 8mb+ in order for the o/s to support and use QuartzExtreme, which supports fullscreen OpenGL. My ibook only has 2mb graphics ram so it barely even has enough memory for one screen yet alone a whole opengl context with various buffers, and yet alone textures on top of that.

In my case when it ran out of graphics memory in windowed mode (ie was working but just not enough vram for textures) it would start to spool textures to main ram. That's a normal feature of OpenGL. Interesting thing is, that without hardware acceleration, textures in `vram` are just the same speed to draw as pixmaps, because they're all in main memory and software driven.

It may be that your particular OpenGL driver perhaps doesn't support fullscreen GL, or maybe has a limit on how much texture ram it recognizes, or maybe it doesn't spool excess graphics to main memory and then just bombs. There also might be other issues `out there` that none of us knows about yet until it happens.

Another thing you must consider is this: If you ask to open an OpenGL display and you request certain buffers, if those buffers are not supported in that particular display mode, the context will fail. It might not necessarily bomb out OR give an error message or an exception, but you just wont see anything on-screen. I implemented a test to draw a pixel and read it back - if it comes back the right color value then I know the display is working. If you get nothing back the context is broken. You get all sorts of different buffer support based on a) the bit depth of the colors requested, b) the way that the card manufacturer distributes pixels across bytes, and c) what particular buffers you ask for - sometimes you ask for a given buffer and it automatically means you get given another buffer as well - typically depth and stencil buffers come together. There is no alpha in 16-bit mode and sometimes you'll get a depth buffer for 16-bit with no stencil, then you ask for a stencil you might get an 8-bit stencil with 24-bit depth buffer. It's not very predictable. In my game I'm just going with a requirement of 32-bit color with alpha channel, and probably must have a stencil buffer.

And no there still isn't decent code (or any) to check vram availability. Ideally you'd want to get that from OpenGL itself but it doesn't have that feature. There should be o/s calls you can tap into that can try to tell you the vram total/usage, but who knows if that is accurate.


Gabriel(Posted 2006) [#5]
I still think the best way to do this is to write something to file when your program closes cleanly and if it's not there when the game begins, suggest that there may be a problem because the game did not shut down properly and offer to run in "safe mode" which means back to DX and your default settings. It's entirely non-technical ( sorry! ) but it's failsafe, or as near as you're going to get.


Grey Alien(Posted 2006) [#6]
AngelDaniel: thanks for the detailed reply. Yeah I'm not using fancy buffers just a plain old 32-bit display.

Gabriel: Yeah that's what I'm gonna do, I suggest it higher up, but then was it you who suggested this a few weeks back?


Gabriel(Posted 2006) [#7]
Sorry, didn't spot it earlier in the thread. It might have been me who suggested this a few weeks back, yeah. I've noticed a couple of games have done this on me when they crashed and I thought it was a nice option from a user point of view. We're used to it from Windows and now I think Firefox does this too, so people will find it natural for the game to report like this and let them choose how to proceed or at least be informed on what is about to happen.


Grey Alien(Posted 2006) [#8]
I was just gonna auto-revert to the new mode but do you think it's better actually telling them something like "bad shutdown detected. Reverting to default settings..."?


ImaginaryHuman(Posted 2006) [#9]
Just remember the alpha buffer is also an extra buffer, so if you ask for 32-bit and their card only supports 24-bit it will fail.


Grey Alien(Posted 2006) [#10]
OK thanks.

Just to let everyone know, I now have code in place that detects if the driver was changed (ini file setting altered) and the game crashed (due to there not being a "goodexit" file in the data folder) so that it can copy ini a default ini file with DirectX as the default driver. This works well on my crappy laptop. It bombs when you change to OpenGl mode on the options screen, then next time you try to load it, it works because it has reverted to DirectX.

It seems that when I try to do OpenGL full-screen on my game, setting the graphics contect succeeds and you even see the screen draw for like 1 frame and then it bombs, so it's not instant just after the setting is changed, it happens later when OpenGL fills up the VRAM. Shame that OpenGL is so unstable in this respect and can't generate some nice tracable error codes.


ImaginaryHuman(Posted 2006) [#11]
OpenGL does generate error codes.

see glGetError()

If you do ANYTHING wrong, an error code is generated.

When glGetError()=GL_NONE there is no error, otherwise it's an error code that matches some other symbolic constant such as GL_INVALID etc. You should be checking the errors to make sure your display was set up right etc and there was nothing that went wrong. If you keep tabs on the errors it might clue you in as to what exact instruction(s) are causing problems.


Grey Alien(Posted 2006) [#12]
I'm using BMax and the display contect is set up with Graphics which returns no errors, then eventually when it tries to draw the .exe just bombs, in debug mode no errors are thrown so none can be trapped. DrawImage doesn't have any error return codes so I don't see how I can find out what caused it. It's probably a VRAM being full issue.


ImaginaryHuman(Posted 2006) [#13]
Like I said you have to check with OpenGL directly as to what errors it might be coming up with.

1) Make a Max2D call to some graphics routine
2) Ask OpenGL if there were errors and then decipher it

e.g.

Graphics(640,480,0)
Local Err:Int=glGetError(Varptr(Err))

something like that

After every Max2D call, which makes internal calls to OpenGL, you can check directly with OpenGL as to what the error codes were if any.

You should also clear out any errors at the start of your app so you start with a clean slate - ie

Repeat
Err=glGetError(Varptr(Err))
Until Err=GL_NO_ERROR

(or might be GL_NONE)

It doesn't matter if Max2d doesn't return error codes, that doesn't mean they aren't being generated or that you can't access them.


Grey Alien(Posted 2006) [#14]
aha I see, very interesting. Thanks!


ImaginaryHuman(Posted 2006) [#15]
Sure, NOW you see ;-)


Grey Alien(Posted 2006) [#16]
yep your second explanation made it clear.