Error: Invalid Drawable?
BlitzMax Forums/BlitzMax Programming/Error: Invalid Drawable?
| ||
I am trying to open a screen in 16-bit color with dithering switched on, and am using custom OpenGL to set up the display etc. In 16-bit mode, I get this error: Compile Error: 2006-04-23 11:45:44.021 Game[558] invalid drawable As I say it works fine with 32-bit mode. I also wrote a quick prog to print the availble modes, the output is shown below. I find it odd that a) the hertz shows as 0 (because it's LCD?) and b) there is more than one mode at a given resolution that seems to be identical. Why? (because they are different hertz but it just doesn't tell you)? More importantly, why the error in 16-bit? [edit] It works fine in window mode, not fullscreen. mode: 0 wid: 1440 hig: 900 depth: 16 htz: 0 avail?: yes mode: 1 wid: 1440 hig: 900 depth: 32 htz: 0 avail?: yes mode: 2 wid: 1152 hig: 720 depth: 16 htz: 0 avail?: yes mode: 3 wid: 1152 hig: 720 depth: 32 htz: 0 avail?: yes mode: 4 wid: 1024 hig: 768 depth: 16 htz: 0 avail?: yes mode: 5 wid: 1024 hig: 768 depth: 32 htz: 0 avail?: yes mode: 6 wid: 1024 hig: 768 depth: 16 htz: 0 avail?: yes mode: 7 wid: 1024 hig: 768 depth: 32 htz: 0 avail?: yes mode: 8 wid: 1024 hig: 640 depth: 16 htz: 0 avail?: yes mode: 9 wid: 1024 hig: 640 depth: 32 htz: 0 avail?: yes mode: 10 wid: 800 hig: 600 depth: 16 htz: 0 avail?: yes mode: 11 wid: 800 hig: 600 depth: 32 htz: 0 avail?: yes mode: 12 wid: 800 hig: 600 depth: 16 htz: 0 avail?: yes mode: 13 wid: 800 hig: 600 depth: 32 htz: 0 avail?: yes mode: 14 wid: 800 hig: 500 depth: 16 htz: 0 avail?: yes mode: 15 wid: 800 hig: 500 depth: 32 htz: 0 avail?: yes mode: 16 wid: 640 hig: 480 depth: 16 htz: 0 avail?: yes mode: 17 wid: 640 hig: 480 depth: 32 htz: 0 avail?: yes mode: 18 wid: 640 hig: 480 depth: 16 htz: 0 avail?: yes mode: 19 wid: 640 hig: 480 depth: 32 htz: 0 avail?: yes |
| ||
When does the error occur? Is it when you are creating the drawing buffers? Creating a scene? What error is returned by the OGL API? |
| ||
This piece of example code (not my actual app) produces the error:SetGraphicsDriver GLGraphicsDriver() GLGraphics (640,480,16,60,GRAPHICS_BACKBUFFER|GRAPHICS_STENCIL_BUFFER|GRAPHICS_ALPHABUFFER) Whereas this piece of code is fine: Graphics 640,480,16,60 However, this piece of code produces the error: Graphics 640,480,16,60,GRAPHICS_BACKBUFFER|GRAPHICS_STENCILBUFFER|GRAPHICS_ALPHABUFFER Somehow the flags perhaps are causing the error? Also this piece of code is fine: Graphics 640,480,16,60,GRAPHICS_BACKBUFFER [code] but this: [code] Graphics 640,480,16,60,GRAPHICS_BACKBUFFER|GRAPHICS_ALPHABUFFER and this: Graphics 640,480,16,60,GRAPHICS_BACKBUFFER|GRAPHICS_STENCILBUFFER and this: Graphics 640,480,16,60,GRAPHICS_BACKBUFFER|GRAPHICS_STENCILBUFFER|GRAPHICS_ALPHABUFFER all do not work. |
| ||
Well, you can't get a stencil buffer in 16-bit on most cards. (If not all) So that's your problem. The alpha buffer is (correct me if I'm wrong, going from memory here) not supported either on must cards. Unless it's a backbuffer with alpha :) |
| ||
Really? Wow. Interesting. I would have thought that at least the alphabuffer would work. If I do: Graphics 640,480,16,0,GRAPHICS_BACKBUFFER Will that give me an RGBA buffer or will it only be RGB? Doesn't OpenGL always default to RGBA? ie is GRAPHICS_ALPHABUFFER obsolete? |
| ||
It seems that working in windowed mode is fine, even if your desktop is 16-bit, and you open a window on it, it will give you a stencil buffer and an alpha buffer. But it doesn't work in fullscreen. So... er, if the user has a fullscreen mode they basically can't use 16-bit because i need to use the stencilbuffer and alpha channels. Do you know if this is consistent - that in window mode most cards will allow a stencilbuffer and an alpha buffer? |
| ||
I guess I'll have to settle for having either a 32-bit in fullscreen or window, and then if they want to fall back to 16-bit they will have to use windowed mode only and let the desktop resolution decide things. |