How do I detect if desktop is 256 color?

BlitzMax Forums/BlitzMax Programming/How do I detect if desktop is 256 color?

ImaginaryHuman(Posted 2006) [#1]
Using OpenGL methods only (if necessarily - basically has to be cross-platform), how do I detect if the desktop is in a 256-color mode?

Will trying to use glReadPixels(GL_COLOR_INDEX) and detecting if glGetError()=GL_INVALID_OPERATION, indicate whether the desktop is in 256-color mode?

I have tried opening a 256-color desktop and then using glGetIntegetv() to get the red/green/blue/alpha bits, but it returns 0:5:5:5 in 256-color mode!

How can I know for sure that the desktop is 256-color, so I can request that the desktop mode be switched?

(Is there some way to actually change the desktop mode from BlitzMax in a cross-platform way?)


H&K(Posted 2006) [#2]
Dont. Just put a min requirements on your product saying MIN Requarments 16Bit color graphics. Then when it crashes the user knows it cos they are in 256 colour mode


ImaginaryHuman(Posted 2006) [#3]
I'd rather `letting it crash` not be part of their experience. If I can at least exit gracefully that would be better, or give the option to wait until the mode is changed.


H&K(Posted 2006) [#4]
I have a herculies 2bit graphics card. And I dont expect things to work when I use it


Warren(Posted 2006) [#5]
"How do I detect if desktop is 256 color?"

The easiest way is to check the system clock and see if it's 1997 or earlier.


ImaginaryHuman(Posted 2006) [#6]
That surely is a windows-only solution and not cross-platform.

It MUST be a cross-platform technique.


Warren(Posted 2006) [#7]
All systems have clocks. It's how they tell time.


JoeRetro(Posted 2006) [#8]
I didn't think OpenGL worked in 256 color mode. I thought you had to be either in 16/32 bit mode? If this is the case, why not test the graphics modes and if they fail (which I would expect when initializing OpenGL) informed the user, including your little tid-bit about changing the color-mode ???


Yan(Posted 2006) [#9]
@AD - Didn't someone (d:bug?) give you a link to a module that'll give you the desktop colour depth?


Smurftra(Posted 2006) [#10]
FYI

I know at least 10 ppl that do buy/play shareware type games and have win98 with 256 color desktop.

Now, if i know 10 ppl like that, imagine how many there are.

Not supporting it can cost you consumers. Its a choice you can make. (I choose to lose them, personally, but i totally understand if someone wants to support it.)

anyhow, AngelDaniel is dead on when he says

"I'd rather `letting it crash` not be part of their experience. If I can at least exit gracefully that would be better, or give the option to wait until the mode is changed."

Smurftra


H&K(Posted 2006) [#11]
You are a really bad friend, not to tell them to buy a better graphics card.


d-bug(Posted 2006) [#12]
@MorrisTheMoroseMoose
Yes I did, but it's still not cross-plattform... Didn't found a solution for Linux so far...

cheers


Yan(Posted 2006) [#13]
Didn't Skidracer post a possible solution for Linux in the same thread?

AFAIR, it was GetGraphicsMode(0, w, h, d, r) or did that not work (I don't run Linux and so can't test)?


Smurftra(Posted 2006) [#14]
Maybe they dont need to invest in a computer? Simple games like most shareware games dont need high end computers/graphic cards. And as long as their word precessor works, why pay more? Would you buy a new hammer if yours still worked for what you did? I mean, its ok, dont support them, its your choice. You can still miss alot of sales just because of that.


ImaginaryHuman(Posted 2006) [#15]
Well, you all have some good points.

WarrenM, if I'm going to trust that you are actually being serious that the system clock can tell you if it's a 256-color mode, can you please explain how that exactly works? How does the year have anything whatsoever to do with bit depth of the desktop?

As to the other ideas, I don't know ... I don't like the idea of using separate bits of different code for each platform. To me, that is not truly cross-platform. I am using BlitzMax because it provides cross-platform capabilities and I don't in any way shape or form want to have to maintain *any* amount of windows or linux or mac specific code that only applies to that platform. To me that is undermining the whole cross-platform philosophy and I would like to keep my software as cross-platform `pure` as possible - that is, I should not have to change it or do anything to the code or make certain bits of code activate or deactivate to make it work on the different platforms.

That said, there are of course some features of the different platforms that I am okay with `taking into account`, for example on Windows when you minimize a program you see part of the title text of the window in the task bar, whereas on the Mac for example you just get an icon, so it would be a good idea to adjust the title text when the window is miminized. But that doesn't require any `if platform=this then do that`stuff, it's one piece of code that applies to all. It's okay that BRL has provided the `?` compiler directive and all that, but I really really hope I never have to use it. And I'm sure many people would not think twice to use it, but that's not my choice.

So I also don't want to pursue an idea of testing the display depth that only works on one platform. Not only does that encourage BRL to make the cross-platform nature of BlitzMax `impure` but it also sets things up for being inconsistent which defeats the point of it. Okay so Linux gives you the desktop mode as mode 0 in the list of graphics modes, but the other two platforms don't so I'm not going to touch that.

As a cross-platform developer I should not have to make considerations for having any knowledge of what a given target platform is capable of or how certain things have to be achieved. I should not have to know nor keep track of pieces of information such as that Linux's mode 0 is the desktop mode. As soon as you start down the road of needing to know all these little platform-specific quirks, it's a bottomless pit of incompatibility and your whole cross-platform integrity goes down the poop chute.

So if it's going to be a solution, it has to be the same solution that works on all three platforms in the same way producing the same results with no need to know what the platform is or adapt to its individualistic behavior. Stepping up from single-platform development to cross-platform is a matter of moving from living in a box of individuality and separation, to removing all the barriers and approaching unity.

With regard to detecting the desktop, what I currently do is open a small window (may be hidden) on the desktop using MaxGUI or whatever, then use OpenGL to tell me the bit depths of each color component. I'm just going to assume that if it tells me there is a 15/16/24 or 32-bit display, that IS what the display is, and that IS what the context supports, and that IS what OpenGL is going to render in, regardless of whether there is some other uncontrollable unspecified translation from that down to the final 256-color desktop display. Since we can't predictably and accurately sense what the display *actually* is, and only what we internally know it to be, I'm just going to have to trust the information and go with it.

Yes, you can run an OpenGL app with a 256-color desktop. In this mode, on my system at least, it reports that you are actually working with a 15-bit doublebuffered display. You can open a window and draw graphics and it will appear to be drawing everything at a 16-bit color resolution, but it will then dither down the output to 256-color paletted using the desktop's palette. Also if the desktop is 16-bit, I can still use 32-bit textures and they will be rendered appropriately and be converted down to 16-bit on the fly. So it apparently doesn't matter what the desktop depth is, you have to go with what OpenGL gives you, which is probably totally implementation specific. If the display doesn't open and the desktop is 256-color, I will probably just report that the mode didn't work, not that it was because it was 256-color, but just because it generally didn't work.

I think if a graphics card supports actual 256-color indexed mode in the OpenGL driver, then you actually would get a 256-color indexed OpenGL context, and maybe it would return -1 or something for color depth (I noticed the stencil buffer returns -1 in a 256-color desktop, so maybe that is an indicator - or maybe that just means stencils aren't allowed in that mode?).

Anyway, bla bla bla.


Warren(Posted 2006) [#16]
"I know at least 10 ppl that do buy/play shareware type games and have win98 with 256 color desktop."

They have an 8 year old operating system and a video card that was produced during the Reagan administration - but they buy shareware games. Sure they do.


H&K(Posted 2006) [#17]
warrenM. We are not talking about writing games that work on 256 colors, we are trying to figure out how to tell them that the software dosent work on 256 colours. So your "you shold support them" is pointless.

As a sensible sudjestion, colud you plot to one of the buffers, then read the pixel, and see if its changed.

ps. I dont even have an option to drop to 256 colours any more.


Dreamora(Posted 2006) [#18]
The simplest way is trying to create the graphics device within a try - catch block. If it fails to create, it will throw an error you then can send the user (telling them that their desktop depth is not supported and that they must start the game in fullscreen because of that)


ImaginaryHuman(Posted 2006) [#19]
I am leaning toward your suggestion, Dreamora.

Are you saying that SetGraphicsDriver GLGraphicsDriver() is the thing I need to check for a failure from, or GLGraphics(), or CanvasGraphics()? I think there is going to be more than one reason why these might fail, other than that the desktop is 256-color, for example maybe that there is no OpenGL driver installed.

I guess you have the option of either reporting the actual error message, which I think is not a great idea because they are not very explanitory, or just consider it a general failure and fall back to trying something else.

H&K: I came to the same conclusion last night too, to plot some known color pixel to the backbuffer and read it back. If the value comes back different or with less resolution etc, maybe that can be used to see what the buffer's depth is. But I've also found that you cannot be sure of what buffer you are reading, whether it is back or front, nor can you reliably select the buffer with glReadBuffer(). On some platforms there are issues with the way the OpenGL driver handles the buffers, some platforms don't even allow you to access the front buffer, and you cannot know that the backbuffer really is the backbuffer after doing a flip.


H&K(Posted 2006) [#20]
If you could get a pointer to the memory set aside for the buffer, could you see how big it is?


ImaginaryHuman(Posted 2006) [#21]
Yes I thought about that too. If you could find out how much memory the desktop consumes, you could divide it by (Width*Height) to find out how many bytes per pixel. That would be great. But I don't know of any cross-platform single reliable way (or any way!) to get that information. Even if I were to open my own window with, say, a canvas on it, I don't know how to find out how much memory that graphics space takes up. If there were any way to know how much memory a given area of the screen consumes, we could know with quite good certainty what the actual desktop color depth is.


ImaginaryHuman(Posted 2006) [#22]
I might try the `draw some pixels of a given color next to each other and read them back to see if they're a given bit depth` solution.


sswift(Posted 2006) [#23]
I'd go with checking that system clock. :-)


ozak(Posted 2006) [#24]
Why not implement an SDL graphics device? I mean SDL is already ported to BlitzMax (in some form), so it should be doable :)


ImaginaryHuman(Posted 2006) [#25]
Check out the desktopext module from d:bug. It reports the desktop width, height, depth and hertz. It is plenty sufficient and I would by no means want to use a large library like SDL to just find out such things.


d-bug(Posted 2006) [#26]
It's requested quiet often, isn't it ?

At the end, i decided to make an own thread for it... *click *

cheers


Warren(Posted 2006) [#27]
Swift seems to understand what I'm saying. :P


ImaginaryHuman(Posted 2006) [#28]
I get ya, Warren, that 256-colors is not used since 97 and no point checking for it. That's your preference.


Warren(Posted 2006) [#29]
That's the reality.


ImaginaryHuman(Posted 2006) [#30]
That's part of the reality. There are people with 256 color desktops. Me for example. It's my own preference and decision whether I want to support such things. *digs heels in*


Warren(Posted 2006) [#31]
You have a 256 color desktop? I'm speechless.


H&K(Posted 2006) [#32]
Angel, I realy think you sould just give it up.

But, new sudjestion, I assume that there is a Pallette somewhere in the system directory that corresponds with the screen. (Yes I know that maybe it will be different for each platform, but just do that ?win, ?Mac thing they do in the mods)
Find out from ppl who know where this is stored, And look see if its there.

ON a Seperate not, I agree with your comments about cross platform to a degree, but I do think that there are some things that we/you as developers should be keeping in mind.
In the past, this would have been that clock difference thing. Now if BMax were perfect, then it would have a command that told you if the screen was in 256 colors. But this command would probably be internaly different for each platform.
SO if you find the need to imppliment different methods for each platform, just do it. Just make sure that the actual command you write has the same footprint for each platform
BUT send the code to BRL and maybe it will be included in the next build.


Warren(Posted 2006) [#33]
This is the last thing I'll say: All the time you've spent trying to figure this out and the time spent discussing it here - couldn't that time have been better spent adding something useful and fun to your game?

Think it over.