problem with Retina Macbook dual GPUs

Archives Forums/BlitzMax Bug Reports/problem with Retina Macbook dual GPUs

Sonic(Posted 2014) [#1]
hi there,

in my large project which uses MiniB3D, there is a 100% reproducible bug.

when running the app, without being in 'discrete GPU' mode (i have used gfxCardStatus to determine which is in use, as well as in System Information app to eliminate the possibility that it was caused by gfxCardStatus), the following problem occurs:

- the graphics appears (by massive banding) to be in 8 bit mode. there are huge jumps in colour gradients etc.
- the resolution appears completely wrong, as if (in most cases) only a percentage of the normal screen is being rendered, or the world is being rendered much larger than the viewport. various resolutions are off in different ways, either borders or fractional images.

i can get this problem to disappear without fail by simply loading Photoshop (ensuring the discrete GPU, the nVidia 650m, kicks in.) then i run the same program and all the GPU problems disappear (except 1200p resolution, see below). i have tried this with two separate Retina Macbook Pro's, and other games run fine when forced to use the Intel graphics, so by a process of elimination it must be something that Blitz is doing.

also, and i'm not sure if this is related, but, previous to OS 10.9:

- 1920 x 1200 mode used to work perfectly (and still does for various games i play, written in Unity, Unreal Engine etc). now, even on the discrete GPU, this resolution fails to work, with a large black border (the actual resolution it was displaying, i worked out, was closer to 20xx across, a non-standard one for sure!)

without these two features (working with integrated GPU, and a working 1920x1200 mode on my 2880x1800 screen, which, as mentioned, used to work great) i will have to start my 2 year project again in a new language, as these problems are too large to ignore.

hoping for some response on this.

thanks,

jasper


Brucey(Posted 2014) [#2]
So, since your upgrade to 10.9, your project hasn't worked properly?


Brucey(Posted 2014) [#3]
You could try applying the following to your application plist file :
defaults write /APP_LOCATION/APPNAME.app/Contents/Info.plist NSSupportsAutomaticGraphicsSwitching -bool YES

from terminal. Replacing APP_LOCATION and APPNAME appropriately.

You may also need to add a line to glgraphics.macos.m (brl.glgraphics), in the function _initAttrs() :
    attrs[n++]=kCGLPFAAllowOfflineRenderers;

(if that doesn't compile, you can use the value 96 instead, apparently. Can't test at the moment as I'm not on the Mac)
Ideally, I suppose this could (should?) probably be added as a proper FLAG, for the Graphics stuff, rather than being hard-coded.



This should force the app (according to all the pages on the internet that Google supplied me with) to use the integrated card if available.


References :
http://zacwe.st/blog/rdio-discrete-card/
http://stackoverflow.com/questions/8870304/dont-automatically-switch-to-the-higher-end-discrete-gpu
https://code.google.com/p/xee/issues/detail?id=346
https://developer.apple.com/library/mac/technotes/tn2229/_index.html
https://bugzilla.libsdl.org/show_bug.cgi?id=1934
(amongst others)


Google is your friend, etc, etc.


Sonic(Posted 2014) [#4]
hi Brucey, i know google is mine and everyone's friend, and of course i've tried extensively to track down the source of this problem before posting here.

regarding 10.9, that is a separate issue related to availability of 1920x1200 resolution. the graphics switching bug has occurred in 10.7 / 10.8 also.

i will try your plist suggestion and get back to you, thanks for the research! though i'm not convinced that forcing the iGPU is what i want to achieve (more that i wish for it to display correctly when used.)

edit:

tried the .plist edit, and it made no difference, with either YES or NO as the boolean value.
if i can't narrow this down, i'm going to have to look into other tech... as i've also been struggling with framerate issues with MiniB3D and what should be relatively simple scenes (due to reliance of oGL immediate mode, perhaps, or some bug in MiniB3D?), i'm starting to feel that my beloved Blitz might be a little long in the tooth.


Sonic(Posted 2014) [#5]
still stuck with this problem and the workaround of loading Photoshop to force the non-integrated GPU to kick in, in order to get the game to display correctly. could it be related to render to texture (using code from the archives here) and / or MiniB3D? these are the only unconventional visual things i'm doing. just wondering how one goes about debugging something like this, which lines of enquiry to take, so to speak...


Sonic(Posted 2015) [#6]
still having this problem, have spent weeks and weeks trying to track it down.

- in os 10.9, BMax tells me the following resolutions are available:


No. of resolutions supported: 13

1440 x 900 Ratio 1.60:1
720 x 450 Ratio 1.60:1
1920 x 1200 Ratio 1.60:1
1680 x 1050 Ratio 1.60:1
1280 x 800 Ratio 1.60:1
1024 x 640 Ratio 1.60:1
840 x 525 Ratio 1.60:1
2880 x 1800 Ratio 1.60:1
2560 x 1600 Ratio 1.60:1
2048 x 1280 Ratio 1.60:1
1024 x 768 Ratio 1.33:1
800 x 600 Ratio 1.33:1
640 x 480 Ratio 1.33:1

No. of bit depths supported: 2

16 bpp
32 bpp

No. of refresh rates available: 1

0 hz


but only a few of them work (1280 x 800 is ok, 1920 x 1200 and 1680 x 1050 are not etc.) the ones that don't work have the screen shunted off to the side.

the second problem, of having the resolution completely off, as well as 16 bit colour instead of 32, occurs whenever the iGPU (intel HD4000) is in use. the only way to get round this is by running Photoshop or another program which forces use of the discrete Nvidia 650 gpu.

in other words, even though the framerate is fine on intel graphics, the graphics mode is wrong and it renders the game unplayable.

as a test, i thought i'd try and 'SavePixmapPNG' for both GPU modes:

here is Intel HD4000 at 1440 x 900 (native res)



you can clearly see the 16bit colour banding. the weird thing is the screenshot is cropped correctly, but this is not what you can see when playing. you can only see the bottom-left 4/5 of the image on the computer itself (in other words, the screenshot captures the full buffer, but the buffer is displayed enlarged in reality, such that you only miss the top and right of it while playing and the centre is offset right and up accordingly.) i have drawn the red lines on top (by hand) to indicate roughly the area of screen that is actually visible.

here is nvidia 650 at 1440 x 900 (native res)



no colour banding. however, i have absolutely no idea why this white ring is showing here - it doesn't show in-game? now i'm really confused!


Sonic(Posted 2015) [#7]
i decided to do the simplest test possible, in the (halved) native resolution of the screen.

Graphics 1440,900,32,60
While Not KeyDown(KEY_ESCAPE)

	Cls
	DrawLine 0,0,GraphicsWidth(),GraphicsHeight()
	Flip -1

Wend


and this breaks it! with Photoshop loaded, forcing the discrete GPU, the line displays correctly. without it, the line is clearly cropped, seemingly letterboxed and enlarged. this is strange behaviour indeed, and leads me to suspect it is indeed in Blitz' code.

i also tried 2880x1800, and the Intel HD4000 simply failed to display the line (whereas the NVidia 650m displayed it correctly.)

as almost all Macs are currently using retina displays, this really needs to be fixed otherwise it makes the language truly obsolete.


skidracer(Posted 2015) [#8]
Are you sure your mac isn't 2560x1600? The following being the half of that-

GLShareContexts
Graphics 1280,800,32,0
While Not KeyDown(KEY_ESCAPE)
Cls
DrawLine 0,0,GraphicsWidth(),GraphicsHeight()
Flip -1
Wend


Sonic(Posted 2015) [#9]
absolutely sure, skid! it's a 15" macbook pro, with 'best for retina' scaling being 1440x900. i believe the 13" has the resolution you mention.


Sonic(Posted 2015) [#10]
despite having been working on the codebase for two and a half years or so, i'm now pretty seriously entertaining moving the whole thing over to unity / C++ with SDL or something / monogame etc ... i'm just not sure what else to do.

most of the issues are probably specific to mac, as far as i can tell (i'm not focussed on Linux as a platform at this time.) but i develop on a mac and the platform is therefore important to me and a big reason for choosing BMax.

the first big issue is the lack of correct display settings (banding / incorrect resolutions etc) on the Intel HD4000 (and possibly HD5000/6000 series etc.) i haven't been able to test on any newer retina macbook pros, or 13" models of 2012 or later, the new 5k imac, or the new retina macbook that just released. most of these don't have a discrete GPU, but then again they might still suffer from these issues for all i know. will these things be maintained any more? if so, by Mark, or by someone else? and thinking further ahead, will there be another CPU switch by Apple in the long-run? a better maintained language would be a much better guard against this (as would going hardcore, c++ / GL etc.)

secondly, the lack of 1080 letterboxed / 1200p (1920x) resolutions, even though they are supported in other games, is also a serious issue. i'm 99.9% sure they used to work in OS 10.8 / 10.7, but for some reason it won't work for me in 10.9. it's a good standard res and probably the ideal res to play at on these laptops which can't fully stretch to their 2880 native res.

finally, in minib3d, there are some debug-less crashes, which could admittedly be something i need to track down in my code, but there are some definite performance problems and codebase i can't seem to get around, despite implementing all the usual suspects in terms of optimisation (the game is complex, but it is graphically less demanding than, say, minecraft.)

i really wanted to believe in this language for this large project but i worry that i might be trying to make it do more than it was intended to.

- jasper


skidracer(Posted 2015) [#11]
OK, I have had a quick squiz. Sonic, sorry I did not see this thread earlier.

From reading through glgraphics.macos.m the bit that sticks out is the use of the command CGDisplayBestModeForParameters which is deprecated.

I will bring this up with Mark as this needs to be examined further.

In the meantime you could try replacing last parameter of both calls with ,true to force exact match


			displayMode=CGDisplayBestModeForParameters( kCGDirectMainDisplay,depth,width,height,0 );
		}else{
			displayMode=CGDisplayBestModeForParametersAndRefreshRate( kCGDirectMainDisplay,depth,width,height,hertz,0 );



Also you could try calling .GetSettings on the graphics object to see what BlitzMax is actually creating.

Local gfx:TGraphics=GLGraphics(1920,1080,0)
Local w,h,d,hz,f
gfx.GetSettings(w,h,d,hz,f)
Print "settings="+w+","+h+","+d+","+hz+","+f



Sonic(Posted 2015) [#12]
hi Skid, thanks for the prompt reply, and for looking into this.

i have already begun porting to unity, as an experiment at least. i'm still concerned, even if this is going to get fixed. as you say, i've been looking into this for a long time.

i tried the macos c code edit, and it produced an unhandled exception error, with both 'true' and '1' as values

the getSettings() function returned the requested resolution and bit settings (i think the right bit settings, i need to check on that with my game) ... but as before, it didn't display it correctly (in my simple line test prog where i also tested the function, in iGPU mode as before)


skidracer(Posted 2015) [#13]
Just to confirm, this is no external monitor cable plugged in and without forcing the frequency (freq=0) - ((your previous post suggests 0 is only frequency supported))?

Link for Mark:

https://developer.apple.com/library/mac/documentation/GraphicsImaging/Conceptual/OpenGL-MacProgGuide/EnablingOpenGLforHighResolution/EnablingOpenGLforHighResolution.html#//apple_ref/doc/uid/TP40001987-CH1001-SW4


Sonic(Posted 2015) [#14]
no external monitor plugged in. the 0hz threw me as well, i'm not sure what the cause could be. regardless, i run my games locked to 60hz which is the default refresh of all Macs i believe.


ShadowTurtle(Posted 2015) [#15]
1.
use a "Viewport-Coords <-> Screen-Resolution <-> Mac-Model/OS" list.

2.
using dithered graphic is better (retro-look). Alternatively you can use a better blending mode.

3.
Shrinking ideas means expanding work. Port only if really necessary. You´re the designer of ya own game. Don´t forget it.


Sonic(Posted 2015) [#16]
...4. Profit!

* * *

in all seriousness, i had some more thoughts weighing up my decision. (please forgive the rambling!)

so, in testing a simple scene in Unity at the moment, one reason i'm still very fond of the BMax build is there is no question as to which runs the 'smoothest' - i'm really sensitive to choppy / jerky movement having grown up with SNES / Amigas etc, and the thing that has always been true about this language is that it has a wonderfully smooth 60hz when you want it. in most if not all of the titles i've seen, running on Mac at least, Unity has too much jitter and the odd annoying jerk, perhaps because too much is going on behind the scenes, who knows.

on the subject of the future of the language in terms of support... i've been able to get FBO's working with both 3D and Max2D, batched sprite meshes, GLSL shaders working etc with the help of the many useful posts here and a the odd friend there (i'm not a great programmer by any stretch), and i guess i feel it's a bit of a shame that Mark moved completely over to Monkey instead of adding these kind of modern features to the language (and perhaps made something along the lines of a more stable MiniB3D, aka the legendary Max3D.) and improved the IDE to add intellisense etc... it could have been a really great platform for the new wave of indies, who want something with the libraries of Game Maker but without the GUI.

i understand what Mark wants to do with Monkey, and i completely respect his decision to focus on it, and i get that it would have been next to impossible to support new additions to two languages fully at the same time. it's just that Monkey wasn't the right fit for my needs, which in this game are as much performance and smoothness as i can get on the three desktop platforms only (and ideally PS4, the port of which i'm sure Mark has time to do! ;)

one thing that still concerns me, even if we can get a Max update, which would be awesome, is my codebase is reaching the point where draw time was causing spikes and therefore slowdown. what i'm actually rendering is fairly simple, i've already optimised as much as i feel i can at this stage, and that the discrete GPU i'm using is pretty decent, so i feel like i should be able to keep a solid 60fps. this is where my fear of the now unsupported MiniB3D come in. it's weighing that and potential future incompatibility issues in Max itself against a switch to the less than perfectly smooth, heavy on the GUI, but very powerful and likely well supported, Unity.

... ramble over...

i'd just like to say thanks again for looking into this, skidracer! and to Mark for making such a great language, i wish it was more widely known and used.


ShadowTurtle(Posted 2015) [#17]
Sometime a developer seems to be courios about dependys.

At its time BMax was a real blockbuster @ underground devs.
There was some cross-compatible compiler at that time.
In the end Bmax had effectively less bugs.


i can get on the three desktop platforms only (and ideally PS4, the port of which i'm sure Mark has time to do! ;)

No. Monkey 1 is supporting WiiU too. (no joke)
See here: http://www.monkey-x.com/Community/posts.php?topic=8661&page=1

in all seriousness, i had some more thoughts weighing up my decision.

I had created a nintendo-alike-game too.
Some stupid marketeer @ nintendo sent a newsletter (via computerbild publisher) in that MY freeware game was advertised too...
In the end my game was banned from the computerbild-spiele page :D
This is the last bit i can find @ archive.org:
https://web.archive.org/web/20081024114637/http://www.computerbild.de/download/Shadowturles-Donkey-Kong-Clone_2110241.html

In the end the whole story advertised my page extremly... :-)


Sonic(Posted 2015) [#18]
ok some interesting news, it seems that certain Unity 4 games seem to exhibit the same issue on my computer.

i noticed it with my own game's simple scene. the 1440 resolution was similarly enlarged such that the centre point was clearly offset in the unity logo, when the integrated GPU was engaged. i also found another Unity 4 or earlier game that exhibited the same issues (though i'm not entirely certain they exhibited the 16bit colour issue I see in Max on iGPU.)

i have a feeling this may have been solved in Unity 5, although i'm still doing more digging to find out. but i have played one game which appears to have a different Unity logo and that works correctly. however the difference here is that there are no resolutions above the monitors own 'virtual resolution' of 1440x900 (despite there actually being twice that many pixels across.)

i initially had this set of resolutions available in my own game test scene in Unity's default build settings, but I checked the enigmatic 'Mac Fullscreen Mode' in 'Player settings' and there was a 'Capture' option (whereas it was set by default to 'Fullscreen Window') ... which made the higher resolutions available and the game behaved like Blitzmax.

so i realised this other game i tried, whether in Unity 5 or not, was probably just using the same setting.

so i have no real evidence to say it's been solved. or that it is even Blitz' fault. it may well be at Apple's door, this one... and unfortunately Mavericks is no longer updated, so no chance of driver fixes, and i don't plan to exclude Mavericks users. it's a shame. i will have to use the fullscreen window type arrangement probably, in unity.

i found this thread, which backs up a lot of what i'm surmising (eg running the unity editor, which probably kickstarts the discrete GPU, makes things work correctly etc):

http://forum.unity3d.com/threads/mac-os-fullscreen-problem-retina-display.217858/

other thread of interest, seemingly related:

http://steamcommunity.com/app/287980/discussions/0/35221584557738041/

it seems 'capture mode' which i'm assuming is what Max is doing, is now deprecated in Unity, but does that mean there's actually no fix for this?

it's hard to know without trying unity 5 whether this problem persists, or if they found a fix for it. i will look into it.

edit:

ok, found this too, on http://docs.unity3d.com/Manual/HOWTO-PortToAppleMacStore.html :

Fix Macbook Pro Retina fullscreen problems (see http://forum.unity3d.com/threads/145534-Mountain-Lion-MacBook-Pro-Retina-gt-problem-for-Unity-games) by adding something like this. It only needs to be called once! It will create a flicker since it goes out and in fullscreen.
if (Screen.fullScreen)
{
//MacBook Pro Retina 15: width = 2880 , MacBook Pro Retina 13: width = 2496 ?
//could check device model name, but not sure now about retina 13 device model name
//if last resolution is almost retina resolution...
var resolutions : Resolution[] = Screen.resolutions;
if (resolutions.length && resolutions[resolutions.length - 1].width > 2048)
{
Screen.fullScreen = false;
yield;
Screen.fullScreen = true;
yield;
}
}




Sonic(Posted 2015) [#19]
just to confirm that capture mode is broken on unity 5 also when using integrated GPU in Mavericks. it seems they never solved the problem either. i wonder if it's also a problem in Yosemite (even though it's mainly an academic point, as i couldn't release a game with the problem on such a recent OS as 10.9.)

so it looks as if, until i hear some really good news, i am going to push on with my unity port (which is coming together surprisingly fast, despite feeling like i'm fighting the GUI sometimes.) it sure is nice to have shaders and shadows on tap! :) but i do miss Blitz's code-style already. :(

this actually feels like Apple's fault, their software QA has really fallen off lately imo.