DX vs. OpenGL Render Performance

BlitzMax Forums/BlitzMax Programming/DX vs. OpenGL Render Performance

Richard Betson(Posted 2012) [#1]
Hi,

I am getting wild differences in the performance of DirectX and OpenGL using BlitzMax. I am working on a project and users are reporting that the DX drivers are slower by large amounts in FPS, for example using my project application a user my hit 20FPS in DX and in OpenGL hit 250FPS. Even on my low-end system I get similar results.

I was wondering why this might be? Even 'col's' new DX11 driver seems to suffer from the DX lag in performance. I use odd.mod but have tried just the native DX drivers in BlitzMax with the same results. Is this just crappy DX implementations on the video card end of things or something else possibly in BlitzMax's DX drivers?

You can test my client and check the performance between DX and OpenGL. Available here:
http://redeyeware.uphero.com/phoenix_ss/phoenix_ss_client_a82b.zip

Project Pages:
http://redeyeware.uphero.com/phoenix_ss/main.html
http://www.subspace.co/forum/510-phoenix/


Richard Betson(Posted 2012) [#2]
I have done a clean install and build (BlitzMax) and no help.

Are there certain limitations on DX that OpenGL may not have such as image size (I'm using 128 x 128 tile images). Just blowing my mind in the difference in performance between the DX and OpenGL drivers on many different video cards.:/


Richard Betson(Posted 2012) [#3]
OK :)

I found the problem. I had SetViewport() one pixel off and that was killing the performance. So given a display resolution of 1366 x 768;

SetViewport(0,0,1366,768) = So slow it hurts

But,

SetViewport(1,1,1366,768) = Now fast as Open GL in DX9

DX7 is still way slow but I expect DX11 users will get the same increase in speed as I did with DX9. I find it a little weird that bleeding off the display by one pixel row and column would tank DX's performance so much.


*(Posted 2012) [#4]
Maybe it's storing that extra pixel each way as an extra screen :)


Richard Betson(Posted 2012) [#5]
Hi,

From what I have read, setting the viewport to larger then the backbuffer will cause the viewport to fail in DirectX and no clipping will occur.

Maybe SetViewport() should automatically limit the view port to the dimensions of the backbuffer(). Just a thought.