Dx9 Max2d Updated
BlitzMax Forums/BlitzMax Programming/Dx9 Max2d Updated
| ||
I have fix a couple minor issues with repeated calling of Graphics/EndGraphics reported by James. I appreciate the feedback I received so far. Also optimized setup of textures from images. Link to updated Driver: http://smokenmirrors.com/Downloads/Dx9Max2d(0.2).zip I have seen some interesting numbers on the Clock test only one I cant understand is Grisu. Although Clock is a lousy benchmark DXTest is a little better as it is litle more aggressive in whats rendered. It seems to be working Ok on ATI and Nvidia cards really would like to see what Intel Motherboard based chipset can do. Anybody got one? If Skid reads this, One part in looking at Dx7 implmentation of texture creation what is purpose of smear edges and what would I use for test case to see benfit. Thanks for feedback and testing in advance. Doug Stastny |
| ||
dude, you rock, thanks so much for this. |
| ||
I get about double the performance of DX9 in OpenGL with the DXTest application. ( I only tried clock in the previous version. ) If I keep clicking as fast as I can ( unscientific, I know, but how else can I measure? ) it won't drop below 150FPS in DX9. In OGL it never drops below 320FPS. Clock is still faster, but only just barely. Not as much as it was in the last version. Definitely seems slower. 6600GT, as before, and no changes to drivers or anything else. |
| ||
Thanks Gabriel, Its not easy to test one vs other, clicking. I myself cant clickfast enough. If you exceed the Particle Cache it will grow, I just defaulted to 2600 as that was fastest I could click. By definitely seems slower? Than the last version? Or Dx9 seem slower. I assume Dx9 seems slower as I didnt change anything that would affect rendering speed. As for DX vs OpenGL, the big problem is unfortunatly architextural, limitations of the design of Max2d. Not so much OpenGL is better or DirectX is better API. Its more a fundimental difference in what the two API's are designed to due. The DirectX API requires a more batch oriented scene graph, so its dificult to get similar performance espeically if you create a lot of render state changes, which is what the 2d rendering in Max2d does. Other issue is the DirectX drivers need to do more setup work in BlitzMax code, and although its fast its not as fast as C code. Which is why using DEBUG when testing performance makes the DirectX look real bad. I do have a couple of Optimizations that I need to look at as reading dx9 driver debug spew, I have many unnecessary state changes. But alas its mostly a balance as to what is good enough kinda thing, for a limited 2d API. Personally I understand what Max2d is trying to do, however from a card/api design point of view, there would be a better way to set up this stuff for rendering and get extreme performance at expense of easy coding style. Just look at how DrawText and Images or Anim Images work makes me cringe as it thrashes the textures. Just nothing I can do as the driver has to implment the code the way its structured. Thanks again for feedback, still looking for someone with that Intel Motherboard GPU. Doug Stastny |
| ||
Sorry, I had debug on, and for some reason OpenGL renders at the same speed with debug on and off ( which tricked me ) and DX9 renders considerably slower with debug on. With debug off, you're right, no speed change. Sorry for the false alarm. The particle test thingy is about the same speed between OpenGL and DX9, the clock is still a good 25% faster. |
| ||
That seems to have sorted out the problem I had. Thanks Doug... great work, and much appreciated. |
| ||
Thanks alot Budman. I have a Intel chipset. DXTest: Debug - 90-100 Release - 100-120 Clock: Debug - 1700 Release - 1700 |
| ||
Clock: DX9: 7-13.000 OpenGL: 15-17.000 DX7 buffered: 7-8.000 This release seems to be a bit more unstable in terms of framerate for me. x850xt-pe with newest Cat 5.9 and NForce 4 ultra chipset. |
| ||
@Gabriel the issue with Debug on/off with OpenGL is the DirectX Driver basically has to emulate the OpenGL functionality so there is a lot more MAX Code to make same effects happen. Also "I think" the Module is compiled with the ?NODEBUG so regardless of the DEBUG compile it really is always debug off. Not sure on NODEBUG though. @Grisu not sure why framerate would be effected as didnt change anything in regards to rendering only cicular ref problem with Garbage collection of textures at shutdown that Jame found and some wasted clock cycles loading the textures. The fact that all of them show flucuation of some sort with the Clock is interesting. Thats about as simple a program you can get. And the X850 is older but pretty highend ATI card right? What kind of CPU do you have? @Khomy thanks, I am assuming it rendered the fireworks OK? I need to make a program that just tests the MIPMapping functionality and prompts for driver as correct rendering is highest priority. Granted its not stellar performance however there is alot of XP(Sp2) editions out there with that Chipset and keeping things above the CPU Framerate is good although the logic of DXTest should handle abrubt frame drops and still keep logic in sync independent of render speed. Todo's: Check CAP Bits of card and handle if automipmapping is not avaialable. Find out if SmearEdges is needed(SKID Help me understand here) Reduce excess RenderState changes as much as possible. The goal is to provide exact replacement functionality of Dx7. I have some ideas that I have tested and can make work. But wonder what people think 1. Smart Window mode positioning. Right now its pretty crappy. 2. Optional Driver Flags to have DirectX notsave FPU State. This increases performance but you cant use any math with doubles. Not safe unless you really know what your doing with BMAX code. The clock for example goes nuts. 3. Changing Graphics Modes and Toggle window modes on the fly with out calling EndGraphics to Graphics. This kinda works have to do 1. To make this reality. 4. Adding Extended Max2d commands to improve performance of Image Rendering. ie. FastText(Single Texture Fonts) DrawSubImage(Single Texture Animationstrips) Any comments and suggestions welcome. Thanks so much for all feedback Doug Stastny |
| ||
Those FPS for DXTest are the FPS rendering as many fireworks as I could click. With no fireworks, it renders about 200-220 FPS. |
| ||
Hi Budman, Thanks again for the great driver. I'm trying to use CreateOffscreenPlainSurface in DX9Utils but can't make out what to supply for DDevice9 :IDirect3DDevice9. Thanks for any help. P.S. I'm assuming I can create an offscreen surface, setrendertarget, draw to it and then convert to pixmap... Am I right? |
| ||
@tonyg thanks for kind comments. the offscreenplainsurface is used to allow copying from a rendertarget to a surface that cant be used as a texture or save to system memory. I use it to grab image. it cant be a render target for that you need to use the following API. HRESULT CreateRenderTarget( UINT Width, UINT Height, D3DFORMAT Format, D3DMULTISAMPLE_TYPE MultiSample, DWORD MultisampleQuality, BOOL Lockable, IDirect3DSurface9** ppSurface, HANDLE* pSharedHandle ); Here is example that hooks the DirectX Driver Safely and creates an rendertarget. Now the current MAX2d driver is not compatible but you can use raw DirectX to draw to it. Max2d could do it but you really would have to know all the little things to flush and ensure get maintained between switching render targets. Here is simple sample Import "Dx9Max2dDriver\d3d9.bmx" Import "Dx9Max2dDriver\Dx9Max2dGraphicsDriver.bmx" Strict SetGraphicsDriver D3D9Max2DDriver() ' this function Type TDx9DeviceObject Field _D3DDevice9 :IDirect3DDevice9 Method SendMessage:Object( message:Object,sender:Object ) Local Msg:TDx9GraphicsDeviceMessage= TDx9GraphicsDeviceMessage(message) If Msg Select Msg.Message Case TDx9GraphicsDeviceMessage.DX9GRAPHICSDEVICECREATED OnDeviceCreate(TDx9GraphicsDevice(Sender)) Case TDx9GraphicsDeviceMessage.DX9GRAPHICSDEVICERESET OnDeviceReset(TDx9GraphicsDevice(Sender)) Case TDx9GraphicsDeviceMessage.DX9GRAPHICSDEVICELOST OnDeviceLost(TDx9GraphicsDevice(Sender)) Case TDx9GraphicsDeviceMessage.DX9GRAPHICSDEVICEDESTROYED OnDeviceDestroy(TDx9GraphicsDevice(Sender)) End Select End If End Method Method OnDeviceLost(Dx9GraphicsDevice:TDx9GraphicsDevice) Print "Device OnDeviceLost" End Method Method OnDeviceReset(Dx9GraphicsDevice:TDx9GraphicsDevice) Print "Device OnDeviceReset" End Method Method OnDeviceDestroy(Dx9GraphicsDevice:TDx9GraphicsDevice) Print "Device OnDeviceDestroy" _D3DDevice9 = Null End Method Method OnDeviceCreate(Dx9GraphicsDevice:TDx9GraphicsDevice) Print "Device OnDeviceCreate" ' now we have our own referece to the D3D9 Device Interface to make raw dx9 calls _D3DDevice9 = Dx9GraphicsDevice._D3DDev9 End Method End Type Type TDx9RenderTarget Extends TDx9DeviceObject Field _renderTarget: IDirect3DSurface9 Field _saveTarget : IDirect3DSurface9 ' since this is not managegd surface we must create and destroy on reset/lost Method OnDeviceLost(Dx9GraphicsDevice:TDx9GraphicsDevice) super.OnDeviceLost(Dx9GraphicsDevice) If _renderTarget _renderTarget.Release_() _renderTarget=Null End If End Method Method OnDeviceReset(Dx9GraphicsDevice:TDx9GraphicsDevice) super.OnDeviceReset(Dx9GraphicsDevice) If _D3DDevice9.CreateRenderTarget(128,128,D3DFMT_X8R8G8B8, D3DMULTISAMPLE_NONE,0,False,_renderTarget,Null)<> 0 Print "Failed To create" Else Print "Created Render Target" End If End Method Method Use() If _renderTarget If _D3DDevice9.GetRenderTarget(0,_saveTarget)= 0 Then If _D3DDevice9.SetRenderTarget(0,_renderTarget) =0 Print "Set Target" End If End If End Method Method Restore() If _saveTarget If _D3DDevice9.SetRenderTarget(0,_saveTarget)=0 Print "Restored Target" _saveTarget.Release_ _saveTarget=Null; End If End Method Method Cls() If _D3DDevice9.Clear(0,Null,D3DCLEAR_TARGET,$FFFF0000,1.0,0) <> 0 Throw "Clear Failed" End Method End Type ' this function hooks into the DirectX Device Managment Messaging system ' Devices are created and destroyed you can create your own DX Code Global rt:TDx9RenderTarget = New TDx9RenderTarget D3D9Max2DDriver()._DXDriver.Dx9GraphicsDevice().AddDeviceObject(rt) Graphics 320,240,0 SetClsColor 255,244,0 While Not KeyHit(KEY_ESCAPE) rt.Use() rt.Cls() rt.Restore() Cls DrawText "Regular Buffer",0,0 Flip 0 Wend The device object pattern was designed just for these types of extensions. It will safely ensure whether you have a render surface or not regardless of screen mode changes. Hope this helps. If you can give me idea what your trying to do or sample of what you want to accomplish I will provide you the extension. Doug Stastny |
| ||
I was hoping to - create a rendertarget the size of a loaded image, - draw the image to it, - draw another image or primitive, - Convert to a pixmap - load the pixmap back to the original image. - reset the rendertarget to the backbuffer. Not sure how quick it would be but another attempt to get 'imagebuffers'. I still think it's possible with the standard DX7 driver *except* it uses DX System managed textures. This prevents the DX7 driver images being rendered to directly. |
| ||
Let me see if I can understand. Do you want to do this every frame? If so then the technique your thinking would be very slow. The Dx9 driver uses managed Texutures as well. Main reason its easy for generic rendering like blitz max. To try and write optimized texture manager for max would be nightmare since you can call anything at any time. Basically you want to Create an Image Render with MAX2d to it, then be able to draw that image on back buffer? If thats it, I dont have much time right now, but can probably complete an extension like above. Something like this. rt=CreateRenderTarget() SetRenderTarget(rt) cls Draw "Hello World",10,10 RestoreRenderTarget() DrawRenderTarget rt,x,y ... I think I can make it do it without haveing to convert back to pixmap. But can still provide mechanims to convert to managed image or pixmap , if wanted. Might be fun to pull that off. Doug Stastny |
| ||
I've had this working to some degree with the DX7 driver and Indiepath has the RTT module. However, in both cases they create a new surface and I haven't cracked how to write that back to a managed texture. |
| ||
Well you need to create a new surface for rendering to. DX7 or DX9 rendertargets need to be video memory. The trick is getting it back to system memory for a pixmap. For DX9 i have disabled the ability to get a device context. That is how DX7 does but the lock is slow, and in DX9 there is StretchRect and UpdateSurface that work using a blitter style of function. Based upon what I see. As you noted you cant render to a managed texture you can however create a texture that is a render target. Once you have it as a texture you can use it. The big issue is that it will need to be recreated if the device is lost since a RenderTarget cant be managed. I would assume you want to Update this RenderTarget every frame. You could also just draw on the rendertarget and call grabimage or grabpixmap. So there is really two choices here. Create a RenderTexture and be able to use it in functions simlar to DrawImage or create a rendertarget and copy the it back to the managed texture. Ill see if I can make it work out over the weekend or if I cant steal some time at work tommorrow. Doug Stastny |
| ||
In DX7, when I attempt to blt the rendertarget back to the managed texture the 'image' reduces in size. I suppose I could create a sprite type which has BaseImage and space for a 'renderimage'. Thanks for anything you come up with but I wouldn't put too much focus on it. |
| ||
I suspect your problem is not the blit but the actual drawing. When you set the render target you need to change the viewport and projection matrix to match your offscreen surface. Then switch it back when done drawing to it. Now I did get Rendertargets working, have a couple of glitches and bugs I found in the Dx9 Driver when converting Offscreen Surfaces to pixmaps. Ill get those fixes to the driver in place, and finish a Imagebuffer style of RenderTarget. It will be a either a render target or a full rendertarget texture. I now know extactly what your trying to do, I read the ImageBuffer docs :). Only issue might be maintiang the buffer if the surface is lost. If I can capture the surface when lost and restore it upon reset, I can avoid copying the image to a pixmap every frame but not sure if I can avoid that performance hit. Doug Stastny |
| ||
Another Intel (82815) based test : "Unhandled Memory Exception Error" when running clock.bmx :( also only OpenGL driven app work using Bmax on this machine. D3D7Max2DDriver does not work. |
| ||
Interesting that Only OpenGL works but not DX7 or DX9. I suspect a driver problems. Might want to reinstall DX9 and find latest driver for the 82815. Can you run the DX9 with DEBUG and give the line that causes the exception? Thanks Doug Stastny |
| ||
I suspect a driver problems. maybe...latest drivers i managed to find are installed + directx 9.0c. In debug mode the following line is highlighted : SetOrigin 160,120 If i comment this line then the next one is highlighted. |
| ||
Hmmm sounds like the Graphics command is failing to initialize the driver. Try adding changing Graphics command to this..Global MyGraphics:TGraphics =Graphics (Screen_Width, Screen_Height) If MyGraphics = NULL then print "Failed to init Graphics";End Also try commenting out all the SetGraphicsDriver commands and see if it works with the basic DirectX7 driver. Do any of the Max2d programs work with this card? If the problem is with the Dx9 driver and the bufferedDx7 driver, I but works with basic driver I think I might be able to track down the issue. If no directx works at all might be tougher. Card driver is here http://www.intel.com/support/graphics/intel815/ Does anyone else have one of these Intel Graphics cards that works/doesnt work? Thanks Doug Stastny |
| ||
Hi Budman, I had 5.12.01.2593 driver version installed for the 85815 Intel. Now (thanks to your link) i have 6.13.01.3196 driver version. Now i can run D3D7Max2D applications without any pb so far which is a slight improvement :) If i put the line you posted above to verify if the graphics initialisation went ok, the clock.bmx program crashes in debug mode with the " DrawText fps,0,0" line highlighted and with : "unhandled exeption:failed to create texture:-2005530516" message. |
| ||
Well thats a good start, now I have to figure out why its failing on the create texture. Thanks for feedback, I am looking into possible causes of failure. Doug Stastny |
| ||
Upon further reading, that chipset is pretty old, The DirectSDK doesnt give much guidance. I am pretty sure I know the cause of the failure in that I am using the AutoGen Mipmap capability in creation of textures. This is actually area, I was unsure of how it would be handled in case that card doesnt support. SDK says should still create texture, but I guess the Intel drivers just says ummm no. Ill see if I can code a fail back case if Mipmap generation is not supported. Look for updated version, to retest in future. Glad I got you at least DX7 support :) Doug Stastny |
| ||
@BennyBoy Can you try this. Comment out these lines in Dx9Max2dGraphicsDriver.bmx line 67-70 ' If (_Flags&MIPMAPPEDIMAGE) ' usage=D3DUSAGE_AUTOGENMIPMAP ' level=0 ' End If This will disable mipmapping but help determine if this is the cause of the problem. Thanks Doug Stastny |
| ||
My old hp pavilian desk top has one of those intel815 chipset cards (pretty sure). My Bmax projects run really slow on that thing. |
| ||
Sorry, didn't take the time to visit the forums on the weekends so late reply: With the lines commented out it still fails to create the texture, same error message as before. In debug the following line is shown after crash: If hr<>D3D_OK Then Throw "Failed to Create Texture:"+HR I would be gald to help you more here but unfortunatly my knowledge of directX sucks, the best i can do is simply try the examples. |
| ||
@Benny - Thanks for trying that. I am pretty much at a loss as to why create texture is failing. On that card/chipset. @Chugs40 - have you tried the DX9 driver on that older machine? I have done some research on the card and its ability to support DX9, for most part I see issues with calling createtexture failing, and Game manufatures saying chipset is not supported. This is in Microsofts SDK examples as well. Now Intels site shows some compatibility with some games but alot have problems so its hard to say if the game is really using the DX9 api or not. Many times Game vendors will say requires DX9 but they might use older API, for compatibility rendering with older cards. Since Clocks example runs in a window. What display depth do you have your desktop set to. If its 16bit can you try changing it to 32 bit. I wonder if there is incompatibility with 32bit textures on 16 bit display with this chipset? Thanks for trying this all out. Doug Stastny |
| ||
I was in 16 bits, switching to 24 bits (no 32) on Windows 2000 results in having the init graphics failed when running clock.bmx. (this PC i am using now with the 85815 intel will soon be replaced by a better one so i won't be using it anymore) |
| ||
Does this chipset support 32Bit 3D contexts at all? (if so then perhaps only in fullscreen, as 32Bit windowed mode was something cards learned later) |
| ||
@Dreamora- I think you identified the problem, I completly forgot about older cards not supporting 32 bit. All the textures in the DX9 driver require 32 bit support. So if this card can not do that depth, the API will fail. So that raises a couple of areas I need to look at. Check CAPS to see if texture format is supported. I have only been testing against newer cards but that would defintily cause the issue. I appreciate all the feedback and help testing. The more cards the better. Doug Stastny |
| ||
Yep thats the issue. Found this on Intels site. This game is trying create 32 bit textures for the displaying mouse cursor and the Intel® 815/810 chipset family does not support 32 bit textures. Let me see what I need to do... Grrrr Thanks Doug Stastny |
| ||
No, i have not tried the driver but i may try it soon! |
| ||
Ok looking more at this Chipsets documentation/specs. I dont think there is any way I can make this work acceptably... err defeat!. Issue surrounds lack of 32bit texture support for card and BlitzMaxs lack of 16bit support for Pixmaps. They way MAX works with images/pixmaps all of them are internally in 32 bit format. To get them to this card would require sampling down to 16 bits. The problem is best format I can get would be a A4R4G4B4. The image quality would be stinko and runtime image conversion is not something I really want to code. 16->32 is easy 32->16 not so easy. If and thats a big if MAX supported 16bit pixmaps it might be better since the image can be preprocessed before loading. Very weird how DX7 seems to handle this but DX9 doesnt. As the Dx7 format is 32bit. except it requests texture formats from DirectDraw Surfaces as 24bit RGB with 8 bit alpha and the intel driver likes that. Dx9 does not support Alpha with a "24 bit" format. I think big issue is this is cheapo Dx7 chip that barely cut it when it was released. Six years later well it still stinks. This chip is going to have trouble with pretty much any Max2d implemenation OpenGL/Dx7/Dx9 if you start pushing heavy Graphics at it as it barely qualifies as a 3d card. It would probably be ok for a puzzle game with low FPS rendering but nothing more, in which case DX7 driver is best for this case. Sorry Doug Stastny |