Forcing Anti-Aliasing for B3D stuff

Blitz3D Forums/Blitz3D Programming/Forcing Anti-Aliasing for B3D stuff

Dock(Posted 2007) [#1]
I'm working on a project that absolutely NEEDS to have anti-aliasing otherwise the game looks like garbage.

It uses a lot of cel-outline stuff which just turns into jaggy rubbish without anti-aliasing. It looks great if I force anti-aliasing in the hardware control panel of the machines I try it on, but it's obviously not ideal for a distributable product.

I don't mind about the minimum hardware requirement that anti-aliasing would cause, it's just crucial for what I'm doing.

Does anyone know whether it is possible to write a 'wrapper' in C++ or otherwise which forces D3D to anti-alias another piece of software?

I presume there is still no proper anti-aliasing solution for B3D. I'd really like to force it on with a DLL or something similar, but I'm not familiar with anything.


boomboom(Posted 2007) [#2]
I think there is one already around, and is quite ok to use. I don't know the name of it, but I am just posting up to give you hope :)


Robert Cummings(Posted 2007) [#3]
There is no hope as that dll only works on an ancient version of blitz.

Forget it, you will never have antialisasing in blitz3D. I asked for a long long time and was basically told they have no plans to fix it.


Dock(Posted 2007) [#4]
An ancient a version of Blitz? Which DLL are you referring to?

I know that BRL has no intentions to fix it, but I was more curious as to whether a more hacky approach to 'wrap' the executable might be theoretically possible.


Ross C(Posted 2007) [#5]
Try the sswift blur routine? You could try and use that to give you some reduction in aliasing.


Dock(Posted 2007) [#6]
Ross, that doesn't really help me because the problem really comes from fine polygon areas that are becoming dotted jaggy lines. The sub-pixel sampling of anti-aliasing draws a fine line instead.


Ross C(Posted 2007) [#7]
Can't you just up the resolution? Sometimes that is just as good as antialiasing, and costs less in performance, as all the graphics card does i believe is render the scene to a much larger texture and use this to perform antialiasing.


Dock(Posted 2007) [#8]
It doesn't really solve my problem to increase the resolution unfortunately. I would sooner run at 800x600 with anti-aliasing than 1600x1200 without.

Ah well, I guess I will have to accept this limitation of Blitz.


Rroff(Posted 2007) [#9]
Ultimately its upto the end user... intel integrated GPUs aside you can force anti-aliasing on/off in the control panel for nVidia and ATI drivers, for nVidia you can even create an application specific profile with FSAA forced... not sure about ATI.

I'd be tempted to put up a big alert on first run advising the end user to enable anti-aliasing on their end - as even if you did use another program/dll to force it on, quite a lot of people run with global FSAA disabled and only enable it on specific games.


Gabriel(Posted 2007) [#10]
Ultimately its upto the end user... intel integrated GPUs aside you can force anti-aliasing on/off in the control panel for nVidia and ATI drivers, for nVidia you can even create an application specific profile with FSAA forced... not sure about ATI.

Forcing it on or off usually has no effect on a Blitz3D program. Forcing it on usually does nothing, and I can remember some reports from people who had antialiasing enabled even when they forced it off.

My advice? Just forget about antialiasing, because it's not going to happen. Or if you're completely desperate for it then try rendering the scene to a large texture and scaling it down to the size of the screen. Performance might be a bit dodgy, you'll have to adjust because resolutions are not generally powers of two, and I'm not even sure how good it will look, but it's probably the nearest you're going to get.


skidracer(Posted 2007) [#11]
All the AA methods implemented by DirectX 7 (edge filtering) were unfortunately discontinued in favor of full screen methods and so it is extremely a rear event to find modern drivers have the relevant caps enabled.

As Gab says, one solution maybe to use the maximum rendering size of the device and as a final pass stretchblit to the directdraw backbuffer as I doubt many Blitz3D games are currently bottlenecked in this area. AA is never free anyway...


Rroff(Posted 2007) [#12]
Forcing it on works fine for me, on all the PCs here - tho my laptop aside they are all nVidia hardware of different generations.


t3K|Mac(Posted 2007) [#13]
Strechblitting to the directdraw buffer does 'AA' on the whole texture/image, not just the edges. so the whole screen looks blurry.


IPete2(Posted 2007) [#14]
Is there anyway you can render and blur just the outlines during a separate renderworld?

I know that a number of very expereinced coders here do use a fair bit of separate renderworld calls for specific reasons, lighting and effects etc. Maybe this is an occasion where you can use it.


IPete2.


Ross C(Posted 2007) [#15]
Here's me thinking Anti-aliasing actually blurred the whole screen. I never knew it only worked on edges :o)


Vertigo(Posted 2007) [#16]
Have you tried the ashadow library? Using the createblur camera, alpha# command will blur just the edges of a mesh if you keep that alpha param low. Just a thought.


boomboom(Posted 2007) [#17]
There are different types of Anti-aliasing. The more common one now days renders the screen bigger then shrinks it down.

For example, 2x Anti-aliasing renders the screen twice as big as the graphics size set by the user, then scales it down. This makes everything a lot smoother. Of course this is all done in hardware on the GPU, but still slows it down, especially when you go into 4x or 8x modes


Pete Carter(Posted 2007) [#18]
Ive had the same issue and have changed my camera angle and zoom values to lessen the jaggy effects. One way round it is to buy blitzmax and use minib3d as i beleave AA works in that. Ive been meaning to move to max anyway seams after my curent project i will make the change. Only down point is that I will then be moaning about ATI's poor open gl performance but you cant win them all :O)


t3K|Mac(Posted 2007) [#19]
i am moving to cobra when my project is finished. max is no option for me. and i doubt that aa will work in minib3d. when it does, why doesn't it work in native b3d?


jhocking(Posted 2007) [#20]
The Antialias command works in miniB3D. Given that the AA problem in Blitz3D is a DirectX 7 thing, and miniB3D uses OpenGL, I'm not sure why you doubt it would work.


Dreamora(Posted 2007) [#21]
It would work in B3D as well if mark implemented the solution posted on the board by some users.
Its just that the init of the graphics context seems to be broken


Pete Carter(Posted 2007) [#22]
max is no option for me.


I used to think that, but if you can program in blitz3d, max isnt that different once you add minib3d. Cobra uses open gl too, so i dont think its much different to max apart from having to learn all the new syntax. so max is an option. don't let the minib3d name make you think its not powerful because its a open gl b3d that works great.

Pete


t3K|Mac(Posted 2007) [#23]
i have no problem in learning a new syntax. i support graham, cause he promised us a 3d engine and he did it (BRL did not!). if b3d gets a fully working dx9 engine, i think i would stick to b3d. but time will tell... and the next few months are fully packed with b3d coding. i am still happy with b3d (for my current project).


Pete Carter(Posted 2007) [#24]
Cobra3d isnt direct x9 its opengl and blitzmax already has a open gl engine. BRL as far as i know never said there would be a dx9 engine for blitz3d? They did say there would be a max3d which has changed, but there is minib3d which doesnt seam any different to Cobra3d.

what advantages does cobra have over the blitzmax/minib3d combo?

points against..

1. its just for windows
2. its much more expensive


t3K|Mac(Posted 2007) [#25]
i know that cobra is not dx9. i am not stuck with dx. fact is brl did us promise a working 3d engine for bmax for loooooooooooooong time. but it never happened. minib3d is not the engine i expect from such a long development time.
fact is, graham did a good job and i support him. if brl had managed to produce a good engine for bmax, i'd support them instead. easy eh? and that cobra is for windows only - so what? my current b3d game is only for windows too.

but we are drifting away from the real topic, which is AA for b3d.


Ross C(Posted 2007) [#26]
Doesn't cobra3d have a built in physics engines, and easy to use shaders and shadows?


Pete Carter(Posted 2007) [#27]
but we are drifting away from the real topic, which is AA for b3d.


Well thats been covered on this forum so many times before. IT DOESNT WORK AND NEVER WILL! that cleared that up. :O)

Im not putting down Cobra it seams quite nice, i just think that blitzmax and minib3d shouldnt be dismissed. If Max3d comes out as a games SDK with editors etc, which is what Marks last worklog sounded like it said, ill be very happy with brl. I agree that max3d should have been released, as i think mark was quite far into making it before he changed his mind about the direction of the project.


t3K|Mac(Posted 2007) [#28]
@ross: yes, stencil shadows are easy to use. shaders yes, but i didn't tried them until now :)

i don't dismiss bmax, i only say its no solution for ME personally.


Pete Carter(Posted 2007) [#29]
t3K|Mac : ok fair play.

For me I dont need shaders. I need my game to run well on as many systems as posible and look clean and unfusy. I think careful use of blitzmax and minib3d will be just the ticket. Open GL has driver issues in windows at the moment but i still think its a better option than direct x 7.


t3K|Mac(Posted 2007) [#30]
what bothers me with dx7 is that animations are performed by cpu instead of gpu and such things. shaders are a great addition to swaying grass for example.... i tried that with dx7 (vertex manipulation), and it was way to slow.
so it heavily depends what type of games you are making.
i don't need edgecutting new features, but a solid base of "modern" effects (like shadows, bumpmapping, antialiasing,...)


Pete Carter(Posted 2007) [#31]
Its a hard position to be in at the moment. because open GL has lots of driver issues with windows ATI seam to be effected the most by this. I have no idea how to code a shaders so id be looking at shaders that other people have coded or I guess programs like this one http://developer.apple.com/graphicsimaging/opengl/shader_image.html
could be the answer for open gl.

I really hope that minib3d and the extended version which has shadows, shaders and phyics keep improving. Ive been very impresed at what they have got so far.


t3K|Mac(Posted 2007) [#32]
we just have to wait - and see what will change in near future. OpenGL has a hard pos at the moment, especially with Vistas nearly-no OpenGL support. but for the next few months i am happy with b3d as it is ;)


Gabriel(Posted 2007) [#33]
I have no idea how to code a shaders so id be looking at shaders that other people have coded or I guess programs like this one http://developer.apple.com/graphicsimaging/opengl/shader_image.html
could be the answer for open gl.

It's a complely different type of programming and it's a new language, but it's not actually hard. Well, like any programming, the really good stuff is tricky, but the basics are definitely not hard. I don't know about GLSL but CG and HLSL are fairly high level languages, similar in syntax to C, except that you have a number of useful functions built in, and automatic type casting, and no pointers, so it's actually a good bit easier than C, I'd say. It takes a while to get your head around because up until Shader Model 4 ( which I've done no programming in at all ) you could only program with a pixel or a vertex. It takes a while to understand quite how that's going to work, it's odd thinking in terms of just one vertex or just one pixel, but when it does click, it takes a big weight off. You no longer have the problems associated with traditional programming because all you're ever writing is a little snippet of code to deal with one pixel or one vertex. Debugging can be tricky as you can forget about debuglogs and breakpoints, but you learn to put your debug information into something obvious ( like the color of each pixel ) and then it's not too bad.

It's kinda fun too. If you ever find yourself missing the days when programming wasn't such a chore and it used to be kinda fun then playing with shader programming might well be the way to go back to that. I know I found shader programming as much fun as any programming I'd done when I first got into it. And it doesn't cost anything of course because all the major tools are free.


Pete Carter(Posted 2007) [#34]
thats cool ill have a look today. Im going to look at GLSL because im looking at using an opengl engine on my next project. Last time i had the type of fun in programming your talking about was when i got jv-ode.


Dreamora(Posted 2007) [#35]
Main difference between normal coding and shader coding is that the hardware differs massively and that you have to know exactly for what you dev because depending on the target minimum you can only use few textures, very few operations, no branching etc