A real antialias routine

Blitz3D Forums/Blitz3D Programming/A real antialias routine

JoshK(Posted 2004) [#1]
All the antialias routines I have seen written in Blitz just kind of blur the image. Here's one that works. These images have not been retouched in any way.

Before:


After:


Give me credit if you use it, and a free copy of the application, whatever it is:



xmlspy(Posted 2004) [#2]
Wow!


xmlspy(Posted 2004) [#3]
BufferWidth, BufferHeight functions missing.


sswift(Posted 2004) [#4]
Guess someone's not banned any more. :-)


Question:

Why would you want to create a function to antialias an image when every modern 3D card likely to be capable of running your game can do antialiasing in hardware, just by going:

Antialias True

???

Any card which doesn't have built in antialiasing is unlikely to render images fast enough to do it with any other technique. And antialiasing in software like that will kill your framerate.

What is the purpose of this?


N(Posted 2004) [#5]
Antialias True

???

Any card which doesn't have built in antialiasing is unlikely to render images fast enough to do it with any other technique. And antialiasing in software like that will kill your framerate.

What is the purpose of this?


Actually, that doesn't work on a lot of people's systems. Setting AntiAlias True certainly doesn't work for me, and I have 2x, Quincunx 2x, and 4x available (and I've seen it in action in other games, so I know it works). Blitz has issues with antialiasing support, as far as I'm concerned.


sswift(Posted 2004) [#6]
"Setting AntiAlias True certainly doesn't work for me"

1. Are you running Blitz in a video mode which your card supports antialising in? My current card can only do antialiasing in 800x600 or below. It doesn't have enough video ram to do it in higer res modes. Your stats indicate you have a better card, but that might not help if you're trying to antialias 1600x1200.

2. Do you have your Direct 3D set up to use "application preference" for antialiasing in your display properties? If not, then Direct3D will override your settings in Blitz.


N(Posted 2004) [#7]
1. Yes.
2. Yes.


sswift(Posted 2004) [#8]
Still too slow to antialias in software. :-)


Mustang(Posted 2004) [#9]
Nice stuff, Halo! But.. I do consider AA essential feature in today's games, but I too doubt if the software AA would be fast enough... for simple games it might be, like platform games done in 3D? Also what happens if the user HAS AA enabled and you do software AA on top of that?

Blitz3D should REALLY have multiple choice AA settings (that really work)... would it be possible someone to write a .dll that asks if 2x 4 x etc are available and then sets them?


JoshK(Posted 2004) [#10]
This is just for making renderings. I wanted a reliable Antialias renderer for CShop. Even if you can tweak your graphics card settings to make it work, you should be able to have total control over it.

Plus, since it takes a few seconds, it makes it seem like I have a really amazing advanced renderer.


sswift(Posted 2004) [#11]
Well, what you're doing isn't really antialiasing. :-)

It's pretty much just finding changes between pixels which are too drastic, and smoothing those areas out. In the image above, you can see that this method is imperfect... The pipes are still jaggy, and indented corners become rounded.

To really antialias an image, you need to render it at 4x resolution and then average each block of four pixels down to 1. This would probably be faster than all these compares you're doing.

But of course you might not be able to render at 4x res cause the video card might not support that high a resolution. There must be some way to shift the camera though where you could render four images and average them to produce the same effect.


WolRon(Posted 2004) [#12]
To really antialias an image, you need to render it at 4x resolution


I don't think that's necessary.

You can achieve antialiasing with another image of the same size, in other words, only twice the memory.


JoshK(Posted 2004) [#13]
Well, it's better than you have done, and it's free, so quit complaining. :-)

:-) :-) :-)


:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)

:-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-):-) :-) :-)


Who was John Galt?(Posted 2004) [#14]
I was going to snitch on halo and get him banned for the tone of that last comment, but the final :-) changed my mind.

;-) ;-) ;-)


JoshK(Posted 2004) [#15]
I hope that was sarcastic enough to convey my disgust.


AntonyWells(Posted 2004) [#16]
Ladies and pre op men, please....calm down and procede to the nearest mud pit to sort this out like proper women.

;) <geniune smiley that time!
-

Halo, although it's slow to do in software, check out shaders(Assuming you're still using your custom engine)
What you can do is, draw the color buffer to a texture(Modern hardware can even use non power of 2 textures)..Then, draw a quad across the entire screen(2d mode) textured with the screen with your blur shader. You don't need to asscoiate a vert shader to use just a pix shader.

You get the built in filtering(or none) of gl, plus you can blur the scene in the shader. I use similar methods for a few things, and it's very fast.(real-time fast.)


CyBeRGoth(Posted 2004) [#17]
Cheers halo

This type of AA would be good for taking in-game screenshots and make them look better :)


Braincell(Posted 2004) [#18]
To really antialias an image, you need to render it at 4x resolution and then average each block of four pixels down to 1.


Yep, thats true. I know 3d max does that, and it is definately the only real way of doing AA.


Gabriel(Posted 2004) [#19]
I don't think that's necessary.

You can achieve antialiasing with another image of the same size, in other words, only twice the memory.


Similar effect perhaps, but SSwift is correct when he says that true antialiasing is done by rendering an image twice as wide and twice as high and sampling from that.


Halo's effect is useful for anyone writing any kind of app though, because being able to take a nice snapshot with less jaggies is always useful. It's more of an edge blur routine than an antialias, but it's still useful.

And BTW I have the same answers as Noel to antialias true, yes my card supports AA at the resolutions I'm running, yes I have my drivers set to application preference, and no Antialias True doesn't do a single thing. It's pretty broken.


JoshK(Posted 2004) [#20]
It is antialiasing. The first shot is aliased. The second shot is not.

This isn't a blur effect, it looks for lines, and interpolates edges pixels along the slope of the line.

If you have something like this:

00000000000000000005555
00000000000000055550000
00000000000555500000000
00000005555000000000000
00055550000000000000000
55500000000000000000000

It creates this:
00000000000000012345555
00000000000123455554321
00000001234555543210000
00012345555432100000000
23455554321000000000000
55543210000000000000000


Regular K(Posted 2004) [#21]
Nice stuff :)


JoshK(Posted 2004) [#22]
There's also no way to render a 4000-pixel-wide image with Blitz.


Tom(Posted 2004) [#23]
Looks nice!

Good to have you back Halo! :)
Tom


Rob Farley(Posted 2004) [#24]
>> There's also no way to render a 4000-pixel-wide image with Blitz.

There probably is if you're clever enough with camerazoom and camera positioning and pointing it at the right place and doing x renders to get the big image. It's way beyond my little brain but I'm sure someone like you or Sswift could work it out (if you want to!)

Anyway, looks really good Halo, Certainly something to put in the code archives for future use.


John Blackledge(Posted 2004) [#25]
Good god - stop knocking the man! He does brilliant work and is always pushing back the boundaries. Keep going Halo!
'You can always tell the trailblazers, by the knives sticking out of their backs.'


gotelin(Posted 2004) [#26]
Can you tell me how I use the antialias routines?

Thanks


Dreamora(Posted 2004) [#27]
pushing back?
it uses fastpixel operations ... so anything lower than GeForce3 is out the race at least for more than a nice screenshot function


Picklesworth(Posted 2004) [#28]
Looks like you just call AntialiasBuffer on whatever buffer you want it on.


gotelin(Posted 2004) [#29]
Excuse me, But I donīt undestand you,
Can you post a example code, please?

Thanks


Eric Draven(Posted 2004) [#30]
"And BTW I have the same answers as Noel to antialias true, yes my card supports AA at the resolutions I'm running, yes I have my drivers set to application preference, and no Antialias True doesn't do a single thing. It's pretty broken."

Same here. GeForceFX 5600 128MB.


Drey(Posted 2004) [#31]
Hardware AA is done differently. IT takes either 2, 4, 6 samples of colors from within ONE Pixel and averages them together. ATI uses a rotated method kinda like this
________
| * |
| * |
| |
| * |
___*____
(4 samples)
Nvidia does this

________
| * |
| |
|* * |
| |
____*____
(4 samples)
which isn't as good as ATI's.

Anyways, if u want your games to look better for a screen shot or some other reason. That's fine, but anything that's run time. NO way, i think u know that tho.


poopla(Posted 2004) [#32]

Credit is fair...but possibly a full commercial application?

Good to have you back Halo! :)



He didn't ask anything that wasn't fair(as in he has that right, and if you use it, live with it). So live with it ;).


RGR(Posted 2004) [#33]
;-


skidracer(Posted 2004) [#34]
<halo boast deleted due to obvious flame bait content>


Tom(Posted 2004) [#35]
You're a better coder than me, I follow your threads because most the time you get into juicier stuff (more technical), and I sometimes learn something. I appreciate you for that!

But then along comes your ego...

I've not taken offence at the 'stupid' remark, because I know I'm not. I'm just not as motivated & dedicated as you :)

My first post was intended as a light hearted welcome back, but I'll remove it to keep the peace.

Take it easy!
Tom
p.s there's a giant kiwi thumb hovering over my house!!! HELP!


gotelin(Posted 2004) [#36]
Good afternoon

Can you send me a example with real antialias code, please?

I don,t know use the routine
Can you help me, please?

Thanks


Gabriel(Posted 2004) [#37]
Hardware AA is done differently. IT takes either 2, 4, 6 samples of colors from within ONE Pixel and averages them together.


If it took 6 samples of color from ONE pixel, wouldn't all six samples be the same color? :)


Gabriel(Posted 2004) [#38]
Good afternoon

Can you send me a example with real antialias code, please?

I don,t know use the routine
Can you help me, please?




Seriously man, it's one function call. If you need a code sample, I find it hard to believe you're going to be able to make any use of it. But what the hey..

AntiAliasBuffer(FrontBuffer(),2)


AntiAliasBuffer(BackBuffer())


AntiAliasBuffer(ImageBuffer(AnyImageHandle),2,15)


There's three.


AntonyWells(Posted 2004) [#39]
You do realise he's going to use all three at once right? :)

(Just kidding gotelin ;))


AdrianT(Posted 2004) [#40]
antialiasing in hardware with blitz causes weird flickering on my system, like one black frame every 2-5 frames


Gabriel(Posted 2004) [#41]
You do realise he's going to use all three at once right? :)



hahaha, don't say that. I'll weep if he does. Literally.


gburgess(Posted 2004) [#42]
There's no correct way to do antialiasing. Some people seem to think that it was invented for 3d graphics cards, and the method they used is the 'right' method.

My Archimedes did font antialiasing in 1988, using a different method to both Halo's and the 3d graphics card. It's all good, baby.


sswift(Posted 2004) [#43]
"If it took 6 samples of color from ONE pixel, wouldn't all six samples be the same color? :)"

I assume you're being facetious, but for others and in case you're not...

Antialiasing does take multiple samples from within the same pixel, but one sample might be at the center of the pixel, and another at the edge.

Normally, each pixel on the screen is the color of whatever happens to be at the very center of that square. Everything else in the square is ignored. That is why a pixel will go from being black to instantly being white as a square moves over it. In reality, if you took all the colors in that square defined by the pixel, you'd get shades of grey as a white square passes over a pixel with a black background.

Basically subpixel sampling is like making lots of smaller pixels within a pixel.


Drey(Posted 2004) [#44]
Look, i read it somewhere. It takes samples around the center. No, it's not that same color, because thre's different polygons that are inside the pixel. The box didn't come out right for whatever reason. but it it takes those samples, averages them together. Just check out sites like anandtech.com and ati.com and read up on it.


skidracer(Posted 2004) [#45]
I would be more concerned that there are still nasty jaggies on the top of horizontal edges and the left side of vertical edges due to the imperfections of the implementation posted.

5 points to the coder who points out where Halo is going wrong.


AntonyWells(Posted 2004) [#46]
I'll take 'He needs to run each pixel through a edge detection matrix and blur each 8x8 sample by the average 'edge bias' of that sample istead, for fifty.

how wrong was I?


AntonyWells(Posted 2004) [#47]
Then again I don't see how either method would work with textures applied(How can you tell apart a texture detail and a edge?)


Gabriel(Posted 2004) [#48]
I assume you're being facetious, but for others and in case you're not...


I was. I'm glad someone spotted it.


Yue(Posted 2011) [#49]
Someone kind enough to give me a practical example of this function, leave me a ton of errors.


Adam Novagen(Posted 2011) [#50]
Since this was for the Blitz that was issued over six years ago, it's no surprise you got errors. There could be any number of changes in modern Blitz that would render this code useless.


Yasha(Posted 2011) [#51]
There's a faster solution here: http://www.blitzbasic.com/Community/posts.php?topic=86928

...but it only really works for low resolutions (1024 and below).


Subirenihil(Posted 2011) [#52]
Blitz discontinued AntiAlias True a while back, AntiAlias does nothing now - if you want your game to be antialiased you'll need to changes setting on your graphics card. Most cards have default setting for applications - mine is set to antialias a program unless it has overrides built in. A Blitz program can be antialiased fairly easily, the unfortunate part is that it can't be done within the program itself - it has to be set on the graphics card controls.


rio29(Posted 2014) [#53]
can't complain:) that's a lot of work sir, for a noob like me on Blitz3D
i will learn something at least, thanks much for the code and if i happened
to write the ultimate game ever:D with Blitz3D than sure why not credit
will always be yours