Wow, Antialiasing really sucks on GF2!!

Blitz3D Forums/Blitz3D Programming/Wow, Antialiasing really sucks on GF2!!

ChrML(Posted 2003) [#1]
Antialiasing really sucks on GeForce 2 MX 400. That's why I add a function to disable/enable it. When antialiasing is disabled on GF2, then I get a framerate on my game at 74-76 (without framelimiter), but with antialiasing, I only get 30-32. Any ideas on what makes antialiasing that slow?


Ross C(Posted 2003) [#2]
Antialiasing is usually slow, unless you've got one of the newer cards. At least you can get your anitaliasing to work. On a number of machines (there's a thread on the blitz 3d bug report forum) it just refuses to work in blitz.

Well, i've never really needed it anyway. Just bump up the resolution. :)

what res are you doing antialiasing at anyway?


ChrML(Posted 2003) [#3]
I'm doing antialiasing at 800x600, but of some reason I can't see the big difference. So, bumping up the resolution seems better.


sswift(Posted 2003) [#4]
4x Antialiasing is worthless, unless your monitor is of limited resolution.

The reason for this is that when you turn on 4x antialising, what the card is doing is rendering an image twice as wide and twice as tall... At 800x600, that means 1600x1200 res. It then scales this down setting each pixel to the average of the 4 pixels at that sampling location in the larger image.

It's worthless, because you could just be displaying the graphics in that higher resolution and be getting better performance! Honestly... would you rather look at a blurry 800x600 image, or a sharp 1600x1200 image? I think the answer is obvious. :-)

But as I said, it is useful if you say, have a monitor which can only do 1024x768. Then you can improve the image quality with antialiasing. That's why games for consoles use antialiasing. TV's can only do 640x480.

The reson for your slowdown if you haven't yet figured it out is because you're really rendering the scene at 1600x1200 when you have antialiasing on. So naturally your card chokes.

Try 1152x864 without antialiasing. Or 1280x960 (or 1280x1024 if you're using an LCD screen with that as a netive resolution) It'll look nice and sharp and be reasonably fast.


Ross C(Posted 2003) [#5]
Tv's only have a resolution of 640*480??

Learn somethin new every day :)


Bouncer(Posted 2003) [#6]
My Philips Widescreen can do 1024*768


fredborg(Posted 2003) [#7]
TV's are 720x576 (at least in Europe PAL), but parts of the image is outside the visible area.


sswift(Posted 2003) [#8]
PAL is different than NTSC, and I'm not counting the area outside of the visible area as that would be pointless. :-)

NTSC is actualyl advertised as having 720x525 resolution, but only 480 vertical lines are visible, and I've never heard of a game console that supported greater than 640 pixels across. To do otherwise would mean a non-square pixel aspect ratio.


"Tv's only have a resolution of 640*480??"

Well actually, it's more complicated than that.

TV's have 480 vertical lines of resolution, but only HALF those lines (even/odd) are updated each frame. The feilds are drawn at 60 fields per second. So you only get one full screen picture change 30 times a second.

The result of this is that either you get 30fps progressive video, (progressive means full screen changes) or you get 60fps interlaced, or you get half the vertical resolution and 60fps. Interlaced means that you get flickering and jagged edges. If you capture an interlaced TV show on your PC and display it on your monitor, it looks like crap because you can see the interlacing really easily on a monitor whereas it is harder to see on the TV because the TV phosphors are slower to change which leads to the images having a bit of motion blur.

I think some of the newer TV's can display so FRAMES epr second, and I know that some consoles like the GameCube have special filters built into the hardware to blur the feilds a bit to hide the interlacing and still be able to get high resolution and high framerates.

And something else interesting to note is that movies are 24 frames per second. As you can't display 24 FPS video on a TV which can only display 30fps video, they have to do a special conversion called a "3:2 pulldown". This means that one progressive frame is placed on three feilds in a row, (even odd even) and then the next progressive frame is placed on two feilds in a row (odd even) and so on (odd even odd) (even odd) (repeat)

This results in some artifacts, and DVD players try to compensate for this. And when you want to then capture these kinds of movies to your PC, not only do you have to deal with the interlacing, but now have to deal with the pulldown. So you do an "inverse pulldown" which reverses the interlacing and the 3:2 pulldown, and converts the mvoie back to 24 frames per second progressive.


Ross C(Posted 2003) [#9]
Very cool. Thanks for that wee lesson :)


fredborg(Posted 2003) [#10]
PAL is different than NTSC, and I'm not counting the area outside of the visible area as that would be pointless. :-)

Why? In the top 15-20 lines of the TV image the teletext is stored. It would be so sweet if a game for one of the consoles utilized this :)


ChrML(Posted 2003) [#11]
Ok, now I see why Antialiasing really sucks, but do really Blitz3D use 4x antialiasing?


sswift(Posted 2003) [#12]
Only if you enable it.


Qube(Posted 2003) [#13]
lol, nice info on TV image generation sswift :)


smilertoo(Posted 2003) [#14]
Anti-Alaising is fine on my pc, can make a nice difference in some games.


Anthony Flack(Posted 2003) [#15]

Honestly... would you rather look at a blurry 800x600 image, or a sharp 1600x1200 image? I think the answer is obvious. :-)



Okay, I know I'm in the minority here, but I would rather have the blurry 800x600 - honestly!

I much prefer the soft edges. And I think that the look of 3d games running at really high res are a little... creepy, somehow. The actual level of detail in the models and textures generally can't match that of the screen res.


ChrML(Posted 2003) [#16]
Well, I do enable antialiasing with this:
Antialiasing true

And it loads a config file, and if it's set to true there, then it's set to true ingame, so my external configurer made in delphi can set it (like most games does), but is there a way to control wether using 2x or 4x?


Red Ocktober(Posted 2003) [#17]
I gotta look more at this... I am noting more and more people complaining about frame rate as related to antialiasing...

... especially with the GeForce2 MX

To be honest, I haven't noticed much slowdown, but there's gotta be something to all this discussion...

--Mike


ChrML(Posted 2003) [#18]
Yup, there is. When I don't have AA enabled, I get 74-76 fps, in 800x600, but when I enabled AA, then the mouse really hang, and the controls reacted slow, and I got only 26-34 fps. And it was much slower than if I set the fps to 30 with my framelimiter.


Kuron(Posted 2003) [#19]
I have a Geoforce2 MX 400 64MB PCI in a 1GHz Celeron and 512MB RAM and when using AA, even at 4X, I have always got nice results speed wise. That said, I usually have it on 2X, and only use 4x when I am grabbing screenshots to use for eye candy...


Red Ocktober(Posted 2003) [#20]
hey HyperBlitzer...

Do you get the same or similar results when in higher resolutions?

What color depths are you using?

--Mike


Mustang(Posted 2003) [#21]
Do note that AA sucks on some cards because it needs LOADS of extra VRAM and if you don't have it then what follows is massive stuff swapping over the AGP. So if you have old card with say 32mb of VRAM, AA probably sucks (and the GPUs are already slow and ancient), and upping resolution with AA just consumes more VRAM. Use low resolution if you need to use AA.

And I agree Anthony - I'd rather have AA'ed 800*600 ANYDAY over sharp 1600*1200! 1600*1200 does make jaggies go away without AA somewhat, but it makes everything too sharp and pointy IMO, as generally speaking texture are not designed to look good in that high rezz. And there is this FILLRATE problem creeping in too; it get BAD very easily when you are using high resolutions.

My ideal (default in my own game) resolution setup is 1024*768*32 and AA if you have fast card for it like new Radeons.


Mustang(Posted 2003) [#22]
is there a way to control wether using 2x or 4x?


I don't think so... it's either ON or OFF, and probably uses your default value that you have in your drivers. To be sure you have do tests like rendering & saving a screen produced by Blitz3D and then forcing all AA options in turn from the drivers, and then compare those rendered Blitz3D screens to the one you got when you used Blitz3D AA=ON.


Rottbott(Posted 2003) [#23]
In my case Blitz gives me a choice of either OFF or OFF. Pity. Even forcing it on in my graphics driver settings doesn't turn it on in Blitz programs.


Anthony Flack(Posted 2003) [#24]

And I agree Anthony -



Hooray! I'm not alone...


Kuron(Posted 2003) [#25]
Hooray! I'm not alone...
Count me in too.

I have 20/200 vision, I can take off my glasses and have awesome AA and no FPS drop ;c)


smilertoo(Posted 2003) [#26]
I wouldnt class an MX card as a true geforce, theyre crippled and even Nvidia have stopped the practice with the new FX cards...now they just sell slower cards, not castrated cards.

(FX cards are not much faster than GF4) :(


ChrML(Posted 2003) [#27]
Hmm, I just think it's weird that my framerate is halved, and my computer's reaction time is 1/10 when AA is on.

Can my drivers have something to do with it?

To Red October: Yeah, I do.


Red Ocktober(Posted 2003) [#28]
... yeah, agreed... that does seem to be a bit too much of a hit. Not sure of a solution yet...

Mustang... I think that you CAN control whether you are running 2X or 4X AA... although maybe not in Blitz. Download the Enigma Rising Tide demo (ver 201) and go to the video preferences... there is a choice of off, 2x, or 4x AA.

--Mike