Leadwerks Engine compatibility test

Community Forums/Showcase/Leadwerks Engine compatibility test

JoshK(Posted 2007) [#1]
http://www.leadwerks.com/ccount/click.php?id=46

This one removes a stencil test that was not supported by ATI cards and makes some internal changes. We are curious to see how this performs on the ATI 1xxx series, as well as the GE Force 6xxx series.

This will not run on an ATI x800, GEForce 5 series, or below.

NVidia released a new set of drivers a few days ago. Please update your drivers.


Gabriel(Posted 2007) [#2]
Runs between 90 and 130 FPS, depending on where in the scene I am.

That's on a Geforce 8600GTS.

There are some nasty shadow artifacts on the balls as they roll. Presumably you have an option to disable self-shadowing, and you probably want to do that on the balls.

Otherwise it seems fine, although I hope there's a lot more going on there than I can see because 90 FPS on an 8600 GTS on such a simple scene is scary.


JoshK(Posted 2007) [#3]
The 8600 is 3-5 times slower than the 8800. There is an additional optimization we can make that will improve lighting performance by about 200%, but NVidia is having trouble with it in their current drivers.


Rob Farley(Posted 2007) [#4]
I just get a black screen with a flare in moving around...

104fps at first... geforce fx5900xt, win xp home, athlon xp2600


xmlspy(Posted 2007) [#5]
ATI Radeon X1950 Pro 512 MB
AMD Athlon 64 X2 Core Duo 3800+
2 GB Ram

Averages 1fps (yes one fps)
50 CPU : 80,000K Mem Usage

Scene shows up for a few seconds and then it turns into a black screen with a flare.


AJirenius(Posted 2007) [#6]
Hmm, don't know whats wrong but I get way below 1 fps on my ATI X1950 PRO-card. Is it supposed to be like that?

(way below means it shows 1FPS but acts like 0.2 FPS)


JoshK(Posted 2007) [#7]
The GE Force 5900 is a SM 2.0 card, and will not run this.

We are fixing some ATI incompatibilities right now. We have identified the issue. The reason you are getting 1 FPS is the card is unable to support a feature, and is falling back to software emulation, which is always 0-2 FPS.

AJirenius, does the scene show up okay, even if it is 1 FPS? We can fix the framerate thing, but does the scene appear correctly?


Rob Farley(Posted 2007) [#8]
The GE Force 5900 is a SM 2.0 card, and will not run this.
All is clear... Lovely flare anyway! I'm sure you've already thought of this but do you think it would be worth mentioning what series of cards it definitely won't work on in your original post so people like myself don't waste their own and your time?


Amon(Posted 2007) [#9]
I'm dissapointed in the speeds considering the system I have.

I'll stick with TrueVision3D.


JoshK(Posted 2007) [#10]
Like I said, the 8800 is 10-15 times faster than the 7 series.


puki(Posted 2007) [#11]
This makes no sense to me - compared to some peoples' results.

On my 8800 GTX, Core 2 Duo E6600 processor with high performance RAM, I only get 72 FPS (constant) on either test.

Can people post their frame rates from the start postion? - ie, don't move the camera. Mine is a static 72 FPS.

EDIT:
Actually, I am just going to download the latest nVidia drivers.


ChaosCoder(Posted 2007) [#12]
0 FPS. I can see the scene but every 30 sec one movement ...

NVidia 6200 | AMD Athlon XP 2200+ | 512 MB RAM


taumel(Posted 2007) [#13]
I guess my good old GeForce3 (my adventure card) isn't supported but hmmm i can run ogre's shadow test and there with default settings i do get 60fps with stencil shadows and 86fps with their texture implementation as long as i use DirectX as the renderer. With openGL *boom* ogre crashes...


puki(Posted 2007) [#14]
Hold on!

I just updated my nvidia drivers and the FPS dropped to 61.

I've lost 11 FPS.

How come I got 61 FPS on my set-up and some people have like double that on lesser hardware?


puki(Posted 2007) [#15]
Wait a minute - this thing locks on to your current refresh rate.

When I updated the driver, it auto set my refresh rate to 60 Hertz.

How do I let this thing fly at top speed?


puki(Posted 2007) [#16]
Stop press!

I forced the refresh rate off - I now get like 300+ FPS - saw it hit 468 at one point!

Right, I am going to leave that sucker switched off in future - this thing is absolutely flying.

EDIT:
It actually hit 570 FPS at one point - when facing a wall.

EDIT:
Make that 601 (self-touch) maximum FPS!!!

EDIT:
Got to 641 FPS when looking at the floor.


siread(Posted 2007) [#17]
1 FPS on Radeon X1600.


jfk EO-11110(Posted 2007) [#18]
"when looking at the floor"...wow. Just add this to the manual. "To improve you Framerate simply look at the floor all the time"

Sure halo wants to do something good, but right now it seems the engine is a little picky.


puki(Posted 2007) [#19]
People probably need to test with their vertical refresh/sync setting set to 'off'. Just to see what sort of difference it makes.

Made a huge difference for me.

EDIT:
Not the one in the the Leadwerks app - I'm talking about in your display settings for your card.


Gabriel(Posted 2007) [#20]
People probably need to test with their vertical refresh/sync setting set to 'off'. Just to see what sort of difference it makes.

Ideally, none. The default ( and best ) setting for VSync is application preference, and since the demo gives you that preference, you should be able to turn it on and off without ever touching your drivers.

The 8600 is 3-5 times slower than the 8800.

I don't doubt it, but I render test scenes in my game with slightly greater complexity than this, with shadows, animated models, normal mapping, specular lighting, bumpmapping, etc, very much as you have here and get 600-700FPS with it.


JoshK(Posted 2007) [#21]
Another thing to keep in mind is the engine is using a renderer designed to scale well with large scenes. The light overdraw in this test is only around 1.0, so it's too small to see the difference it makes. As scene complexity increases, the lighting requirements don't change...it just processes lighting on the final pixels, regardless of how many polys make up the scene. So this engine can display scenes with real-time lighting that would be impossible in many other engines. The engine renders large outdoor scenes without any problem. We'll have some more demos later to demonstrate this point.

If you graphed Doom 3 and Leadwerks Engines in performance vs. scene complexity, Doom 3 would start off faster, but would decline faster with scene complexity, and in more demanding scenes, Leadwerks Engine would run much faster. This is similar to how the STALKER engine's deferred renderer works.

So Puki, a much larger scene with the same number of lights will run at about the same speeds you are getting now.


JoshK(Posted 2007) [#22]
Lowering the screen resolution will make the biggest increase in speed. Also, don't forget to disable VSync (press v).

Try this:
http://www.leadwerks.com/files/Engine_640x480.zip


Dustin(Posted 2007) [#23]
Looks great and runs well on my beefy system.

But whatever you do... DON'T PRESS THE WINDOWS KEY! Man, am I dizzy!


Naughty Alien(Posted 2007) [#24]
here running on stable 72 FPS..

nVidia 7600GT 256MB RAM


FBEpyon(Posted 2007) [#25]
Well, I doesn't even load for me I guess it must be Vista :(


_33(Posted 2007) [#26]
Regarding the 8800GTS slowdowns:
http://www.theinq.com/?article=41351

So, you'll have to be pretty patient guys.

Not only that, but basing a game engine on a graphics model doesn't prove to be a good choice. Better yet widen compatibility and stop being an nvidia fanboy.


Svenart(Posted 2007) [#27]
130 frames with geforce8800gts, and this while lightwave is rendering highpolyscene with 16threads... demo looks very nice to me, well done.


Loktar(Posted 2007) [#28]
38 fps
7600GS
1.5 GB ram
AMD 3500+ 2.2 Ghz


taumel(Posted 2007) [#29]
Hmmm i wonder how reasonable it is writing a 3d engine which is good at large scenes whilst it's not this fast at smaller ones if you're targeting at indy developers and their customer base. Well, at least time is on your side.

May i ask which light types are supported for casting shadows yet?


Dragon57(Posted 2007) [#30]
All I get is a black screen on the machine with the Geforce 6800 GT (see my sig).


LarsG(Posted 2007) [#31]
engine and engine_low demos: window opens.. then closes again..
640 demo: fullscreen opens (turns black).. then closes again.

this is on my work laptop;
Asus A7J 17"WXGA+,ATI X1600-256,Core Duo T2300,1 GB

might be because I'm running an extended desktop on an external monitor.. I might try without, later..

I've been following your "worklog" on the leadwerks forum, and I must say that it looks like you're doing some cool work, Josh..
Too bad this engine isn't cross-platform, seeing as I'm on a Mac at home.. I would definately consider buying it if it wasn't for win only..


[edit]
tried without extended desktop, and it still won't work..
the same thing happened on the three demos.. :(
[/edit]


xlsior(Posted 2007) [#32]
I see a scene with some barrels for about a second, then a black screen with a moving flare, then just black.
ATI X1650XT (which speed-wise should be similar to a GeForce 7600GT)

FPS: 1
vsync: 1 (press v to toggle)
GFill odrw: 1.00000000
GFill time: 0.00000000 ms
Light odrw: 0.00000000
Light time: 0.00000000 ms
Ligh:-1.#IND0000 ns/pixel
Ligh:-1.#IND0000 MPix/s
shadow time: 0.00000000 ms


Perturbatio(Posted 2007) [#33]
In Single GPU mode I get constant 61 FPS with VSync off.

(Back against the wall, facing barrels)


puki(Posted 2007) [#34]
WAIT A MINUTE. "Naughty Alien" is claiming a stable 72 FPS - which is what I had at first.

I found that toggling the 'vysync' option inside the engine test made no difference - I had to do it via the hardware driver software.

Some people are being held to their default refresh rate. Deactivate it via your Catalyst Control Centre or nVidia Control Panel settings - you at least then get to see the potential speed.

Right, I'm off to try the second demo thing now.

EDIT:
Mmm, I find the 'vsync toggle' is working now - could be that I didn't have my hardware setting set right previously.


puki(Posted 2007) [#35]
With that new demo, I am stabilising at 565 FPS whilst not moving the mouse - ie standing still, facing the barrels. Maximum FPS (when looking at a floorpiece) was 833 FPS.


puki(Posted 2007) [#36]
I managed to break the physics too. Not sure if this is a known issue or not.

I find that with the stack of barrels - you can knock into it and some of the barrels will lean, but not fall - even though nothing is holding them in place.


Any chance of a dynamic light demo? Like walking around with a light source? Like carrying the lantern that is lighting the scene?


JoshK(Posted 2007) [#37]
Here's what the engine is designed for. Lots of dynamic lights in a large scene. (An additional variance option will help soften the shadows a lot more, and allow a lower resolution). You can also ignore the toon shading here:



degac(Posted 2007) [#38]
First test:
38 FPS (V synch ON)
62 FPS (V synch OFF)


NVidia released a new set of drivers a few days ago. Please update your drivers.



Sigh...second test:
25 FPS (V synch ON)
38 FPS (V synch OFF)
and all the room is painted blue.

Sigh :(

Athlon64 3500+, Nvidia Ge6600g, 1Gb RAM


Doiron(Posted 2007) [#39]
I only see a flashing brownish screen with a flare.

Specs: Amd X2 5200+, Ati Radeon X1950 Pro 512mb (GeCube), 2GB ram.


JoshK(Posted 2007) [#40]
http://www.leadwerks.com/post/ATI_Test.zip

Please extract the exe and shaders into the Test3 folder. Make sure you delete or overwrite the old shaders folder.

If you have an ATI card above an x800, please let us know your results.

Thank you.


SebHoll(Posted 2007) [#41]
nVidia Go 7600GT 256MB RAM here with Intel Core 2 Duo 7200, runs at around 45fps on Vista x86 Ultimate.


AJirenius(Posted 2007) [#42]
As I said, on a 1950 below 1 fps but looks ok-


puki(Posted 2007) [#43]
Mmm.

In Doom 3, I got very excited on a level whereby I had to escort a sort of scientist bloke - he was carrying a lantern to light our way.

Does the engine mimick this? Can I carry a lightsource with me in that way?

I love games that feature light and dark - like the Thief games, Doom 3, Oblivion, S.T.A.L.K.E.R., etc.


Doiron(Posted 2007) [#44]
Please extract the exe and shaders into the Test3 folder. Make sure you delete or overwrite the old shaders folder.

Same as before: a flashing brownish screen with a flare.

I forgot to mention that I am using Vista (with the latest catalyst drivers).


Genexi2(Posted 2007) [#45]
Running in windowed mode here if that's intentional, anyways...

V-Sync 1 : 38 FPS
V-Sync 0 : 52-60'ish

One weird anomally I've spotted though, right portion of the screen gets skewed for some reason :


Anyway's, system specs -

AMD64 3600+ (2.21ghz)
2gb DDR RAM
GeForce6800 GT (256mb)


[EDIT] Forgot to mention this isn't with the recently released nVidia drivers (weird that they changed the numbering system on the new ones), I'll try it again with them afterwards.


JoshK(Posted 2007) [#46]
Once you update your drivers, the clamping will be fixed, but the screen will be blue. I love developing graphics. :\


Matthew Smith(Posted 2007) [#47]
Josh,

Updated to the latest ATI drivers - running a 9600SE - get 1 fps, a black screen and a moving spot.

Here is my shader log:




_33(Posted 2007) [#48]
Matt Smith, the 9600SE is not the standard the engine is looking for. The engine test here seems to be developped to use features that only a Shader model 3 graphics card can give. Thus the 9600SE is too old for this.

NOTE: Even my quite capable 500mhz R480 powered X800 series graphics card (16 shader pipelines (being 48 in actual fact) SM 2.1 standard) with 256MB of GDDR3 isn't supported...


Matthew Smith(Posted 2007) [#49]
@ _33 - Thanks!


CodeOrc(Posted 2007) [#50]
1 FPS- but still very nice looking

Nvidia GeForce 6800
2.6 Proc
1 Gig Ram

yes I know, I have a lame system :)


JoshK(Posted 2007) [#51]
This one uses lightmaps with bumpmaps. It should be very fast:
www.leadwerks.com/post/fastbumps.zip


Amon(Posted 2007) [#52]
I know I said I'll stick with Truevision but I've been messing about with LE and it is a nice Engine.

I get 86 FPS with the bumpmaps demo.

Is there no way at all to optimise the Engine so that it can work better with older hardware? Maybe have 2 versions?


JoshK(Posted 2007) [#53]
I think that is what we are going to do...finish up the HL2-style renderer now and work on the crazy-high-tech stuff in the background until it is ready.

The old renderer is very nearly finished, so it makes sense to complete that and have a nice finished engine out there.


Tachyon(Posted 2007) [#54]
I find these Leadwerks Engine posts fascinating, but why is it so hard to find any info on it? The website gives you the option to purchase it, but I'd certainly need to see how it is actually integrated into BlitzMax, maybe be able to view some sample code showing how to put together a small demo using shaders and physics, what the licensing limitations are, etc.


John Blackledge(Posted 2007) [#55]
Window titlebar and border appear (no contents), then immediately shuts down.

Log says:

Compiling vertex shader object "shaders\surface.vse"...
Linking shader program...
Failed to link shader program.

Vertex info
-----------
(3) : error C5041: cannot locate suitable resource to bind parameter "N"


Filax(Posted 2007) [#56]
Impressive shoot Josh ! as always :) Can't wait for the next engine update ;)


JoshK(Posted 2007) [#57]
John Blackledge, without your gfx card that information means nothing.


John Blackledge(Posted 2007) [#58]
Oh sorry.
I thought the important thing might be to trace through your code and find out when and why this happens, rather than a specific card.

Pentium4 1.60 Ghz 768MB Ram, Nvidia GeForce4 Ti4200, DriectX 9.


xlsior(Posted 2007) [#59]
This one uses lightmaps with bumpmaps. It should be very fast:


I'm getting a rock-solid 60 fps in that one, which is the same as my monitors refresh rate. Any way of turning off the vsync?

Radeon X1650XT


Chalky(Posted 2007) [#60]
Engine.exe:
vsync 1 = 30 fps
vsync 0 = 37 fps

Engine_640x480.exe:
vsync 1 = 60 fps
vsync 0 = 87 fps

Whole scene was wishy-washy purple in both cases.

Pentium 4 2.8Ghz, 1GB RAM, nVidia 6600GT (latest drivers), DirectX 9.0c (latest)


JoshK(Posted 2007) [#61]
Hahaha, you are using a GEForce 4?


bytecode77(Posted 2007) [#62]
thats all i get to see:

(running with 2 fps at maximum)

both executables shows the same result.
also lots of your previous engine demos were not running properly or werent running at all...


JoshK(Posted 2007) [#63]
I've decided to finish up the old renderer. It is nearly done, so I might as well complete it. It seems that the new renderer will probably require a GE Force 8800, and most of the feedback I have gotten is that people just want something to run fast on their 3-year-old card. I might as well tie up the remaining loose ends, put the finished engine out there, and let it get used for a few years.

You should consider the recent demos a preview of things to come. Call it Leadwerks Engine 2.x, Pro, Advanced, whatever. It won't be out for a while, and it will require a good graphics card.


So to reiterate:

Leadwerks Engine 1.x
-Half-Life 2-style renderer
-Bumpmaps, specular, water, HDR, etc.
-Runs fast on low-end machines (probably SM 2.0)
-New patch soon

Leadwerks Engine 2.x, Pro, Advanced, whatever
-Unreal 3-style renderer
-Shadow mapping, unified light, deferred rendering
-Not available for a while
-probably requires an 8800 (or higher). And I don't care if you can't afford that.


Amon(Posted 2007) [#64]
Perfect.

Don't give up on the advanced Unreal Renderer because is does make things look lush. What people do want though is something they can develope in and end up with a product that works on lots of hardware, old and new.

I personally would be happy with a HL2 renderer.

Once you've finally nailed the problems with the Unreal style Renderer, will there be an upgrade path to LE version 2.0? Will we get it free?


bytecode77(Posted 2007) [#65]
i would like to buy this. but i won't. thats simply, beacause 75 % of your demos didnt work on my high end PC


puki(Posted 2007) [#66]
I want the Unreal 3-style renderer.


JoshK(Posted 2007) [#67]
There aren't really any problems with the deferred renderer...it just requires a GEForce 7 or 8.


bytecode77(Posted 2007) [#68]
i have:

- ATI shappire x1900GT
- 2x 1 GB DDR 2 ram
- AMD 2400 dual core

what else do i have to have?

you cant really tell me, that your engine doesnt work with ATI card, or what?


puki(Posted 2007) [#69]
Well, an nVidia card for starters.


Canardian(Posted 2007) [#70]
@Devils Child, if you mean the test demos, they were not supposed to work on all systems (he even told in one test that X800 users don't even bother to try), the whole point was to gather information on what graphics cards which functions didn't work. Now Josh made a very smart move, there will be a new 1.x version parallel to a new 2.x version, just like with TGE and TGEA. The lower version works with all old systems and graphics card while still delivering the best possible graphics quality and speed, while the higher version is like Windows Vista: you need to upgrade your hardware, but then you also get the best possible graphics quality and speed which is possible nowadays (doesn't apply for Vista though, the speed thing, until SP1 or even SP2 comes out).


John Blackledge(Posted 2007) [#71]
Hahaha, you are using a GEForce 4?

What people do want though is something they can develope in and end up with a product that works on lots of hardware, old and new.

75 % of your demos didnt work on my high end PC



taumel(Posted 2007) [#72]
ATI's openGL implementation can be a nightmare on XP (Vista is a better) and is slower than DirectX. If you're no masochist and want some shaderstuff i suggest using DirectX instead...


_33(Posted 2007) [#73]
ATI's X1x00 series, naming X1300, X1800, X1900, X1950 runs OpenGL as fast as the Nvidia equivalent of the same period, meaning 7300, 7600, 7800, 7900. Reason for this is the 512-bit ringbus memory architecture and graphics driver optimisations. These cards should be supported as they are as numerous on the market as the equivalent nvidia card, and they do run shader model 3.


taumel(Posted 2007) [#74]
Obviously you've never programmed them on your own...


JoshK(Posted 2007) [#75]
It is impossible to design an engine that scales performance between high-end cards and low-end cards that are 32 times slower. Thus, I am finishing up the old renderer, and when we move on to Engine 2.x the requirements will probably be a GEForce 8 series card.

BTW, the X1550 I have been testing with has errors with basic GLSL functionality. The drivers are really bad. I have not tested their newer cards yet, but judging on what I have seen so far, I do not recommend using an ATI card for any OpenGL app. Apparently the guy who wrote C4 has said the same thing. I have run into a lot of the problems he mentions.


_33(Posted 2007) [#76]
Leadwerks, in that case, you should get Nvidia to sponsor your products and leave ATI alone for good ;). I mean, put Nvidia adds on your website... *** MADE FOR NVIDIA ***

From what I read, the X1550 card is the same GPU as the low cost X1300 model.


Defoc8(Posted 2007) [#77]
"It is impossible to design an engine that scales performance between high-end cards and low-end cards that are 32 times slower. "

hmmmm....this must be were "impossible" means, "i cant be arsed!" ;)


Chroma(Posted 2007) [#78]
This test gives me a black screen and locks up my comp. I have an ATI X800 Pro card btw.


Rob Farley(Posted 2007) [#79]
Chroma... Think you need to read the first post! ;)
This will not run on an ATI x800, GEForce 5 series, or below.

Personally I have no beef with an engine only working on cards of a certain calibre. The biggest problem you've got with this approach is cutting off audience. But that's been going on for years with minimum specs so I don't see a problem saying you have to have a decent gfx card.


taumel(Posted 2007) [#80]
That's what i think too.

Almost no one will use features which only run reasonable on 8800 nVIDIA cards if you don't have a defined target platform like for a kiosk system for instance or if you're doing research on your own.


Chroma(Posted 2007) [#81]
Crap. Well I'm not upgrading my graphics card so I guess Leadwerks Engine isn't an option anymore.


Rob Farley(Posted 2007) [#82]
The other thing to remember of course is that it'll probably take you a good year or so to write a game after you've got the engine. So these graphics cards will be much more common place when your game is ready to ship.


Gabriel(Posted 2007) [#83]
Considering Nvidia - across every card - only have 33% ( I was generous and rounded up ) of the market now, I can't see "Geforce 8 series and above only" being a very high percentage of the market, even a year from now.


_33(Posted 2007) [#84]
What Leadwerks is doing is a fantastic engine with fantastic graphics capabilities. But, what I do not understand, is focusing on 8800 cards, when there are so many shader model 3 cards out there that would probably run these demos, fast or slow, it doesn't matter at what speed it will run the demo, as long as it's compatible. that's how I see it in my book. if the user wants to have the game (demo / test) run on his hardware, all he's got to do is limit the resolution to 1024x768 and cut off antialiasing and anisotropic filtering. I mean, even if the said card is 32 times slower (from Leadwerks words), these are the majority of potential customers for ANY game on the market. They shouldn't be discarted, as the technology is practically the same, but simply slower (which is what is suggested). It's very different than the days they shipped MX cards, which were cut down versions, previosu generation cards. Today, and I could be wrong, all they do is cut the number of shader engines, cut on memory performance, power consumption, but that's generally the idea.


Wings(Posted 2007) [#85]
The ATI X800 Pro can run Oblivion but not all features enabled. Mine is about 3 years old.

You gona miss many players.

Do u know why it dosent support ATI X800.. is it to much surfaces.. (Its one weak point of that graphics card)


Wings(Posted 2007) [#86]
My girlfriends laptop
Nvidia Geforce Go 7300 got 15 fps.. every thing worlks. :)


JoshK(Posted 2007) [#87]
We have put off the deferred renderer for a future version, probably not aimed at indies. We have learned that only about 5% of our current market has the hardware for this, so we would have to charge 20 times as much just to make the same amount of money.

The deferred renderer will not be available for some time, and it will probably require OpenGL 3.0 and a Shader Model 4 card. The features of SM 4.0 do not scale at all with cards that are 30 times slower. There's no way to "scale" the fact that ATI doesn't support a required FBO format or that the GE Force 7 series does not support texture arrays.

We are adding features to the older renderer and that will continue to be used in Leadwerks Engine 1.x.


puki(Posted 2007) [#88]
Sniff.


taumel(Posted 2007) [#89]
Hmm i think i've never witnesses someone before, trying to develop a commercial 3d engine which should be sold without looking reasonable at what it takes to get it running on more than the developers system specs.

I also wonder how much success you'll have when making it an openGL3 and SM4 only thing in the future. Because a) SM2 and SM3 cards are widely spread and b) due to the wide gap in performance the mid to high SM2/SM3 cards outperform the mid to low SM4 sector. The performance issues do not only show up between different generations, they are primary an issue within each generation (8400GS vs 8800 Ultra).

Furthermore the success of openGL3 heavily depends on the vendors driver support (availability, feature complete, stable, fast). There is no question that DirectX will have the best support also in the future. If the vendors put the same effort into openGL3 like they did till now then this will be a rather dead product. Maybe Apple's success is kind of a motivation making better drivers for all systems.


Naughty Alien(Posted 2007) [#90]
..by keep changing direction, telling me that you guys are not sure where to go actually..I hope Im wrong..