Leadwerks Engine 2 Demo 1

Community Forums/Showcase/Leadwerks Engine 2 Demo 1

JoshK(Posted 2008) [#1]
Minimum requirement is a Shader Model 3.0 GPU. A GEForce 8800 GTS is recommended.
http://www.leadwerks.com/ccount/click.php?id=49




GfK(Posted 2008) [#2]
Neither of the EXEs work.

Archipelago.exe = displays "loading" then exits to desktop, no error.

Minimal.exe = "GLSL 1.20 is not supported. Please update your graphics drivers or replace your hardware." o.O


JoshK(Posted 2008) [#3]
What GPU do you have?


GfK(Posted 2008) [#4]
Intel GMA X3100, latest drivers.


JoshK(Posted 2008) [#5]
Your hardware is not supported. A graphics card is required.


GfK(Posted 2008) [#6]
Your hardware is not supported.
Yeah, I worked that out by myself.

Your post didn't mention any hardware requirements when I downloaded it.


Wiebo(Posted 2008) [#7]
It works here, on my AMD 3200+, Win XP sp2, 2gb ram, with geforce 7600, 256mb. Not a top rig, but hey.

30fps on low settings, 1280x1024. 15fps on all high settings, 800x600.

Things look really nice with all settings up :) kinda like having a hangover on a tropical island or something...


puki(Posted 2008) [#8]
I just ran Archipelago with highest settings on my 8800GTX - it ran fine.

I ran the Minimal one - I only saw a spinning cube.


JoshK(Posted 2008) [#9]
The minimal example just shows what you need to make a simple app. The point of it is mostly so you can look at the included source code and see how it works.


boomboom(Posted 2008) [#10]
So are Intel graphics chips not going to be supported by this engine?


JoshK(Posted 2008) [#11]
No, unless they support SM 3.0. And even they they would likely be too slow.


boomboom(Posted 2008) [#12]
ok :)


Mortiis(Posted 2008) [#13]
I have a Ati Radeon x1950pro 512mb and it's still powerful also supports SM3.0 but couldn't run each of these.

I have the latest drivers.

Archipelago.exe
Failed to Link Shader Program
Fragment shader(s) failed to link, vertex shader(s) linked.
Fragment Shader not supported by HW


Minimal.exe
OpenGL extension GL_ARB_texture_non_power_of_two is not supported. Please update your graphics drivers or replace your hardware.



Gabriel(Posted 2008) [#14]
I have a Ati Radeon x1950pro 512mb and it's still powerful also supports SM3.0 but couldn't run each of these.

By any chance have you recently upgraded from an Nvidia card?


Mortiis(Posted 2008) [#15]
You can say that, I got this one replaced with my old card (it was the same model but it was overheating so they replaced it with a new one)

While I wasn't using it, I bought a cheap nvidia card, en7200. But I deleted nvidia drivers and applications then installed the ati ones.


Doiron(Posted 2008) [#16]
I receive the following messages:

Archipelago: (This one is similar to Mortiis', except for the missing last line)
Failed to Link Shader Program
Fragment shader(s) failed to link, vertex shader(s) linked.


Minimal:
OpenGL extension GL_ARB_texture_non_power_of_two is not supported.
Please update your graphics drivers or replace your hardware.



Running a AMD X2 5200+, 3GB DDR2 800mhz, ATI Radeon X1950Pro 512mb, Windows Vista. Drivers updated to latest version. Clean system, no conflicts or partially uninstalled drivers.

@Gabriel: never replaced hardware on my system


JoshK(Posted 2008) [#17]
The extension in question is ARB meaning that if your card claims to support OpenGL 2.0 it has to support the extension.

I wonder if ATI has their own version of the extension?

I have been thinking about switching to rectangle textures, which I think have a little wider support than nonpoweroftwo textures. We'll have to try that soon and see how it goes.


The r0nin(Posted 2008) [#18]
Really impressive! My only hesitation is that I just can't generate the art that would actually make full use of the engine. But I am quite impressed with the quality of the engine...


plash(Posted 2008) [#19]
Arch feels fast but runs around 15-20fps, and I'm seeing a really weird line through the shadow textures:

AMD 64 2.21ghz, 2.5gigs ram, ATI Radeon HD 2600 XT 512MB SM 3.0


JoshK(Posted 2008) [#20]
That line has appeared on the transition between cascade levels on ATI cards. I am not sure why this would occur, but it can be fixed by only editing one of the shader files. I will have to test it some more and see what I can do.


SabataRH(Posted 2008) [#21]
Nice demo, ran fine here with everything jacked up. 115 fps on average.
Shadowing looked good, cam controls were ultra smooth. Impressive but i still think i'll wait on it as compatibility issues seem enormous at this stage.

amd x2-dual5000's / 8800Gt 512mb


Gabriel(Posted 2008) [#22]
While I wasn't using it, I bought a cheap nvidia card, en7200. But I deleted nvidia drivers and applications then installed the ati ones.

Well I've seen that exact same error on the exact same card, and it's because Nvidia uninstallers don't remove all their files. If you try DriverCleaner or something like that, and follow all the instructions to the letter, you might well solve that, and fix yourself a few other issues. You're actually lucky by comparison because I had a machine which was crashing regularly playing games and I eventually found out that it had left files from four completely different driver versions on the machine.

WRT the demo, I ran it with the default settings and got 60FPS. I have an Nvidia card (8600GTS) and I also got the lines on the border between the cascaded shadow maps. I tried running it again with shadows set to high and I couldn't see the lines any more. I was getting about 50FPS with shadows on high.


Reactor(Posted 2008) [#23]
I had a quick look on my 7600GS. It ran quite well considering what was being thrown around. Everything on maximum, 1024 with no AA I managed 10-15fps. With the default mediums, I managed 15-20fps... meaning it has the same issue as all SM3 engines on cards like the 7600, in what while the level may have been dropped back graphically to look like a DX8 game, it runs much slower. That's not the engine's fault though, obviously.

The issues for me were- the look of the demo is quite bad (I'm looking forward to the indoor demo much more), some of the glow effects looked chunky (especially along the edges of plant leaves), and on the medium settings the shadow level of detail draw distance was set so close it looked incredibly ugly. If that can be pushed back without too much difficulty it's a non-issue, but... I wouldn't ever want to see that kind of lod on shadow drawing in a game. It's just too noticable.

Otherwise, nice work...


JoshK(Posted 2008) [#24]
If your card does not support non-power-of-two textures, please try this test app, which uses rectangle textures for the render buffers. I believe rectangle buffers may be more widely supported. This works on my ATI HD 3870:
http://www.leadwerks.com/post/TextureRectangleTest.zip

Extract the EXE to the demo directory. Water and post-processing effects will not be rendered. You will either be able to move around the scene, or you will have an empty/corrupt screen or more likely an error message.


Yahfree(Posted 2008) [#25]
Hey, nice demo and very high quality!

With everything maxed out I get an avg of 65-75 fps in the middle of it.
With the standard settings I get 122 fps avg.

Additionaly, I don't know what the minimal app does, but its just a spinning white cube for me.

My specs:

NVIDIA 9600GT, 512mb
P4 processer - 3ghz and 1gb ram
Windows XP


ImaginaryHuman(Posted 2008) [#26]
Couldn't run this on my system, I have GLSL 1.1 but not 1.2

What does your app do that requires 1.2?


N(Posted 2008) [#27]
What does your app do that requires 1.2?
Check the shaders.


plash(Posted 2008) [#28]
I get a slightly higher fps on that demo, no errors or anything.

EDIT: With everything on full, I get a fantastical 5-9fps

Here is a better image showing the shadow detail radius thingy:



JoshK(Posted 2008) [#29]
Plash, you mean the demo using rectangle textures?

Regarding the CSM border line, I am not too worried about that yet.


plash(Posted 2008) [#30]
Plash, you mean the demo using rectangle textures?
Yep.

If I do no shadows on Arch I get about 100fps.


JoshK(Posted 2008) [#31]
I have tested on the ATI HD 3--- and 2--- series, but not on the X1--- series. I ordered an X1550 for testing, which will be my 7th GPU I have laying around here.


mtnhome3d(Posted 2008) [#32]
the archipelago demo wont open, it shows the menue with settings then when i click ok it opens a window "frame" and then closes with out further ado.
and the minimal one says that my hardware doesn't support non power of two textures.


JoshK(Posted 2008) [#33]
What GPU?


mtnhome3d(Posted 2008) [#34]
nvidia geforce 7000m/ nforce 610m on laptop


JoshK(Posted 2008) [#35]
Well, it says that is theoretically an SM 3.0 card.

Do you want to do me a favor and download and run the test I posted above using rectangle textures? You have a very rare card, so that would help a lot.

http://www.leadwerks.com/post/TextureRectangleTest.zip

Extract the EXE to the demo directory. Water and post-processing effects will not be rendered. You will either be able to move around the scene, or you will have an empty/corrupt screen or more likely an error message.


mtnhome3d(Posted 2008) [#36]
ok i tryed it and it gave me an error that said "Failed to create world"


JoshK(Posted 2008) [#37]
Could you post the contents of the log file please?


mtnhome3d(Posted 2008) [#38]
heres everything in the text file
Warning: Failed to initialize Newton.



JoshK(Posted 2008) [#39]
You did not extract the exe to the directory the demo is in.


Paul "Taiphoz"(Posted 2008) [#40]
7600 GS -
Everything up full i get 6 FPS.
With default settings I get 25.

Looks really nice, but its not nearly compatible enough with enough systems to make it of value to me, thanks for showing us a demo tho, i was gona buy it.


nawi(Posted 2008) [#41]
Not even Crysis require a SM 3.0 card. This seems to be a badly programmed graphics library.


Vorderman(Posted 2008) [#42]
Hmm, very slow - my work PC is a quad-core 3ghz Xeon with 8800GTX and it could only manage 25fps on medium, 10 to 15fos on high. Seems very slow for what is quite a small scene.


puki(Posted 2008) [#43]
I have an Intel dual core and an 8800GTX and was getting much higher frame rates.


Doiron(Posted 2008) [#44]
@JoshKlint: I've tried the "TextureRectangleTest.exe" demo (unzipped in the same folder as the older one) and the window simply hangs for 3-4 seconds, then closes down.


SabataRH(Posted 2008) [#45]

You did not extract the exe to the directory the demo is in.



And to to think this is a forums supposedly full of programs... tisk tisk.


Hmm, very slow - my work PC is a quad-core 3ghz Xeon with 8800GTX and it could only manage 25fps on medium, 10 to 15fos on high. Seems very slow for what is quite a small scene.



Somthings seriously wrong with your rig dude. Or does intel drag that much behind Amd's? Your system is quite a bit more powerful than mine and I managed 115 fps with everything on. How odd.

But it does seem leadwerks need some better shadow management, seems to be the crucial slowdown for most of these reports. Not sure if shadows are really required for such distant objects? Or is there a paremeter that allows the coder to adjust the distant ques?.

Regarding the CSM area, I would prefer that removed completely, the little circle around the camera that renders higher quality shadows is just a noticble distraction, I would much prefer the dull, blurred shawdows around the camera if this is the case, its just to noticible. When i first ran the demo it was the FIRST thing that i noticed, not good.

- amd x2-dual5000's / 8800Gt 512mb / vista x64 (83% native)


boomboom(Posted 2008) [#46]
I have got a rig similar to Vorderman's and got the same sorta speeds as him. Maybe its quadcores?


Koriolis(Posted 2008) [#47]
Same here.


SabataRH(Posted 2008) [#48]
Odd, possibly some amd optimizations in the engine core?
Dunno.


_33(Posted 2008) [#49]
I didn't notice the seams like Plash on my system. But I remember that my old ATI X800 series card showed seams on cubemaps and other situations.

Sooo, I'm getting 60 FPS average with everything set to max in 1600x1200, which is pretty good. I'm using a 8800GTS 512 (G92 gpu) with AMD Opteron 165 at 2.8 ghz and Nforce 4 motherboard. But I was expecting good performance, considering games like Crysis runs flawlessly on my system.


M2PLAY(Posted 2008) [#50]
Mmmm very, very slow !!!
Default Configuration 9 Fps
My configuration: IntelCore2Duo/3GB/8500GT-512MB/XPPro
Thanks for the demo Josh.






johnnyfreak(Posted 2008) [#51]
don't work here: intel core duo 1.85GHz geforce go7400


mtnhome3d(Posted 2008) [#52]
ok i tryed it agian this time i put it in the correct folder and it errored out so heres the log
 Initializing Renderer...
OpenGL Version: 2.1.1
GLSL Version: 1.20 NVIDIA via Cg compiler
Render device: GeForce 7000M / nForce 610M/PCI/SSE2/3DNOW!
Vendor: NVIDIA Corporation
OpenGL extension GL_ARB_texture_rectangle is not supported.  Please update your graphics drivers or replace your hardware.



plash(Posted 2008) [#53]
Its not a SM 3.0 card then.


mtnhome3d(Posted 2008) [#54]
ok i wasn't sure


Mortiis(Posted 2008) [#55]
Here is my log using TextureRectangleTest.exe

It opens a window and hangs a min. than quits without an error.

Initializing Renderer...
OpenGL Version: 2.1.7415 Release
GLSL Version: 1.20
Render device: Radeon X1950 Pro
Vendor: ATI Technologies Inc.
GPU instancing supported: 0
Shader model 4.0 supported: 0
Loading mesh "zip::c:/documents and settings/sarper soher/desktop/new folder/meshes.pak//meshes/skydome.obj"...
Loading material "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/skydome.mat"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/sky1.jpg"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/sand_01.jpg"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/clumpy_grass.jpg"...
Loading material "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/reflection.mat"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/water004_normal.tga"...
Warning: Failed to load texture "abstract::water02.tga": Path not found.
Loading shader "abstract::postfilter.vert", "abstract::postfilter_bloom.frag"...
Fragment shader was successfully compiled to run on hardware.
Vertex shader was successfully compiled to run on hardware.
Fragment shader(s) linked, vertex shader(s) linked.
Loading mesh "zip::c:/documents and settings/sarper soher/desktop/new folder/meshes.pak//meshes/obj__tree1.obj"...
Loading material "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/tree1.mat"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/tree1.bmp"...
Loading mesh "zip::c:/documents and settings/sarper soher/desktop/new folder/meshes.pak//meshes/tree_palm.gmf"...
Loading material "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/palmt3.mat"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/palmt3.tga"...
Loading mesh "zip::c:/documents and settings/sarper soher/desktop/new folder/meshes.pak//meshes/tree_banana.gmf"...
Loading material "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/banan2.mat"...
Loading texture "zip::c:/documents and settings/sarper soher/desktop/new folder/materials.pak//materials/banan2.tga"...
Loading shader "abstract::grass_shadow.vert", "abstract::mesh_shadow.frag"...
Fragment shader was successfully compiled to run on hardware.
Vertex shader was successfully compiled to run on hardware.
Fragment shader(s) linked, vertex shader(s) linked.
Loading shader "abstract::terrain.vert", "abstract::terrain.frag"...
Fragment shader was successfully compiled to run on hardware.
Vertex shader was successfully compiled to run on hardware.
Error: Failed to link shader program.
Fragment shader(s) failed to link,  vertex shader(s) linked. 
Fragment Shader not supported by HW



JoshK(Posted 2008) [#56]
It looks like the low-end ATI cards do support rectangle textures, so I can probably rely on those. It would be possible to use a pow2-sized texture for render buffers for systems that did not support either extension, but they are likely to be so slow anyways, it probably is not worth bothering with. I don't know, it might be worth doing, just to avoid problems with all the people with ancient laptops and those who refuse to update drivers ever.

As for the "Fragment Shader not supported by HW" message, I ordered a really low-end ATI card, and will test it myself so I can make it compatible with those cards.

To the person getting 9 FPS with an 8500, you have something really wrong with your computer, since a GEForce 7 series performs much better than that.


Vorderman(Posted 2008) [#57]
Doesn't work at all on my home PC (Core2Duo 1.8 + Radeon x1950 pro).


Mark Tiffany(Posted 2008) [#58]
Both run okay on my pc (spec in sig but 6600GT based), with two annoying minor exceptions:
- the text in top left which I assume shows FPS is screwed.
- the mouse is 100% controlled by the demo, even if I minimise it, i.e. have to close to do anything else

Presumably both are just the fact it's a demo at this stage.

On default settings, I guesstimate about 10-15 fps initially, dropping to about 5-10 FPS if I head into some trees.

If I set max settings @ 1280x1084, it's about 4 FPS until I head into the trees. And. Then. It. Slows. Doooooooooooooooooooooowwnnn....

Silky smooth on bare minimum settings 640x480.

So not bad I guess for a comparatively old card.


clownhunter(Posted 2008) [#59]
I got ~50fps on full with my AMD Athlon X2 5200|3 Gigs RAM|8800GT-512MB|Vista.

For some reason I feel that's pretty slow, but meh, I didn't notice any lag so it's fine by me.

By any chance, do you still have that one demo with the 2000 barrels with full physics still around? I'd like to see how that would preform on my computer. :D


JoshK(Posted 2008) [#60]
Define "screwed".


Mark Tiffany(Posted 2008) [#61]
unreadable. There's clearly something it's trying to display, but all I see are some black triangles and squares. Will try to screen grab...have to re-download first...


Mark Tiffany(Posted 2008) [#62]
Screwed=

It does change as you move round, but something's preventing it from rendering correctly.


Tab(Posted 2008) [#63]
Works fine in my Pentium D 2.8 / Nvidia 7600GS / 2GB Ram / Win Vista.

Default Settings 640 -> 20 ~ 52 FPS
Maximum Settings 640 -> 11 ~ 30 FPS

Default Settings 800 -> 19 ~ 42 FPS
Maximum Settings 800 -> 8 ~ 15 FPS

Default Settings 1024 -> 15 ~ 22 FPS
Maximum Settings 1024 -> 9 ~ 15 FPS


KimoTech(Posted 2008) [#64]
I haven't tried that demo yet, but on such powerful computers, isn't that simple scene running pretty slow (10-30FPS avg.)?
Because about 180.000 polys and 20FPS on a 8800GT is pretty slow! Even that there are soft shadows. Thats slower than Crysis!


Reactor(Posted 2008) [#65]
This demo is much faster than Crysis on my 7600GS.


JoshK(Posted 2008) [#66]
I haven't tried that demo yet, but on such powerful computers, isn't that simple scene running pretty slow (10-30FPS avg.)?
Because about 180.000 polys and 20FPS on a 8800GT is pretty slow! Even that there are soft shadows. Thats slower than Crysis!

Those results are obviously not valid when people with much slower 7 series cards are getting much better performance.


plash(Posted 2008) [#67]
I think windows has something to do with it (ie. when I tested a particle demo, it ran a lot faster on a separate desktop, with no other windows being drawn by the OS). You should compile fullscreen versions.


Reactor(Posted 2008) [#68]
What AA options do people have on when running the demo? No one seems to have mentioned anything about it...


KimoTech(Posted 2008) [#69]
Tried it out now. Is 8 FPS on my GF 8600GT not valid? 180.000 polys?


JoshK(Posted 2008) [#70]
I think you have something wrong with your computer because a GEForce 7200 performs better than that.


Vorderman(Posted 2008) [#71]
You can't dismiss results because the framerate is low, they are just as valid - lots of people getting such low framerates on frankly some ridiculously powerful computers would suggest that something else is wrong, and seeing as my PC runs Crysis, GRID etc.. just fine, I would suggest that something lies within your engine.


boomboom(Posted 2008) [#72]
I get around 12 fps on a quadcore and a series 8 nvidia card. But was able to play COD4 pretty well on it with much higher speeds.


puki(Posted 2008) [#73]
Personally, I think ATI users should switch to nVidia.

Quad-core users need to switch to dual-core - throw "Quadimodo" in the bin.

Get yourself an E6600 and an 8800 GTX and Large-it-up!


Paul "Taiphoz"(Posted 2008) [#74]
Oh I dont think I mentioned..

3 GIG of Ram.
Duel core
and Vista Ult.

Tried it again and switched some drivers around and it still sucks ass.


chwaga(Posted 2008) [#75]
Can this engine use .fx shaders?


chwaga(Posted 2008) [#76]
evil double posts...


AdrianT(Posted 2008) [#77]
Works fine here


Athlon 64 3000+, 2gb DDR ram, 256mb 7800GS OC (AGP)

With default settings I get 20-30fps
If I max everything I get 9-20 FPS

Both run at the default 1024x768 res.


taumel(Posted 2008) [#78]
Is there a video around for getting a feeling how it loosk and feels like?

By the way the X3100 is a SM3 gfx card but with uhm awesome drivers and it's fillrate is quite limited, anyway widely spread in notebooks.


JoshK(Posted 2008) [#79]
ATI cards struggle with stuff like this because they don't have GPU instancing, so every single plant instance requires a draw call. However, I have actually seen high-end ATI hardware perform slightly faster in indoor scenes.

Can this engine use .fx shaders?

No, that is a DirectX format.

Today I tried performing the lighting in a deferred pass. So far the results indicate about a 200% performance gain, so that is certainly interesting. I will get that ATI card I ordered on Monday probably. Working on an ATI 3870 right now.


WMSteadman(Posted 2008) [#80]
I got 40fps, with Quad Core, 2xGeforce 8800GT SLi'd, with everything on the highest possible settings (except screen res only 1280x1024 the highest my aging flatscreen monitor supports). I suspect my lower fps compared to my fellow GF8 series users is down to compatibility with SLi.

Bloom was bit OTT, I felt.

When I moved out to see facing away from all the plants and everything, I got upto 120fps. My current project is very low poly with no effects and it seems to be capped at 60fps, so I am wondering what I am doing wrong.

I have been viewing your previous screenshots and videos, and am incredibly impressed with this engine, especially the real time shadows. Doom 3's attempt at this was appalling in my opinion, black polys do not mean realistic shadows, I would sooner have had the option to turn them off, it really let's he game down, I can't play it for too long without getting depressed, yet other gory miserable games do not seem to bother me.

Keep up the impressive work. Perhaps a demo demonstrating the realtime shadows would be cool to see. :)


WMSteadman(Posted 2008) [#81]
Yup, I turned off SLI and got 55-60fps, not a massive jump, but closer to par with other cards of the same spec.


Reactor(Posted 2008) [#82]
WMSteadman, why are you so impressed? It's not as if Josh invented how to make these kinds of shadows.


KimoTech(Posted 2008) [#83]
I get around 12 fps on a quadcore and a series 8 nvidia card. But was able to play COD4 pretty well on it with much higher speeds.


There you see Josh, i think there is a compitability problem between GF8 cards, Quad-cores and your engine?


WMSteadman(Posted 2008) [#84]
Reactor, It's not simply the type of shadows, it's the quality and speed they render at that impressed me.


Reactor(Posted 2008) [#85]
That's pretty average for SM3 cards. Doom 3's shadows are a completely different type which run much better on lower-end hardware. Carmack designed them for the tech of the day, and considering the card I could run Doom 3 with, I think he did a pretty good job.


AdrianT(Posted 2008) [#86]
Texture shader based shadows still require quite a bit of work to get running properly. I'm waiting for Lina to set them up so we can use them in Flow, hopefully its easy to adjust my shader materials to recieve the texture shadows correctly.

Stencil shadows are great for low end systems and lower poly scenes but they are getting a bit dated now. I've heard that with texture shadows theres a point where higher poly scenes start to overtake performance of stencils. Looking forward to finding out where this happens.


JoshK(Posted 2008) [#87]
Well, I have the results for deferred lighting:
A test scene with 8 point lights reveals a 400% speed increase with deferred lighting on a GEForce 8800. A GEForce 7 series card renders 6 deferred point lights at a rate of 90 frames per second, versus 27 FPS when forward rendering is used.




plash(Posted 2008) [#88]
it's the quality and speed they render at that impressed me.
40fps is pretty fast for realtime shadows!![/sarcasm]


_33(Posted 2008) [#89]
From all the demos and screenshots I have seen Josh, this last one is the most impressive (taking into account the possibilities of this technique).


puki(Posted 2008) [#90]
eh?

I didn't see a download for the deferred lighting demo. I want it.


Ruz(Posted 2008) [#91]
Archipelago demo runs ok on mine about 12 fps maxed out.
running 8600MGt 1280MB - 2GB Ram.
blooms a bit heavy :)
no graphic glitches apart from the slow frame rate.


plash(Posted 2008) [#92]
running 8600MGt 1280MB
Wow. That doesn't sound right..


Ruz(Posted 2008) [#93]
well thats what it says on my laptop - its does say 'up to' 1280 mb so i suppose it part dedicated graphics card part borrowing memory from somewhere( probably my memory)

http://www.acerdirect.co.uk/Acer_Aspire_9920G_Laptop_LX.AKE0U.004/version.asp#top


Reactor(Posted 2008) [#94]
A test scene with 8 point lights reveals a 400% speed increase with deferred lighting on a GEForce 8800.


And how does it do when the lighting isn't taking place in the world's most basic test scene? Do things like shader effects have an impact on the speed increase?


JoshK(Posted 2008) [#95]
Lighting is independent from scene complexity, because the light routine only processes the final screen buffer. The lighting routine will run at the exact same speed no matter what is onscreen.


Gabriel(Posted 2008) [#96]
How do you handle alpha transparency when using deferred shading?


Azaratur(Posted 2008) [#97]
I tried all the possibility of the demo, and with maximum set i receive a 35 fps with an old 7600nvidia (dual core 3 ghz, 2 gb ram)
Have you tried to install last forceware (or ati) driver and 3dmark06?

I tried in other (clean) computer with 8800 nvidia, and fps was like 60-80.
In many cases i see more than 800.000 poly

I don't know why it's not working well in your computer.. I'll try more other time in other computer and i'll report my result.

Aza


plash(Posted 2008) [#98]
Fullscreen
Demos.

asap.


JoshK(Posted 2008) [#99]
How do you handle alpha transparency when using deferred shading?

That is a little tricky. Deferred rendering changes the nature of the renderer, eliminating old limitations but creating a few new ones. You won't be able to have a hallway of z-sorted glass planes like in older engines.

You can do a multiplicative blend for dirty windows with no problem. You can also make transparency the same way we do our refraction effects, by rendering the refracted surface in a second step. Another option is to render the alpha-blended glass with a regular forward lighting process in the glass shader. Another cheap technique is to dither the transparent surface so that only every other pixel is drawn.

So it's not really a problem, and there are several ways of handling it, but it isn't quite as "automatic" as forward rendering techniques.

I have added an optional secondary color buffer, and I am interested in having some objects write to that buffer, and then rendering it with a bloom effect. So the result would be a selective bloom that only occurs on some objects, so a bright light or a computer screen would emit a dynamic glow effect. I think this is something the STALKER renderer does.


Jiffy(Posted 2008) [#100]
Up until now I was really looking forward to buying this engine. I seriously hope that you will consider a fallback procedure to accommodate older technology. People may run out, drop cash, and up their rig for Crysis or Unreal, but an Indie game probably will be looking at a market of only a percentage of those people who already have 'high-end' graphics cards (and know 'which ones' they have) not a percentage of the total market of computer users. A little backward compatibility goes a long way.

I know this is a WIP to some degree, and congrats on what you've got so far, but 'waiting for' the world to catch up with technology really is a 'bent' way to program. You are programming for the future which is smart, but you're really thinning your market when you don't have to. After all- games made with this engine people will want to play- even if they don't look like what's 'on the tin'.

I know that fallback paths can double or triple your coding workload- but that is why people will shell out money for your engine. In any case, your website, price and features are good- I wish you luck.


HNPhan(Posted 2008) [#101]
I found out that setting the "Multi-display/Mixed-GPU acceleration" setting in the NVIDIA control panel to "Single display performance mode"

made the speed jump from 15 to 60Fps...


I also fully support Josh's view on making it a non-compromising engine, since any games made now will probably be done in 2-3 years time. so enough time for people to catch up


Jiffy(Posted 2008) [#102]
Yeah- in 2-3 years a new crop of consoles will be out, which is where all the games are going to be because of a lack of a good market on the PC due to all the 'unsupportable' variations and crap.

Last I checked, compromise was usually seen as a good thing.

Hey- It's his engine, I wish him well. I was very interested in buying it. I just see development time and future sales better geared toward something that is more 'cutting edge' without cutting off it's userbase. 'Shiny' is great- 'green' is better.


Reactor(Posted 2008) [#103]
I also fully support Josh's view on making it a non-compromising engine, since any games made now will probably be done in 2-3 years time. so enough time for people to catch up...


If you're planning on selling a game, the best thing someone can do is capture as much of the market as possible.


HNPhan(Posted 2008) [#104]
I agree with what you guys are saying but my needs are obviously different from yours.
heres what I have so far with the engine. I need some optimization code






plash(Posted 2008) [#105]
Yay! real grass!


JoshK(Posted 2008) [#106]
Those shadows and grass look pretty amazing.

Grass is probably the slowest thing you can render in the engine. I finished deferred point and spot lights, and am working on deferred directional lights now. I think this will have a really big impact on the framerate of your scene here, since the grass is being drawn in a lot of layers.


Reactor(Posted 2008) [#107]
Yay! real grass!


Gamer 'real', yes.


*(Posted 2008) [#108]
I would add variation on the trees TBH


HNPhan(Posted 2008) [#109]
Josh, u just made me very happy :)

I'll eventually make some trees for added variation, just not right now


Naughty Alien(Posted 2008) [#110]
@ Josh
I see that on your web site staying >>Leadwerks Engine 2.0 Is Now Available For Pre-Order<<

and below is button for purchase..so is it available for purchase or what?


KimoTech(Posted 2008) [#111]
Grass is probably the slowest thing you can render in the engine.

Doesn't your engine support instancing?


JoshK(Posted 2008) [#112]
Yes, but it still uses a lot of texture lookups when rendering the grass shadows, and the grass has a lot of overdraw, so the forward lighting get calculated on many pixels that end up getting discarded. The deferred lighting only processed the final screen pixels.


KimoTech(Posted 2008) [#113]
Isn't it just one big shadow map for the scene or is it one shadow map per object?


JoshK(Posted 2008) [#114]
One big shadowmap, but it still has to render the scene three times for the lighting.

Normally shadow rendering doesn't run with a fragment program, but for masked shadows a texture lookup is required to discard fragments.

I think the "hardware requirements are too high" crowd is motivated by their own low-end hardware more than concern for their own end users.


Dreamora(Posted 2008) [#115]
1. Beside 5% of the users all only have what you call "low end hardware" (even thought 600GT series card are higher mid range according official market segment naming)
2. Creating SM3 content is even harder. You should have realized that by now. That content is not optional anymore if you enforce crazy minimum requirements.

3. You set SM3.0 as minimum requirement but do not qualify for it. Given that the Intel X3100 is !!DX10 compliant!! I would guess that you hide your own errors beside a "your graphics hardware is just crap" attitude to be not forced to dive into what you will have to dive anyway: vendor dependent extensions. You will have to use those in any case, even if you hate to do it and you will have to do it for all 3 major GPU vendors, not only NVIDIA, at least if you intend to sell an engine that runs efficientely on all 3. Intels X3100 might not be great at performance, but for simple outdoor scenes with quite little going on beside eye candy they should be more than enough to run in realtime. If not, then there is still lots of room left for optimations. (which actually is needed on the Intel side and quite well documented on their page on what is efficient, whats the right way to do it)

4. As far as I see you don't create a game, but a technology. So you better should stop assuming to what target audience we intent to deliver our games but deliver technology that enables us to do so. We are not interested in tech demo eye candy technology and so far thats the only thing I've seen done with it. No realworld example that it is actually usable within a real environment with stuff going on etc.
If we were crazy for eye candy, we would prototype our games in the Crysis Sandbox / Mod it which is cheaper than a license of your engine, more performant and more stable and approach potential investors with that instead of investing into your technology, because yours will cut any possibility to approach an online portal anyway so we end at an investor / publisher anyway.


Reactor(Posted 2008) [#116]
I think the "hardware requirements are too high" crowd is motivated by their own low-end hardware more than concern for their own end users.


Since the "hardware requirements are too high" crowd are your potential customers, perhaps you should be concerned with what they're motivated by.


_33(Posted 2008) [#117]
Eventually, everyone on the Blitz forums wanting to make 3D games for the masses will have to realize that their games will likely be compared to console games like XBOXes and Playstations, so aiming at high end is a good thing. The PC has evolved and will evolve. There was a time where the Voodoo cards were hot, so were Nvidia's TNT or ATI's Rage3D...


Doggie(Posted 2008) [#118]
Works great for me. 50fps on default setting and from 19 -25fps on highest.

No errors. High settings are a little bright but other than that.

Vista Home Premium
Intel(R)Core(TM)2Duo CPU E4500 @2.2ghz(2cpus),~2.2ghz
3070MB RAM
NVIDIA 8600 GT 256MB


JoshK(Posted 2008) [#119]
Since the "hardware requirements are too high" crowd are your potential customers

No they aren't.

The 8600 has 32 stream processors. The 8800 has 128, so it is roughly 4 times faster. The example scene does have a lot of overdraw with all those layers of leaves, so I think I can still speed the lighting up a lot. I will be working on deferred directional lights tomorrow. I am not sure what the speed gains will be yet, but with point and spot lights and speed gains were enormous.


Reactor(Posted 2008) [#120]
No no, you said that the wrong way around. You should have said, "I don't worry about it anymore, and because of that they aren't."

EDIT: Nice edit of your post there.

Anyway, I have no problem if you want to create an engine for one area of the market... just don't make comments about people who are right in saying your engine would be more suited to the market if its focus wasn't so narrow. They have valid reasons for not choosing Leadwerks, and it very likely has nothing to do with the mid to low-end cards in their own box.


JoshK(Posted 2008) [#121]
Good. You should not use this engine. Sales have been great and there are other people who like it. So go away and quit spamming this thread telling me I should be designing my engine differently.


Reactor(Posted 2008) [#122]
Dude, I'll gladly say goodbye to this thread. You have completely missed my point. Have fun in your own little personal utopia.


Baystep Productions(Posted 2008) [#123]
I think this is awesome. But I have sticker shock. So I shall not purchase. Plus I'm a B3D developer who is starting to do C++ Graphical stuff. So I can find what I need for free. Cause I don't program for the money. But if this was free for personal/educational use I defiantly would give it a shot. But my hobby doesn't require the purchases in the hundreds.


Good stuff over-all. I haven't seen something that smooth looking in a long long time. Reminds me of Unreal Engine. (more sticker shock)


Tab(Posted 2008) [#124]
I can't understand why the people complain the work of Josh... The engine is pretty nice, support shaders, shadows and a loot of great stuff in a easy way... maybe is a little slower, but is the first release, in some month or weeks this can be better and faster.

From my point of view, is a pretty good option to make a game aiming to graphics with SM 3.0+, only need a little more of work...


chwaga(Posted 2008) [#125]
hmm...I'm considering buying this, but before I do, Josh, you'd probably be able to get a bunch of people buying if you made a mini game demo to demonstrate the engine's power. I'm sure the people on your forums would do it in a contest if the best game gets a free copy of 3DWS or whatever.


JustLuke(Posted 2008) [#126]
I can't understand why the people complain the work of Josh...

I think that it's as much to do with his personality and attitude as it is the quality of his work (which seems to be pretty impressive if you've got high-spec hardware).


AdrianT(Posted 2008) [#127]
How many blitzers ever actually finish anything? Most seem to make tech demo's at best, and many only use it for fun and experimentation and simply want cutting edge tech to play with. Meanwhile others aren't interested in supporting low spec systems and want to release something that will play well in a couple of years from now when PS3.0 runs well on budget hardware.

All these people can buy the engine and still get value for money.


Dreamora(Posted 2008) [#128]
Main problem with the "in a few years" thinking is that OpenGL 3 is the first time that the new OpenGL is not just an extension to the old but breaks with some of the core aspects.
You can not dev now with an engine that will need 4 years to work usefully on the mainstream systems, at least not if it uses current generation API.

You wouldn't go for DX9.0c as well with overdone effects to see them working on mainstream in 4 years, you would go with DX10 is this is what GPUs get opted for, not DX9 anymore.


JoshK(Posted 2008) [#129]
You can not dev now with an engine that will need 4 years to work usefully on the mainstream systems, at least not if it uses current generation API.

You are right. This engine is terrible. I have no idea what I am doing, and you should not buy it.

Now please go away since this obviously does not meet your standards.


nawi(Posted 2008) [#130]
I wonder why anyone buys stuff from person like you JoshKlint.


Tab(Posted 2008) [#131]
I think that it's as much to do with his personality and attitude as it is the quality of his work (which seems to be pretty impressive if you've got high-spec hardware).

But, you buy his Engine no his personality or attitude.

Anyway... i don't like the polemic... then... i leave this post.

Keep your work Josh.


nawi(Posted 2008) [#132]
But, you buy his Engine no his personality or attitude.


Even if a store has high quality goods, I'm not going to shop there if they are rude or something.


_33(Posted 2008) [#133]
I think Josh broadened his horizons last few months. He received ATI graphics cards, and he seemed to insinuate that his engine will play on all SM 3 graphics cards. I think it's very good news! It is a next gen engine after all. Why all the arguing about supporting onboard graphics and 4+ year old graphics cards?

I was arguing this matter before, as I was using an X800 graphics card. But at some point, I had to face the fact that most newer games don't play on that card (ie. Crysis, BioShock, Splinter Cell: Double Agent, Call of Duty 4, DiRT, Rainbow Six: Vegas, Jericho, Gears of War, Grid, ... etc etc). I still remember the time when all gaming sites were praising the visuals of HDR enabled games. Again, HDR is SM 3.0+....

Next gen graphics require a decent card, so the guys that can't deal with that are facing a nice problem, that is obviously not Josh's.


nawi(Posted 2008) [#134]
This engine is not as advanced as the engine used in Crysis, but has higher requirements, therefore it is not really well programmed. It is just lazy to code support only for his own latest gfx card. There is no real justification for SM3.0 requirement.


_33(Posted 2008) [#135]
This engine is not as advanced as the engine used in Crysis, but has higher requirements, therefore it is not really well programmed. It is just lazy to code support only for his own latest gfx card. There is no real justification for SM3.0 requirement.

When I had my X800 card, I tried the Crysis demo, and it crashed after 30 minutes of gameplay, and never was able to display the graphics properly afterwards. You are right, Josh doesn't have to limit his engine to SM 3.0 ... And what are you gonna do about it? Have you tried the demo 1 (which is the subject of this thread, btw) ? Why be so harsh on Josh in the first place?

Let me quote you 1 week ago:
Not even Crysis require a SM 3.0 card. This seems to be a badly programmed graphics library.

Don't you consider yourself rude right there?


Vorderman(Posted 2008) [#136]
Seems fair enough to me - Crysis runs well on my PC, but this techdemo won't run at all. Crysis runs well on my work PC whereas this techdemo runs at a very low framerate.

The fact that I can play HL2, CoD4, Dirt, GRID, MoH Airborne etc... all perfectly well on my home PC would suggest that this engine is very inefficient, and with such ridiculous minimum system requirements who is going to use it? Most potential customers couldn't even run a game made with it.


JoshK(Posted 2008) [#137]
This engine is not as advanced as the engine used in Crysis, but has higher requirements, therefore it is not really well programmed. It is just lazy to code support only for his own latest gfx card. There is no real justification for SM3.0 requirement.

I agree. This engine is not suitable for your needs. So why are you still here? You keep threatening not to buy it. I don't want you to buy it. Go away.

This engine is aimed at an SM 4.0 GPU with 128 stream processors. Anything less than that will require settings be scaled down. I don't need to convince you that you need it. I have lots of other people I am working with who like it a lot.


N(Posted 2008) [#138]
Most of you seem to be very confused by the idea that maybe you're not the target audience. Josh made a decent engine and has targeted a specific hardware range. This hardware range is not your hardware range, so what else is there to discuss? Go away. I was much more interested in reading the useful information in this thread, and the rest of you are just being annoying.


taumel(Posted 2008) [#139]
From the impressions you get from reading through this thread as well as the threads on the Leadwerks forum i would say it seems to be the way Evak pointed out already.

The target audience of the engine are those who like to play around with a current technology engine. Those are also the ones who don't mind a lot about buying a decent gfx card in the mid/upper range if they don't own one already or if the engine works in a reasonable way for releasing something. Due to several points like the early status of the engine (seems like there is still a lot of work to be done), the for today relative high hardware requirements, lacking support regarding fallbacks/workarounds for older or even common hardware like the X3100, missing needed optimisations, concept changes and simply the way he deals with his engine and the situation isn't something you really want if you're heading for releasing something in a professional way.

Maybe this will change in the future but at the moment i don't see how someone could seriously consider the engine for a stable commercial release for instance.

But on the other side that's fine too. That's what he offers and what you get and if you're fine with it then why not.


Jiffy(Posted 2008) [#140]
Some of you seem to be very confused by the idea the discussion forums will attract discussion. Yes- it is confusing to some that a vendor who has product not want to sell it but to a specific niche of customers. People might mistakenly think he might want more, and would be suprised that he would rather be rude than understanding. But life is about experiences of all types, both expected and unexpected.

This is not your forum. If you do not like people participating in threads with legitimate, albeit peripheral questions/comments, then it is you who should go away. The thread creator does not own it. The usual assumption is that threads are created to facilitate discussion. Learn from this and post to your worklogs from now on- you can completely control the environment there.

No one can ask 'annoying' questions about your product and how it can (or in this case, not) run on their systems, or stupidly try to 'help' by stating what they think might be improvements (like it being able to run on more 'other people's' systems)- you know- so that the license to use your engine be eventually profitable. That you consider this a 'problem' reflects on you more than the 'annoying' people who should go away.

Be clever. Be rude. Make your money. I'm sure you're doing fine. Just hope no one else figures out that good fps and great gfx on a new tech gfx card reflects the gfx card, not your tiny little, minimally interactive test environments.

But most of all know that you're doing things right and people like me who don't pay for your friendship can kiss your butt. Well, you've lost a customer- oh, right- I'm not a customer till I've made a purchase. Gotta remember that- well 'potential sales' are down- but don't worry about that- if you can't spend it, it doesn't matter.

Despite all that, I still wish you luck. You're an obviously sharp coder. Make tons of sales- that is what matters to you, right? Not the annoying 'customers'...

Later.


Moraldi(Posted 2008) [#141]
Jiffy: I totally agree with you


IPete2(Posted 2008) [#142]
I'd just like to ask - of all those who are 'complaining here' -

How many of you have ever written, from scratch,

(a) Their own game engine and shown it publically - even given it away for free so others can learn?

(b) Finished writing a commercial game and found a willing publisher?

(c) Made numerous, fully commercial tools for the development of indie games?

(d) Just messed about with something like B3d or BMAX to make a little, yet highly impressive, fully finished demo?

To those who honestly answered 'Well er no but I'm gonna.. "I say this:

Josh doesn't talk about stuff he is 'gonna do' by the time he opens his mouth, it is normally already a reality, unlike most people on this thread who just like a good 'belly ache' Josh just gets on with it, driving onwards. He has his own 'gps' and his own journey and his own agenda, but he IS in the race, in control and moving at speed.

If you wanna 'drive' - go get a car and pass your test, otherwise be a passenger, or worse be a bystander. I see a few passengers on this thread, but most are bystanders.

IPete2.
Team Driver.

PS for those with no car yet, this is only one of many you can learn to drive, look around and find the one that suites your driving best.


JoshK(Posted 2008) [#143]
Deferred lighting is implemented, and will be included in version 2.1. It looks like deferred directional lights yield about a 200-300% performance improvement. Regardless of what is rendered onscreen, the lighting routine takes the same amount of time to render, so the scene usually becomes limited by the vertex pipeline rate, which tends to be quite fast on modern hardware:



Wiebo(Posted 2008) [#144]
It runs like crap on my machine but I thought Josh would like to know that is does run though. I have no use for this engine, I am not the target audience, but I sure like what I am seeing and Josh has done a good job with it. Congrats.


Hotshot2005(Posted 2008) [#145]
impressive screenshot. JoshKlint


_33(Posted 2008) [#146]
Indeed!


Tab(Posted 2008) [#147]
OMG... 1.174.421 polys and 35 FPS xD


fredborg(Posted 2008) [#148]
It seems to work fine on both the machines in my signature... Well done!


plash(Posted 2008) [#149]
.exe for last screenshot plx.

P.S. Why do you keep changing your name back to Leadwerks?


Zetto(Posted 2008) [#150]
demo runs at ~8FPS on:

e8400 @ 5ghz (100% stable. great chip, BUY ONE!)
SLI 9800GTX
4gb ram
vista ultimate
latest drivers

Not impressed.


plash(Posted 2008) [#151]
rofl.


JoshK(Posted 2008) [#152]
SLI has problems on Vista. Set the "Multi-display/Mixed-GPU acceleration" setting in the NVIDIA control panel to "Single display performance mode".


Zetto(Posted 2008) [#153]
didn't make a difference at all. Its not my hardware I can run ANY game at 1920x1200 with full everything and 16x QSAA no problem. This is the first thing ive seen lag on my computer since I put it together.


Robert Cummings(Posted 2008) [#154]
200fps here with all settings maxed 1280x1024

intel quad core with 8800GT card


*(Posted 2008) [#155]
Now I have a shiny new HD2600 shader model 4 card I can test it 8)

Archipelago comes up with Loading then quits, Minimal tells me GLSL 1.20 isnt supported try changing driver or hardware. I have the latest drivers for my ATI Radeon HD2600 graphics card, it has support for shader model 4 etc.

I can run Bioshock, UT3, Oblivion with the settings all set to max but cant run ya engine :s


JoshK(Posted 2008) [#156]
The 2600 isn't actually an SM4 card, but it should be able to run the engine. If GLSL 1.20 is not supported you do not have recent drivers.


_33(Posted 2008) [#157]
The 2600 isn't actually an SM4 card

http://ati.amd.com/products/radeonhd2600/specs.html

Full support for Microsoft® DirectX® 10.0

* Shader Model 4.0
* Geometry Shaders
* Stream Output
* Integer and Bitwise Operations
* Alpha to Coverage
* Constant Buffers
* State Objects
* Texture Arrays



ATI doesn't suck as much as one might think!


JoshK(Posted 2008) [#158]
No ATI card supports the OpenGL Shader Model 4 extension.


KimoTech(Posted 2008) [#159]
Josh, i just think people want to say, that maybe you could try to optimize your engine a lille bit, and maybe make some smaller shaders, so computers with SM2.0 hardware can run it too. I won't try bring your project down, i think it is good work, but as a suggestion, i agree with people, that your engine is a little lazy in performance. I hope you understand.


*(Posted 2008) [#160]
Got it to work, it seemed to be a graphic issue. After a restart of the computer it worked ok can run it with everything maxed out at 30fps :)


JoshK(Posted 2008) [#161]
maybe make some smaller shaders, so computers with SM2.0 hardware can run it too.

It would be easier for you to shell out 32 bucks for a GEForce 8 series card:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127354


KimoTech(Posted 2008) [#162]
Josh, do you know the problem is here?

As every middleware-developer should know, that compatibility is not such important for the developer, but for the end-consumer of the game!

I mean, i've already got a GeForce 8 card, but if i sell my game, and it requires a SM3.0 or higher Geforce/ATI card, and people doesn't have such new hardware, they wont buy my game.

So yes, it wou´ld be easier for me, but if i fx. sell a game for 4000 instances, 32$ * 4000 is a lot of money, and then some people wont buy new hardware because of one game. So therefore i mean it is important for you and fx. me as a game developer, that the engine has good compatability.

Thats also why Blitz3D is so popular, and still is used. That's because Blitz3D runs on almost every hardware today!

Do you see my point Josh? I am only trying to help you. :)


JoshK(Posted 2008) [#163]
You should use Blitz3D. Fixed-function pipeline rendering is clearly better.


KimoTech(Posted 2008) [#164]
I have used Blitz3D, but i need an engine with better performance and shader support, and support of the new SM4.0 techniques. But still, i want my game to run on an "old" GeForce 4 too (SM2.0).

Josh, what are you so angry about? Why do you say to people, that they have to find another engine? Why don't you listen to them, and improve your engine, to sell more of it?


JoshK(Posted 2008) [#165]
Because SM 2 cards lack support for simple fundamental things like reading the depth from a texture, but you think it is just a matter of disabling a few effects.


KimoTech(Posted 2008) [#166]
As i am through with writing my own engine, yes it is.
I am not dum at engine-programming Josh, and maybe you will have to rewrite some of your code, but as i can see, your engine relies a lot on shaders, so i don't think it is a matter of rewriting the engine.
And you can also use floating point textures instead of depth-stencil textures on SM2.0 hardware, so thats a tiny problem!
My engine can fx. render shadow maps on SM2.0 hardware using FP16 textures, hardware shadow maps on some SM2.0 and SM3.0+ hardware using depth-stencil texture, and at last, soft shadows on SM3.0+ hardware.


JoshK(Posted 2008) [#167]
Maybe we should all use your engine instead of spamming this thread with requests for hardware I have said will never ever be supported in a million years.


KimoTech(Posted 2008) [#168]
As long as you are threading your costumers as shit, yes i welcome them to stick to my upcoming engine. At least, i listen to their suggestions, and i try to make my engine fit the costumers needs.


KimoTech(Posted 2008) [#169]
And i'm just curious, but Josh, are you faking about your company?
That picture you have got on your home page is a picture of UTEK, Industrial Biotechnical Corp. in Sarasota, FL :



http://sharesleuth.com/2006/10/utek_corp.html

Scroll down a bit, then the picture arraves there.


JoshK(Posted 2008) [#170]
I thought it was a nice picture of a generic-looking office building.


DStastny(Posted 2008) [#171]
Normally I dont jump into these threads. But this is too funny.

Only the photo was retouched. Didnt even change the name of the file.

Common practice to use clipart for websites could be other company is faking it. Could be Leadworks....

Only the Shadow knows...

LOL


tonyg(Posted 2008) [#172]
That does it. I'm using a picture of George Clooney in my passport from now on.


KimoTech(Posted 2008) [#173]
Is your phone and adress fake too? And what about the pic of yourself?

Josh, i don't preciate a developer who fakes his "company".

Think about people who buys your engine, and they are paying money to a faker.

Gratz, LeadJerks .. Uoops, spelling mistake.


Jiffy(Posted 2008) [#174]
That's rich. 'Unintentionally misleading', but funny as all get up.

He's doing what he wants, how he wants to, and making money from customers as a side effect (who hopefully realize how much consideration will be given to 'reasonable' requests). Leave him alone. He's happy as is. Our world is not his world.

It's a good engine, it's got a great price. And it could be even better- but he's the one who has chosen to do the work- or not. It's his sweat.
At least he's not promising the moon and not intending to deliver. He is in fact delivering on what he promises, just not more if asked nicely or responding politely.

I admire what he has done, even if it's not good for me.
'Course, even if he was doing what I wanted, his attitude would kill the deal for me- but that's just me. I'm stupid that way- I like friendly customer service. To each their own.

I'm going back to Ogre. Once you get past their wonky forum layout, you realize people are giving away source to features and stuff you'd be paying for elsewhere- shaders included. People seem to think shaders are engine dependant, when in fact you can just about cram them into any opengl/dx9 rendering engine if you know what you're doing.

Oh, and Ipete2? Nice invoking of the 'only experts have a ''right'' to speak credo'-
Mind if I ask why is it...

(a) only experts (people who already made engines) should need to buy his at all?
(b) only experts (people who've published games) should want to limit their market (after success with 'lower quality engines')?
(c) only experts (people who've made numerous commercial tools (for indie games))(wtf?) can even relate?
(d) only experts ('HIGHLY impressive' b3d/bmax demo makers) can ask?

My response? Customers aren't always experts. Their money spends the same. Maybe some people count on that.

You may equate backward compatability with belly aching, but it's about market share and customer base.

If I wanna drive a car that's shiny, fast, only makes right turns, and has no rear view mirror or reverse- well- maybe the future is 'full throttle, no left turn highway design'. But until then don't give people crap for asking about those lefts, behinds.


Dreamora(Posted 2008) [#175]
Its interesting how threads from Josh end that show his attitude towards user and potential user suggestions and input.
Sad that there are still too much funding that attitude.

Josh is a great programmer with a lot of experience with writting great technology, but for some reason he only focuses on the latest instead of focusing on latest and broad at the same time.
My suggestion for you Josh: get a job at ATI or NVIDIA on the demo reel groups of those companies. They are always looking for capable and talented guys to have great demos side a side to their new GPU generations.

Because thats what you are actually targetting with your ultra ristrictive hardware requirements: demos for highest end machines.
The difference by working for them would be a stable income of most likely more than you have now and a less hurt reputation which is strenghtening itself as "drop when it gets too hard" - "I do as I please and if you don't like it and piss me I will terminate your license illegaly" - developer that you have.
Your reputation isn't far away from Anthony's anymore (you might remember him. Developing about 3 or 4 3D engines, selling it to a few that were willing to trust him again and then leaving it after a month into development nearly totally broken) and at that point you better don't show your head in any community anymore that knows you, as you are reselling through EU based companies that will get pretty sued for stuff like illegal license terminations (EULA have no legal binding outside USA, in case you see them as legal base for your behavior, as are non-standard paragraphs in the license)


Reactor(Posted 2008) [#176]
I agree with Dreamora and many other people on here with a distaste for Josh's attitude, but I know that kind of thing always comes back to bite people badly in the bottom at the most inconvenient of times. Plus, I agree the engine could be a much more popular one if it'd actually been created... well, with people in mind, instead of being another of Josh's personal adventures into doing whatever Josh feels like at the time.

That said, I agree with Jiffy. This is his engine and he has the right to go about things any way he likes. If he wants to make it for SM3 cards only, that's his choice, and there's nothing wrong with that. If he wants to treat people the way he is, that's his choice too, and he bares the weight of whatever else he does on his own head. So, let's be professional, and let the threads on this forum about the Leadwerks engine remain as Josh would hope they would be... just as we'd like our own threads to be. He is a person after all, and having threads on the forum (which help the development of the engine continue) is the respectful thing to do.


Sonic(Posted 2008) [#177]
Wow I'm sorry for you Josh that people are giving you so much grief for not tailor-making an engine for them.

Want a future-proof, forward looking engine? Josh's shows a lot of promise. It's young, but I'm sure it'll grow, and it's unfair to compare him to Anthony.

Want another B3D? Use B3D. Or MiniB3D. Or myriad alternatives. Stop bitching at someone who's clearly put his heart and soul into this project, which, believe it or not, was not written specifically for you.

Josh doesn't claim it will run on SM2.0, so why be annoyed when you realise he's right?

I'm personally comfortable with Ogre now, it's finally lead me away from Blitz, because I couldn't wait any longer for a DX9 version of B3D or Max. I'm looking forward to trying Flow, but then again I've built my own C++/Ogre framework and Mark may just have lost me with his failure to deliver a working Max 3D engine.

I'll still always use the Blitz family for 2D work, but when it comes to 3D, you have to choose the engine that suits you and run with it. If it's not right, there's no need to moan... there's bound to be an alternative that will be.


Reactor(Posted 2008) [#178]
You lost a bit of credibility there by using the term, 'future-proof'.


IPete2(Posted 2008) [#179]
Jiffy,


Nice invoking of the 'only experts have a ''right'' to speak credo' -
mind if I ask why is it...



You mis-understood me, not sure why but hey. In my world everyone has a voice, the ones I chose to listen to tend to be people who have proved themselves in their portfolio. Most of the 'belly achers' in this thread have empty portfolios or just don't like Josh, that's fine too - but it does tend to muddy the water.

IPete2.


Damien Sturdy(Posted 2008) [#180]
heh I dont think Josh deserves all that critique. He's decided on his target audience and stuck to it. What's wrong with that?


JoshK(Posted 2008) [#181]
I only bother posting here because this site is something like our third most frequent referrer.


*(Posted 2008) [#182]
TBH I have bought Josh's products in the past (on the whole they are good products, still use CharacterShop :) ), if I had the dosh I would get his engine too. Before you all hiss etc, the main reason is simple by the time I have made a game in it that uses the engine to its capabilities the graphics card industry would be on Shader Model 6 by then and the games industry would practically be rendering real life images.

To put the point across, the engine is a tool and it will take us ages to make something with it. Yeah its future proofed but if you look at it the engine will allow us to make games that when released can take on the other games out there at the time rather than looking like something from ten years ago.

If Josh likes to make the engine the best he can then good luck to him, if its for a small audience then thats his decision. TBH most hardcore gamers would have cards that support that specification anyways.


Sonic(Posted 2008) [#183]
Reactor,

Not that I want to get any more involved in this, but why do you say his engine isn't future-proof, being aimed at DX10 and Shader Model 3.0? I mean if you're saying that future-proof means never needing anything better then I see your point - but I take it to mean something that is abreast of current technology.

Anyway, good luck to you Josh...


Naughty Alien(Posted 2008) [#184]
..my dear Sonic..to say that something is 'future proof'... while you cant control one single bit of that future, is not wise..not at all...so far I never heard that anything in human civilization was 'future proof', and I hardly doubt that such dynamic thing, such as one piece of software can match such argument..other than that..nice engine Josh..


Reactor(Posted 2008) [#185]
Abreast of current technology... okay, fair enough.


plash(Posted 2008) [#186]
I only bother posting here because this site is something like our third most frequent referrer.
Doesn't sound like it, at 8300 posts..


Sonic(Posted 2008) [#187]
Hehe, sorry Naughty, our definitions clearly differ. No hard feelings eh? By your definition 'future-proof' doesn't exist then? I'm kidding though, please don't take me too seriously. I'm just blown away by what Josh's achieved, and even if he is a little 'blunt', I think we should be applauding his efforts.


Damien Sturdy(Posted 2008) [#188]

Doesn't sound like it, at 8300 posts..



He's been here for quite a few years.


Digital Anime(Posted 2008) [#189]
@ KimoTech, you are just a dork being this rude to Leadwerks. I wouldn't even support you if your engine was better after your posts in here. "That felt good!"

If Leadwerks engine is made for high end graphics cards so be it. If I was searching for a new engine for a project that will take some years to be completed, I would go for the latest shaders now and when the game is done it will be minimum requirements for the latest games anyway.

If you want to play quality games, you need a quality computer.

[edit : Tried the demo and it works fine on my rig, details shown below. used 1024x768 in highest quality and got between 40 and 140 frames per second. The lowest framerate I got when I went through the bushes a lot.]


Reactor(Posted 2008) [#190]
If you want to play quality games, you need a quality computer.


I know what you're saying, but... not all quality games are about the graphics ;)


JoshK(Posted 2008) [#191]
Early implementation of volumetric light scattering for version 2.1:



_33(Posted 2008) [#192]
Looks good.


The r0nin(Posted 2008) [#193]
OK... That's mighty impressive! Does the light scattering create a significant FPS hit, or is it done within your deferred lighting pass?


ShadowTurtle(Posted 2008) [#194]
" OpenGL extension GL_ARB_texture_non_power_of_two is not supported. Please update your graphics drivers or replace your hardware. "
oh my god.. todays it is realy hard calculate textures down/up ... ( cpu ;)