Shaders Example

BlitzMax Forums/OpenGL Module/Shaders Example

Jay Kyburz(Posted 2006) [#1]
Hey All,

I see a few of you are using GL shaders in your engines. I was wondering if any of you have any super simple examples of setting up a shader on an object and rendering it? A little test app or something you might have lying around?

I'm not looking for anything to fancy in my game, just want a diffuse, normal, spec and mask, and possibly emissive in some situations.

I'm a game artist at work and familiar with using shaders. just know nothing about setting them up.

Jay


Chris C(Posted 2006) [#2]
you could do no worse than play with the Ogre SDK, you can edit the material scripts and frag and vert shader code, and then re-run one of the example exe's

Theres a simple example (test) of using a shader with BlitzMax on my site


Drey(Posted 2006) [#3]
Check out RenderMonkey by ATI and Shader Designer too.


Grover(Posted 2006) [#4]
Heres an example I knocked up.
Note that it is ARB, and no Cg, or GLSlang etc. This should work on pretty much any card with basic vertex shaders (V1.0).

Its very rudimentary - you should be able to easily change the vertex shader loading into file based rather than string based - has been a while since I used blitz, so didnt bother looking how to do it ;)

For fragment shaders you can use almost the same setup as the vertex shaders, and using params etc is fairly simple. I'll update the example to be more complete when I have some time.

www.gagagames.com/Files/glcube_shader.bmx


Dreamora(Posted 2006) [#5]
should more or less work but one elemental thing is missing: Before using the first extension command, glewinit() is needed to initialize pub.glew()


Grover(Posted 2006) [#6]
Shouldnt need any glewinit() - thats probably doing the func setup gear. if you look at the source, the functions are being fetched manually (there should be error checks, but hey its a quick an dirty). I would think glewinit() would prolly check to see if there is ARB support in the extension string, again more an error check thing, not a requirement to run a shader (all 3D cards since V1.0 should have ARB support).


Grover(Posted 2006) [#7]
Added fragment shader and a simple file loader for file based shader programs. These shaders are ultra simple and only to provide an example of shader usage.

http://users.on.net/~dlannan/Files/fragmentshader.psh
http://users.on.net/~dlannan/Files/vertexshader.vsh


The vertexshader is a classic colour cube.The fragmentshader is a simple additive blend of colour and texture.


Drey(Posted 2006) [#8]
Using asm like programming for shaders is wild style. Look into GLSL, i really like it.


http://www.lighthouse3d.com/opengl/glsl/

it has links to a nice beginners program called Shader Designer. Once i got into shaders, that's all i want to program with.


Grover(Posted 2006) [#9]
GLSL is good, but it has problems with wide range of card support (we had various issues especially with SGI, ATI and NV hardware specific problems - like fog). ARB is generally supported by all cards with pixel and vertex shaders (even back to the old GF3 cards). You'll find the example should run on pretty much any GPU - GLSL doesnt do this well. And for higher level coding, imho Cg is probably the best if you are happy to stick with NV solutions (briliiant debugging tools too). Of course.. this is rarely the case.. really DX is a far better choice regardless.

There are a few other shader languages for OGL too - but few that are as nice as HLSL for instance for wide GPU support, one of the problems with OGL is its shader support - quite horrible even at the best of times. Its a pity because the OGL API is quite easily the best API around for gfx, its just been horribly mangled with extensions.

I agree shader designer apps like ShaderX, RenderMonkey and FX are all great tools for shader design, but few people realise the problems with integration from an app like those - you need to use the same env/temps/param setups as the programs, and this can often conflict with the design of the system that you have created. Shaders are a minefield of problems - this is simply a base example to get ppl running shaders in just over 2 pages of code.

I should note, that GLSL is a nice language. But when you cant even run simple GLSL shaders on modern cards, Im not too keen to use it as a simple example (the odds it wont work on many cards is pretty high - including one of my machines, the X700 has problems with various GLSL shaders, but is fine with the ARB asm... as an example :) ).


Drey(Posted 2006) [#10]
Well, my goal was to shoot for systems that has GL 2.0(not to interested on programming fallbacks because the game will be free). Though i have taken note on your x700 issue. I was expecting X hundred and Xk series to be fine with it. My 6600gt ran GLSL fine. My friends 6800 and my x1900xt does as well so far. Odds are by the time the game comes out, using GLSL shouldn't be an issue.


Drey(Posted 2006) [#11]
I have a question. I was reading that you can attach as many frag and vertex shaders to a program as you want. I was wondering if that mean i could break shaders down into segmented concepts(normal map calcs, light property calcs) and then link them all and run them?


Grover(Posted 2006) [#12]
Not sure what you mean attach. You can only run one of each at any one time. Btw, if you want to have very good performance, make sure you minimise frag and vert shader swaps. Prioritize on shaders first, then materials (textures) then on things like vertex list sizes (batch if possible).

If you mean having a single shader for each specific feature op, then be very careful, because swapping over causes a state flush - and this is very expensive on any GPU. If you can, try and get as much rendering in a single set of shaders as you can, render as much of the scene as you can with them, then do the extra 'effects shaders' on a few elementary portions of the screen. There is a caveat to this though - beware shader sizes of course, because many older cards dont support large shaders... so its a bit of a balancing act. Inneffective batching of shaders can reduce frame rates massively, I have seen fps go from 40 to 400 fps just from changing to batched shaders.

On the X700 thing, its a bit of an oddity (or annoyance) because I came across it with a GLSL realtime bummap sample actually a while back:
http://www.shadertech.com/shaders/stsummer04/AndreasHalm-bin.zip
And had problems with it in a job, with alpha textures and fog as well... ATI cards seemed to have some pretty serious issues with certain shaders, alphas and fog :)

The problem with GLSL is not the language, its the GPU's and compilers (which are embedded in the drivers). Its built mainly for newer cards, which is fair enough, but as I mentioned if you need to cover the huge amount of card differences out there it'll be pretty messy. Especially when you find so many 'cut down' GPU's are used in consumer cards. Often the advanced full pipe cards are the least problematic (like 6800, and x1900) because they have multiple texel units, multiple vert units, and fairly flexible pathways too (and compilers that suit).

Something to watch out for anyway. Just advise people to use the latest drivers with their cards (hopefully compiler fixes will help - they are in the driver for GLSL).


Drey(Posted 2006) [#13]
Yeah, that's how i have my rendering structure set up. Shader>Materials>Mesh(vertexgrouping with VA and VBO).

Yeah, i realize it's a card/driver issue on older cards. Just not targeting them. Seems like x700 might be out of luck then.

I know you can only run a program at a time. I mean this.


Texture = glCreateShader(GL_FRAGMENT_SHADER)
Bump = glCreateShader(GL_FRAGMENT_SHADER)
LightProps = glCreateShader(GL_FRAGMENT_SHADER)
FinalCOlor = glCreateShader(GL_FRAGMENT_SHADER)

'loading of shadersources and compiling

P = glCreateProgram()

glAttachShader(p, Texture)
glAttachShader(p, Bump)
glAttachShader(p, LightProps)
glAttachShader(p, FinalColor)

glLinkProgram(p)
glUseProgram(P)


Grover(Posted 2006) [#14]
Ahh k. These are calls I steer clear of personally - GL 2.0 (again due to the non-compatibility with older cards). The reason I go for older card compat, is the sheer number of aged cards out there, newer cards only represent about 5% of the OGL market.

Attach is a safe call though. And mixing and matching to suit should be fine, its the Link and Use that will cost - Use calls a state flush, and Link invokes the linker in the driver (can be very slow).

So for batching, make sure you dont have too many variances of glUseProgram. And certainly try to steer clear of glLinkProgram at runtime - unless you have a dual core or something neat like that (schedule a link on the other core, and then Use when ready).

Each shader object should be considered a seperate shader, and thus when switch you have all the performance problems as changing to a totally different shader (state flushes essenitally). Also, theres the issue of fixed function pipeline being disabled during the use of shader objects too (worth remembering if you are running on older cards - you'll get blank screens, and all sorts - one of the main pains with writing fallbacks).


Drey(Posted 2006) [#15]
I'm going to have a collection of programs that are linked at load time(if everything works the way i think it does).

I plan for this game to take over a year and probably close to 2. I'm not trying to sell it. The issue with fallbacks..it's time taken away from making the game. I enjoy engine programming but i want to make a game. The new nvidia and ATI mobos have intergraded graphics with shaders on them. So it isn't a concern of mine to get someones 8500 or gf3 working with this game.

I'm thinking shader grouping is going to be done in a custom editor for the artist on the team. Just some check boxes and blending options. Most of the shaders are going to static(like normals calcs) but the frag colors and such i see being lil more dynamicly written with set blend options. I'm trying to design a pretty general purpose shader that'll handle most things any how. I"m trying to keep the shader varitation low.

Thanks for insight Grover! Please keep it coming if you have more to say. Do you work in the industry?


Grover(Posted 2006) [#16]
Yeah - thats a pretty decent move. Especially since its not an easy thing to achieve, a modern PC based computer game. Definitely the best way to handle the resources you have.

I like the idea of a specific editor to help the artists etc develop shaders for your engine - will speed up production bigtime.

The more prebuilt static data you can make the better (especially if youd later like to move to console or handheld).

Yeah have done a few little bits in the game industry :) Keeps me off the streets (or out of the pub, if you ask my wife ;) ).