Plash: duct.mod shader error

BlitzMax Forums/BlitzMax Beginners Area/Plash: duct.mod shader error

xlsior(Posted 2009) [#1]
I came across duct.mod the other day, which includes the interesting looking protog2d.mod which supposedly has some 2D shaders for Blitzmax.
URL: http://www.komiga.com/?q=node/25

However, attempting to run the enclosed shaders example, it consistently errors out on compiling the blur.glsl shader:



It seems like the enclosed blur.glsl is somehow broken, but looking at it I have no idea what the problem would be.

Plash? Anyone?


CyBeRGoth(Posted 2009) [#2]
You are lucky to get that far, I still can't get the module to compile :P


ImaginaryHuman(Posted 2009) [#3]
I couldn't get it to compile the modules either, gave up, couldn't be bothered.


xlsior(Posted 2009) [#4]
Heh.

Compiling the actual module was no problem at all, no errors there... Just bombs out the shader.bmx example with the error above when I try to run it.

The others (like the text alignment one) work OK.


jkrankie(Posted 2009) [#5]
The modules compile, but you have a dependency on one of Noels modules that isn't explained. Shaders worked fine on my dev box.

Cheers
Charlie


ImaginaryHuman(Posted 2009) [#6]
Yah I noticed that, couldn't be bothered/didn't want to install his modules as well. Oh well. Doesn't the error code clue you into something?


plash(Posted 2009) [#7]
Yikes! I've had this issue come up a few times before..
I don't know whether it's OpenGL freaking out because of the line-endings or simply the module not setting up the shader correctly.

I'll look into it..

The modules compile, but you have a dependency on one of Noels modules that isn't explained.
That is correct. Keep in mind the entire duct branch is no where near complete, and I have not had to deal with issues other developers have had with it.. because it's been just me up until this point.

EDIT: I don't check the forums so often anymore, so if you'd like to get ahold of me faster (I only knew of this thread by reference from Noel), hop on irc.blitzed.org/#blitzbasic.


xlsior(Posted 2009) [#8]
I'll look into it..


Thanks.

Oh, one other thing I'm wondering: What happens if you're trying to call one of those shader functions on a card that doesn't support shaders? Does it fail gracefully, or blow up?

Is it possible to check in advance whether or not a card supports shaders before trying to call them?


plash(Posted 2009) [#9]
Oh, one other thing I'm wondering: What happens if you're trying to call one of those shader functions on a card that doesn't support shaders? Does it fail gracefully, or blow up?
It'll likely blow up. There is not much error-checking in Protog2D.

Is it possible to check in advance whether or not a card supports shaders before trying to call them?
Pretty sure that's possible.. I'll look into that too.


plash(Posted 2009) [#10]
Well.. I'm not really sure what to do about the shader 'issue'. I cannot reproduce it on my machine, but I do recall having to fix this in the past.
Also, I think OpenGL gets cocky about line endings in shaders (particularly if you stick a bunch of empty lines at the end of the source).

@xlsior: What OS are you testing it on?
I'm pretty sure OpenGL on Linux doesn't accept a shader program without both the vertex and the fragment shaders present (Protog2D currently only works with fragment shaders, due to some strange view matrix issues [I think that's the issue anyhow]).

Could you try this in place of blur.glsl (make sure to clear the entire file before pasting)?
//@fragment
vec2 blurscale = vec2(2.0, 2.0);
float blurfactors[15] = float[15](0.0044, 0.0115, 0.0257, 0.0488, 0.0799, 0.1133, 0.1394, 0.1494, 0.1394, 0.1133, 0.0799, 0.0488, 0.0257, 0.0115, 0.0044);

void main() {
	vec4 color = vec4(0.0);
	for(int i = 0; i < 15; ++i) {
		vec2 tc = min(gl_TexCoord[0].st + blurscale * float(i - 7), p2d_viewportsize);
		color += texture2DRect(p2d_rendertexture, tc) * blurfactors[i];
	}
	gl_FragColor = color;
}



xlsior(Posted 2009) [#11]
@xlsior: What OS are you testing it on?


Windows 7 64 bit, with ATI Radeon HD4670 video card.

After trying the blur.glsl you mention above, I no longer get the original error, but now get this instead:

Executing:shaders.debug.exe
DebugLog:1
DebugLog:Exception caught:
(TShaderAssist.LinkProgram) Failed to link program
Infolog:
Fragment shader(s) failed to link, no vertex shader(s) defined.
ERROR: error(#280) Not all shaders have valid object code


Are you sure that the other files on the repository are the latest versions as well?


plash(Posted 2009) [#12]
After trying the blur.glsl you mention above, I no longer get the original error, but now get this instead
Interesting. It runs fine on XP, but not on Linux (same error you get).
I disabled vertex shaders because they were messing up the entire render.. I think it had something to do with the projection matrix in OpenGL getting all mangled up.

Are you sure that the other files on the repository are the latest versions as well?
Yes.

I guess I'll have to try getting vertex shaders working again.


plash(Posted 2009) [#13]
Righto. I think I got it working now (it's working in Linux at least)..
Pull the repo again or amend the changes yourself: http://github.com/komiga/duct/commit/11dd900efea8ff90d1f918eddfa906672636fcd8

EDIT: Oh yeah, the blur shader was sorta stolen from Max3D, and it does this strange tearing in the bottom-left corner of the screen..


xlsior(Posted 2009) [#14]
Ok, looks like that did the trick, it's running now!

EDIT: Oh yeah, the blur shader was sorta stolen from Max3D, and it does this strange tearing in the bottom-left corner of the screen..


It has bigger problems than that -- there a lot of tiny glitches throughout the screen, but most of them are very small. The bottom left one is just the most obvious one of all... But the other two shaders do seem to work great without any weirdness. Very nice!

There currently isn't a way to verify beforehand that shaders are indeed supported on the video card, correct?


plash(Posted 2009) [#15]
There currently isn't a way to verify beforehand that shaders are indeed supported on the video card, correct?
Ah yes, this..
I'm pretty certain the GL_ARB_vertex_shader, GL_ARB_fragment_shader and GL_ARB_shading_language_### (e.g. 100, 110, 120, etc.) constants (in pub.glew) can be used to check for shader support.

EDIT: But I have not implemented it yet.
EDIT2: They're not actually constants btw, they're just globals initiated by GLEW.