Please help me test this opengl stuff
BlitzMax Forums/BlitzMax Programming/Please help me test this opengl stuff
| ||
Hi! Im messing around a bit with opengl atm, trying to figure out whats the best way to do things. I did a simple 2d scene to have something to compare with. Both should draw the same thing, and print fps regularly to the console/output. Id like to see some figures from other people, and know if it works (at all!). GLMax2D: Custom OpenGL: EDIT: Codeboxes are wonderful little things :) |
| ||
ok, first thing is your shader code - It's not glsl (not sure, possibly hlsl?) you need to change to the following: Function PlainVShaderSource:String() Local str:String="" str:+"#version 120~n" str:+"uniform vec2 vertex_pos;~n" str:+"uniform vec4 vertex_col;~n" str:+"uniform mat4 pmatrix;~n" str:+"varying vec4 f_col;~n" str:+"void main(void) {~n" str:+" gl_Position=pmatrix*vec4(vertex_pos, -1.0, 1.0);~n" str:+" f_col=vertex_col;~n" str:+"}" Return str End Function Function PlainFShaderSource:String() Local str:String="" str:+"#version 120~n" str:+"varying vec4 f_col;~n" str:+"void main(void) {~n" str:+"gl_FragColor=vec4(f_col);~n" str:+"}~n" Return str End Function then you have no texture shaders, so it reports an error. Suggestion here is to drop any thoughts of textures until you get the basic colour shader to work, then add textures later :) |
| ||
Hi! Updated the shaders to something like you suggested. Im using a reference card for ogl es 2 from krhonos along with google. Havent done any shader coding before, so Im not too sure about what Im doing. The sample is not using textures so that part is not a problem :) EDIT: Your supposed to see a slowly changing gradient background, a whole lot of low alpha rectangles on top of that, and a spiral of dots slowly spinning. Everything should be some shade of blue/black. |
| ||
I just realized your probably not supposed to try and do this with desktop opengl :o hmm. |
| ||
Updated the code again to hopefully work with ogl3/glsl 1.5. |
| ||
first off. you are going to have issues with anything over glsl 120. So for compatibility - you'll have to stick with #version 120 This make even more sense as you are not using anything remotely 150 related. always go for the lowest compatible lordy!!!!!! attribute! in! out! These are NOT GLSL - it won't work. period! See above once corrected - still no output on desktop for testing purposes. I would suggest either starting from scratch and just displaying a single triangle, then adding the shader. Or stripping out everything apart from a single triangle and possible shader support. Hint: OpenGL is notoriously difficult to debug. If you can't see anything, it could be all sorts of issues - better to start with something that works and go forward :) |
| ||
Well, it all works perfectly fine for me. Im a bit stumped with this, since the shader code is based on various glsl examples for es/desktop Ive found via google. I wont have time for a few days to mess around, but would you have a reference to a official glsl spec? |
| ||
SDL has some examples of "default" shaders : https://github.com/eddieringle/SDL/blob/master/src/render/opengles2/SDL_shaders_gles2.c I also have a BaH.glslopt module (GLSL Optimizer), that you can use to optimize your code. |
| ||
Hello, you mentioned something about desktop, but I tried it anyway. ----------------------------------------------------------------- Setup: Intel Quad core 2, Ati Radeon HD 3400, Win 7 Result: Both ran fine. GLMax -> 17 FPS Custom: -> 25 FPS -Henri |
| ||
Thanks. Im trying to get it to work on the standard blitzmax targets first, but I only have a win7 box (HD 6870 card) myself to test on. I thought i could get away with using the ES2 spec for desktop too, since its supposed to be a subset of desktop opengl. It should be fine for the opengl part, but it turned out that ES uses a completely different version of glsl. @AdamStrange Ive been using this spec: http://www.opengl.org/registry/doc/GLSLangSpec.1.50.11.pdf and it says in/out are perfectly fine glsl. Exactly how do the sample program fail for you? |
| ||
for a start 1.20 is the max that a mac can do OpenGL. An as you are not using anything 1.50 related - use the lowest for compatibility. regarding the in/out variable situation. I used what was available on the net - and all references to them do not exist. But, more worrying, the mac just bombs out on encountering those. so I would expect them to not be there - and need to remove them before any shaders will compile. I've always found that specs are usually a bad starting point. Much better to use a tutorial that you know works. Getting your hands dirty is much more productive, I've found? OK, so once the shaders compile, the window appears, and it's just black. That is what I mean by starting with something small that works and adding - OpenGL is not easy to debug - that's why I didn't try :) |
| ||
Dunno about you, but some of these glsl things are rather splendid : http://glsl.heroku.com/e#18709.0 ooooerr.... |
| ||
Here's some quick 1.20 glsl shaders according to spec if you want to try them Adam. Otherwise I should have access to a mac in a few days, which should make this much more straightforward to fix. Sorry for being stubborn with this code, Ill do something simpler when I have time for it. |
| ||
Output: and a black window... FPS ticks down the output. |
| ||
Can you try it again Brucey? I had a stray line left in the fragment shader code, and that error message suggests you copied the code before i made a ninja-edit :p |
| ||
compiles ok shaders ok output = black |
| ||
Can you try it again Brucey? Yeah, I'd tried it again after commenting out that line, and had the same result as Mr Strange. I've yet to try it on the Raspberry Pi though. |
| ||
Well, didnt get my hands on a mac for testing. Heres a simple example, should draw a green triangle on a purple background. Id be interested in the console output. |
| ||
Kind of like a very pink background... no triangle.Building untitled1 Executing:untitled1 Matrix 4x4: 0.00249999994, 0.00000000, 0.00000000, 0.00000000 0.00000000, -0.00333333341, 0.00000000, 0.00000000 0.00000000, 0.00000000, -1.00000000, 0.00000000 -1.00000000, 1.00000000, -0.00000000, 1.00000000 vertex shader id: 1 fragment shader id: 2 attribute pos index: 0 uniform projection index: 0 successfully created buffer Entering main loop glUniformMatrix4fv : 0 glUniformMatrix4fv : 0 ... lots of these Maybe your RGB is mixed up :-) GPU on my Mac Mini : AMD Radeon HD 6630M 256 MB |
| ||
Well i guess that part is a bit subjective. Anyways, that helps a bunch. Updated the code with some relevant error tests, so please try it again :) EDIT: On a bit of a side note, I saw that AMD offers native ES support on desktop via this http://developer.amd.com/tools-and-sdks/graphics-development/amd-opengl-es-sdk/ . Would that be possible to wrap for blitzmax? |
| ||
Similar to Brucey, output is Initial thoughts are the matrix doesnt look correct but I'm used to looking at 3d ones on d3d lately so I could be talking out of my .... Shouldn't the math be grid[12]=-((pr+pl)/(pr-pl)) grid[13]=-((pt+pb)/(pt-pb)) grid[14]=-((pf+pn)/(pf-pn)) That wont stop the error though as you're telling ogl that you're passing in 16 matrices when in fact theres only 1. Try glUniformMatrix4fv(projection, 1, False,matrix.grid) and see how you get on with those adjustments. |
| ||
Thanks a bunch! Updated the latest code to these changes. The matrix code was from some tutorial, and I just assumed it was to specify 16 elements, so didnt bother looking it up in the specs, but it seem you are correct :) Funny that it wouldnt bomb out on windows. Changed the matrix code too, altough it gives the same result. |
| ||
I now see a green triangle on the bright pink background ;-) |
| ||
pink background green triangle here too :) Hint: be very careful with windows and opengl. MS tried everything they could to not have it, so it is no surprise that it is not compliant. Macs only have opengl so their spec is much stricter |
| ||
Yeah, the impression ive been getting is that while the opengl spec works on most things if you keep to it, different vendors/os'es will tolerate different kinds of stray errors like this, which is not really helpful. Anyways, I updated the first code samples that i posted, hoping that this was the offending thing for you guys :) |
| ||
different vendors/os'es will tolerate different kinds of stray errors like this, which is not really helpful. In which case, you should try to develop on the least tolerant system ;-) |
| ||
It is called "Monkey X"-approach :p bye Ron |
| ||
Yeah well, I wouldnt complain if someone donated a mac to me, but Ill have to work with what I have :) Reiterating these: Max2D: Custom OpenGL: |
| ||
And if anyone is interested, In release mode i get ~45fps on the Max2D version, and ~132fps on the custom one. |
| ||
You can emulate Mac OS X in VirtualBox (works on AMD but Intel is preferred because you then do not need a modified bootloader). Snow Leopard (< Mountain Lion which is 10.8) works really good here (except "restart" having segfaults in the vm and therefore "restart" means "cold start" for me :D). Of course slower than on real hardware, but it might save you some time. Of course you have to buy a licence for Mac OS X somewhere to stay at least a bit more legal (according to the country you live the EULA of apple is valid for you and you only are allowed to use Mac OS X on real hardware or whatever). bye Ron |
| ||
debug: max2d 21-23 custom 30-31 release: max2d 61-65 custom 167-174 |
| ||
Err, The above i wrote in was on Win7 :-) The latest test - debug natve: ~25 custom: ~37 and release native: ~56 custom: ~117 with your custom code i get this report for debug and release (Initialize) OpenGL 3 Target (Initialize) GLSL: 3.30 - Intel Build 8.15.10.2696 (1.20 required) (Initialize) Matrix Debug: Matrix 4x4: 0.00249999994, 0.000000000, 0.000000000, 0.000000000 0.000000000, -0.00333333341, 0.000000000, 0.000000000 0.000000000, 0.000000000, -1.00000000, 0.000000000 -1.00000000, 1.00000000, -0.000000000, 1.00000000 (Initialize) Setting colors to <1, 1, 1, 1> (CompileShader) Compiling vertex shader (CompileShader) Successfully compiled shader! (CompileShader) Compiling fragment shader (CompileShader) Successfully compiled shader! (Initialize) Created plain shader program! ERROR (CompileShader) No shader source! ERROR (CompileShader) No shader source! (Initialize) Created texture shader program! ERROR (ValidateShaderprogram) Supplied program is not valid! (in context) (UpdateShaderLayout) attrib_pos: 0 (UpdateShaderLayout) attrib_uv: -1 (UpdateShaderLayout) attrib_col: 1 (UpdateShaderLayout) unif_projection: 23724032 (UpdateShaderLayout) unif_tex: -1 |
| ||
@Dave Dont happen to have a nvidia card then? Anyways, those errors are for texturing stuff thats not used, so it looks ok to me. @Derron I didnt think about that, Ill have to take a look at setting that up sometime. Anyways, the batch thing seems to work fairly well, at least for this simple scene. It can also be hidden away completely in a max2d driver. Sadly though for single draw calls the deprecated immediate mode seems to be a lot faster (~2x for me). Also, Max2D is doing transformations while my code is not. Gonna try and get some more test code for other aspects together in the next few days. |
| ||
It even works in 64-bit :-) shader_test64_osx.zip 286kb Also, Max2D is doing transformations while my code is not. Can't that be done in the shader? |
| ||
Yes, but 2d transformations are so straightforward Im not sure if that will give a performance boost or not. Its up for testing sooner or later though :) |
| ||
Dont happen to have a nvidia card then? The laptop I use is blessed with an nvidia and an ati. Great for testing :) edit: to optimise for batching you could try setting everything up for the batch before you enter the main loop ( maybe using some kind of pooling ), during setup you do all the checks for validity so then in the main loop you can then make the assumption that all data is valid and just blast it to the gpu ( your ogl calls ). In the latest code the main loop is checking for shader validity, your changing state several times for each iteration of drawing a particle. Ultimately try to keep all objects using the same state together to keep state changing to a minimum number as possible. |
| ||
Reviving this post. This is a quick module hack (written before the code posted above) to test if I could get a new max2d driver up and running. Changed the issues from the above example, so it *should* run. It do run the digesteroids sample fine on my windows 7 box, but it'd be nice to see if it works for anyone else. It should do plot, line, rect and image commands fine. Lacking some blendmodes etc, and anything poly wont work. This is in no way a good example of opengl programming, since im not really familiar with it, I settled for something that works as good enough. Basically posting it here for Brucey and co :) |
| ||
Just get a black screen in my demoapp. My app uses no special GL-Commands or so. Same error when running a simple sample: -> blackscreen. Log of this sample: I am on Linux Mint, 64Bit, Amd LLano CPU/GPU - using one of the proprietary AMD drivers. bye Ron |
| ||
Ok, Ill repost it with a bunch of error checking later today. Would you mind testing if your simple example can change clear color? EDIT: Version with more error checks: |
| ||
nothing gets displayed, window opens, keeps black, and window closes. bye Ron |
| ||
That should be easy enough to fix, try this one: |
| ||
rem removed ... board system does not allow fast editing of the same posting ... it just adds another post. end rem |
| ||
works. Edit: is there some kind of frame limiter? does not matter if flip -1,0 or 1 ... it gets max75 (when using "flip 0", flip -1 is at screen refreshrate, flip 1 even cuts down to 35fps). Edit2: as you requested "setClsColor" works too. Edit3: Images work too - but displaying one image cuts FPS from 75 to 65 ... maybe you recompile things every run (or upload things to gpu ... I do know nothing about that subject). PS: I of course removed the "prints" when measuring. bye Ron |
| ||
Its basically a quick hack I did to see if Id be able to get a ogl3 module up and running at all. Dont expect any kind of performance from it whatsoever :) Posted it here since Brucey were struggling with textures. |
| ||
Posted it here since Brucey were struggling with textures. Thanks :-) |
| ||
Might be worth pointing out that the batch example I started this thread with was meant for testing purposes only. The example "scene" is very naive, and its not really working great for more realistic workloads. For the module I aimed (still am) to get all the functionality in, and disregard performance until I had everything working. Considering performance, its really hard to come up with a system that work for *everything*. A reasonable approach would probably be to rewrite the standard glmax2d module to collect data on pretty much everything it does, and ask people here to run their projects with it and post the results. |
| ||
Do not "disregard performance" at all ... you might not "optimize" during tinkering with something, but you should keep performance in mind when designing working routines (how batching is done, how "auto batching" could work etc.) bye Ron |
| ||
The problem is that i knew nothing about how opengl3 works, which makes it hard to write efficient code, considering I didnt even know what the code would have to do. As soon as we have a module that does everything that the standard max2d modules do, Ill try and redesign it for performance. |
| ||
Got a small break from school stuff, so tinkered on a bit with the module. It should now do multiple images and pixmaps properly, and i think i got polygons working (drawoval uses drawpoly now). Please test and report any weird things :) |