More distractions...

BlitzMax Forums/Brucey's Modules/More distractions...

Brucey(Posted 2008) [#1]
I blame Winni for this one...

Anyhoo... here's a screenshot from a "made-in-an-evening" module I started today :


It shows a movie playing at 2x zoom.... without sound for now, until I can work out how to feed the stream into somewhere - I'm sure it's easy enough.

Got to mess about with the framerate in the example too, but it's not bad for an evening.

Looks like it'll work on all three platforms too, which is kinda groovy.


But it's probably another of those niche modules...


Winni(Posted 2008) [#2]
I feel so innocent and have been mostly offline for almost two weeks... What did I do when to get you working on this one?


Brucey(Posted 2008) [#3]
Yeah... you indirectly mentioned avbin ;-)


Brucey(Posted 2008) [#4]
Video is working... properly... now... using what I suppose is delta timing to keep the video running in real-time.

So I've discovered that FreeAudio doesn't do anything other than play normal samples. Even creating a "static" sample doesn't work properly - I have a fix for the module, but to be honest I'm not bothered that it sucks eggs. Moving on....


Tachyon(Posted 2008) [#5]
Nice Brucey! I'd love to see this module finished and working on all platforms!


Brucey(Posted 2008) [#6]
Ooooh... I appear to have sound playing with the video... although it's not quite in sync... which is a bit of a problem...

Not sure how I'm going to get them sync'd up properly...


Brucey(Posted 2008) [#7]
Hmm, so currently the video is sync'd to "real time"... and the sound is just playing...

Should I sync the video to the audio timestamp? I suppose that sounds better...


Brucey(Posted 2008) [#8]
Sorted :-)


Brucey(Posted 2008) [#9]
Here's another little screenie...

You'll have to take my word for it on the audio playback ;-)
This particular example is Ogg Theora 854x480 w/ 48khz stereo. Sounds sweet!

I've found that a "hacked" version of the DrawPixmap function works great, although I guess your mileage will vary depending on what you want to do.
Since I'm getting back RGB8 images, there's little point in converting them to RGBA8 as you are only going to be blitting (you know what I mean) them to the screen anyway. If you want to do fancy alpha effects and what not, I guess you'd have to work out something else.

Since my running draw routine is using this :
	glRasterPos2i 0,0
	glBitmap 0,0,0,0,x,-y-t.height,Null
	glDrawPixels t.width,t.height,GL_RGB,GL_UNSIGNED_BYTE,t.pixels

I've no idea if one could apply scaling etc to it, or if those functions are effectively like a blit. I'm not an expert.

Anyhoo... it's been quite fun.. these last 24 hours or so :-)

Thanks to Winni for the heads-up on the library that he didn't really mention. I have to say.. open source really rocks! (if you know where to look, apparently).


Brucey(Posted 2008) [#10]
Okay... committed..

I've tried testing the example on Win32 (on Parallels), but it can't chuck the stuff around fast enough to work very well - however, I can see that it is trying to do it, so I'm guessing it works :-p

Things you will need to install (once you've built the module - via SVN)...

AVbin (link to the downloads page).

AVbin comes in the form of a DLL / .dylib, that needs to be installed.
For Mac, in theory, you could ship it in the app bundle, but I haven't tried it yet.

and to try the example app on Win32,
OpenAL (link to the setup .zip)
...if you don't already have it installed.


Your mileage may, as usual, vary considerably. But I'm happy enough with what it is doing so far - playing video with sound.
You may have issues running the example with different types of media... as I may not be using the library in the correct way for the start of a stream... I think you are meant to use the "start timestamp" stuff to determine when things are meant to kick off. I ain't no expert by a long way.

The example also doesn't handle the end of the stream properly. In reality the streams should be closed, and then the file closed. And of course, have OpenAL stopped.

Some work could be done on the "drawing" stuff. BlitzMax by default prefers to draw on RGBA8 textures. It would have been nice if we got a choice. And therefore less data copying. But it's all a bit fuzzy to me this GL thing, so I've done what I could to get it working so-so.

But, given a lack of integrated video+audio in BlitzMax, this appears to be the best there is for now.

Comments, suggestions, feedback (and patches) always welcome :-)

I'll do a proper release once things settle down a bit...

:o)


Brucey(Posted 2008) [#11]
Latest commit adds support for Linux, which I'm very happy to say is working too :-)

One note for OpenAL on Linux. There is a bug with the library loader in Pub.OpenAL that requires you to have the OpenAL devel package installed to be able to run apps with it.
Hopefully this will be fixed by BRL shortly.

Have fun...

:o)


markcw(Posted 2008) [#12]
Can't you pass GL_RGBA to the format parameter to get RGBA8 images back?
glDrawPixels t.width,t.height,GL_RGBA,GL_UNSIGNED_BYTE,t.pixels

Edit: oh that's the draw pixel code, sorry.


plash(Posted 2008) [#13]
Hey! that was a good short film (Blender ftw!)

What library is this based on??


Brucey(Posted 2008) [#14]
I suppose the idea behind using RGB8 instead of RGBA8 is that for a video stream, you don't really need an alpha channel?

What library is this based on??


FFmpeg :-)


markcw(Posted 2008) [#15]
You can use glPixelZoom to scale the image, glDrawPixels is a blit operation. It can be slow on some cards (see here) but I don't think there's an alternative. FFmpeg seems to support up to 32-bit data, maybe the file you used was 24-bit? Btw, this tutorial was quite interesting.


Brucey(Posted 2008) [#16]
I was using AVbin rather than FFmpeg directly, because it simplifies the process of aqcuiring data somewhat.
I suppose what is really needed, to speed things up somewhat, is, as that tutorial points out, to run the various parts in their own threads.
I can see that the audio decoding could be better done in a separate thread, since I am not really doing anything BlitzMax specific in there. All we'd need to do is feed it with packets and let it take care of itself.
That would give the rest of the code a few more cycles per frame...

Maybe I should return to my old ffmpeg attempt from a year-or-so ago... but I'm reasonably happy with progress so far with AVbin.

I would prefer if in BlitzMax we had a bit more control over what "kind" of textures we could create - without having to code directly in OGL. Perhaps we need a new Max2D with more choices.. :-)


Brucey(Posted 2008) [#17]
Heh... runs like a three-legged dog on my work PC... Sure I wasn't expecting much, but I thought it might handle my test video.

So, where does one start looking for bottlenecks in my crappy example code?

:-p


markcw(Posted 2008) [#18]
No idea as I'm not looking at the code. glDrawPixels doesn't perform well on some OS/video card setups so that might be it, and if it is that then there's not much you can do about it as the video card drivers are the problem in that case, according to the sun bug report I linked to.


Brucey(Posted 2008) [#19]
I've got a workaround which uses glTexSubImage2D, via a modified version of GLMax2D. Works great. I can now run the test video on my work PC in debug mode without stutter.. yay.

What my version of GLMax2D does is to allow you to use PF_RGB888 format pixmaps as well as PF_RGBA8888. Rather than converting the passed in pixmap to RGBA, it will let you use the pixmap in its original form.
Which removes a whole level of pixel conversions along the way.

And I suppose that I can now perform all those funky effects on the image now that it is a texture rather than a straight blit to the card.


I still fancy a go at threading off the decoding routines though, but it'll take me a while to get a module set up to use threading mixed in with BlitzMax - I only intend running threads in C++, so it shouldn't be a problem with the GC.


Lord Danil(Posted 2008) [#20]
Video stops, before the record end.


Brucey(Posted 2008) [#21]
I don't doubt it for a moment.

It seems there are better ways to sync the video with audio than my example uses. Ideally they both sync via a "current" time, which I haven't done with the example.

Still, it's early days yet, and I've never really played with time-critical streams before.


Gabriel(Posted 2008) [#22]
Would it be possible to use this with my own 3D engine? I can change the data within a texture just by providing a pointer to the new data, and I don't need audio. Would I have to modify much, given that you're (understandably) wrapping it up for use with Max2D.


Brucey(Posted 2008) [#23]
Only the example has anything to do with Max2D... the module itself is render-agnostic :-)

If you don't need audio, it's a lot easier to manage the streams.
Each video packet contains a timestamp which you can use to determine when that particular frame should be shown.


Tachyon(Posted 2009) [#24]
Brucey- did this module ever become "official"? I re-discovered it because of another post someone made. Is it ready-to-go...i.e. stable and complete enough to use on a commercial product?


LT(Posted 2010) [#25]
Sadly, I have not been able to get this to compile. :(