GStreamer


Some time in 2012, the GStreamer team was busy working toward the GStreamer 1.0 major release. Along the way, I did my part and ported the DVD playback components from 0.10. DVD is a pretty complex playback scenario (let’s not talk about Blu-ray)

I gave a talk about it at the GStreamer conference way back in 2010 – video here. Apart from the content of that talk, the thing I liked most was that I used Totem as my presentation tool :)

With all the nice changes that GStreamer 1.0 brought, DVD playback worked better than ever. I was able to delete a bunch of hacks and workarounds from the 0.10 days. There have been some bugs, but mostly minor things. Recently though, I became aware of a whole class of DVDs that didn’t work for a very silly reason. The symptom was that particular discs would error out at the start with a cryptic “The stream is in the wrong format” message.

It turns out that these are DVDs that begin with a piece of video that has no sound.

Sometimes, that’s implemented on a disc as a video track with accompanying silence, but in the case that was broken the DVDs have no audio track for that initial section at all. For a normal file, GStreamer would handle that by not creating any audio decoder chain or audio sink output element and just decode and play video. For DVD though, there are very few discs that are entirely without audio – so we’re going to need the audio decoder chain sooner or later. There’s no point creating and destroying when the audio track appears and disappears.

Accordingly, we create an audio output pad, and GStreamer plugs in a suitable audio output sink, and then nothing happens because the pipeline can’t get to the Playing state – the pipeline is stuck in the Paused state. Before a pipeline can start playing, it has to progress through Ready and Paused and then to Playing state. The key to getting from Paused to Playing is that each output element (video sink and audio sink) in our case, has to receive some data and be ready to output it. A process called Pre-roll. Pre-rolling the pipeline avoids stuttering at the start, because otherwise the decoders would have to race to try and deliver something in time for it to get on screen.

With no audio track, there’s no actual audio packets to deliver, and the audio sink can’t Pre-roll. The solution in GStreamer 1.0 is a GAP event, sent to indicate that there is a space in the data, and elements should do whatever they need to to skip or fill it. In the audio sink’s case it should handle it by considering itself Pre-rolled and allowing the pipeline to go to Playing, starting the ring buffer and the audio clock – from which the rest of the pipeline will be timed.

Everything up to that point was working OK – the sink received the GAP event… and then errored out. It expects to be told what format the audio samples it’s receiving are so it knows how to fill in the gap… when there’s no audio track and no audio data, it was never being told.

In the end, the fix was to make the dummy place-holder audio decoder choose an audio sample format if it gets a GAP event and hasn’t received any data yet – any format, it doesn’t really matter as long as it’s reasonable. It’ll be discarded and a new format selected and propagated when some audio data really is encountered later in playback.

That fix is #c24a12 – later fixed up a bit by thiagoss to add the ‘sensible’ part to format selection. The initial commit liked to choose a samplerate of 1Hz :)

If you have any further bugs in your GStreamer DVD playback, please let us know!

Hi world! It’s been several years since I used this blog, and there’s been a lot of things happen to us since then. I don’t even live on the same continent as I did.

More on that in a future post. Today, I have an announcement to make – a new Open Source company! Together with fellow GStreamer hackers Tim-Philipp Müller and Sebastian Dröge, I have founded a new company: Centricular Ltd.

From 2007 until July, I was working at Oracle on Sun Ray thin client firmware. Oracle shut down the project in July, and my job along with it – opening up this excellent opportunity to try something I’ve wanted for a while and start a business, while getting back to Free Software full time.

Our website has more information about the Open Source technologies and services we plan to offer. This list is not complete and we will try to broaden it over time, so if you have anything interesting that is not listed there but you think we can help with, get in touch

As Centricular’s first official contribution to the software pool, here’s my Raspberry Pi Camera GStreamer module. It wraps code from Raspivid to allow direct capture from the official camera module and hardware encoding to H.264 in a GStreamer pipeline – without the shell pipe and fdsrc hack people have been using to date. Take a look at the README for more information.

Raspberry Pi Camera GStreamer element

Sebastian, Tim and I will be at the GStreamer Conference in Edinburgh next week.

 GStreamer logo

I gave my talk titled “Towards GStreamer 1.0″ at the Gran Canaria Desktop Summit on Sunday. The slides are available here

My intention with the talk was to present some of the history and development of the GStreamer project as a means to look at where we might go next. I talked briefly about the origins of the project, its growth, and some of my personal highlights from the work we’ve done in the last year. To prepare the talk, I extracted some simple statistics from our commit history. In those, it’s easy to see both the general growth of the project, in terms of development energy/speed, as well as the increase in the number of contributors. It’s also possible to see the large hike in productivity that switching to Git in January has provided us.

The second part of the talk was discussing some of the pros and cons around considering whether to embark on a new major GStreamer release cycle leading up to a 1.0 release. We’ve successfully maintained the 0.10 GStreamer release series with backwards-compatible ABI and API (with some minor glitches) for 3.5 years now, and been very successful at adding features and improving the framework while doing so.

After 3.5 years of stable development, it’s clear to me that when we made GStreamer 0.10, it really ought to have been 1.0. Nevertheless, there are some parts of GStreamer 0.10 that we’re collectively not entirely happy with and would like to fix, but can’t without breaking backwards compatibility – so I think that even if we had made 0.10 at that point, I’d want to be doing 1.2 by now.

Some examples of things that are hard to do in 0.10:

  • Replace ugly or hard to use API
  • ABI mistakes such as structure members that should be private having been accidentally exposed in some release.
  • Running out of padding members in public structures, preventing further expansion
  • Deprecated API (and associated dead code paths) we’d like to remove

There are also some enhancements that fall into a more marginal category, in that they are technically possible to achieve in incremental steps during the 0.10 cycle, but are made more difficult by the need to preserve backwards compatibility. These include things like adding per-buffer metadata to buffers (for extensible timestamping/timecode information, pan & scan regions and others), variable strides in video buffers and creating/using more base classes for common element types.

In the cons category are considerations like the obvious migration pain that breaking ABI will cause our applications, and the opportunity cost of starting a new development cycle. The migration cost is mitigated somewhat by the ability to have parallel installations of GStreamer. GStreamer 0.10 applications will be able to coexist with GStreamer 1.0 applications.

The opportunity cost is a bit harder to ignore. When making the 0.9 development series, we found that the existing 0.8 branch became essentially unmaintained for 1.5 years, which is a phenomenon we’d all like to avoid with a new release series. I think that’s possible to achieve this time around, because I expect a much smaller scope of change between 0.10 and 1.0. Apart from the few exceptions above, GStreamer 0.10 has turned out really well, and has become a great framework being used in all sorts of exciting ways that doesn’t need large changes.

Weighing up the pros and cons, it’s my opinion that it’s worth making GStreamer 1.0. With that in mind, I made the following proposal at the end of my talk:

  • We should create a shared Git playground and invite people to use it for experimental API/ABI branches
  • Merge from the 0.10 master regularly into the playground regularly, and rebase/fix experimental branches
  • Keep developing most things in 0.10, relying on the regular merges to get them into the playground
  • After accumulating enough interesting features, pull the experimental branches together as a 0.11 branch and make some released
  • Target GStreamer 1.0 to come out in time for GNOME 3.0 in March 2010

This approach wasn’t really possible the last time around when everything was stored in CVS – it’s having a fast revision control system with easy merging and branch management that will allow it.

GStreamer Summit

On Thursday, we’re having a GStreamer summit in one of the rooms at the university. We’ll be discussing my proposal above, as well as talking about some of the problems people have with 0.10, and what they’d like to see in 1.0. If we can, I’d like to draw up a list of features and changes that define GStreamer 1.0 that we can start working towards.

Please come along if you’d like to help us push GStreamer forward to the next level. You’ll need to turn up at the university GCDS venue and then figure out on your own which room we’re in. We’ve been told there is one organised, but not where – so we’ll all be in the same boat.

The summit starts at 11am.

We’re leaving tomorrow afternoon for 11 days holiday in New York and Washington D.C. While we’re there, I’m hoping to catch up with Luis and Krissa and Thom May. It’s our first trip to either city, so we’re really excited – there’s a lot of fun, unique stuff to do in both places and we’re looking forward to trying to do all of it in our short visit.

On the GStreamer front, I just pushed a bunch of commits I’ve been working on for the past few weeks upstream into Totem, gst-plugins-base and gst-plugins-bad. Between them they fix a few DVD issues like multiangle support and playback in playbin2. The biggest visible feature though is the API that allowed me to (finally!) hook up the DVD menu items in Totem’s UI. Now the various ‘DVD menu’, ‘Title Menu’ etc menu items work, as well as switching angles in multiangle titles, and it provides the nice little ‘cursor switches to a hand when over a clickable button’ behaviour.

I actually had it all ready yesterday, but people told me April 1 was the wrong day to announce any big improvements in totem-gstreamer DVD support :-)

Nearly 4 months after the fact, my birthday is finally complete – my “Netherlands and Architecture” “Open Source” coin finally arrived from the Royal Dutch Mint:

Royal Dutch Mint coin - Netherlands and Architecture

Royal Dutch Mint coin - Netherlands and Architecture

The delay was caused by transmission errors introduced somewhere while communicating our delivery address.

Obligatory GStreamer bit

The other nice thing from today is this script Luis and I put together to convert any supported video into a format suitable for playback on his new BlackBerry Storm after he had trouble with encoding errors trying to use FFmpeg for the task.

It’s a simple shell script that uses GStreamer’s gst-launch utility to do 2 pass conversion to H.264 and AAC in an MPEG-4 container. You can find it here if you’re interested.

As an added bonus, Luis reports that the GStreamer conversion is noticeably faster than the erroneous FFmpeg one.


We took a trip to London this weekend. It started with some cheap BA tickets and a desire to see the musical Wicked – which was wonderfully done.

We were introduced to Wicked after reading Gregory Macguire’s book of the same name. I really enjoyed the idea of taking a story that everyone knows inside out (The Wizard of Oz) and creating an entire other story within and around it. I’m obviously not alone, since both the book, the sequel and the musical have been smash hits all over the world.

While we were in London, we took the opportunity to visit Zaheer and Alia and see their lovely new-to-us house. We also did some unsuccessful hunting for a copy of Starfarers of Catan.

On Sunday, since the skies were clear, we decided to brave the queues and went on the London Eye. Afterward, we visited Madame Tussauds, which was fun.

Mr T pities the foo'
I pity the foo’ that’s never visited a Madame Tussauds.

On an unrelated note from the Serendipitous Amusements department, a curious moment as I was in Hamburg for work a few weeks ago. One night, as we were looking for a restaurant in town we tried to print a map from Google maps. When I got around to the printer, it was blocked with a paper jam from earlier in the day. After unclogging the paper, the printer proceeded to divest itself of other peoples’ spooled jobs.

The first print job seemed to be a printout of someone’s computer magazine subscription, and the very first page that emerged was this one:

Klocwork code analysis ad

Klocwork code analysis ad

As I waited for my page, I was reading the Klocwork advertisement, and then was amused to note that amongst the source code at the bottom, I could discern lines that are clearly from GStreamer:

I recognise some of the lines from other open source projects, so hopefully someone else is as amused as I was to spot code from something they’ve contributed to :)

We’re back at home after an awesome time in Istanbul, in which we attended GUADEC, did some sight-seeing (not enough), hung out with a bunch of really fun awesome people, and attended a great conference about what people are hacking on and planning for GNOME.

Friday
Arrived from Dublin around 11pm and got to the Golden Horn Sirkeci. Rang Wim & Christian and set one tone for the week by going out for beers until 3am.
Saturday
Woke up late. Hungover for some reason. Went out and explored the spice market and Grand Bazaar. On the way back to the hotel ran into Collabora people sitting on the terrace of their hotel.

Luis and KrissaAfter a beer there, we all went searching for dinner and discovered the cushions on the street next to the Golden Horn, conveniently already populated with several other attendees, including Luis, Krissa and Phillip Van Hoof. Enjoyed dinner, beer & nargileh until after midnight.

Sunday
Up early. Had breakfast at the hotel, and then off to the impromptu GStreamer summit from last blog post. Jaime went exploring while we chatted. We were short by a few key GStreamer hackers who either couldn’t make it to Istanbul or missed my poor excuse for an announcement, but had a nice solid representation of core hackers nevertheless. Quick summary:

  • Discussed that we should move GStreamer out of CVS, and our best overall choice is Git. FDO already has the infrastructure, we have people who know how to use it and will be there to help those of us who don’t and, most importantly, Tim & Edward are willing to do the migration work.
  • People weren’t interested in doing a 0.12 branch along the lines of 0.8->0.10 where everyone stops hacking on 0.10 for 12 months while the new version is produced. Instead, we should try a process where (after we move to Git and can do this sort of thing easily) people create their own experimental branches. After, when we have enough interesting experimental branches we try and bring them together and call it 0.12.
  • Saw and did interesting demos of in progress hacking: Edward’s HDV camera support. Wim’s basetransform rework. My DVD navigation pieces.

After, went hunting food and stopped at a place where, as soon as they saw us looking at the menu, about 10 nice policemen jumped up from the tables they were sitting at and took their food inside so we could sit outside.
After lunch, Andy declared “Let’s go to Asia!”, and we caught a ferry. Saw police in riot gear and tanks, then discovered the protest they appeared to be prepared for. Had beer, caught another ferry and reclined in the Cushion Street again for more food, beer and nargileh.
Monday
I know this one! It’s Eunuchs!Up early and off to see Topkapi palace. I found myself constantly assessing features of the palace for ledges, handholds and other climbing points. Clearly played too much PoP.

After, caught the tram to the conference venue. We had a little trouble finding the place, but not as much as some. Met up with some more people, and went to Ghee’s printing BOF and Marina’s Online Desktop widget talk. Went for dinner at 360 restaurant before joining the opening drinks at Reddim night club. Didn’t stay long after the price of beer changed and the temperature went up.
Tuesday
dondurma bressert
Had Turkish icecream for breakfast, then I went to the conference while Jaime did some more sightseeing. Enjoyed the Banshee BOF – Banshee is cool. Inspired to try using Banshee as my primary media player for a while.
Ran into Marc-Andre and Meriam on the way to dinner and they joined Wingo, Zeeshan and us for dinner and chilling followed by more chilling under the Galata bridge with others.
Wednesday
Straight to university in the morning. Went to the conference opening, then Travis’ Soylent talk, the Clutter Guts talk, lunch with Lennart and kfish. As always after a clutter talk, felt inspired to try building a video app in clutter.
After lunch, Leisa’s great keynote on user experience design. Inspired to reimplement Silverback by gluing Istanbul and Cheese together.
Florian’s Elisa talk was interesting, and we talked with him about why Elisa doesn’t work for our video-watching workflow. It still feels like Clutter and Pigment duplicate a lot of effort, but I understand why combining is tricky.
Pippin’s GEGL talk was great. I really liked his ‘semantic desktop prototype’ presentation tool. Inspired to hack pretty things with GEGL.
A second good keynote from Blizzard. I want to see more of his presentations. Interesting stuff about using Firefox to influence the future of the web. Intrigued by their optimisation tools and techniques. Inspired to go optimise something. Drinks on the roof of the university with live music, followed by Cushion Bar.

Thursday
Matt Webb’s keynote, then Rob’s Telepathy talk. I want to see features like ‘Right-click->share my desktop to Joe’ land in releases. Actually, I want to see ‘Right-click->Frag Joe in Quake ]|[‘ first.

Michael’s Moonlight canvas talk. There are lots of people exploring paths for the ‘platform for next generation UIs’ thing – clutter, moonlight, webkit, mozilla. It’ll be interesting to see where they converge.
After lunch, Kristian’s “GTK+ State of the Union’ keynote. I like their approach to GTK+ 3.0. The care it takes to preserve ABI/API while still producing new features and working around design shortcomings slows GStreamer development down too.
Went to Colin’s and then Owen’s Online Desktop talks. Glad to hear later that the APOC guys and the Online Desktop guys had some conversations about the commonalities and differences between them.
Collabora boat partyNot sure why Alp’s WebKit talk qualified as a keynote, but maybe because I already saw substantially the same talk at FOSDEM.
Fun with penis-enlargement pumps at dinner. More fun at the Collabora boat party. I came 3rd in the icecream eating competition. Inspired to eat icecream, forever.
After the boat party, we went out for kebabs with a crowd led by Pete and Miguel. Amusing chilli eating competition between Aaron and Jono. A good time was had by all.
Friday
Went to Benjamin’s Flash talk, then Bastien’s update on bluetooth development. Interested in PackageKit, but worried that it has a restrictive (prescriptive?) view of what package management systems should be allowed to ask as they install things. Apparently I’m not the first.
Excellent set of lightning talks after lunch, then the AGM. I liked the inspirational tone of Federico’s Document-Centric-Gnome proposal, but found the paper design unconvincing. I’d like to see a prototype though :)
Google sponsored cup-o-beer-foam party had a good light show, but I can’t find a good pic of it.
Saturday
Aya SofiaHad a sleep-in and then visited the Blue Mosque and Aya Sofia before going to Alberto and Joerg’s APOC talk. Also went to Alberto’s talk about polishing GTK’s image. Somewhat inspired to create a ‘GNOME stack SDK’ for Windows that’d come with GStreamer, GTK, Pango, etc in one big bundle.
After, went to the Galata tower, and then met up with Collabora guys and more to go to Asia again for dinner. Amusingly, the good restaurant from the guide book turned out to be the place we’d randomly ended up in on Sunday for drinks. Turns out the food is quite good.
Sunday
Caught the plane back to Dublin. Did some good hacking on the way – integrating my recent GStreamer DVD navigation work into playbin so (e.g.) Totem can use it. Check it out:
Totem DVD

Hoping to have it feature complete within the next 2.5 months.

Current Mood: (cheerful) cheerful

We’re having a fairly impromptu GStreamer summit this Sunday in Istanbul.

If you’re interested in discussion where GStreamer is at, and where it might go next, come along :)

Details:

http://gstreamer.freedesktop.org/wiki/IstanbulSummit2008

I just finished uploading the releases of 3 of the big GStreamer modules – the Good, Bad & Ugly plugins modules. These releases are really big, because all 3 haven’t been released since June last year! 8 months of sweet hacking has produced some nice stuff. Among my favourite features of these releases:

  • DVB support – soon to be integrated directly into Totem for TV watching from the comfort of…. whereever you’re comfortably using Totem
  • Vastly improved RTSP and RTP support. Sweet.
  • Video4Linux 2 capture support improved – go Cheese!
  • Wrappers for native QuickTime and DirectShow plugins on OS/X and Windows. Thanks, Songbird and Fluendo!
  • A bunch of plugins hit the quality standard and graduated from the Bad collection to Good and Ugly. Helping to improve the plugins in the Bad collection is a great way to get involved!
  • OpenGL support became sophisticated enough to warrant its own module, and gst-plugins-gl was born.

There are, of course, lots of other changes – check it out

Tomorrow evening, Jaime and I’ll be landing in Belgium for FOSDEM, which is going to be awesome. I’m a big fan of our community gatherings and getting to meet up with y’all. I was sort-of looking forward to giving a talk with some other GStreamer guys on Sunday in the GNOME/CrossDesktop devroom, but confusion over whether I said we would or not means the slot has been filled. We may still end up doing something on Saturday instead, will have to see.

We’re planning on checking out Brussels during the day on Friday, and then meeting up with everyone at the pub on Friday night. If you’re coming too, make sure to say hi!

This is a bit of a long post, but bear with me :)

Last week, I found out that OpenSolaris has recently added the Video4Linux2 APIs, and now provides a v4l2 driver for UVC compatible video cameras. That’s slightly funny, of course, because it has the word ‘Linux’ right there in the name. I think it’s really cool to see that the OpenSolaris guys aren’t suffering from NIH syndrome about that though.

To continue, I’ve had my eye on the Logitech Quickcam Pro 9000 for a little while, and it happens to be a UVC camera. This seemed like a good opportunity, so I ordered one. Nice webcam – I’m really pleased with it.

After that arrived, I was playing with V4L2 on one of the machines at work, and put a couple of patches from Brian Cameron into GStreamer‘s gst-plugins-good module to make our v4l2src plugin compile and work. The biggest difference from the Linux V4L2 implementation is that the header is found in /usr/include/sys/videodev2.h instead of /usr/include/linux/videodev2.h. That, and a small fix to gracefully handle ioctls that the OpenSolaris usbvc driver doesn’t implement, and I was up and away.

Coincidentally, Tim Foster was looking for talks for the Irish OpenSolaris User Group meeting last Thursday night. I thought people might enjoy a quick demo, so I volunteered.

I started off by showing that the camera shows up nicely in Ekiga, but I don’t have a SIP account so I wasn’t able to do much more there. Also, I really wanted to show the camera in GStreamer, since I’m a GStreamer guy. Note, for people who want to follow along at home I was using the version of GStreamer’s V4L2 plugin from CVS. It’s in the gst-plugins-good module, which is due for release on the 18th February.

I tried a simple pipeline to show the camera output in a window:

gst-launch v4l2src ! ffmpegcolorspace ! xvimagesink

This captures the video using v4l2 (v4l2src is the name of the element responsible), and feeds it via the ‘ffmpegcolorspace’ element to the ‘xvimagesink’ output. xvimagesink feeds video the XVideo extension, and ffmpegcolorspace provides for any format conversion between the frame buffer formats the camera supports, and what Xv can handle. Actually, this pipeline didn’t work by default on Tim’s laptop. For some reason, it tried to capture at 1600×1208 pixels, which the camera doesn’t support. It might work for you, not sure.

Anyway, the obvious fix was to explicitly choose a particular resolution to capture at:

gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! xvimagesink

This is the same as the first pipeline, with the addition of the ‘video/x-raw-yuv,width=320,height=240′ bit, which in GStreamer jargon is called a ‘caps filter’ – it filters down the set of formats that are allowed for data transfer between the 2 elements. By default, the pipeline will ask v4l2src and ffmpegcolorspace what formats they have in common, and pick one. By filtering it down, I’m forcing it to choose 320×240. Doing that made a little window pop up with the video in it. It looked a little like this, although this one was actually from later:

OSUG USB camera shot

Next, I thought I’d show how to save the incoming video to a file. In this case, as an AVI with MJPEG in it:

gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=osug-1.avi

The difference here is, instead of feeding the video to an Xv window, it goes through a JPEG encoder (jpegenc), gets put into an AVI container (avimux) and then written to a file (filesink location=$blah). I let it run for 5 or 6 seconds, and then stopped it with ctrl-c. The result looked like this:

osug-1.jpg

Apologies for the blurriness – I was waving the camera around and it didn’t really get a chance to focus.

Alternatively, I could have used something like this to record to Ogg/Theora:

gst-launch v4l2src ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! theoraenc ! oggmux ! filesink location=osug-1.ogg

I can play the recorded video back with a pipeline like:

gst-launch filesrc location=osug-1.avi ! decodebin ! ffmpegcolorspace ! xvimagesink

This uses filesink’s cousin ‘filesrc’ to read from an existing file, feeds it to the nice ‘decodebin’ element – which encapsulates the ability to decode any audio or video file GStreamer has installed plugins for, and then feeds the result (a sequence of raw YUV video buffers) to ‘ffmpegcolorspace ! xvimagesink’ for colorspace conversion and display in a window.

Anyone who watched the clip might be wondering why there is a guy in the front holding up a bright orange folder. For the next trick, I wanted to show the nice ‘alpha’ plugin. By default, alpha simply adds a alpha channel with a given opacity to the video as it passes through. However, it also has a green-screen mode. Or, in this case, orange-screening.

First, I played the video I captured in totem, and paused it at a suitable frame. Then I used the ‘Take Screenshot’ item in the Edit menu to save it out as a png – which actually became the first photo above. Next, I opened the png in the Gimp and used the eyedropper to isolate the RGB triple for the orange colour. Somewhere around Red=244, Green=161, Blue=11

At this point, I used live video for the rest of the demo, but I didn’t think to capture any of it. Instead, I’ll use the short clip I captured earlier as a canned re-enactment. So, I ran the video through a pipeline like this (using v4l2src etc instead of filesrc ! decodebin):

gst-launch filesrc location=osug-1.avi ! decodebin ! alpha method=custom target-r=245 target-g=161 target-b=11 angle=10 ! videomixer ! ffmpegcolorspace ! xvimagesink

This pipeline uses the alpha plugin in ‘custom’ (custom colour) mode, to add an alpha channel based on the bright orange colour-key, and then uses ‘videomixer’ to blend a default transparent looking background in behind it. Here:

OSUG demo 2 - colourkeying

The colour-key effect breaks up a little in places, because the skin tones and the wood of the desk get a little too close to the orange of the folder. A better choice of colour and filming conditions are really needed to do it well :)

And now for the really tricky bit:

gst-launch filesrc location=osug-1.avi ! decodebin ! alpha method=custom target-r=245 target-g=161 target-b=11 angle=10 ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink \
filesrc location=bg.avi ! decodebin ! ffmpegcolorspace ! mix.

This pipeline adds two things to the previous one:
1) It adds a ‘name=mix’ property assignment to the instance of the videomixer element. This makes it easier to refer to the instance by name later in the pipeline.
2) It adds a second filesrc, reading in a file named bg.avi, decoding it and then feeding it into the element named ‘mix’ – which is the instance of the videomixer element. Adding a period to the name lets GStreamer know that the connection should be made to an existing instance of something with that name, rather than creating a new instance of some element as all the other links do.

To save the output, I replaced the ‘xvimagesink’ with ‘jpegenc ! avimux ! filesink location=osug-3.avi’ to produce this:

OSUG demo 3 - colourkeying and compositing

There you go – a quick demo of doing a couple of fun things with GStreamer that don’t require writing any code. Have fun!

Next Page »