{"id":40,"date":"2008-02-02T12:59:49","date_gmt":"2008-02-02T01:59:49","guid":{"rendered":"http:\/\/noraisin.net\/~jan\/diary\/?p=40"},"modified":"2008-02-02T12:59:49","modified_gmt":"2008-02-02T01:59:49","slug":"fun-things-to-do-with-gstreamer-command-lines","status":"publish","type":"post","link":"https:\/\/noraisin.net\/diary\/?p=40","title":{"rendered":"Fun things to do with GStreamer command-lines"},"content":{"rendered":"<p>This is a bit of a long post, but bear with me \ud83d\ude42<\/p>\n<p>Last week, I found out that OpenSolaris has recently added the Video4Linux2 APIs, and now provides a v4l2 driver for UVC compatible video cameras. That&#8217;s slightly funny, of course, because it has the word &#8216;Linux&#8217; right there in the name. I think it&#8217;s really cool to see that the OpenSolaris guys aren&#8217;t suffering from <a href=\"http:\/\/en.wikipedia.org\/wiki\/Not_Invented_Here\">NIH syndrome<\/a> about that though.<\/p>\n<p>To continue, I&#8217;ve had my eye on the <a href=\"http:\/\/www.logitech.com\/index.cfm\/38\/3056&#038;cl=us,en\">Logitech Quickcam Pro 9000<\/a> for a little while, and it happens to be a UVC camera. This seemed like a good opportunity, so I ordered one. Nice webcam &#8211; I&#8217;m really pleased with it.<\/p>\n<p>After that arrived, I was playing with V4L2 on one of the machines at work, and put a couple of patches from Brian Cameron into <a href=\"http:\/\/gstreamer.freedesktop.org\">GStreamer<\/a>&#8216;s gst-plugins-good module to make our v4l2src plugin compile and work. The biggest difference from the Linux V4L2 implementation is that the header is found in \/usr\/include\/sys\/videodev2.h instead of \/usr\/include\/linux\/videodev2.h. That, and a small fix to gracefully handle ioctls that the OpenSolaris usbvc driver doesn&#8217;t implement, and I was up and away.<\/p>\n<p>Coincidentally, <a href=\"http:\/\/blogs.sun.com\/timf\/\">Tim Foster<\/a> was looking for talks for the Irish OpenSolaris User Group meeting last Thursday night. I thought people might enjoy a quick demo, so I volunteered.<\/p>\n<p>I started off by showing that the camera shows up nicely in Ekiga, but I don&#8217;t have a SIP account so I wasn&#8217;t able to do much more there. Also, I really wanted to show the camera in GStreamer, since I&#8217;m a GStreamer guy. Note, for people who want to follow along at home I was using the version of GStreamer&#8217;s V4L2 plugin from CVS. It&#8217;s in the gst-plugins-good module, which is due for <a href=\"http:\/\/gstreamer.freedesktop.org\/wiki\/ReleasePlanning2008\">release<\/a> on the 18th February.<\/p>\n<p>I tried a simple pipeline to show the camera output in a window:<\/p>\n<p>gst-launch v4l2src ! ffmpegcolorspace ! xvimagesink<\/p>\n<p>This captures the video using v4l2 (v4l2src is the name of the element responsible), and feeds it via the &#8216;ffmpegcolorspace&#8217; element to the &#8216;xvimagesink&#8217; output. xvimagesink feeds video the XVideo extension, and ffmpegcolorspace provides for any format conversion between the frame buffer formats the camera supports, and what Xv can handle. Actually, this pipeline didn&#8217;t work by default on Tim&#8217;s laptop. For some reason, it tried to capture at 1600&#215;1208 pixels, which the camera doesn&#8217;t support. It might work for you, not sure.<\/p>\n<p>Anyway, the obvious fix was to explicitly choose a particular resolution to capture at:<\/p>\n<p>gst-launch v4l2src ! video\/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! xvimagesink<\/p>\n<p>This is the same as the first pipeline, with the addition of the &#8216;video\/x-raw-yuv,width=320,height=240&#8217; bit, which in GStreamer jargon is called a &#8216;caps filter&#8217; &#8211; it filters down the set of formats that are allowed for data transfer between the 2 elements. By default, the pipeline will ask v4l2src and ffmpegcolorspace what formats they have in common, and pick one. By filtering it down, I&#8217;m forcing it to choose 320&#215;240. Doing that made a little window pop up with the video in it. It looked a little like this, although this one was actually from later:<\/p>\n<p><a href='http:\/\/noraisin.net\/~jan\/diary\/wp-content\/uploads\/2008\/02\/osug-photo.jpg' title='OSUG USB camera shot'><img src='http:\/\/noraisin.net\/~jan\/diary\/wp-content\/uploads\/2008\/02\/osug-photo.thumbnail.jpg' alt='OSUG USB camera shot' \/><\/a><\/p>\n<p>Next, I thought I&#8217;d show how to save the incoming video to a file. In this case, as an AVI with MJPEG in it:<\/p>\n<p>gst-launch v4l2src ! video\/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=osug-1.avi<\/p>\n<p>The difference here is, instead of feeding the video to an Xv window, it goes through a JPEG encoder (jpegenc), gets put into an AVI container (avimux) and then written to a file (filesink location=$blah). I let it run for 5 or 6 seconds, and then stopped it with ctrl-c. The result looked like this:<\/p>\n<p><a href='http:\/\/people.freedesktop.org\/~thaytan\/osug-1.avi'><img src='http:\/\/noraisin.net\/~jan\/diary\/wp-content\/uploads\/2008\/02\/osug-1.jpg' alt='osug-1.jpg' \/><\/a><\/p>\n<p>Apologies for the blurriness &#8211; I was waving the camera around and it didn&#8217;t really get a chance to focus.<\/p>\n<p>Alternatively, I could have used something like this to record to Ogg\/Theora:<\/p>\n<p>gst-launch v4l2src ! video\/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! theoraenc ! oggmux ! filesink location=osug-1.ogg<\/p>\n<p>I can play the recorded video back with a pipeline like:<\/p>\n<p>gst-launch filesrc location=osug-1.avi ! decodebin !  ffmpegcolorspace ! xvimagesink<\/p>\n<p>This uses filesink&#8217;s cousin &#8216;filesrc&#8217; to read from an existing file, feeds it to the nice &#8216;decodebin&#8217; element &#8211; which encapsulates the ability to decode any audio or video file GStreamer has installed plugins for, and then feeds the result (a sequence of raw YUV video buffers) to &#8216;ffmpegcolorspace ! xvimagesink&#8217; for colorspace conversion and display in a window.<\/p>\n<p>Anyone who watched the clip might be wondering why there is a guy in the front holding up a bright orange folder. For the next trick, I wanted to show the nice &#8216;alpha&#8217; plugin. By default, alpha simply adds a alpha channel with a given opacity to the video as it passes through. However, it also has a green-screen mode. Or, in this case, orange-screening.<\/p>\n<p>First, I played the video I captured in <a href=\"http:\/\/www.gnome.org\/projects\/totem\/\">totem<\/a>, and paused it at a suitable frame. Then I used the &#8216;Take Screenshot&#8217; item in the Edit menu to save it out as a png &#8211; which actually became the first photo above. Next, I opened the png in the <a href=\"http:\/\/www.gimp.org\/\">Gimp<\/a> and used the eyedropper to isolate the RGB triple for the orange colour. Somewhere around Red=244, Green=161, Blue=11<\/p>\n<p>At this point, I used live video for the rest of the demo, but I didn&#8217;t think to capture any of it. Instead, I&#8217;ll use the short clip I captured earlier as a canned re-enactment. So, I ran the video through a pipeline like this (using v4l2src etc instead of filesrc ! decodebin):<\/p>\n<p>gst-launch filesrc location=osug-1.avi ! decodebin ! alpha method=custom target-r=245 target-g=161 target-b=11 angle=10 ! videomixer ! ffmpegcolorspace ! xvimagesink<\/p>\n<p>This pipeline uses the alpha plugin in &#8216;custom&#8217; (custom colour) mode, to add an alpha channel based on the bright orange colour-key, and then uses &#8216;videomixer&#8217; to blend a default transparent looking background in behind it. Here:<\/p>\n<p><a href='http:\/\/people.freedesktop.org\/~thaytan\/osug-2.avi'><img src='http:\/\/noraisin.net\/~jan\/diary\/wp-content\/uploads\/2008\/02\/osug-2.jpg' alt='OSUG demo 2 - colourkeying' \/><\/a><\/p>\n<p>The colour-key effect breaks up a little in places, because the skin tones and the wood of the desk get a little too close to the orange of the folder. A better choice of colour and filming conditions are really needed to do it well \ud83d\ude42<\/p>\n<p>And now for the really tricky bit:<\/p>\n<p>gst-launch filesrc location=osug-1.avi ! decodebin ! alpha method=custom target-r=245 target-g=161 target-b=11 angle=10 ! videomixer name=mix ! ffmpegcolorspace ! xvimagesink \\<br \/>\nfilesrc location=bg.avi ! decodebin ! ffmpegcolorspace ! mix.<\/p>\n<p>This pipeline adds two things to the previous one:<br \/>\n1) It adds a &#8216;name=mix&#8217; property assignment to the instance of the videomixer element. This makes it easier to refer to the instance by name later in the pipeline.<br \/>\n2) It adds a second filesrc, reading in a file named <a href=\"http:\/\/people.freedesktop.org\/~thaytan\/bg.avi\">bg.avi<\/a>, decoding it and then feeding it into the element named &#8216;mix&#8217; &#8211; which is the instance of the videomixer element. Adding a period to the name lets GStreamer know that the connection should be made to an existing instance of something with that name, rather than creating a new instance of some element as all the other links do.<\/p>\n<p>To save the output, I replaced the &#8216;xvimagesink&#8217; with &#8216;jpegenc ! avimux ! filesink location=osug-3.avi&#8217; to produce this:<\/p>\n<p><a href='http:\/\/people.freedesktop.org\/~thaytan\/osug-3.avi'><img src='http:\/\/noraisin.net\/~jan\/diary\/wp-content\/uploads\/2008\/02\/osug-3.jpg' alt='OSUG demo 3 - colourkeying and compositing' \/><\/a><\/p>\n<p>There you go &#8211; a quick demo of doing a couple of fun things with GStreamer that don&#8217;t require writing any code. Have fun!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This is a bit of a long post, but bear with me \ud83d\ude42 Last week, I found out that OpenSolaris has recently added the Video4Linux2 APIs, and now provides a v4l2 driver for UVC compatible video cameras. That&#8217;s slightly funny, of course, because it has the word &#8216;Linux&#8217; right there in the name. I think &hellip; <a href=\"https:\/\/noraisin.net\/diary\/?p=40\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Fun things to do with GStreamer command-lines&#8221;<\/span><\/a><\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4],"tags":[],"class_list":["post-40","post","type-post","status-publish","format-standard","hentry","category-gstreamer"],"_links":{"self":[{"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=\/wp\/v2\/posts\/40","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=40"}],"version-history":[{"count":0,"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=\/wp\/v2\/posts\/40\/revisions"}],"wp:attachment":[{"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=40"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=40"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/noraisin.net\/diary\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=40"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}