Basically you build up a chain of elements, linking your sources to sinks though pads.
gstreamer comes with two handy little command-line tools:
- gst-launch
Lets you test your chain (before you write it into your program and compile it, just to see it fail :-/) - gst-inspect
Lets you see the options/parameters of each element.
Two words on gst-inspect:
- use gst-inspect without any arguments to see all available elements and gst-inspect
to see it's options (capabilities). - You can't always trust the capabilities of source elements, as these may depend on the source (example of untrust worthy source element: v4l2src)
Your source elements needs to be either paused or playing to see what it offers.
All the elements in your chain have one or more pads, that you can use to link a to other elements, each pad has a set of capabilities are negotiated, to see if they are compatible.
Example just showing a sample videostream on screen:
gst-launch-0.10 videotestsrc ! ximagesink
To specify a width and height, we can pass it through another set of capabilities, forcing the negotiation between the elements to set our desired width and height:
gst-launch-0.10 videotestsrc ! video/x-raw-rgb,width=640,height=480 ! ximagesink
The videotestsrc, as a test source, is very generous of the formats it will output.
ximagesink is not; it only accepts video/x-raw-rgb.
You know this by using gst-inspect:
[urup@c4a012 ~]$ gst-inspect-0.10 ximagesink
Factory Details:
Long name: Video sink
Class: Sink/Video
Description: A standard X based videosink
Author(s): Julien Moutte
Rank: secondary (128)
Plugin Details:
Name: ximagesink
Description: X11 video output element based on standard Xlib calls
Filename: /usr/lib/gstreamer-0.10/libgstximagesink.so
Version: 0.10.36
License: LGPL
Source module: gst-plugins-base
Source release date: 2012-02-20
Binary package: GStreamer Base Plugins (Archlinux)
Origin URL: http://www.archlinux.org/
GObject
+----GstObject
+----GstElement
+----GstBaseSink
+----GstVideoSink
+----GstXImageSink
Implemented Interfaces:
GstImplementsInterface
GstNavigation
GstXOverlay
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw-rgb
framerate: [ 0/1, 2147483647/1 ]
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
...
And hence we set the video format accordingly to rgb.
One of the more frequently use elements when working with video is the ffmpegcolorspace one. It provides conversion between different formats.
We could therefore write to above as:
gst-launch-0.10 videotestsrc ! video/x-raw-yuv, width=640, height=480 ! ffmpegcolorspace ! ximagesink
It still works, even though the source format is wrong, because ffmpegcolorspace converts it for us.
note: Whenever possible you should make sure your formats match up, instead of converting them. This is especially true if your on a platform with limited capabilities, or you find your playback is lagging.
If you want to know what formats are being used, you can use the -v argument to gst-launch:
[urup@c4a012 ~]$ gst-launch-0.10 -v videotestsrc ! video/x-raw-yuv, width=640, height=480 ! ffmpegcolorspace ! ximagesink
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, format=(fourcc)YUY2, color-matrix=(string)sdtv, chroma-site=(string)mpeg2
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, format=(fourcc)YUY2, color-matrix=(string)sdtv, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, format=(fourcc)YUY2, color-matrix=(string)sdtv, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-rgb, bpp=(int)32, depth=(int)24, endianness=(int)4321, red_mask=(int)65280, green_mask=(int)16711680, blue_mask=(int)-16777216, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, format=(fourcc)YUY2, color-matrix=(string)sdtv, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstXImageSink:ximagesink0.GstPad:sink: caps = video/x-raw-rgb, bpp=(int)32, depth=(int)24, endianness=(int)4321, red_mask=(int)65280, green_mask=(int)16711680, blue_mask=(int)-16777216, width=(int)640, height=(int)480, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstXImageSink:ximagesink0: Output window was closed
...
Here you see the source element (videotestsrc) delivers video in yuv format.
The ffmpegcolorspace elements receives video in yuv format (sink), but sends it out in rgb format (src).
Ok, simple enough (once you gets the hang of it).
Now to the point of this whole blogspost...
How to tell what formats (capabilities) are supported of any given source element:
Ok, so gstreamer is a powerful thing, and it's error messages are actually understandable, so you know where go and fix it when it breaks.
First up, as stated before, it needs to be connected and running before you actually know this!
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-yuv,width=640,height=480,format=\(fourcc\)I420 ! appsink
I'm using a webcam (uvcvideo), that is connected as /dev/video0.
Set the environment variable GST_DEBUG:
export GST_DEBUG=v4l2src:5
You should ofcource replace v4l2src with the element name of your source!
Next, make your terminal windows BIG, and run your pipeline as above.
If you're connected to another device slow a slow line (like serial), I suggest piping the output to a file using "2>".
Now, I'm not going to show your the entire output of this, but somewhere in there, there's this (and alot more like it):
...
0:00:00.137710333 [334m30105 [00m 0x8c05120 [36mDEBUG [00m [00m v4l2src gstv4l2src.c:450:gst_v4l2src_negotiate:
...
These are the actual supported formats from the camera!
Note: That before this it may print all the capabilities it think it supports (which is nothing worth really)!
You can also read what formats are used from this log, just look a little lower in the log.
There's also the official gstreamer documentation, which is worth reading:
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gst-running.html
Ingen kommentarer:
Send en kommentar