2016-05-27 02:21:04 +00:00
# Basic tutorial 10: GStreamer tools
2016-05-16 14:30:34 +00:00
This page last changed on Jun 01, 2012 by xartigas.
# Goal
GStreamer (and the GStreamer SDK) come with a set of tools which range
from handy to absolutely essential. There is no code in this tutorial,
just sit back and relax, and we will teach you:
- How to build and run GStreamer pipelines from the command line,
without using C at all\!
- How to find out what GStreamer elements you have available and their
capabilities.
- How to discover the internal structure of media files.
# Introduction
These tools are available in the bin directory of the SDK. You need to
move to this directory to execute them, because it is not added to the
system’ s `PATH` environment variable (to avoid polluting it too much).
Just open a terminal (or console window) and go to the `bin` directory
of your GStreamer SDK installation (Read again the [Installing the
SDK](Installing%2Bthe%2BSDK.html) section to find our where this is),
and you are ready to start typing the commands given in this tutorial.
< table >
< tbody >
< tr class = "odd" >
< td > < img src = "images/icons/emoticons/information.png" width = "16" height = "16" / > < / td >
< td > < p > On Linux, though, you can use the provided < code > /opt/gstreamer-sdk/bin/gst-sdk-shell< / code > script to enter the GStreamer SDK shell environment, in which the < code > bin< / code > directory is in the path. In this environment, you can use the GStreamer tools from any folder.< / p > < / td >
< / tr >
< / tbody >
< / table >
In order to allow for multiple versions of GStreamer to coexists in the
same system, these tools are versioned, this is, a GStreamer version
number is appended to their name. This version of the SDK is based on
GStreamer 0.10, so the tools are called `gst-launch-0.10` ,
`gst-inspect-0.10` and `gst-discoverer-0.10`
# `gst-launch`
This tool accepts a textual description of a pipeline, instantiates it,
and sets it to the PLAYING state. It allows you to quickly check if a
given pipeline works, before going through the actual implementation
using GStreamer API calls.
Bear in mind that it can only create simple pipelines. In particular, it
can only simulate the interaction of the pipeline with the application
up to a certain level. In any case, it is extremely handy to test
pipelines quickly, and is used by GStreamer developers around the world
on a daily basis.
Please note that `gst-launch` is primarily a debugging tool for
developers. You should not build applications on top of it. Instead, use
the `gst_parse_launch()` function of the GStreamer API as an easy way to
construct pipelines from pipeline descriptions.
Although the rules to construct pipeline descriptions are very simple,
the concatenation of multiple elements can quickly make such
descriptions resemble black magic. Fear not, for everyone learns the
`gst-launch` syntax, eventually.
The command line for gst-launch consists of a list of options followed
by a PIPELINE-DESCRIPTION. Some simplified instructions are given next,
se the complete documentation at [the reference page ](gst-launch.html )
for `gst-launch` .
#### Elements
In simple form, a PIPELINE-DESCRIPTION is a list of element types
separated by exclamation marks (\!). Go ahead and type in the following
command:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! autovideosink
```
You should see a windows with an animated video pattern. Use CTRL+C on
the terminal to stop the program.
This instantiates a new element of type `videotestsrc` (an element which
generates a sample video pattern), an `ffmpegcolorspace` (an element
which does color space conversion, making sure other elements can
understand each other), and an `autovideosink` (a window to which video
is rendered). Then, GStreamer tries to link the output of each element
to the input of the element appearing on its right in the description.
If more than one input or output Pad is available, the Pad Caps are used
to find two compatible Pads.
#### Properties
Properties may be appended to elements, in the form
*property=value *(multiple properties can be specified, separated by
spaces). Use the `gst-inspect` tool (explained next) to find out the
available properties for an
element.
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 videotestsrc pattern=11 ! ffmpegcolorspace ! autovideosink
```
You should see a static video pattern, made of circles.
#### Named elements
Elements can be named using the `name` property, in this way complex
pipelines involving branches can be created. Names allow linking to
elements created previously in the description, and are indispensable to
use elements with multiple output pads, like demuxers or tees, for
example.
Named elements are referred to using their name followed by a
dot.
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 videotestsrc ! ffmpegcolorspace ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
```
You should see two video windows, showing the same sample video pattern.
If you see only one, try to move it, since it is probably on top of the
second window.
This example instantiates a `videotestsrc` , linked to a
`ffmpegcolorspace` , linked to a `tee` (Remember from [Basic tutorial 7:
Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) that
a `tee` copies to each of its output pads everything coming through its
input pad). The `tee` is named simply ‘ t’ (using the `name` property)
and then linked to a `queue` and an `autovideosink` . The same `tee` is
referred to using ‘ t.’ (mind the dot) and then linked to a second
`queue` and a second `autovideosink` .
To learn why the queues are necessary read [Basic tutorial 7:
Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
#### Pads
Instead of letting GStreamer choose which Pad to use when linking two
elements, you may want to specify the Pads directly. You can do this by
adding a dot plus the Pad name after the name of the element (it must be
a named element). Learn the names of the Pads of an element by using
the `gst-inspect` tool.
This is useful, for example, when you want to retrieve one particular
stream out of a
demuxer:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
```
This fetches a media file from the internet using `souphttpsrc` , which
is in webm format (a special kind of Matroska container, see [Basic
tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)). We
then open the container using `matroskademux` . This media contains both
audio and video, so `matroskademux` will create two output Pads, named
`video_00` and `audio_00` . We link `video_00` to a `matroskamux` element
to re-pack the video stream into a new container, and finally link it to
a `filesink` , which will write the stream into a file named
"sintel\_video.mkv" (the `location` property specifies the name of the
file).
All in all, we took a webm file, stripped it of audio, and generated a
new matroska file with the video. If we wanted to keep only the
audio:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_00 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
```
The `vorbisparse` element is required to extract some information from
the stream and put it in the Pad Caps, so the next element,
`matroskamux` , knows how to deal with the stream. In the case of video
this was not necessary, because `matroskademux` already extracted this
information and added it to the Caps.
Note that in the above two examples no media has been decoded or played.
We have just moved from one container to another (demultiplexing and
re-multiplexing again).
#### Caps filters
When an element has more than one output pad, it might happen that the
link to the next element is ambiguous: the next element may have more
than one compatible input pad, or its input pad may be compatible with
the Pad Caps of all the output pads. In these cases GStreamer will link
using the first pad that is available, which pretty much amounts to
saying that GStreamer will choose one output pad at random.
Consider the following
pipeline:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! filesink location=test
```
This is the same media file and demuxer as in the previous example. The
input Pad Caps of `filesink` are `ANY` , meaning that it can accept any
kind of media. Which one of the two output pads of `matroskademux` will
be linked against the filesink? `video_00` or `audio_00` ? You cannot
know.
You can remove this ambiguity, though, by using named pads, as in the
previous sub-section, or by using **Caps
Filters**:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! video/x-vp8 ! matroskamux ! filesink location=sintel_video.mkv
```
A Caps Filter behaves like a pass-through element which does nothing and
only accepts media with the given Caps, effectively resolving the
ambiguity. In this example, between `matroskademux` and `matroskamux` we
added a `video/x-vp8` Caps Filter to specify that we are interested in
the output pad of `matroskademux` which can produce this kind of video.
To find out the Caps an element accepts and produces, use the
`gst-inspect` tool. To find out the Caps contained in a particular file,
use the `gst-discoverer` tool. To find out the Caps an element is
producing for a particular pipeline, run `gst-launch` as usual, with the
`– v` option to print Caps information.
#### Examples
Play a media file using `playbin2` (as in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)):
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm
```
A fully operation playback pipeline, with audio and video (more or less
the same pipeline that `playbin2` will create
internally):
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! ffmpegcolorspace ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
```
A transcoding pipeline, which opens the webm container and decodes both
streams (via uridecodebin), then re-encodes the audio and video branches
with a different codec, and puts them back together in an Ogg container
(just for the sake of
it).
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm name=d ! queue ! theoraenc ! oggmux name=m ! filesink location=sintel.ogg d. ! queue ! audioconvert ! audioresample ! flacenc ! m.
```
A rescaling pipeline. The `videoscale` element performs a rescaling
operation whenever the frame size is different in the input and the
output caps. The output caps are set by the Caps Filter to
320x200.
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-launch-0.10 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! ffmpegcolorspace ! autovideosink
```
This short description of `gst-launch` should be enough to get you
started. Remember that you have the [complete documentation available
here](gst-launch.html).
# `gst-inspect`
This tool has three modes of operation:
- Without arguments, it lists all available elements types, this is,
the types you can use to instantiate new elements.
- With a file name as an argument, it treats the file as a GStreamer
plugin, tries to open it, and lists all the elements described
inside.
- With a GStreamer element name as an argument, it lists all
information regarding that element.
Let's see an example of the third mode:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-inspect-0.10 vp8dec
Factory Details:
Long name: On2 VP8 Decoder
Class: Codec/Decoder/Video
Description: Decode VP8 video streams
Author(s): David Schleef < ds @ entropywave . com >
Rank: primary (256)
Plugin Details:
Name: vp8
Description: VP8 plugin
Filename: I:\gstreamer-sdk\2012.5\x86\lib\gstreamer-0.10\libgstvp8.dll
Version: 0.10.23
License: LGPL
Source module: gst-plugins-bad
Source release date: 2012-02-20
Binary package: GStreamer Bad Plug-ins (GStreamer SDK)
Origin URL: http://www.gstreamer.com
GObject
+----GstObject
+----GstElement
+----GstBaseVideoCodec
+----GstBaseVideoDecoder
+----GstVP8Dec
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw-yuv
format: I420
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-vp8
Element Flags:
no flags set
Element Implementation:
Has change_state() function: gst_base_video_decoder_change_state
Has custom save_thyself() function: gst_element_save_thyself
Has custom restore_thyself() function: gst_element_restore_thyself
Element has no clocking capabilities.
Element has no indexing capabilities.
Element has no URI handling capabilities.
Pads:
SRC: 'src'
Implementation:
Has custom eventfunc(): gst_base_video_decoder_src_event
Has custom queryfunc(): gst_base_video_decoder_src_query
Provides query types:
(1): position (Current position)
(2): duration (Total duration)
(8): convert (Converting between formats)
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Has getcapsfunc(): gst_pad_get_fixed_caps_func
Has acceptcapsfunc(): gst_pad_acceptcaps_default
Pad Template: 'src'
SINK: 'sink'
Implementation:
Has chainfunc(): gst_base_video_decoder_chain
Has custom eventfunc(): gst_base_video_decoder_sink_event
Has custom queryfunc(): gst_base_video_decoder_sink_query
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Has setcapsfunc(): gst_base_video_decoder_sink_setcaps
Has acceptcapsfunc(): gst_pad_acceptcaps_default
Pad Template: 'sink'
Element Properties:
name : The name of the object
flags: readable, writable
String. Default: "vp8dec0"
post-processing : Enable post processing
flags: readable, writable
Boolean. Default: false
post-processing-flags: Flags to control post processing
flags: readable, writable
Flags "GstVP8DecPostProcessingFlags" Default: 0x00000003, "demacroblock+deblock"
(0x00000001): deblock - Deblock
(0x00000002): demacroblock - Demacroblock
(0x00000004): addnoise - Add noise
deblocking-level : Deblocking level
flags: readable, writable
2016-05-27 02:21:04 +00:00
Unsigned Integer. Range: 0 - 16 Default: 4
2016-05-16 14:30:34 +00:00
noise-level : Noise level
flags: readable, writable
Unsigned Integer. Range: 0 - 16 Default: 0
```
The most relevant sections are:
- Pad Templates (line 25): This lists all the kinds of Pads this
element can have, along with their capabilities. This is where you
look to find out if an element can link with another one. In this
case, it has only one sink pad template, accepting only
`video/x-vp8` (encoded video data in VP8 format) and only one source
pad template, producing `video/x-raw-yuv` (decoded video data).
- Element Properties (line 70): This lists the properties of the
element, along with their type and accepted values.
For more information, you can check the [documentation
page](http://gst-inspect) of `gst-inspect` .
# `gst-discoverer`
This tool is a wrapper around the `GstDiscoverer` object shown in [Basic
tutorial 9: Media information
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html).
It accepts a URI from the command line and prints all information
regarding the media that GStreamer can extract. It is useful to find out
what container and codecs have been used to produce the media, and
therefore what elements you need to put in a pipeline to play it.
Use `gst-discoverer --help` to obtain the list of available options,
which basically control the amount of verbosity of the output.
Let's see an
example:
2016-05-27 02:48:36 +00:00
```
2016-05-16 14:30:34 +00:00
gst-discoverer-0.10 http://docs.gstreamer.com/media/sintel_trailer-480p.webm -v
Analyzing http://docs.gstreamer.com/media/sintel_trailer-480p.webm
Done discovering http://docs.gstreamer.com/media/sintel_trailer-480p.webm
Topology:
container: video/webm
audio: audio/x-vorbis, channels=(int)2, rate=(int)48000
Codec:
audio/x-vorbis, channels=(int)2, rate=(int)48000
Additional info:
None
Language: en
Channels: 2
Sample rate: 48000
Depth: 0
Bitrate: 80000
Max bitrate: 0
Tags:
taglist, language-code=(string)en, container-format=(string)Matroska, audio-codec=(string)Vorbis, application-name=(string)ffmpeg2theora-0.24, encoder=(string)"Xiph.Org\ libVorbis\ I\ 20090709", encoder-version=(uint)0, nominal-bitrate=(uint)80000, bitrate=(uint)80000;
video: video/x-vp8, width=(int)854, height=(int)480, framerate=(fraction)25/1
Codec:
video/x-vp8, width=(int)854, height=(int)480, framerate=(fraction)25/1
Additional info:
None
Width: 854
Height: 480
Depth: 0
Frame rate: 25/1
Pixel aspect ratio: 1/1
Interlaced: false
Bitrate: 0
Max bitrate: 0
Tags:
taglist, video-codec=(string)"VP8\ video", container-format=(string)Matroska;
Properties:
Duration: 0:00:52.250000000
Seekable: yes
Tags:
video codec: On2 VP8
language code: en
container format: Matroska
application name: ffmpeg2theora-0.24
encoder: Xiph.Org libVorbis I 20090709
encoder version: 0
audio codec: Vorbis
nominal bitrate: 80000
bitrate: 80000
```
# Conclusion
This tutorial has shown:
- How to build and run GStreamer pipelines from the command line using
the `gst-launch` tool.
- How to find out what GStreamer elements you have available and their
capabilities, using the `gst-inspect` tool.
- How to discover the internal structure of media files, using
`gst-discoverer` .
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27