Update the last basic tutorials

This commit is contained in:
Olivier Crête 2016-06-15 20:00:24 -04:00
parent ab17d47d29
commit 0a973622a9
10 changed files with 458 additions and 619 deletions

View file

@ -1,5 +1,10 @@
# Basic tutorial 15: Clutter integration
> ![Warning](images/icons/emoticons/warning.png)
>
> **THIS TUTORIAL HAS NOT BEEN UPDATED, CLUTTER IS DEPRECATED, DON'T USE IT**
# Goal
“[Clutter](https://clutter-project.org/) is an open source software

View file

@ -1,6 +1,6 @@
# Basic tutorial 11: Debugging tools
# Goal
## Goal
Sometimes things wont go as expected and the error messages retrieved
from the bus (if any) just dont provide enough information. Luckily,
@ -13,7 +13,7 @@ hint what the problem might be. This tutorial shows:
- How to get pipeline graphs
# Printing debug information
## Printing debug information
### The debug log
@ -22,7 +22,7 @@ the code where a particularly interesting piece of information is
printed to the console, along with time stamping, process, category,
source code file, function and element information.
The debug output is controlled with the `GST_DEBUG` environment
The debug output is controlled with the `GST_DEBUG` environment
variable. Heres an example with
`GST_DEBUG=2`:
@ -40,54 +40,43 @@ at once.
The first category is the Debug Level, which is a number specifying the
amount of desired output:
<table>
<tbody>
<tr class="odd">
<td><span style="color: rgb(192,192,192);">0</span></td>
</tr>
<tr class="even">
<td><span style="color: rgb(192,192,192);">1</span></td>
</tr>
<tr class="odd">
<td><span style="color: rgb(192,192,192);">2</span></td>
</tr>
<tr class="even">
<td><span style="color: rgb(192,192,192);">3</span></td>
</tr>
<tr class="odd">
<td><span style="color: rgb(192,192,192);">4</span></td>
</tr>
<tr class="even">
<td><span style="color: rgb(192,192,192);">5</span></td>
</tr>
</tbody>
</table>
| # | Name | Description |
|---|---------|---|
| 0 | none | No debug information is output. |
| 1 | ERROR | Logs all fatal errors. These are errors that do not allow the core or elements to perform the requested action. The application can still recover if programmed to handle the conditions that triggered the error. |
| 2 | WARNING | Logs all warnings. Typically these are non-fatal, but user-visible problems are expected to happen. |
| 3 | FIXME | Logs all "fixme" messages. Those typically that a codepath that is known to be incomplete has been triggered. It may work in most cases, but mauy cause problems in specific instances. |
| 4 | INFO | Logs all informational messages. These are typically used for events in the system that only happen once, or are important and rare enough to be logged at this level. |
| 5 | DEBUG | Logs all debug messages. These are general debug messages for events that happen only a limited number of times during an object's lifetime; these include setup, teardown, change of parameters, ... |
| 6 | LOG | Logs all log messages. These are messages for events that happen repeatedly during an object's lifetime; these include streaming and steady-state conditions. This is used for log messages that happen on every buffer in an element for example. |
| 7 | TRACE | Logs all trace messages. Those are message that happen very very often. This is for example is each each time the reference count of a GstMiniObject, such as a GstBuffer or GstEvent, is modified. |
| 8 | MEMDUMP | Logs all memory dump messages. This is the heaviest logging and may include dumping the content of blocks of memory. |
To enable debug output, set the `GST_DEBUG` environment variable to the
To enable debug output, set the `GST_DEBUG` environment variable to the
desired debug level. All levels below that will also be shown (i.e., if
you set `GST_DEBUG=2`, you will get both `ERROR` and
`WARNING` messages).
you set `GST_DEBUG=2`, you will get both `ERROR` and
`WARNING` messages).
Furthermore, each plugin or part of the GStreamer defines its own
category, so you can specify a debug level for each individual category.
For example, `GST_DEBUG=2,audiotestsrc:5`, will use Debug Level 5 for
the `audiotestsrc` element, and 2 for all the others.
For example, `GST_DEBUG=2,audiotestsrc:6`, will use Debug Level 6 for
the `audiotestsrc` element, and 2 for all the others.
The `GST_DEBUG` environment variable, then, is a comma-separated list of
The `GST_DEBUG` environment variable, then, is a comma-separated list of
*category*:*level* pairs, with an optional *level* at the beginning,
representing the default debug level for all categories.
The `'*'` wildcard is also available. For example
`GST_DEBUG=2,audio*:5` will use Debug Level 5 for all categories
starting with the word `audio`. `GST_DEBUG=*:2` is equivalent to
The `'*'` wildcard is also available. For example
`GST_DEBUG=2,audio*:6` will use Debug Level 5 for all categories
starting with the word `audio`. `GST_DEBUG=*:2` is equivalent to
`GST_DEBUG=2`.
Use `gst-launch-1.0 --gst-debug-help` to obtain the list of all
Use `gst-launch-1.0 --gst-debug-help` to obtain the list of all
registered categories. Bear in mind that each plugin registers its own
categories, so, when installing or removing plugins, this list can
change.
Use `GST_DEBUG` when the error information posted on the GStreamer bus
Use `GST_DEBUG` when the error information posted on the GStreamer bus
does not help you nail down a problem. It is common practice to redirect
the output log to a file, and then examine it later, searching for
specific messages.
@ -99,48 +88,18 @@ is:
0:00:00.868050000 1592 09F62420 WARN filesrc gstfilesrc.c:1044:gst_file_src_start:<filesrc0> error: No such file "non-existing-file.webm"
```
<table>
<thead>
<tr class="header">
<th><code>0:00:00.868050000</code></th>
<th>Time stamp in HH:MM:SS.sssssssss format since the start of the program</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td><code>1592</code></td>
<td>Process ID from which the message was issued. Useful when your problem involves multiple processes</td>
</tr>
<tr class="even">
<td><code>09F62420</code></td>
<td><span>Thread ID from which the message was issued. Useful when your problem involves multiple threads</span></td>
</tr>
<tr class="odd">
<td><code>WARN</code></td>
<td>Debug level of the message</td>
</tr>
<tr class="even">
<td><code>filesrc</code></td>
<td>Debug Category of the message</td>
</tr>
<tr class="odd">
<td><code>gstfilesrc.c:1044</code></td>
<td>Source file and line in the GStreamer source code where this message is printed</td>
</tr>
<tr class="even">
<td><code>gst_file_src_start</code></td>
<td>Function from which the message was issued</td>
</tr>
<tr class="odd">
<td><code>&lt;filesrc0&gt;</code></td>
<td>Name of the object that issued the message. It can be an element, a Pad, or something else. Useful when you have multiple elements of the same kind and need to distinguish among them. Naming your elements with the name property will make this debug output more readable (otherwise, GStreamer assigns each new element a unique name).</td>
</tr>
<tr class="even">
<td><code>error: No such file &quot;non-existing-file.webm&quot;</code></td>
<td>The actual message.</td>
</tr>
</tbody>
</table>
| Example | Explained |
|---------------------|-----------|
| `0:00:00.868050000` | Time stamp in HH:MM:SS.sssssssss format since the start of the program |
| `1592` | Process ID from which the message was issued. Useful when your problem involves multiple processes |
| `09F62420` | Thread ID from which the message was issued. Useful when your problem involves multiple threads |
| `WARN` | Debug level of the message |
| `filesrc` | Debug Category of the message |
| `gstfilesrc.c:1044` | Source file and line in the GStreamer source code where this message is printed |
| `gst_file_src_start`| Function from which the message was issued |
| `&lt;filesrc0&gt;` | Name of the object that issued the message. It can be an element, a Pad, or something else. Useful when you have multiple elements of the same kind and need to distinguish among them. Naming your elements with the name property will make this debug output more readable (otherwise, GStreamer assigns each new element a unique name). |
| `error: No such file &quot;non-existing-file.webm&quot;` | The actual message.|
### Adding your own debug information
@ -150,8 +109,8 @@ have all debug output in the same file and the temporal relationship
between different messages is preserved.
To do so, use the `GST_ERROR()`, `GST_WARNING()`, `GST_INFO()`,
`GST_LOG()` and `GST_DEBUG()` macros. They accept the same parameters as
`printf`, and they use the `default` category (`default` will be shown
`GST_LOG()` and `GST_DEBUG()` macros. They accept the same parameters as
`printf`, and they use the `default` category (`default` will be shown
as the Debug category in the output log).
To change the category to something more meaningful, add these two lines
@ -171,34 +130,34 @@ GST_DEBUG_CATEGORY_INIT (my_category, "my category", 0, "This is my very own");
This registers a new category (this is, for the duration of your
application: it is not stored in any file), and sets it as the default
category for your code. See the documentation
for `GST_DEBUG_CATEGORY_INIT()`.
category for your code. See the documentation
for `GST_DEBUG_CATEGORY_INIT()`.
### Getting pipeline graphs
For those cases where your pipeline starts to grow too large and you
lose track of what is connected with what, GStreamer has the capability
to output graph files. These are `.dot` files, readable with free
to output graph files. These are `.dot` files, readable with free
programs like [GraphViz](http://www.graphviz.org), that describe the
topology of your pipeline, along with the caps negotiated in each link.
This is also very handy when using all-in-one elements like `playbin`
 or `uridecodebin`, which instantiate several elements inside them. Use
the `.dot` files to learn what pipeline they have created inside (and
or `uridecodebin`, which instantiate several elements inside them. Use
the `.dot` files to learn what pipeline they have created inside (and
learn a bit of GStreamer along the way).
To obtain `.dot` files, simply set
the `GST_DEBUG_DUMP_DOT_DIR` environment variable to point to the
To obtain `.dot` files, simply set
the `GST_DEBUG_DUMP_DOT_DIR` environment variable to point to the
folder where you want the files to be placed. `gst-launch-1.0` will create
a `.dot` file at each state change, so you can see the evolution of the
a `.dot` file at each state change, so you can see the evolution of the
caps negotiation. Unset the variable to disable this facility. From
within your application, you can use the
`GST_DEBUG_BIN_TO_DOT_FILE()` and
`GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS()` macros to generate `.dot` files
`GST_DEBUG_BIN_TO_DOT_FILE()` and
`GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS()` macros to generate `.dot` files
at your convenience.
Here you have an example of the kind of pipelines that playbin
generates. It is very complex because `playbin` can handle many
generates. It is very complex because `playbin` can handle many
different cases: Your manual pipelines normally do not need to be this
long. If your manual pipeline is starting to get very big, consider
using `playbin`.
@ -208,20 +167,15 @@ using `playbin`.
To download the full-size picture, use the attachments link at the top
of this page (It's the paperclip icon).
# Conclusion
## Conclusion
This tutorial has shown:
- How to get more debug information from GStreamer using the
`GST_DEBUG` environment variable.
`GST_DEBUG` environment variable.
- How to print your own debug information into the GStreamer log with
the `GST_ERROR()` macro and relatives.
the `GST_ERROR()` macro and relatives.
- How to get pipeline graphs with the
`GST_DEBUG_DUMP_DOT_DIR` environment variable.
`GST_DEBUG_DUMP_DOT_DIR` environment variable.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[playbin.png](attachments/327830/2424840.png) (image/png)
It has been a pleasure having you here, and see you soon!

View file

@ -1,44 +1,46 @@
# Basic tutorial 10: GStreamer tools
# Goal
## Goal
GStreamer (and the GStreamer SDK) come with a set of tools which range
from handy to absolutely essential. There is no code in this tutorial,
just sit back and relax, and we will teach you:
GStreamer comes with a set of tools which range from handy to
absolutely essential. There is no code in this tutorial, just sit back
and relax, and we will teach you:
- How to build and run GStreamer pipelines from the command line,
without using C at all\!
without using C at all!
- How to find out what GStreamer elements you have available and their
capabilities.
- How to discover the internal structure of media files.
# Introduction
## Introduction
These tools are available in the bin directory of the SDK. You need to
move to this directory to execute them, because it is not added to the
systems `PATH` environment variable (to avoid polluting it too much).
systems `PATH` environment variable (to avoid polluting it too much).
Just open a terminal (or console window) and go to the `bin` directory
of your GStreamer SDK installation (Read again the [Installing the
SDK](Installing%2Bthe%2BSDK.html) section to find our where this is),
Just open a terminal (or console window) and go to the `bin` directory
of your GStreamer SDK installation (Read again the [Installing the
SDK](sdk-installing.html) section to find our where this is),
and you are ready to start typing the commands given in this tutorial.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>On Linux, though, you can use the provided <code>/opt/gstreamer-sdk/bin/gst-sdk-shell</code> script to enter the GStreamer SDK shell environment, in which the <code>bin</code> directory is in the path. In this environment, you can use the GStreamer tools from any folder.</p></td>
</tr>
</tbody>
</table>
> ![Information](images/icons/emoticons/information.png)
>
>On Linux, though, you can use the provided
>`/opt/gstreamer-sdk/bin/gst-sdk-shell` script to enter the
>GStreamer SDK shell environment, in which the <code>bin</code>
>directory is in the path. In this environment, you can use the
>GStreamer tools from any folder.
**FIXME: What is this now? Just refer to /usr/bin of the distro??**
In order to allow for multiple versions of GStreamer to coexists in the
same system, these tools are versioned, this is, a GStreamer version
number is appended to their name. This version of the SDK is based on
GStreamer 1.0, so the tools are called `gst-launch-1.0`,
`gst-inspect-1.0` and `gst-discoverer-1.0`
`gst-inspect-1.0` and `gst-discoverer-1.0`
# `gst-launch-1.0`
## `gst-launch-1.0`
This tool accepts a textual description of a pipeline, instantiates it,
and sets it to the PLAYING state. It allows you to quickly check if a
@ -51,60 +53,60 @@ up to a certain level. In any case, it is extremely handy to test
pipelines quickly, and is used by GStreamer developers around the world
on a daily basis.
Please note that `gst-launch-1.0` is primarily a debugging tool for
Please note that `gst-launch-1.0` is primarily a debugging tool for
developers. You should not build applications on top of it. Instead, use
the `gst_parse_launch()` function of the GStreamer API as an easy way to
the `gst_parse_launch()` function of the GStreamer API as an easy way to
construct pipelines from pipeline descriptions.
Although the rules to construct pipeline descriptions are very simple,
the concatenation of multiple elements can quickly make such
descriptions resemble black magic. Fear not, for everyone learns the
`gst-launch-1.0` syntax, eventually.
`gst-launch-1.0` syntax, eventually.
The command line for gst-launch-1.0 consists of a list of options followed
by a PIPELINE-DESCRIPTION. Some simplified instructions are given next,
se the complete documentation at [the reference page](gst-launch-1.0.html)
for `gst-launch-1.0`.
by a PIPELINE-DESCRIPTION. Some simplified instructions are given next,
se the complete documentation at [the reference page](gst-launch.md)
for `gst-launch-1.0`.
#### Elements
### Elements
In simple form, a PIPELINE-DESCRIPTION is a list of element types
separated by exclamation marks (\!). Go ahead and type in the following
separated by exclamation marks (!). Go ahead and type in the following
command:
```
gst-launch-1.0 videotestsrc ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink
```
You should see a windows with an animated video pattern. Use CTRL+C on
the terminal to stop the program.
This instantiates a new element of type `videotestsrc` (an element which
generates a sample video pattern), an `ffmpegcolorspace` (an element
which does color space conversion, making sure other elements can
understand each other), and an `autovideosink` (a window to which video
This instantiates a new element of type `videotestsrc` (an element which
generates a sample video pattern), an `videoconvert` (an element
which does raw video format conversion, making sure other elements can
understand each other), and an `autovideosink` (a window to which video
is rendered). Then, GStreamer tries to link the output of each element
to the input of the element appearing on its right in the description.
If more than one input or output Pad is available, the Pad Caps are used
to find two compatible Pads.
#### Properties
### Properties
Properties may be appended to elements, in the form
*property=value *(multiple properties can be specified, separated by
spaces). Use the `gst-inspect-1.0` tool (explained next) to find out the
*property=value *(multiple properties can be specified, separated by
spaces). Use the `gst-inspect-1.0` tool (explained next) to find out the
available properties for an
element.
```
gst-launch-1.0 videotestsrc pattern=11 ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 videotestsrc pattern=11 ! videoconvert ! autovideosink
```
You should see a static video pattern, made of circles.
#### Named elements
### Named elements
Elements can be named using the `name` property, in this way complex
Elements can be named using the `name` property, in this way complex
pipelines involving branches can be created. Names allow linking to
elements created previously in the description, and are indispensable to
use elements with multiple output pads, like demuxers or tees, for
@ -114,7 +116,7 @@ Named elements are referred to using their name followed by a
dot.
```
gst-launch-1.0 videotestsrc ! ffmpegcolorspace ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
gst-launch-1.0 videotestsrc ! videoconvert ! tee name=t ! queue ! autovideosink t. ! queue ! autovideosink
```
You should see two video windows, showing the same sample video pattern.
@ -122,42 +124,36 @@ If you see only one, try to move it, since it is probably on top of the
second window.
This example instantiates a `videotestsrc`, linked to a
`ffmpegcolorspace`, linked to a `tee` (Remember from [Basic tutorial 7:
Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) that
a `tee` copies to each of its output pads everything coming through its
input pad). The `tee` is named simply t (using the `name` property)
and then linked to a `queue` and an `autovideosink`. The same `tee` is
`videoconvert`, linked to a `tee` (Remember from [](sdk-basic-tutorial-multithreading-and-pad-availability.md) that
a `tee` copies to each of its output pads everything coming through its
input pad). The `tee` is named simply t (using the `name` property)
and then linked to a `queue` and an `autovideosink`. The same `tee` is
referred to using t. (mind the dot) and then linked to a second
`queue` and a second `autovideosink`.
`queue` and a second `autovideosink`.
To learn why the queues are necessary read [Basic tutorial 7:
Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
To learn why the queues are necessary read [](sdk-basic-tutorial-multithreading-and-pad-availability.md).
#### Pads
### Pads
Instead of letting GStreamer choose which Pad to use when linking two
elements, you may want to specify the Pads directly. You can do this by
elements, you may want to specify the Pads directly. You can do this by
adding a dot plus the Pad name after the name of the element (it must be
a named element). Learn the names of the Pads of an element by using
the `gst-inspect-1.0` tool.
the `gst-inspect-1.0` tool.
This is useful, for example, when you want to retrieve one particular
stream out of a
demuxer:
```
gst-launch-1.0.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.video_00 ! matroskamux ! filesink location=sintel_video.mkv
```
This fetches a media file from the internet using `souphttpsrc`, which
is in webm format (a special kind of Matroska container, see [Basic
tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)). We
is in webm format (a special kind of Matroska container, see [](sdk-basic-tutorial-concepts.md)). We
then open the container using `matroskademux`. This media contains both
audio and video, so `matroskademux` will create two output Pads, named
`video_00` and `audio_00`. We link `video_00` to a `matroskamux` element
audio and video, so `matroskademux` will create two output Pads, named
`video_00` and `audio_00`. We link `video_00` to a `matroskamux` element
to re-pack the video stream into a new container, and finally link it to
a `filesink`, which will write the stream into a file named
"sintel\_video.mkv" (the `location` property specifies the name of the
@ -168,20 +164,20 @@ new matroska file with the video. If we wanted to keep only the
audio:
```
gst-launch-1.0.exe souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_00 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d d.audio_00 ! vorbisparse ! matroskamux ! filesink location=sintel_audio.mka
```
The `vorbisparse` element is required to extract some information from
The `vorbisparse` element is required to extract some information from
the stream and put it in the Pad Caps, so the next element,
`matroskamux`, knows how to deal with the stream. In the case of video
this was not necessary, because `matroskademux` already extracted this
this was not necessary, because `matroskademux` already extracted this
information and added it to the Caps.
Note that in the above two examples no media has been decoded or played.
We have just moved from one container to another (demultiplexing and
re-multiplexing again).
#### Caps filters
### Caps filters
When an element has more than one output pad, it might happen that the
link to the next element is ambiguous: the next element may have more
@ -197,10 +193,10 @@ pipeline:
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux ! filesink location=test
```
This is the same media file and demuxer as in the previous example. The
input Pad Caps of `filesink` are `ANY`, meaning that it can accept any
kind of media. Which one of the two output pads of `matroskademux` will
be linked against the filesink? `video_00` or `audio_00`? You cannot
This is the same media file and demuxer as in the previous example. The
input Pad Caps of `filesink` are `ANY`, meaning that it can accept any
kind of media. Which one of the two output pads of `matroskademux` will
be linked against the filesink? `video_00` or `audio_00`? You cannot
know.
You can remove this ambiguity, though, by using named pads, as in the
@ -213,34 +209,33 @@ gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trail
A Caps Filter behaves like a pass-through element which does nothing and
only accepts media with the given Caps, effectively resolving the
ambiguity. In this example, between `matroskademux` and `matroskamux` we
added a `video/x-vp8` Caps Filter to specify that we are interested in
the output pad of `matroskademux` which can produce this kind of video.
ambiguity. In this example, between `matroskademux` and `matroskamux` we
added a `video/x-vp8` Caps Filter to specify that we are interested in
the output pad of `matroskademux` which can produce this kind of video.
To find out the Caps an element accepts and produces, use the
`gst-inspect-1.0` tool. To find out the Caps contained in a particular file,
use the `gst-discoverer-1.0` tool. To find out the Caps an element is
producing for a particular pipeline, run `gst-launch-1.0` as usual, with the
`v` option to print Caps information.
`gst-inspect-1.0` tool. To find out the Caps contained in a particular file,
use the `gst-discoverer-1.0` tool. To find out the Caps an element is
producing for a particular pipeline, run `gst-launch-1.0` as usual, with the
`v` option to print Caps information.
#### Examples
### Examples
Play a media file using `playbin` (as in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)):
Play a media file using `playbin` (as in [](sdk-basic-tutorial-hello-world.md)):
```
gst-launch-1.0 playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm
```
A fully operation playback pipeline, with audio and video (more or less
the same pipeline that `playbin` will create
the same pipeline that `playbin` will create
internally):
```
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! ffmpegcolorspace ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! matroskademux name=d ! queue ! vp8dec ! videoconvert ! autovideosink d. ! queue ! vorbisdec ! audioconvert ! audioresample ! autoaudiosink
```
A transcoding pipeline, which opens the webm container and decodes both
A transcoding pipeline, which opens the webm container and decodes both
streams (via uridecodebin), then re-encodes the audio and video branches
with a different codec, and puts them back together in an Ogg container
(just for the sake of
@ -250,20 +245,20 @@ it).
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm name=d ! queue ! theoraenc ! oggmux name=m ! filesink location=sintel.ogg d. ! queue ! audioconvert ! audioresample ! flacenc ! m.
```
A rescaling pipeline. The `videoscale` element performs a rescaling
A rescaling pipeline. The `videoscale` element performs a rescaling
operation whenever the frame size is different in the input and the
output caps. The output caps are set by the Caps Filter to
320x200.
```
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! queue ! videoscale ! video/x-raw-yuv,width=320,height=200 ! videoconvert ! autovideosink
```
This short description of `gst-launch-1.0` should be enough to get you
This short description of `gst-launch-1.0` should be enough to get you
started. Remember that you have the [complete documentation available
here](gst-launch-1.0.html).
here](gst-launch.md).
# `gst-inspect-1.0`
## `gst-inspect-1.0`
This tool has three modes of operation:
@ -279,120 +274,114 @@ Let's see an example of the third mode:
```
gst-inspect-1.0 vp8dec
 
Factory Details:
Long name: On2 VP8 Decoder
Class: Codec/Decoder/Video
Description: Decode VP8 video streams
Author(s): David Schleef <ds@entropywave.com>
Rank: primary (256)
Rank primary (256)
Long-name On2 VP8 Decoder
Klass Codec/Decoder/Video
Description Decode VP8 video streams
Author David Schleef <ds@entropywave.com>, Sebastian Dröge <sebastian.droege@collabora.co.uk>
Plugin Details:
Name: vp8
Description: VP8 plugin
Filename: I:\gstreamer-sdk\2012.5\x86\lib\gstreamer-1.0\libgstvp8.dll
Version: 0.10.23
License: LGPL
Source module: gst-plugins-bad
Source release date: 2012-02-20
Binary package: GStreamer Bad Plug-ins (GStreamer SDK)
Origin URL: http://www.gstreamer.com
Name vpx
Description VP8 plugin
Filename /usr/lib64/gstreamer-1.0/libgstvpx.so
Version 1.6.4
License LGPL
Source module gst-plugins-good
Source release date 2016-04-14
Binary package Fedora GStreamer-plugins-good package
Origin URL http://download.fedoraproject.org
GObject
+----GstObject
+----GstElement
+----GstBaseVideoCodec
+----GstBaseVideoDecoder
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstVideoDecoder
+----GstVP8Dec
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw-yuv
format: I420
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-vp8
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw
format: I420
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Element Flags:
no flags set
Element Implementation:
Has change_state() function: gst_base_video_decoder_change_state
Has custom save_thyself() function: gst_element_save_thyself
Has custom restore_thyself() function: gst_element_restore_thyself
Has change_state() function: gst_video_decoder_change_state
Element has no clocking capabilities.
Element has no indexing capabilities.
Element has no URI handling capabilities.
Pads:
SRC: 'src'
Implementation:
Has custom eventfunc(): gst_base_video_decoder_src_event
Has custom queryfunc(): gst_base_video_decoder_src_query
Provides query types:
(1): position (Current position)
(2): duration (Total duration)
(8): convert (Converting between formats)
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Has getcapsfunc(): gst_pad_get_fixed_caps_func
Has acceptcapsfunc(): gst_pad_acceptcaps_default
Pad Template: 'src'
SINK: 'sink'
Implementation:
Has chainfunc(): gst_base_video_decoder_chain
Has custom eventfunc(): gst_base_video_decoder_sink_event
Has custom queryfunc(): gst_base_video_decoder_sink_query
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Has setcapsfunc(): gst_base_video_decoder_sink_setcaps
Has acceptcapsfunc(): gst_pad_acceptcaps_default
Pad Template: 'sink'
SRC: 'src'
Pad Template: 'src'
Element Properties:
name : The name of the object
flags: readable, writable
String. Default: "vp8dec0"
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
post-processing : Enable post processing
flags: readable, writable
Boolean. Default: false
post-processing-flags: Flags to control post processing
flags: readable, writable
Flags "GstVP8DecPostProcessingFlags" Default: 0x00000003, "demacroblock+deblock"
Flags "GstVP8DecPostProcessingFlags" Default: 0x00000403, "mfqe+demacroblock+deblock"
(0x00000001): deblock - Deblock
(0x00000002): demacroblock - Demacroblock
(0x00000004): addnoise - Add noise
(0x00000400): mfqe - Multi-frame quality enhancement
deblocking-level : Deblocking level
flags: readable, writable
Unsigned Integer. Range: 0 - 16 Default: 4
Unsigned Integer. Range: 0 - 16 Default: 4
noise-level : Noise level
flags: readable, writable
Unsigned Integer. Range: 0 - 16 Default: 0  
Unsigned Integer. Range: 0 - 16 Default: 0
threads : Maximum number of decoding threads
flags: readable, writable
Unsigned Integer. Range: 1 - 16 Default: 0
```
The most relevant sections are:
- Pad Templates (line 25): This lists all the kinds of Pads this
- Pad Templates: This lists all the kinds of Pads this
element can have, along with their capabilities. This is where you
look to find out if an element can link with another one. In this
look to find out if an element can link with another one. In this
case, it has only one sink pad template, accepting only
`video/x-vp8` (encoded video data in VP8 format) and only one source
pad template, producing `video/x-raw-yuv` (decoded video data).
- Element Properties (line 70): This lists the properties of the
`video/x-vp8` (encoded video data in VP8 format) and only one source
pad template, producing `video/x-raw` (decoded video data).
- Element Properties: This lists the properties of the
element, along with their type and accepted values.
For more information, you can check the [documentation
page](http://gst-inspect-1.0) of `gst-inspect-1.0`.
For more information, you can check the [documentation
page](gst-inspect.md) of `gst-inspect-1.0`.
# `gst-discoverer-1.0`
## `gst-discoverer-1.0`
This tool is a wrapper around the `GstDiscoverer` object shown in [Basic
tutorial 9: Media information
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html).
This tool is a wrapper around the `GstDiscoverer` object shown in [](sdk-basic-tutorial-media-information-gathering.md).
It accepts a URI from the command line and prints all information
regarding the media that GStreamer can extract. It is useful to find out
what container and codecs have been used to produce the media, and
therefore what elements you need to put in a pipeline to play it.
Use `gst-discoverer-1.0 --help` to obtain the list of available options,
Use `gst-discoverer-1.0 --help` to obtain the list of available options,
which basically control the amount of verbosity of the output.
Let's see an
@ -438,7 +427,7 @@ Properties:
Duration: 0:00:52.250000000
Seekable: yes
Tags:
video codec: On2 VP8
video codec: VP8 video
language code: en
container format: Matroska
application name: ffmpeg2theora-0.24
@ -449,15 +438,15 @@ Properties:
bitrate: 80000
```
# Conclusion
## Conclusion
This tutorial has shown:
- How to build and run GStreamer pipelines from the command line using
the `gst-launch-1.0` tool.
the `gst-launch-1.0` tool.
- How to find out what GStreamer elements you have available and their
capabilities, using the `gst-inspect-1.0` tool.
- How to discover the internal structure of media files, using
`gst-discoverer-1.0`.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!

View file

@ -1,19 +1,18 @@
# Basic tutorial 14: Handy elements
# Goal
## Goal
This tutorial gives a list of handy GStreamer elements that are worth
knowing. They range from powerful all-in-one elements that allow you to
build complex pipelines easily (like `playbin`), to little helper
build complex pipelines easily (like `playbin`), to little helper
elements which are extremely useful when debugging.
For simplicity, the following examples are given using the
`gst-launch-1.0` tool (Learn about it in [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)). Use the
`-v` command line parameter if you want to see the Pad Caps that are
being negotiated.
`gst-launch-1.0` tool (Learn about it in
[](sdk-basic-tutorial-gstreamer-tools.md)). Use the `-v` command line
parameter if you want to see the Pad Caps that are being negotiated.
# Bins
## Bins
These are Bin elements which you treat as a single element and they take
care of instantiating all the necessary internal pipeline to accomplish
@ -24,76 +23,75 @@ their task.
This element has been extensively used throughout the tutorials. It
manages all aspects of media playback, from source to display, passing
through demuxing and decoding. It is so flexible and has so many options
that a whole set of tutorials are devoted to it. See the [Playback
tutorials](Playback%2Btutorials.html) for more details.
that a whole set of tutorials are devoted to it. See the [](sdk-playback-tutorials.md) for more details.
### `uridecodebin`
This element decodes data from a URI into raw media. It selects a source
element that can handle the given URI scheme and connects it to
a `decodebin2` element. It acts like a demuxer, so it offers as many
a `decodebin` element. It acts like a demuxer, so it offers as many
source pads as streams are found in the
media.
``` bash
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! videoconvert ! autovideosink
```
``` bash
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! audioconvert ! autoaudiosink
```
### `decodebin2`
### `decodebin`
This element automatically constructs a decoding pipeline using
available decoders and demuxers via auto-plugging until raw media is
obtained.  It is used internally by `uridecodebin` which is often more
obtained. It is used internally by `uridecodebin` which is often more
convenient to use, as it creates a suitable source element as well. It
replaces the old `decodebin` element. It acts like a demuxer, so it
replaces the old `decodebin` element. It acts like a demuxer, so it
offers as many source pads as streams are found in the
media.
``` bash
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin ! autovideosink
```
# File input/output
## File input/output
### `filesrc`
This element reads a local file and produces media with `ANY` Caps. If
This element reads a local file and produces media with `ANY` Caps. If
you want to obtain the correct Caps for the media, explore the stream by
using a `typefind` element or by setting the `typefind` property
of `filesrc` to
using a `typefind` element or by setting the `typefind` property
of `filesrc` to
`TRUE`.
``` c
gst-launch-1.0 filesrc location=f:\\media\\sintel\\sintel_trailer-480p.webm ! decodebin2 ! autovideosink
gst-launch-1.0 filesrc location=f:\\media\\sintel\\sintel_trailer-480p.webm ! decodebin ! autovideosink
```
### `filesink`
This element writes to a file all the media it receives. Use the
`location` property to specify the file
`location` property to specify the file
name.
```
gst-launch-1.0 audiotestsrc ! vorbisenc ! oggmux ! filesink location=test.ogg
```
# Network
## Network
### `souphttpsrc`
This element receives data as a client over the network via HTTP using
the SOUP library. Set the URL to retrieve through the `location`
the [libsoup](https://wiki.gnome.org/Projects/libsoup) library. Set the URL to retrieve through the `location`
property.
``` bash
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin2 ! autovideosink
gst-launch-1.0 souphttpsrc location=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! decodebin ! autovideosink
```
# Test media generation
## Test media generation
These elements are very useful to check if other parts of the pipeline
are working, by replacing the source by one of these test sources which
@ -102,24 +100,24 @@ are “guaranteed” to work.
### `videotestsrc`
This element produces a video pattern (selectable among many different
options with the `pattern` property). Use it to test video pipelines.
options with the `pattern` property). Use it to test video pipelines.
``` bash
gst-launch-1.0 videotestsrc ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink
```
### `audiotestsrc`
This element produces an audio wave (selectable among many different
options with the `wave` property). Use it to test video pipelines.
options with the `wave` property). Use it to test video pipelines.
``` bash
gst-launch-1.0 audiotestsrc ! audioconvert ! autoaudiosink
```
# Video adapters
## Video adapters
### `ffmpegcolorspace`
### `videoconvert`
This element converts from one color space (e.g. RGB) to another one
(e.g. YUV). It can also convert between different YUV formats (e.g.
@ -130,13 +128,13 @@ When not needed, because its upstream and downstream elements can
already understand each other, it acts in pass-through mode having
minimal impact on the performance.
As a rule of thumb, always use `ffmpegcolorspace` whenever you use
As a rule of thumb, always use `videoconvert` whenever you use
elements whose Caps are unknown at design time, like `autovideosink`, or
that can vary depending on external factors, like decoding a
user-provided file.
``` bash
gst-launch-1.0 videotestsrc ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 videotestsrc ! videoconvert ! autovideosink
```
### `videorate`
@ -156,7 +154,7 @@ rate is unknown at design time, just in
case.
``` c
gst-launch-1.0 videotestsrc ! video/x-raw-rgb,framerate=30/1 ! videorate ! video/x-raw-rgb,framerate=1/1 ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 videotestsrc ! video/x-raw,framerate=30/1 ! videorate ! video/x-raw,framerate=1/1 ! videoconvert ! autovideosink
```
### `videoscale`
@ -172,15 +170,15 @@ and RGB formats and is therefore generally able to operate anywhere in a
pipeline.
If the video is to be output to a window whose size is controlled by the
user, it is a good idea to use a `videoscale` element, since not all
user, it is a good idea to use a `videoscale` element, since not all
video sinks are capable of performing scaling
operations.
``` bash
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! videoscale ! video/x-raw-yuv,width=178,height=100 ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm ! videoscale ! video/x-raw,width=178,height=100 ! videoconvert ! autovideosink
```
# Audio adapters
## Audio adapters
### `audioconvert`
@ -189,7 +187,7 @@ formats. It supports integer to float conversion, width/depth
conversion, signedness and endianness conversion and channel
transformations.
Like `ffmpegcolorspace` does for video, you use this to solve
Like `videoconvert` does for video, you use this to solve
negotiation problems with audio, and it is generally safe to use it
liberally, since this element does nothing if it is not needed.
@ -215,7 +213,7 @@ gst-launch-1.0 uridecodebin uri=http://docs.gstreamer.com/media/sintel_trailer-4
This element takes an incoming stream of time-stamped raw audio frames
and produces a perfect stream by inserting or dropping samples as
needed. It does not allow the sample rate to be changed
as `videorate` does, it just fills gaps and removes overlapped samples
as `videorate` does, it just fills gaps and removes overlapped samples
so the output stream is continuous and “clean”.
It is useful in situations where the timestamps are going to be lost
@ -223,13 +221,14 @@ It is useful in situations where the timestamps are going to be lost
will require all samples to be present. It is cumbersome to exemplify
this, so no example is given.
# Multithreading
![Warning](images/icons/emoticons/warning.png)
Most of the time, `audiorate` is not what you want.
## Multithreading
### `queue`
Queues have been explained in [Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
Basically, a queue performs two tasks:
Queues have been explained in [](sdk-basic-tutorial-multithreading-and-pad-availability.md). Basically, a queue performs two tasks:
- Data is queued until a selected limit is reached. Any attempt to
push more buffers into the queue blocks the pushing thread until
@ -237,14 +236,13 @@ Basically, a queue performs two tasks:
- The queue creates a new thread on the source Pad to decouple the
processing on sink and source Pads.
Additionally, `queue` triggers signals when it is about to become empty
Additionally, `queue` triggers signals when it is about to become empty
or full (according to some configurable thresholds), and can be
instructed to drop buffers instead of blocking when it is full.
As a rule of thumb, prefer the simpler `queue` element
over `queue2` whenever network buffering is not a concern to you.
See [Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html)
As a rule of thumb, prefer the simpler `queue` element
over `queue2` whenever network buffering is not a concern to you.
See [](sdk-basic-tutorial-multithreading-and-pad-availability.md)
for an example.
### `queue2`
@ -254,16 +252,15 @@ goals but follows a different implementation approach, which results in
different features. Unfortunately, it is often not easy to tell which
queue is the best choice.
`queue2` performs the two tasks listed above for `queue`, and,
`queue2` performs the two tasks listed above for `queue`, and,
additionally, is able to store the received data (or part of it) on a
disk file, for later retrieval. It also replaces the signals with the
more general and convenient buffering messages described in [Basic
tutorial 12: Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html).
more general and convenient buffering messages described in
[](sdk-basic-tutorial-streaming.md).
As a rule of thumb, prefer `queue2` over `queue` whenever network
buffering is a concern to you. See [Basic tutorial 12:
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) for an example
(`queue2` is hidden inside `playbin`).
As a rule of thumb, prefer `queue2` over `queue` whenever network
buffering is a concern to you. See [](sdk-basic-tutorial-streaming.md)
for an example (`queue2` is hidden inside `playbin`).
### `multiqueue`
@ -271,46 +268,42 @@ This element provides queues for multiple streams simultaneously, and
eases their management, by allowing some queues to grow if no data is
being received on other streams, or by allowing some queues to drop data
if they are not connected to anything (instead of returning an error, as
a simpler queue would do). Additionally, it synchronizes the different
a simpler queue would do). Additionally, it synchronizes the different
streams, ensuring that none of them goes too far ahead of the others.
This is an advanced element. It is found inside `decodebin2`, but you
This is an advanced element. It is found inside `decodebin`, but you
will rarely need to instantiate it yourself in a normal playback
application.
### `tee`
[Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) already
showed how to use a `tee` element, which splits data to multiple pads.
[](sdk-basic-tutorial-multithreading-and-pad-availability.md) already
showed how to use a `tee` element, which splits data to multiple pads.
Splitting the data flow is useful, for example, when capturing a video
where the video is shown on the screen and also encoded and written to a
file. Another example is playing music and hooking up a visualization
module.
One needs to use separate `queue` elements in each branch to provide
One needs to use separate `queue` elements in each branch to provide
separate threads for each branch. Otherwise a blocked dataflow in one
branch would stall the other
branches.
```
gst-launch-1.0 audiotestsrc ! tee name=t ! queue ! audioconvert ! autoaudiosink t. ! queue ! wavescope ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 audiotestsrc ! tee name=t ! queue ! audioconvert ! autoaudiosink t. ! queue ! wavescope ! videoconvert ! autovideosink
```
# Capabilities
## Capabilities
### `capsfilter`
[Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) already
[](sdk-basic-tutorial-gstreamer-tools.md) already
explained how to use Caps filters with `gst-launch-1.0`. When building a
pipeline programmatically, Caps filters are implemented with
the `capsfilter` element. This element does not modify data as such,
but enforces limitations on the data
format.
the `capsfilter` element. This element does not modify data as such,
but enforces limitations on the data format.
``` bash
gst-launch-1.0 videotestsrc ! video/x-raw-gray ! ffmpegcolorspace ! autovideosink
gst-launch-1.0 videotestsrc ! video/x-raw, format=GRAY8 ! videoconvert ! autovideosink
```
### `typefind`
@ -320,20 +313,19 @@ typefind functions in the order of their rank. Once the type has been
detected it sets its source Pad Caps to the found media type and emits
the `have-type` signal.
It is instantiated internally by `decodebin2`, and you can use it too to
It is instantiated internally by `decodebin`, and you can use it too to
find the media type, although you can normally use the
`GstDiscoverer` which provides more information (as seen in [Basic
tutorial 9: Media information
gathering](Basic%2Btutorial%2B9%253A%2BMedia%2Binformation%2Bgathering.html)).
`GstDiscoverer` which provides more information (as seen in
[](sdk-basic-tutorial-media-information-gathering.md)).
# Debugging
## Debugging
### `fakesink`
This sink element simply swallows any data fed to it. It is useful when
debugging, to replace your normal sinks and rule them out of the
equation. It can be very verbose when combined with the `-v` switch
of `gst-launch-1.0`, so use the `silent` property to remove any unwanted
equation. It can be very verbose when combined with the `-v` switch
of `gst-launch-1.0`, so use the `silent` property to remove any unwanted
noise.
```
@ -344,7 +336,7 @@ gst-launch-1.0 audiotestsrc num-buffers=1000 ! fakesink sync=false
This is a dummy element that passes incoming data through unmodified. It
has several useful diagnostic functions, such as offset and timestamp
checking, or buffer dropping. Read its documentation to learn all the
checking, or buffer dropping. Read its documentation to learn all the
things this seemingly harmless element can
do.
@ -352,11 +344,11 @@ do.
gst-launch-1.0 audiotestsrc ! identity drop-probability=0.1 ! audioconvert ! autoaudiosink
```
# Conclusion
## Conclusion
This tutorial has listed a few elements which are worth knowing, due to
their usefulness in the day-to-day work with GStreamer. Some are
valuable for production pipelines, whereas others are only needed for
debugging purposes.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!

View file

@ -1,6 +1,6 @@
# Basic tutorial 9: Media information gathering
# Goal
## Goal
Sometimes you might want to quickly find out what kind of media a file
(or URI) contains, or if you will be able to play the media at all. You
@ -12,9 +12,9 @@ shows:
- How to find out if a URI is playable
# Introduction
## Introduction
`GstDiscoverer` is a utility object found in the `pbutils` library
`GstDiscoverer` is a utility object found in the `pbutils` library
(Plug-in Base utilities) that accepts a URI or list of URIs, and returns
information about them. It can work in synchronous or asynchronous
modes.
@ -29,9 +29,8 @@ The recovered information includes codec descriptions, stream topology
(number of streams and sub-streams) and available metadata (like the
audio language).
![](images/icons/grey_arrow_down.gif)As an example, this is the result
As an example, this is the result
of discovering http://docs.gstreamer.com/media/sintel\_trailer-480p.webm
(Click to expand)
Duration: 0:00:52.250000000
Tags:
@ -66,15 +65,14 @@ The following code tries to discover the URI provided through the
command line, and outputs the retrieved information (If no URI is
provided it uses a default one).
This is a simplified version of what the `gst-discoverer-1.0` tool does
([Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)), which is
This is a simplified version of what the `gst-discoverer-1.0` tool does
([](sdk-basic-tutorial-gstreamer-tools.md)), which is
an application that only displays data, but does not perform any
playback.
# The GStreamer Discoverer
## The GStreamer Discoverer
Copy this code into a text file named `basic-tutorial-9.c` (or find it
Copy this code into a text file named `basic-tutorial-9.c` (or find it
in the SDK installation).
**basic-tutorial-9.c**
@ -298,31 +296,22 @@ int main (int argc, char **argv) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-2049220294" class="expand-container">
<div id="expander-control-2049220294" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-2049220294" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-9.c -o basic-tutorial-9 `pkg-config --cflags --libs gstreamer-pbutils-1.0 gstreamer-1.0`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial opens the URI passed as the first parameter in the command line (or a default URI if none is provided) and outputs information about it on the screen. If the media is located on the Internet, the application might take a bit to react depending on your connection speed.</span></p>
<p>Required libraries: <code>gstreamer-pbutils-1.0 gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Build), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](sdk-installing-on-windows.md#InstallingonWindows-Build), or use this specific command on Linux:
>
> ``gcc basic-tutorial-9.c -o basic-tutorial-9 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-pbutils-1.0` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Run), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](sdk-installing-on-windows.md#InstallingonWindows-Run).
>
> This tutorial opens the URI passed as the first parameter in the command line (or a default URI if none is provided) and outputs information about it on the screen. If the media is located on the Internet, the application might take a bit to react depending on your connection speed.
>
> Required libraries: `gstreamer-pbutils-1.0` `gstreamer-1.0`
## Walkthrough
These are the main steps to use the `GstDiscoverer`:
@ -336,9 +325,9 @@ if (!data.discoverer) {
}
```
`gst_discoverer_new()` creates a new Discoverer object. The first
`gst_discoverer_new()` creates a new Discoverer object. The first
parameter is the timeout per file, in nanoseconds (use the
`GST_SECOND` macro for simplicity).
`GST_SECOND` macro for simplicity).
``` c
/* Connect to the interesting signals */
@ -354,7 +343,7 @@ snippet for their callbacks.
gst_discoverer_start (data.discoverer);
```
`gst_discoverer_start()` launches the discovering process, but we have
`gst_discoverer_start()` launches the discovering process, but we have
not provided any URI to discover yet. This is done
next:
@ -367,7 +356,7 @@ if (!gst_discoverer_discover_uri_async (data.discoverer, uri)) {
}
```
`gst_discoverer_discover_uri_async()` enqueues the provided URI for
`gst_discoverer_discover_uri_async()` enqueues the provided URI for
discovery. Multiple URIs can be enqueued with this function. As the
discovery process for each of them finishes, the registered callback
functions will be fired
@ -380,8 +369,8 @@ g_main_loop_run (data.loop);
```
The usual GLib main loop is instantiated and executed. We will get out
of it when `g_main_loop_quit()` is called from the
`on_finished_cb` callback.
of it when `g_main_loop_quit()` is called from the
`on_finished_cb` callback.
``` c
/* Stop the discoverer process */
@ -389,7 +378,7 @@ gst_discoverer_stop (data.discoverer);
```
Once we are done with the discoverer, we stop it with
`gst_discoverer_stop()` and unref it with `g_object_unref()`.
`gst_discoverer_stop()` and unref it with `g_object_unref()`.
Let's review now the callbacks we have
registered:
@ -408,11 +397,11 @@ static void on_discovered_cb (GstDiscoverer *discoverer, GstDiscovererInfo *info
```
We got here because the Discoverer has finished working on one URI, and
provides us a `GstDiscovererInfo` structure with all the information.
provides us a `GstDiscovererInfo` structure with all the information.
The first step is to retrieve the particular URI this call refers to (in
case we had multiple discover process running, which is not the case in
this example) with `gst_discoverer_info_get_uri()` and the discovery
this example) with `gst_discoverer_info_get_uri()` and the discovery
result with `gst_discoverer_info_get_result()`.
``` c
@ -451,15 +440,15 @@ if (result != GST_DISCOVERER_OK) {
}
```
As the code shows, any result other than `GST_DISCOVERER_OK` means that
As the code shows, any result other than `GST_DISCOVERER_OK` means that
there has been some kind of problem, and this URI cannot be played. The
reasons can vary, but the enum values are quite explicit
(`GST_DISCOVERER_BUSY` can only happen when in synchronous mode, which
(`GST_DISCOVERER_BUSY` can only happen when in synchronous mode, which
is not used in this example).
If no error happened, information can be retrieved from the
`GstDiscovererInfo` structure with the different
`gst_discoverer_info_get_*` methods (like,
`GstDiscovererInfo` structure with the different
`gst_discoverer_info_get_*` methods (like,
`gst_discoverer_info_get_duration()`, for example).
Bits of information which are made of lists, like tags and stream info,
@ -474,10 +463,10 @@ if (tags) {
```
Tags are metadata (labels) attached to the media. They can be examined
with `gst_tag_list_foreach()`, which will call `print_tag_foreach` for
with `gst_tag_list_foreach()`, which will call `print_tag_foreach` for
each tag found (the list could also be traversed manually, for example,
or a specific tag could be searched for with
`gst_tag_list_get_string()`). The code for `print_tag_foreach` is pretty
`gst_tag_list_get_string()`). The code for `print_tag_foreach` is pretty
much self-explicative.
``` c
@ -492,10 +481,10 @@ print_topology (sinfo, 1);
gst_discoverer_stream_info_unref (sinfo);
```
`gst_discoverer_info_get_stream_info()` returns
a `GstDiscovererStreamInfo` structure that is parsed in
the `print_topology` function, and then discarded
with `gst_discoverer_stream_info_unref()`.
`gst_discoverer_info_get_stream_info()` returns
a `GstDiscovererStreamInfo` structure that is parsed in
the `print_topology` function, and then discarded
with `gst_discoverer_stream_info_unref()`.
``` c
/* Print information regarding a stream and its substreams, if any */
@ -524,25 +513,25 @@ static void print_topology (GstDiscovererStreamInfo *info, gint depth) {
}
```
The `print_stream_info` function's code is also pretty much
self-explicative: it prints the stream's capabilities and then the
associated caps, using `print_tag_foreach` too.
The `print_stream_info` function's code is also pretty much
self-explicative: it prints the stream's capabilities and then the
associated caps, using `print_tag_foreach` too.
Then, `print_topology` looks for the next element to display. If
`gst_discoverer_stream_info_get_next()` returns a non-NULL stream info,
Then, `print_topology` looks for the next element to display. If
`gst_discoverer_stream_info_get_next()` returns a non-NULL stream info,
it refers to our descendant and that should be displayed. Otherwise, if
we are a container, recursively call `print_topology` on each of our
we are a container, recursively call `print_topology` on each of our
children obatined with `gst_discoverer_container_info_get_streams()`.
Otherwise, we are a final stream, and do not need to recurse (This part
of the Discoverer API is admittedly a bit obscure).
# Conclusion
## Conclusion
This tutorial has shown:
- How to recover information regarding a URI using the `GstDiscoverer`
- How to recover information regarding a URI using the `GstDiscoverer`
- How to find out if a URI is playable by looking at the return code
obtained with `gst_discoverer_info_get_result()`.
obtained with `gst_discoverer_info_get_result()`.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!

View file

@ -1,4 +1,4 @@
# Basic tutorial 7: Multithreading and Pad Availability
# Basic tutorial 7: Multithreading and Pad Availability
## Goal
@ -184,7 +184,7 @@ int main(int argc, char *argv[]) {
> ``gcc basic-tutorial-7.c -o basic-tutorial-7 `pkg-config --cflags --libs gstreamer-1.0` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Run), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](sdk-installing-on-windows.md#InstallingonWindows-Run).
>
>
> This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.
>
> Required libraries: `gstreamer-1.0`

View file

@ -1,55 +1,57 @@
# Basic tutorial 16: Platform-specific elements
# Goal
## Goal
Even though GStreamer is a multiplatform framework, not all the elements
are available on all platforms. For example, the audio and video sinks
are available on all platforms. For example, the video sinks
depend heavily on the underlying windowing system, and a different one
needs to be selected depending on the platform. You normally do not need
to worry about this when using elements like `playbin` or
to worry about this when using elements like `playbin` or
`autovideosink`, but, for those cases when you need to use one of the
sinks that are only available on specific platforms, this tutorial hints
you some of their peculiarities.
# Linux
## Cross Platform
### `glimagesink`
This video sink is based on
[OpenGL](http://en.wikipedia.org/wiki/OpenGL) or [OpenGL ES](http://en.wikipedia.org/wiki/OpenGL ES). It supports rescaling
and filtering of the scaled image to alleviate aliasing. It implements
the VideoOverlay interface, so the video window can be re-parented
(embedded inside other windows). This is the video sink recommended on
most platforms. In particular, on Android and iOS, it is the only
available video sink. It can be decomposed into
`glupload ! glcolorconvert ! glimagesinkelement` to insert further OpenGL
hardware accelerated processing into the pipeline.
## Linux
### `ximagesink`
A standard X-based video sink. It implements the XOverlay interface, so
the video window can be re-parented (embedded inside other windows). It
does not support scaling; it has to be performed by different means
(using the `videoscale` element, for example).
A standard RGB only X-based video sink. It implements the VideoOverlay
interface, so the video window can be re-parented (embedded inside
other windows). It does not support scaling or color formats other
than RGB; it has to be performed by different means (using the
`videoscale` element, for example).
### `xvimagesink`
An X-based video sink, using the [X Video
Extension](http://en.wikipedia.org/wiki/X_video_extension) (Xv). It
implements the XOverlay interface, so the video window can be
implements the VideoOverlay interface, so the video window can be
re-parented (embedded inside other windows). It can perform scaling
efficiently, on the GPU. It is only available if the hardware and
corresponding drivers support the Xv extension.
### `cluttersink`
This is a GStreamer video sink element that sends data to a
[ClutterTexture](http://developer.gnome.org/clutter-gst/stable/ClutterGstVideoTexture.html)
which can then be used in any [Clutter](https://clutter-project.org/)
scene (See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)). Clutter
is a multiplatform library, so this sink is available on every platform
(as long as Clutter is installed). Clutter achieves
platform-independence by using [OpenGL](http://www.opengl.org) as the
rendering backend, so OpenGL must also be available in the system.
### `alsasink`
This audio sink outputs to the sound card via
[ALSA](http://www.alsa-project.org/) (Advanced Linux Sound
Architecture). This sink is available on almost every Linux platform. It
is often seen as a “low level” interface to the sound card, and can be
complicated to configure (See the comment on [Playback tutorial 9:
Digital audio
pass-through](Playback%2Btutorial%2B9%253A%2BDigital%2Baudio%2Bpass-through.html)).
complicated to configure (See the comment on
[](sdk-playback-tutorial-digital-audio-pass-through.md)).
### `pulsesink`
@ -58,29 +60,18 @@ server. It is a higher level abstraction of the sound card than ALSA,
and is therefore easier to use and offers more advanced features. It has
been known to be unstable on some older Linux distributions, though.
# Mac OS X
## Mac OS X
### `osxvideosink`
This is the only video sink available to GStreamer on Mac OS X.
### `cluttersink`
This is a GStreamer video sink element that sends data to
a [ClutterTexture](http://developer.gnome.org/clutter-gst/stable/ClutterGstVideoTexture.html) which
can then be used in any [Clutter](https://clutter-project.org/) scene
(See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)). Clutter
is a multiplatform library, so this sink is available on every platform
(as long as Clutter is installed). Clutter achieves
platform-independence by using [OpenGL](http://www.opengl.org/) as the
rendering backend, so OpenGL must also be available in the system.
This is the video sink available to GStreamer on Mac OS X. It is also
possible to draw using `glimagesink` using OpenGL.
### `osxaudiosink`
This is the only audio sink available to GStreamer on Mac OS X.
# Windows
## Windows
### `directdrawsink`
@ -92,15 +83,15 @@ rescaling and filtering of the scaled image to alleviate aliasing.
### `dshowvideosink`
This video sink is based on [Direct
Show](http://en.wikipedia.org/wiki/Direct_Show).  It can use different
Show](http://en.wikipedia.org/wiki/Direct_Show). It can use different
rendering back-ends, like
[EVR](http://en.wikipedia.org/wiki/Enhanced_Video_Renderer),
[VMR9](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters)
or
[VMR7](http://en.wikipedia.org/wiki/Direct_Show#Video_rendering_filters),
EVR only being available on Windows Vista or more recent. It supports
rescaling and filtering of the scaled image to alleviate aliasing. It
implements the XOverlay interface, so the video window can be
rescaling and filtering of the scaled image to alleviate aliasing. It
implements the VideoOverlay interface, so the video window can be
re-parented (embedded inside other windows).
### `d3dvideosink`
@ -108,22 +99,10 @@ re-parented (embedded inside other windows).
This video sink is based on
[Direct3D](http://en.wikipedia.org/wiki/Direct3D) and its the most
recent Windows video sink. It supports rescaling and filtering of the
scaled image to alleviate aliasing. It implements the XOverlay
scaled image to alleviate aliasing. It implements the VideoOverlay
interface, so the video window can be re-parented (embedded inside other
windows).
### `cluttersink`
This is a GStreamer video sink element that sends data to
a [ClutterTexture](http://developer.gnome.org/clutter-gst/stable/ClutterGstVideoTexture.html) which
can then be used in any [Clutter](https://clutter-project.org/) scene
(See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)). Clutter
is a multiplatform library, so this sink is available on every platform
(as long as Clutter is installed). Clutter achieves
platform-independence by using [OpenGL](http://www.opengl.org/) as the
rendering backend, so OpenGL must also be available in the system.
### `directsoundsink`
This is the default audio sink for Windows, based on [Direct
@ -132,50 +111,44 @@ all Windows versions.
### `dshowdecwrapper`
[Direct Show](http://en.wikipedia.org/wiki/Direct_Show) is a multimedia
[Direct Show](http://en.wikipedia.org/wiki/Direct_Show) is a multimedia
framework similar to GStreamer. They are different enough, though, so
that their pipelines cannot be interconnected. However, through this
element, GStreamer can benefit from the decoding elements present in
Direct Show. `dshowdecwrapper` wraps multiple Direct Show decoders so
they can be embedded in a GStreamer pipeline. Use the `gst-inspect-1.0` tool
(see [Basic tutorial 10: GStreamer
Direct Show. `dshowdecwrapper` wraps multiple Direct Show decoders so
they can be embedded in a GStreamer pipeline. Use the `gst-inspect-1.0` tool
(see [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html)) to see the
available decoders.
# Android
### `eglglessink`
This video sink is based on [OpenGL
ES](http://en.wikipedia.org/wiki/OpenGL_ES) and
[EGL](http://en.wikipedia.org/wiki/EGL_%28OpenGL%29). It supports
rescaling and filtering of the scaled image to alleviate aliasing. It
implements the XOverlay interface, so the video window can be
re-parented (embedded inside other windows).
## Android
### `openslessink`
This is the only audio sink available to GStreamer on Android. It is
based on [OpenSL
ES](http://en.wikipedia.org/wiki/OpenSL_ES).
based on [OpenSL ES](http://en.wikipedia.org/wiki/OpenSL_ES).
### `openslessrc`
This is the only audio source available to GStreamer on Android. It is
based on [OpenSL ES](http://en.wikipedia.org/wiki/OpenSL_ES).
### `androidmedia`
[android.media.MediaCodec](http://developer.android.com/reference/android/media/MediaCodec.html)
is an Android specific API to access the codecs that are available on
the device, including hardware codecs. It is available since API level
16 (JellyBean) and GStreamer can use it via the androidmedia plugin for
audio and video decoding.
16 (JellyBean) and GStreamer can use it via the androidmedia plugin
for audio and video decoding. On Android, attaching the hardware
decoder to the `glimagesink` element can produce a high performance
zero-copy decodebin pipeline.
# iOS
### `ahcsrc`
### `eglglessink`
This video source can capture from the cameras on Android devices, it is part
of the androidmedia plugin and uses the [android.hardware.Camera API](https://developer.android.com/reference/android/hardware/Camera.html).
This video sink is based on [OpenGL
ES](http://en.wikipedia.org/wiki/OpenGL_ES) and [EGL](http://en.wikipedia.org/wiki/EGL_%28OpenGL%29).
It supports rescaling and filtering of the scaled image to alleviate
aliasing. It implements the XOverlay interface, so the video window can
be re-parented (embedded inside other windows).
## iOS
### `osxaudiosink`
@ -185,23 +158,23 @@ This is the only audio sink available to GStreamer on iOS.
Source element to read iOS assets, this is, documents stored in the
Library (like photos, music and videos). It can be instantiated
automatically by `playbin` when URIs use the
`assets-library://` scheme.
automatically by `playbin` when URIs use the
`assets-library://` scheme.
### `iosavassetsrc`
Source element to read and decode iOS audiovisual assets, this is,
documents stored in the Library (like photos, music and videos). It can
be instantiated automatically by `playbin` when URIs use the
`ipod-library://` scheme. Decoding is performed by the system, so
be instantiated automatically by `playbin` when URIs use the
`ipod-library://` scheme. Decoding is performed by the system, so
dedicated hardware will be used if available.
# Conclusion
## Conclusion
This tutorial has shown a few specific details about some GStreamer
elements which are not available on all platforms. You do not have to
worry about them when using multiplatform elements like `playbin` or
worry about them when using multiplatform elements like `playbin` or
`autovideosink`, but it is good to know their personal quirks if
instancing them manually.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!

View file

@ -1,6 +1,6 @@
# Basic tutorial 13: Playback speed
# Goal
## Goal
Fast-forward, reverse-playback and slow-motion are all techniques
collectively known as *trick modes* and they all have in common that
@ -12,7 +12,7 @@ shows:
forward and backwards.
- How to advance a video frame-by-frame
# Introduction
## Introduction
Fast-forward is the technique that plays a media at a speed higher than
its normal (intended) speed; whereas slow-motion uses a speed lower than
@ -30,45 +30,37 @@ media besides changing the subsequent playback rate (only to positive
values). Seek Events, additionally, allow jumping to any position in the
stream and set positive and negative playback rates.
In [Basic tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) seek
In [](sdk-basic-tutorial-time-management.md) seek
events have already been shown, using a helper function to hide their
complexity. This tutorial explains a bit more how to use these events.
Step Events are a more convenient way of changing the playback rate, due
to the reduced number of parameters needed to create them; however,
their implementation in GStreamer still needs a bit more polishing
so Seek Events are used in this tutorial instead.
so Seek Events are used in this tutorial instead.
**FIXME: Is that even true ???**
To use these events, they are created and then passed onto the pipeline,
where they propagate upstream until they reach an element that can
handle them. If an event is passed onto a bin element like `playbin`,
it will simply feed the event to all its sinks, which will result in
multiple seeks being performed. The common approach is to retrieve one
of `playbin`s sinks through the `video-sink` or
`audio-sink` properties and feed the event directly into the sink.
of `playbin`s sinks through the `video-sink` or
`audio-sink` properties and feed the event directly into the sink.
Frame stepping is a technique that allows playing a video frame by
frame. It is implemented by pausing the pipeline, and then sending Step
Events to skip one frame each time.
# A trick mode player
## A trick mode player
Copy this code into a text file named `basic-tutorial-13.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
Copy this code into a text file named `basic-tutorial-13.c`.
**basic-tutorial-13.c**
``` c
#include <string.h>
#include <stdio.h>
#include <gst/gst.h>
typedef struct _CustomData {
@ -184,7 +176,7 @@ int main(int argc, char *argv[]) {
data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32
#ifdef G_OS_WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
#else
io_stdin = g_io_channel_unix_new (fileno (stdin));
@ -216,34 +208,24 @@ int main(int argc, char *argv[]) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-1662010270" class="expand-container">
<div id="expander-control-1662010270" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-1662010270" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-13.c -o basic-tutorial-13 `pkg-config --cflags --libs gstreamer-1.0`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The console shows the available commands, composed of a single upper-case or lower-case letter, which you should input followed by the Enter key.</span></p>
<p>Required libraries: <code>gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Build), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](sdk-installing-on-windows.mdb#InstallingonWindows-Build), or use this specific command on Linux:
>
> `` gcc basic-tutorial-13.c -o basic-tutorial-13 `pkg-config --cflags --libs gstreamer-1.0` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Run), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](sdk-installing-on-windows.md#InstallingonWindows-Run).
>
> This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The console shows the available commands, composed of a single upper-case or lower-case letter, which you should input followed by the Enter key.
>
> Required libraries: `gstreamer-1.0`
There is nothing new in the initialization code in the main function:  a
`playbin` pipeline is instantiated, an I/O watch is installed to track
## Walkthrough
There is nothing new in the initialization code in the main function: a
`playbin` pipeline is instantiated, an I/O watch is installed to track
keystrokes and a GLib main loop is executed.
Then, in the keyboard handler function:
@ -265,7 +247,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
break;
```
Pause / Playing toggle is handled with `gst_element_set_state()` as in
Pause / Playing toggle is handled with `gst_element_set_state()` as in
previous tutorials.
``` c
@ -285,18 +267,17 @@ case 'd':
Use S and s to double or halve the current playback rate, and d to
reverse the current playback direction. In both cases, the
`rate` variable is updated and `send_seek_event` is called. Lets
`rate` variable is updated and `send_seek_event` is called. Lets
review this function.
``` c
/* Send seek event to change rate */
static void send_seek_event (CustomData *data) {
gint64 position;
GstFormat format = GST_FORMAT_TIME;
GstEvent *seek_event;
/* Obtain the current position, needed for the seek event */
if (!gst_element_query_position (data->pipeline, &format, &position)) {
if (!gst_element_query_position (data->pipeline, GST_FORMAT_TIME, &position)) {
g_printerr ("Unable to retrieve current position.\n");
return;
}
@ -336,7 +317,7 @@ if (data->video_sink == NULL) {
As explained in the Introduction, to avoid performing multiple Seeks,
the Event is sent to only one sink, in this case, the video sink. It is
obtained from `playbin` through the `video-sink` property. It is read
obtained from `playbin` through the `video-sink` property. It is read
at this time instead at initialization time because the actual sink may
change depending on the media contents, and this wont be known until
the pipeline is PLAYING and some media has been read.
@ -369,36 +350,25 @@ A new Step Event is created with `gst_event_new_step()`, whose
parameters basically specify the amount to skip (1 frame in the example)
and the new rate (which we do not change).
The video sink is grabbed from `playbin` in case we didnt have it yet,
The video sink is grabbed from `playbin` in case we didnt have it yet,
just like before.
And with this we are done. When testing this tutorial, keep in mind that
backward playback is not optimal in many elements.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
<td><p>Changing the playback rate might only work with local files. If you cannot modify it, try changing the URI passed to <code>playbin</code> in line 114 to a local URI, starting with <code>file:///</code></p></td>
</tr>
</tbody>
> ![Warning](images/icons/emoticons/warning.png)
>
>Changing the playback rate might only work with local files. If you cannot modify it, try changing the URI passed to `playbin` in line 114 to a local URI, starting with `file:///`
</table>
# Conclusion
## Conclusion
This tutorial has shown:
- How to change the playback rate using a Seek Event, created with
`gst_event_new_seek()` and fed to the pipeline
with `gst_element_send_event()`.
`gst_event_new_seek()` and fed to the pipeline
with `gst_element_send_event()`.
- How to advance a video frame-by-frame by using Step Events, created
with `gst_event_new_step()`.
with `gst_event_new_step()`.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-13.c](attachments/327800/2424883.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327800/2424884.zip) (application/zip)
It has been a pleasure having you here, and see you soon!

View file

@ -12,8 +12,7 @@ any time, in a variety of ways. This tutorial shows:
- How to access and manipulate this data.
[Playback tutorial 3: Short-cutting the
pipeline](sdk-playback-tutorial-short-cutting-the-pipeline.md) explains
[](sdk-playback-tutorial-short-cutting-the-pipeline.md) explains
how to achieve the same goals in a playbin-based pipeline.
## Introduction
@ -69,8 +68,7 @@ this simplified vision should suffice for now.
As an example, a `filesrc` (a GStreamer element that reads files)
produces buffers with the “ANY” caps and no time-stamping information.
After demuxing (see [Basic tutorial 3: Dynamic
pipelines](sdk-basic-tutorial-dynamic-pipelines.md))
After demuxing (see [](sdk-basic-tutorial-dynamic-pipelines.md))
buffers can have some specific caps, for example “video/x-h264”. After
decoding, each buffer will contain a single video frame with raw caps
(for example, “video/x-raw-yuv”) and very precise time stamps indicating
@ -78,8 +76,7 @@ when should that frame be displayed.
### This tutorial
This tutorial expands [Basic tutorial 7: Multithreading and Pad
Availability](sdk-basic-tutorial-multithreading-and-pad-availability.md) in
This tutorial expands [](sdk-basic-tutorial-multithreading-and-pad-availability.md) in
two ways: firstly, the `audiotestsrc` is replaced by an `appsrc` that
will generate the audio data. Secondly, a new branch is added to the
`tee` so data going into the audio sink and the wave display is also
@ -498,9 +495,7 @@ gst_buffer_unref (buffer);
```
Once we have the buffer ready, we pass it to `appsrc` with the
`push-buffer` action signal (see information box at the end of [Playback
tutorial 1: Playbin
usage](sdk-playback-tutorial-playbin-usage.md)), and then
`push-buffer` action signal (see information box at the end of [](sdk-playback-tutorial-playbin-usage.md)), and then
`gst_buffer_unref()` it since we no longer need it.
``` c
@ -539,8 +534,7 @@ This tutorial has shown how applications can:
- Manipulate this data by accessing the `GstBuffer`.
In a playbin-based pipeline, the same goals are achieved in a slightly
different way. [Playback tutorial 3: Short-cutting the
pipeline](sdk-playback-tutorial-short-cutting-the-pipeline.md) shows
different way. [](sdk-playback-tutorial-short-cutting-the-pipeline.md) shows
how to do it.
It has been a pleasure having you here, and see you soon\!

View file

@ -1,6 +1,6 @@
# Basic tutorial 12: Streaming
# Goal
## Goal
Playing media straight from the Internet without storing it locally is
known as Streaming. We have been doing it throughout the tutorials
@ -11,7 +11,7 @@ particular:
- How to enable buffering (to alleviate network problems)
- How to recover from interruptions (lost clock)
# Introduction
## Introduction
When streaming, media chunks are decoded and queued for presentation as
soon as they arrive form the network. This means that if a chunk is
@ -26,7 +26,7 @@ waiting.
As it turns out, this solution is already implemented in GStreamer, but
the previous tutorials have not been benefiting from it. Some elements,
like the `queue2` and `multiqueue` found inside `playbin`, are capable
like the `queue2` and `multiqueue` found inside `playbin`, are capable
of building this buffer and post bus messages regarding the buffer level
(the state of the queue). An application wanting to have more network
resilience, then, should listen to these messages and pause playback if
@ -45,18 +45,9 @@ When the clock is lost, the application receives a message on the bus;
to select a new one, the application just needs to set the pipeline to
PAUSED and then to PLAYING again.
# A network-resilient example
## A network-resilient example
Copy this code into a text file named `basic-tutorial-12.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
Copy this code into a text file named `basic-tutorial-12.c`.
**basic-tutorial-12.c**
@ -162,35 +153,24 @@ int main(int argc, char *argv[]) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-8150053" class="expand-container">
<div id="expander-control-8150053" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-8150053" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-10.c -o basic-tutorial-10 `pkg-config --cflags --libs gstreamer-1.0`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. In the console window, you should see a buffering message, and playback should only start when the buffering reaches 100%. This percentage might not change at all if your connection is fast enough and buffering is not required.</p>
<p>Required libraries: <code>gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Build), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Build) or [Windows](sdk-installing-on-windows.mdb#InstallingonWindows-Build), or use this specific command on Linux:
>
> `` gcc basic-tutorial-12.c -o basic-tutorial-12 `pkg-config --cflags --libs gstreamer-1.0` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](sdk-installing-on-linux.md#InstallingonLinux-Run), [Mac OS X](sdk-installing-on-mac-osx.md#InstallingonMacOSX-Run) or [Windows](sdk-installing-on-windows.md#InstallingonWindows-Run).
>
> This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. In the console window, you should see a buffering message, and playback should only start when the buffering reaches 100%. This percentage might not change at all if your connection is fast enough and buffering is not required.
>
> Required libraries: `gstreamer-1.0`
# Walkthrough
## Walkthrough
The only special thing this tutorial does is react to certain messages;
therefore, the initialization code is very simple and should be
self-explanative by now. The only new bit is the detection of live
self-explanative by now. The only new bit is the detection of live
streams:
``` c
@ -208,14 +188,14 @@ if (ret == GST_STATE_CHANGE_FAILURE) {
Live streams cannot be paused, so they behave in PAUSED state as if they
were in the PLAYING state. Setting live streams to PAUSED succeeds, but
returns `GST_STATE_CHANGE_NO_PREROLL`, instead of
`GST_STATE_CHANGE_SUCCESS` to indicate that this is a live stream. We
`GST_STATE_CHANGE_SUCCESS` to indicate that this is a live stream. We
are receiving the NO\_PROROLL return code even though we are trying to
set the pipeline to PLAYING, because state changes happen progressively
(from NULL to READY, to PAUSED and then to PLAYING).
We care about live streams because we want to disable buffering for
them, so we take note of the result of `gst_element_set_state()` in the
`is_live` variable.
them, so we take note of the result of `gst_element_set_state()` in the
`is_live` variable.
Lets now review the interesting parts of the message parsing callback:
@ -239,7 +219,7 @@ case GST_MESSAGE_BUFFERING: {
First, if this is a live source, ignore buffering messages.
We parse the buffering message with `gst_message_parse_buffering()` to
We parse the buffering message with `gst_message_parse_buffering()` to
retrieve the buffering level.
Then, we print the buffering level on the console and set the pipeline
@ -264,7 +244,7 @@ For the second network issue, the loss of clock, we simply set the
pipeline to PAUSED and back to PLAYING, so a new clock is selected,
waiting for new media chunks to be received if necessary.
# Conclusion
## Conclusion
This tutorial has described how to add network resilience to your
application with two very simple precautions:
@ -275,11 +255,4 @@ application with two very simple precautions:
Handling these messages improves the applications response to network
problems, increasing the overall playback smoothness.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-12.c](attachments/327806/2424843.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/327806/2424844.zip) (application/zip)
It has been a pleasure having you here, and see you soon!