Update all playback tutorials

This commit is contained in:
Olivier Crête 2016-06-15 21:26:59 -04:00
parent 820891918c
commit 06563a56ef
11 changed files with 507 additions and 779 deletions

46
TODO.md
View file

@ -4,15 +4,6 @@ This is just a simple TODO list to follow progress of the port from
gstreamer.com content to hotdoc gstreamer.com content to hotdoc
Pages to review: Pages to review:
- sdk-playback-tutorials.md
- sdk-playback-tutorial-subtitle-management.md
- sdk-playback-tutorial-short-cutting-the-pipeline.md
- sdk-playback-tutorial-progressive-streaming.md
- sdk-playback-tutorial-color-balance.md
- sdk-playback-tutorial-audio-visualization.md
- sdk-playback-tutorial-custom-playbin-sinks.md
- sdk-playback-tutorial-hardware-accelerated-video-decoding.md
- sdk-playback-tutorial-digital-audio-pass-through.md
- sdk-android-tutorials.md - sdk-android-tutorials.md
- sdk-android-tutorial-video.md - sdk-android-tutorial-video.md
- sdk-android-tutorial-media-player.md - sdk-android-tutorial-media-player.md
@ -48,6 +39,19 @@ Code:
Reviewed pages: Reviewed pages:
- index.md - index.md
- sdk-basic-tutorials.md - sdk-basic-tutorials.md
- sdk-basic-tutorial-concepts.md
- sdk-basic-tutorial-dynamic-pipelines.md
- sdk-basic-tutorial-time-management.md
- sdk-basic-tutorial-media-formats-and-pad-capabilities.md
- sdk-basic-tutorial-multithreading-and-pad-availability.md
- sdk-basic-tutorial-short-cutting-the-pipeline.md
- sdk-basic-tutorial-media-information-gathering.md
- sdk-basic-tutorial-gstreamer-tools.md
- sdk-basic-tutorial-debugging-tools.md
- sdk-basic-tutorial-streaming.md
- sdk-basic-tutorial-playback-speed.md
- sdk-basic-tutorial-handy-elements.md
- sdk-basic-tutorial-platform-specific-elements.md
- sdk-installing.md - sdk-installing.md
- sdk-installing-for-android-development.md - sdk-installing-for-android-development.md
- sdk-building-from-source-using-cerbero.md - sdk-building-from-source-using-cerbero.md
@ -56,23 +60,19 @@ Reviewed pages:
- sdk-android-tutorial-link-against-gstreamer.md - sdk-android-tutorial-link-against-gstreamer.md
- sdk-android-tutorial-a-running-pipeline.md - sdk-android-tutorial-a-running-pipeline.md
- sdk-api-reference.md - sdk-api-reference.md
- sdk-playback-tutorial-playbin-usage.md - sdk-playback-tutorials.md
- sdk-playback-tutorial-playbin-usage.md
- sdk-playback-tutorial-subtitle-management.md
- sdk-playback-tutorial-short-cutting-the-pipeline.md
- sdk-playback-tutorial-progressive-streaming.md
- sdk-playback-tutorial-color-balance.md
- sdk-playback-tutorial-audio-visualization.md
- sdk-playback-tutorial-custom-playbin-sinks.md
- sdk-playback-tutorial-hardware-accelerated-video-decoding.md
- sdk-playback-tutorial-digital-audio-pass-through.md
- sdk-basic-tutorial-hello-world.md - sdk-basic-tutorial-hello-world.md
- sdk-gst-inspect.md - sdk-gst-inspect.md
- gst-launch.md - gst-launch.md
- sdk-basic-tutorial-concepts.md
- sdk-basic-tutorial-dynamic-pipelines.md
- sdk-basic-tutorial-time-management.md
- sdk-basic-tutorial-media-formats-and-pad-capabilities.md
- sdk-basic-tutorial-multithreading-and-pad-availability.md
- sdk-basic-tutorial-short-cutting-the-pipeline.md
- sdk-basic-tutorial-media-information-gathering.md
- sdk-basic-tutorial-gstreamer-tools.md
- sdk-basic-tutorial-debugging-tools.md
- sdk-basic-tutorial-streaming.md
- sdk-basic-tutorial-playback-speed.md
- sdk-basic-tutorial-handy-elements.md
- sdk-basic-tutorial-platform-specific-elements.md
For-later pages: For-later pages:
- sdk-qt-tutorials.md [tpm: this should all be rewritten from scratch with qmlglsink; QtGStreamer is outdated and unmaintained, we should not promote it] - sdk-qt-tutorials.md [tpm: this should all be rewritten from scratch with qmlglsink; QtGStreamer is outdated and unmaintained, we should not promote it]

View file

@ -1,6 +1,6 @@
# Playback tutorial 6: Audio visualization # Playback tutorial 6: Audio visualization
# Goal ## Goal
GStreamer comes with a set of elements that turn audio into video. They GStreamer comes with a set of elements that turn audio into video. They
can be used for scientific visualization or to spice up your music can be used for scientific visualization or to spice up your music
@ -9,33 +9,24 @@ player, for example. This tutorial shows:
- How to enable audio visualization - How to enable audio visualization
- How to select the visualization element - How to select the visualization element
# Introduction ## Introduction
Enabling audio visualization in `playbin` is actually very easy. Just Enabling audio visualization in `playbin` is actually very easy. Just
set the appropriate `playbin` flag and, when an audio-only stream is set the appropriate `playbin` flag and, when an audio-only stream is
found, it will instantiate the necessary elements to create and display found, it will instantiate the necessary elements to create and display
the visualization. the visualization.
If you want to specify the actual element that you want to use to If you want to specify the actual element that you want to use to
generate the visualization, you instantiate it yourself and then tell generate the visualization, you instantiate it yourself and then tell
`playbin` about it through the `vis-plugin` property. `playbin` about it through the `vis-plugin` property.
This tutorial searches the GStreamer registry for all the elements of This tutorial searches the GStreamer registry for all the elements of
the Visualization class, tries to select `goom` (or another one if it is the Visualization class, tries to select `goom` (or another one if it is
not available) and passes it to `playbin`. not available) and passes it to `playbin`.
# A fancy music player ## A fancy music player
Copy this code into a text file named `playback-tutorial-6.c`. Copy this code into a text file named `playback-tutorial-6.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
**playback-tutorial-6.c** **playback-tutorial-6.c**
@ -72,7 +63,7 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Get a list of all visualization plugins */ /* Get a list of all visualization plugins */
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL); list = gst_registry_feature_filter (gst_registry_get (), filter_vis_features, FALSE, NULL);
/* Print their names */ /* Print their names */
g_print("Available visualization plugins:\n"); g_print("Available visualization plugins:\n");
@ -131,34 +122,24 @@ int main(int argc, char *argv[]) {
} }
``` ```
<table> > ![information] If you need help to compile this code, refer to the
<tbody> > **Building the tutorials** section for your platform: [Mac] or
<tr class="odd"> > [Windows] or use this specific command on Linux:
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td> >
<td><div id="expander-2064723206" class="expand-container"> > `` gcc playback-tutorial-6.c -o playback-tutorial-6 `pkg-config --cflags --libs gstreamer-1.0` ``
<div id="expander-control-2064723206" class="expand-control"> >
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span> > If you need help to run this code, refer to the **Running the
</div> > tutorials** section for your platform: [Mac OS X], [Windows][1], for
<div id="expander-content-2064723206" class="expand-content"> > [iOS] or for [android].
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p> >
<div class="panel" style="border-width: 1px;"> > This tutorial plays music streamed from the [HBR1](http://www.hbr1.com/) Internet radio station. A window should open displaying somewhat psychedelic color patterns moving with the music. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.
<div class="panelContent"> >
<p><code>gcc playback-tutorial-6.c -o playback-tutorial-6 `pkg-config --cflags --libs gstreamer-1.0`</code></p> > Required libraries: `gstreamer-1.0`
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p>This tutorial plays music streamed from the <a href="http://www.hbr1.com/" class="external-link">HBR1</a> Internet radio station. A window should open displaying somewhat psychedelic color patterns moving with the music. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.</p>
<p>Required libraries: <code>gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough ## Walkthrough
First off, we indicate `playbin` that we want an audio visualization by First off, we indicate `playbin` that we want an audio visualization by
setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains
video, this flag has no effect. video, this flag has no effect.
``` c ``` c
@ -168,19 +149,19 @@ flags |= GST_PLAY_FLAG_VIS;
g_object_set (pipeline, "flags", flags, NULL); g_object_set (pipeline, "flags", flags, NULL);
``` ```
If no visualization plugin is enforced by the user, `playbin` will use If no visualization plugin is enforced by the user, `playbin` will use
`goom` (audio visualization will be disabled if `goom` is not `goom` (audio visualization will be disabled if `goom` is not
available). The rest of the tutorial shows how to find out the available available). The rest of the tutorial shows how to find out the available
visualization elements and enforce one to `playbin`. visualization elements and enforce one to `playbin`.
``` c ``` c
/* Get a list of all visualization plugins */ /* Get a list of all visualization plugins */
list = gst_registry_feature_filter (gst_registry_get_default (), filter_vis_features, FALSE, NULL); list = gst_registry_feature_filter (gst_registry_get (), filter_vis_features, FALSE, NULL);
``` ```
`gst_registry_feature_filter()` examines all elements currently in the `gst_registry_feature_filter()` examines all elements currently in the
GStreamer registry and selects those for which GStreamer registry and selects those for which
the `filter_vis_features` function returns TRUE. This function selects the `filter_vis_features` function returns TRUE. This function selects
only the Visualization plugins: only the Visualization plugins:
``` c ``` c
@ -199,15 +180,15 @@ static gboolean filter_vis_features (GstPluginFeature *feature, gpointer data) {
``` ```
A bit of theory regarding the organization of GStreamer elements is in A bit of theory regarding the organization of GStreamer elements is in
place: Each of the files that GStreamer loads at runtime is known as a place: Each of the files that GStreamer loads at runtime is known as a
Plugin (`GstPlugin`). A Plugin can contain many Features Plugin (`GstPlugin`). A Plugin can contain many Features
(`GstPluginFeature`). There are different kinds of Features, among them, (`GstPluginFeature`). There are different kinds of Features, among them,
the Element Factories (`GstElementFactory`) that we have been using to the Element Factories (`GstElementFactory`) that we have been using to
build Elements (`GstElement`). build Elements (`GstElement`).
This function simply disregards all Features which are not Factories, This function simply disregards all Features which are not Factories,
and then all Factories whose class (obtained with and then all Factories whose class (obtained with
`gst_element_factory_get_klass()`) does not include “Visualization”.  As `gst_element_factory_get_klass()`) does not include “Visualization”. As
stated in the documentation for `GstElementFactory`, a Factorys class stated in the documentation for `GstElementFactory`, a Factorys class
is a “string describing the type of element, as an unordered list is a “string describing the type of element, as an unordered list
separated with slashes (/)”. Examples of classes are “Source/Network”, separated with slashes (/)”. Examples of classes are “Source/Network”,
@ -231,7 +212,7 @@ for (walk = list; walk != NULL; walk = g_list_next (walk)) {
``` ```
Once we have the list of Visualization plugins, we print their names Once we have the list of Visualization plugins, we print their names
(`gst_element_factory_get_longname()`) and choose one (in this case, (`gst_element_factory_get_longname()`) and choose one (in this case,
GOOM). GOOM).
``` c ``` c
@ -242,8 +223,8 @@ if (!vis_plugin)
return -1; return -1;
``` ```
The selected factory is used to instantiate an actual `GstElement` which The selected factory is used to instantiate an actual `GstElement` which
is then passed to `playbin` through the `vis-plugin` property: is then passed to `playbin` through the `vis-plugin` property:
``` c ``` c
/* set vis plugin for playbin */ /* set vis plugin for playbin */
@ -252,20 +233,22 @@ g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
And we are done. And we are done.
# Conclusion ## Conclusion
This tutorial has shown: This tutorial has shown:
- How to enable Audio Visualization in `playbin` with the - How to enable Audio Visualization in `playbin` with the
`GST_PLAY_FLAG_VIS` flag `GST_PLAY_FLAG_VIS` flag
- How to enforce one particular visualization element with the - How to enforce one particular visualization element with the
`vis-plugin` `playbin` property  `vis-plugin` `playbin` property
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!
## Attachments: [information]: images/icons/emoticons/information.png
[Mac]: sdk-installing-on-mac-osx.md
![](images/icons/bullet_blue.gif) [Windows]: Installing+on+Windows
[vs2010.zip](attachments/327802/2424878.zip) (application/zip) [Mac OS X]: sdk-installing-on-mac-osx.md#building-the-tutorials
![](images/icons/bullet_blue.gif) [1]: sdk-installing-on-windows.md#running-the-tutorials
[playback-tutorial-6.c](attachments/327802/2424879.c) (text/plain) [iOS]: sdk-installing-for-ios-development.md#building-the-tutorials
[android]: sdk-installing-for-android-development.md#building-the-tutorials
[warning]: images/icons/emoticons/warning.png

View file

@ -1,6 +1,6 @@
# Playback tutorial 5: Color Balance # Playback tutorial 5: Color Balance
# Goal ## Goal
Brightness, Contrast, Hue and Saturation are common video adjustments, Brightness, Contrast, Hue and Saturation are common video adjustments,
which are collectively known as Color Balance settings in GStreamer. which are collectively known as Color Balance settings in GStreamer.
@ -9,43 +9,33 @@ This tutorial shows:
- How to find out the available color balance channels - How to find out the available color balance channels
- How to change them - How to change them
# Introduction ## Introduction
[](sdk-basic-tutorial-toolkit-integration.md) has
[Basic tutorial 5: GUI toolkit
integration](Basic%2Btutorial%2B5%253A%2BGUI%2Btoolkit%2Bintegration.html) has
already explained the concept of GObject interfaces: applications use already explained the concept of GObject interfaces: applications use
them to find out if certain functionality is available, regardless of them to find out if certain functionality is available, regardless of
the actual element which implements it. the actual element which implements it.
`playbin` implements the Color Balance interface (`gstcolorbalance`), `playbin` implements the Color Balance interface (`GstColorBalance`),
which allows access to the color balance settings. If any of the which allows access to the color balance settings. If any of the
elements in the `playbin` pipeline support this interface, elements in the `playbin` pipeline support this interface,
`playbin` simply forwards it to the application, otherwise, a `playbin` simply forwards it to the application, otherwise, a
colorbalance element is inserted in the pipeline. colorbalance element is inserted in the pipeline.
This interface allows querying for the available color balance channels This interface allows querying for the available color balance channels
(`gstcolorbalancechannel`), along with their name and valid range of (`GstColorBalanceChannel`), along with their name and valid range of
values, and then modify the current value of any of them. values, and then modify the current value of any of them.
# Color balance example ## Color balance example
Copy this code into a text file named `playback-tutorial-5.c`. Copy this code into a text file named `playback-tutorial-5.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
**playback-tutorial-5.c** **playback-tutorial-5.c**
``` c ``` c
#include <string.h> #include <string.h>
#include <stdio.h>
#include <gst/gst.h> #include <gst/gst.h>
#include <gst/interfaces/colorbalance.h> #include <gst/video/colorbalance.h>
typedef struct _CustomData { typedef struct _CustomData {
GstElement *pipeline; GstElement *pipeline;
@ -161,7 +151,7 @@ int main(int argc, char *argv[]) {
data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef G_OS_WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin)); io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
#else #else
io_stdin = g_io_channel_unix_new (fileno (stdin)); io_stdin = g_io_channel_unix_new (fileno (stdin));
@ -190,36 +180,25 @@ int main(int argc, char *argv[]) {
} }
``` ```
<table> > ![information] If you need help to compile this code, refer to the
<tbody> > **Building the tutorials** section for your platform: [Mac] or
<tr class="odd"> > [Windows] or use this specific command on Linux:
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td> >
<td><div id="expander-1909648074" class="expand-container"> > `` gcc playback-tutorial-5.c -o playback-tutorial-5 `pkg-config --cflags --libs gstreamer-1.0 gstreamer-video-1.0` ``
<div id="expander-control-1909648074" class="expand-control"> >
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span> > If you need help to run this code, refer to the **Running the
</div> > tutorials** section for your platform: [Mac OS X], [Windows][1], for
<div id="expander-content-1909648074" class="expand-content"> > [iOS] or for [android].
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p> >
<div class="panel" style="border-width: 1px;"> > This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.
<div class="panelContent"> >
<p><code>gcc playback-tutorial-5.c -o playback-tutorial-5 `pkg-config --cflags --libs gstreamer-interfaces-1.0 gstreamer-1.0`</code></p> >The console should print all commands (Each command is a single upper-case or lower-case letter) and list all available Color Balance channels, typically, CONTRAST, BRIGHTNESS, HUE and SATURATION. Type each command (letter) followed by the Enter key.
</div> >
</div> > Required libraries: `gstreamer-1.0 gstreamer-video-1.0`
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed.</span></p>
<p>The console should print all commands (Each command is a single upper-case or lower-case letter) and list all available Color Balance channels, typically, CONTRAST, BRIGHTNESS, HUE and SATURATION. Type each command (letter) followed by the Enter key.</p>
<p></p>
<p>Required libraries: <code>gstreamer-interfaces-1.0 gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough ## Walkthrough
The `main()` function is fairly simple. A `playbin` pipeline is The `main()` function is fairly simple. A `playbin` pipeline is
instantiated and set to run, and a keyboard watch is installed so instantiated and set to run, and a keyboard watch is installed so
keystrokes can be monitored. keystrokes can be monitored.
@ -242,12 +221,12 @@ static void print_current_values (GstElement *pipeline) {
This method prints the current value for all channels, and exemplifies This method prints the current value for all channels, and exemplifies
how to retrieve the list of channels. This is accomplished through the how to retrieve the list of channels. This is accomplished through the
`gst_color_balance_list_channels()` method. It returns a `GList` which `gst_color_balance_list_channels()` method. It returns a `GList` which
needs to be traversed. needs to be traversed.
Each element in the list is a `GstColorBalanceChannel` structure, Each element in the list is a `GstColorBalanceChannel` structure,
informing of the channels name, minimum value and maximum value. informing of the channels name, minimum value and maximum value.
`gst_color_balance_get_value()` can then be called on each channel to `gst_color_balance_get_value()` can then be called on each channel to
retrieve the current value. retrieve the current value.
In this example, the minimum and maximum values are used to output the In this example, the minimum and maximum values are used to output the
@ -300,26 +279,29 @@ stored and indexed by something more efficient than a string.
The current value for the channel is then retrieved, changed (the The current value for the channel is then retrieved, changed (the
increment is proportional to its dynamic range), clamped (to avoid increment is proportional to its dynamic range), clamped (to avoid
out-of-range values) and set using `gst_color_balance_set_value()`. out-of-range values) and set using `gst_color_balance_set_value()`.
And there is not much more to it. Run the program and observe the effect And there is not much more to it. Run the program and observe the effect
of changing each of the channels in real time. of changing each of the channels in real time.
# Conclusion ## Conclusion
This tutorial has shown how to use the color balance interface. This tutorial has shown how to use the color balance interface.
Particularly, it has shown: Particularly, it has shown:
- How to retrieve the list of color available balance channels - How to retrieve the list of color available balance channels
with `gst_color_balance_list_channels()` with `gst_color_balance_list_channels()`
- How to manipulate the current value of each channel using - How to manipulate the current value of each channel using
`gst_color_balance_get_value()` and `gst_color_balance_set_value()` `gst_color_balance_get_value()` and `gst_color_balance_set_value()`
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif) [information]: images/icons/emoticons/information.png
[playback-tutorial-5.c](attachments/327804/2424874.c) (text/plain) [Mac]: sdk-installing-on-mac-osx.md
![](images/icons/bullet_blue.gif) [Windows]: Installing+on+Windows
[vs2010.zip](attachments/327804/2424875.zip) (application/zip) [Mac OS X]: sdk-installing-on-mac-osx.md#building-the-tutorials
[1]: sdk-installing-on-windows.md#running-the-tutorials
[iOS]: sdk-installing-for-ios-development.md#building-the-tutorials
[android]: sdk-installing-for-android-development.md#building-the-tutorials
[warning]: images/icons/emoticons/warning.png

View file

@ -1,55 +1,46 @@
# Playback tutorial 7: Custom playbin sinks # Playback tutorial 7: Custom playbin sinks
# Goal ## Goal
`playbin` can be further customized by manually selecting its audio and `playbin` can be further customized by manually selecting its audio and
video sinks. This allows applications to rely on `playbin` to retrieve video sinks. This allows applications to rely on `playbin` to retrieve
and decode the media and then manage the final render/display and decode the media and then manage the final render/display
themselves. This tutorial shows: themselves. This tutorial shows:
- How to replace the sinks selected by `playbin`. - How to replace the sinks selected by `playbin`.
- How to use a complex pipeline as a sink. - How to use a complex pipeline as a sink.
# Introduction ## Introduction
Two properties of `playbin` allow selecting the desired audio and video Two properties of `playbin` allow selecting the desired audio and video
sinks: `audio-sink` and `video-sink` (respectively). The application sinks: `audio-sink` and `video-sink` (respectively). The application
only needs to instantiate the appropriate `GstElement` and pass it to only needs to instantiate the appropriate `GstElement` and pass it to
`playbin` through these properties. `playbin` through these properties.
This method, though, only allows using a single Element as sink. If a This method, though, only allows using a single Element as sink. If a
more complex pipeline is required, for example, an equalizer plus an more complex pipeline is required, for example, an equalizer plus an
audio sink, it needs to be wrapped in a Bin, so it looks to audio sink, it needs to be wrapped in a Bin, so it looks to
`playbin` as if it was a single Element. `playbin` as if it was a single Element.
A Bin (`GstBin`) is a container that encapsulates partial pipelines so A Bin (`GstBin`) is a container that encapsulates partial pipelines so
they can be managed as single elements. As an example, the they can be managed as single elements. As an example, the
`GstPipeline` we have been using in all tutorials is a type of `GstPipeline` we have been using in all tutorials is a type of
`GstBin`, which does not interact with external Elements. Elements `GstBin`, which does not interact with external Elements. Elements
inside a Bin connect to external elements through Ghost Pads inside a Bin connect to external elements through Ghost Pads
(`GstGhostPad`), this is, Pads on the surface of the Bin which simply (`GstGhostPad`), this is, Pads on the surface of the Bin which simply
forward data from an external Pad to a given Pad on an internal Element. forward data from an external Pad to a given Pad on an internal Element.
![](attachments/1441842/2424880.png) ![](images/bin-element-ghost.png)
**Figure 1:** A Bin with two Elements and one Ghost Pad. **Figure 1:** A Bin with two Elements and one Ghost Pad.
`GstBin`s are also a type of `GstElement`, so they can be used wherever `GstBin`s are also a type of `GstElement`, so they can be used wherever
an Element is required, in particular, as sinks for `playbin` (and they an Element is required, in particular, as sinks for `playbin` (and they
are then known as **sink-bins**). are then known as **sink-bins**).
# An equalized player ## An equalized player
Copy this code into a text file named `playback-tutorial-7.c`. Copy this code into a text file named `playback-tutorial-7.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
**playback-tutorial7.c** **playback-tutorial7.c**
@ -111,31 +102,21 @@ int main(int argc, char *argv[]) {
} }
``` ```
<table> > ![information] If you need help to compile this code, refer to the
<tbody> > **Building the tutorials** section for your platform: [Mac] or
<tr class="odd"> > [Windows] or use this specific command on Linux:
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td> >
<td><div id="expander-1371267928" class="expand-container"> > `` gcc playback-tutorial-7.c -o playback-tutorial-7 `pkg-config --cflags --libs gstreamer-1.0` ``
<div id="expander-control-1371267928" class="expand-control"> >
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span> > If you need help to run this code, refer to the **Running the
</div> > tutorials** section for your platform: [Mac OS X], [Windows][1], for
<div id="expander-content-1371267928" class="expand-content"> > [iOS] or for [android].
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p> >
<div class="panel" style="border-width: 1px;"> > This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The higher frequency bands have been attenuated, so the movie sound should have a more powerful bass component.<
<div class="panelContent"> >
<p><code>gcc playback-tutorial-7.c -o playback-tutorial-7 `pkg-config --cflags --libs gstreamer-1.0`</code></p> > Required libraries: `gstreamer-1.0`
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The higher frequency bands have been attenuated, so the movie sound should have a more powerful bass component.</span></p>
<p>Required libraries: <code>gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough ## Walkthrough
``` c ``` c
/* Create the elements inside the sink bin */ /* Create the elements inside the sink bin */
@ -149,7 +130,7 @@ if (!equalizer || !convert || !sink) {
``` ```
All the Elements that compose our sink-bin are instantiated. We use an All the Elements that compose our sink-bin are instantiated. We use an
`equalizer-3bands` and an `autoaudiosink`, with an `audioconvert` in `equalizer-3bands` and an `autoaudiosink`, with an `audioconvert` in
between, because we are not sure of the capabilities of the audio sink between, because we are not sure of the capabilities of the audio sink
(since they are hardware-dependant). (since they are hardware-dependant).
@ -175,14 +156,13 @@ Now we need to create a Ghost Pad so this partial pipeline inside the
Bin can be connected to the outside. This Ghost Pad will be connected to Bin can be connected to the outside. This Ghost Pad will be connected to
a Pad in one of the internal Elements (the sink pad of the equalizer), a Pad in one of the internal Elements (the sink pad of the equalizer),
so we retrieve this Pad with `gst_element_get_static_pad()`. Remember so we retrieve this Pad with `gst_element_get_static_pad()`. Remember
from [Basic tutorial 7: Multithreading and Pad from [](sdk-basic-tutorial-multithreading-and-pad-availability.md) that
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) that
if this was a Request Pad instead of an Always Pad, we would need to use if this was a Request Pad instead of an Always Pad, we would need to use
`gst_element_request_pad()`. `gst_element_request_pad()`.
The Ghost Pad is created with `gst_ghost_pad_new()` (pointing to the The Ghost Pad is created with `gst_ghost_pad_new()` (pointing to the
inner Pad we just acquired), and activated with `gst_pad_set_active()`. inner Pad we just acquired), and activated with `gst_pad_set_active()`.
It is then added to the Bin with `gst_element_add_pad()`, transferring It is then added to the Bin with `gst_element_add_pad()`, transferring
ownership of the Ghost Pad to the bin, so we do not have to worry about ownership of the Ghost Pad to the bin, so we do not have to worry about
releasing it. releasing it.
@ -190,14 +170,14 @@ Finally, the sink Pad we obtained from the equalizer needs to be release
with `gst_object_unref()`. with `gst_object_unref()`.
At this point, we have a functional sink-bin, which we can use as the At this point, we have a functional sink-bin, which we can use as the
audio sink in `playbin`. We just need to instruct `playbin` to use it: audio sink in `playbin`. We just need to instruct `playbin` to use it:
``` c ``` c
/* Set playbin's audio sink to be our sink bin */ /* Set playbin's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL); g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
``` ```
It is as simple as setting the `audio-sink` property on `playbin` to It is as simple as setting the `audio-sink` property on `playbin` to
the newly created sink. the newly created sink.
``` c ``` c
@ -209,33 +189,33 @@ g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
The only bit remaining is to configure the equalizer. For this example, The only bit remaining is to configure the equalizer. For this example,
the two higher frequency bands are set to the maximum attenuation so the the two higher frequency bands are set to the maximum attenuation so the
bass is boosted. Play a bit with the values to feel the difference (Look bass is boosted. Play a bit with the values to feel the difference (Look
at the documentation for the `equalizer-3bands` element for the allowed at the documentation for the `equalizer-3bands` element for the allowed
range of values). range of values).
# Exercise ## Exercise
Build a video bin instead of an audio bin, using one of the many Build a video bin instead of an audio bin, using one of the many
interesting video filters GStreamer offers, like `solarize`, interesting video filters GStreamer offers, like `solarize`,
`vertigotv` or any of the Elements in the `effectv` plugin. Remember to `vertigotv` or any of the Elements in the `effectv` plugin. Remember to
use the color space conversion element `ffmpegcolorspace` if your use the color space conversion element `videoconvert` if your
pipeline fails to link due to incompatible caps. pipeline fails to link due to incompatible caps.
# Conclusion ## Conclusion
This tutorial has shown: This tutorial has shown:
- How to set your own sinks to `playbin` using the audio-sink and - How to set your own sinks to `playbin` using the audio-sink and
video-sink properties. video-sink properties.
- How to wrap a piece of pipeline into a `GstBin` so it can be used as - How to wrap a piece of pipeline into a `GstBin` so it can be used as
a **sink-bin** by `playbin`. a **sink-bin** by `playbin`.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!
## Attachments: [information]: images/icons/emoticons/information.png
[Mac]: sdk-installing-on-mac-osx.md
![](images/icons/bullet_blue.gif) [Windows]: Installing+on+Windows
[bin-element-ghost.png](attachments/1441842/2424880.png) (image/png) [Mac OS X]: sdk-installing-on-mac-osx.md#building-the-tutorials
![](images/icons/bullet_blue.gif) [1]: sdk-installing-on-windows.md#running-the-tutorials
[playback-tutorial-7.c](attachments/1441842/2424881.c) (text/plain) [iOS]: sdk-installing-for-ios-development.md#building-the-tutorials
![](images/icons/bullet_blue.gif) [android]: sdk-installing-for-android-development.md#building-the-tutorials
[vs2010.zip](attachments/1441842/2424882.zip) (application/zip) [warning]: images/icons/emoticons/warning.png

View file

@ -1,10 +1,10 @@
# Playback tutorial 9: Digital audio pass-through # Playback tutorial 9: Digital audio pass-through
# Goal ## Goal
This tutorial shows how GStreamer handles digital audio pass-through. This tutorial shows how GStreamer handles digital audio pass-through.
# Introduction ## Introduction
Besides the common analog format, high-end audio systems usually also Besides the common analog format, high-end audio systems usually also
accept data in digital form, either compressed or uncompressed. This is accept data in digital form, either compressed or uncompressed. This is
@ -23,7 +23,7 @@ In this scenario, GStreamer does not need to perform audio decoding; it
can simply output the encoded data, acting in *pass-through* mode, and can simply output the encoded data, acting in *pass-through* mode, and
let the external audio system perform the decoding. let the external audio system perform the decoding.
# Inner workings of GStreamer audio sinks ## Inner workings of GStreamer audio sinks
First off, digital audio output must be enabled at the system level. The First off, digital audio output must be enabled at the system level. The
method to achieve this depend on the operating system, but it generally method to achieve this depend on the operating system, but it generally
@ -31,25 +31,24 @@ involves going to the audio control panel and activating a checkbox
reading “Digital Audio Output” or similar. reading “Digital Audio Output” or similar.
The main GStreamer audio sinks for each platform, Pulse Audio The main GStreamer audio sinks for each platform, Pulse Audio
(`pulsesink`) for Linux, `osxaudiosink` for OS X and Direct Sound (`pulsesink`) for Linux, `osxaudiosink` for OS X and Direct Sound
(`directsoundsink`) for Windows, detect when digital audio output is (`directsoundsink`) for Windows, detect when digital audio output is
available and change their input caps accordingly to accept encoded available and change their input caps accordingly to accept encoded
data. For example, these elements typically accept `audio/x-raw-int` or data. For example, these elements typically accept `audio/x-raw` data:
`audio/x-raw-float` data: when digital audio output is enabled in the when digital audio output is enabled in the system, they may also
system, they may also accept `audio/mpeg`, `audio/x-ac3`, accept `audio/mpeg`, `audio/x-ac3`, `audio/x-eac3` or `audio/x-dts`.
`audio/x-eac3` or `audio/x-dts`.
Then, when `playbin` builds the decoding pipeline, it realizes that the Then, when `playbin` builds the decoding pipeline, it realizes that the
audio sink can be directly connected to the encoded data (typically audio sink can be directly connected to the encoded data (typically
coming out of a demuxer), so there is no need for a decoder. This coming out of a demuxer), so there is no need for a decoder. This
process is automatic and does not need any action from the application. process is automatic and does not need any action from the application.
On Linux, there exist other audio sinks, like Alsa (`alsasink`) which On Linux, there exist other audio sinks, like Alsa (`alsasink`) which
work differently (a “digital device” needs to be manually selected work differently (a “digital device” needs to be manually selected
through the `device` property of the sink). Pulse Audio, though, is the through the `device` property of the sink). Pulse Audio, though, is the
commonly preferred audio sink on Linux. commonly preferred audio sink on Linux.
# Precautions with digital formats ## Precautions with digital formats
When Digital Audio Output is enabled at the system level, the GStreamer When Digital Audio Output is enabled at the system level, the GStreamer
audio sinks automatically expose all possible digital audio caps, audio sinks automatically expose all possible digital audio caps,
@ -60,8 +59,8 @@ supported, and, in fact, the cable can even be disconnected during this
process. process.
For example, after enabling Digital Audio Output in the systems Control For example, after enabling Digital Audio Output in the systems Control
Panel,  `directsoundsink`  will automatically expose `audio/x-ac3`, Panel, `directsoundsink` will automatically expose `audio/x-ac3`,
`audio/x-eac3` and `audio/x-dts` caps in addition to `audio/x-raw-int`. `audio/x-eac3` and `audio/x-dts` caps in addition to `audio/x-raw`.
However, one particular external decoder might only understand raw However, one particular external decoder might only understand raw
integer streams and would try to play the compressed data as such (a integer streams and would try to play the compressed data as such (a
painful experience for your ears, rest assured). painful experience for your ears, rest assured).
@ -77,28 +76,26 @@ configuration panel, from the same place where Digital Audio Output is
enabled, but, unfortunately, this option is not available in all audio enabled, but, unfortunately, this option is not available in all audio
drivers. drivers.
Another solution involves, using a custom sinkbin (see [Playback Another solution involves, using a custom sinkbin (see
tutorial 7: Custom playbin [](sdk-playback-tutorial-custom-playbin-sinks.md)) which includes a
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin%2Bsinks.html)) `capsfilter` element (see [](sdk-basic-tutorial-handy-elements.md))
which includes a `capsfilter` element (see [Basic tutorial 14: Handy and an audio sink. The caps that the external decoder supports are
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html)) and an then set in the capsfiler so the wrong format is not output. This
audio sink. The caps that the external decoder supports are then set in allows the application to enforce the appropriate format instead of
the capsfiler so the wrong format is not output. This allows the relying on the user to have the system correctly configured. Still
application to enforce the appropriate format instead of relying on the requires user intervention, but can be used regardless of the options
user to have the system correctly configured. Still requires user the audio driver offers.
intervention, but can be used regardless of the options the audio driver
offers.
Please do not use `autoaudiosink` as the audio sink, as it currently Please do not use `autoaudiosink` as the audio sink, as it currently
only supports raw audio, and will ignore any compressed format. only supports raw audio, and will ignore any compressed format.
# Conclusion ## Conclusion
This tutorial has shown a bit of how GStreamer deals with digital audio. This tutorial has shown a bit of how GStreamer deals with digital audio.
In particular, it has shown that: In particular, it has shown that:
- Applications using `playbin` do not need to do anything special to - Applications using `playbin` do not need to do anything special to
enable digital audio output: it is managed from the audio control enable digital audio output: it is managed from the audio control
panel of the operating system. panel of the operating system.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!

View file

@ -1,6 +1,6 @@
# Playback tutorial 8: Hardware-accelerated video decoding # Playback tutorial 8: Hardware-accelerated video decoding
# Goal ## Goal
Hardware-accelerated video decoding has rapidly become a necessity, as Hardware-accelerated video decoding has rapidly become a necessity, as
low-power devices grow more common. This tutorial (more of a lecture, low-power devices grow more common. This tutorial (more of a lecture,
@ -11,7 +11,7 @@ Sneak peek: if properly setup, you do not need to do anything special to
activate hardware acceleration; GStreamer automatically takes advantage activate hardware acceleration; GStreamer automatically takes advantage
of it. of it.
# Introduction ## Introduction
Video decoding can be an extremely CPU-intensive task, especially for Video decoding can be an extremely CPU-intensive task, especially for
higher resolutions like 1080p HDTV. Fortunately, modern graphics cards, higher resolutions like 1080p HDTV. Fortunately, modern graphics cards,
@ -20,152 +20,114 @@ allowing the CPU to concentrate on other duties. Having dedicated
hardware becomes essential for low-power CPUs which are simply incapable hardware becomes essential for low-power CPUs which are simply incapable
of decoding such media fast enough. of decoding such media fast enough.
In the current state of things (July-2012) each GPU manufacturer offers In the current state of things (June 2016) each GPU manufacturer offers
a different method to access their hardware (a different API), and a a different method to access their hardware (a different API), and a
strong industry standard has not emerged yet. strong industry standard has not emerged yet.
As of July-2012, there exist at least 8 different video decoding As of June 2016, there exist at least 8 different video decoding
acceleration APIs: acceleration APIs:
[VAAPI](http://en.wikipedia.org/wiki/Video_Acceleration_API) (*Video - [VAAPI](http://en.wikipedia.org/wiki/Video_Acceleration_API) (*Video
Acceleration API*): Initially designed by Acceleration API*): Initially designed by
[Intel](http://en.wikipedia.org/wiki/Intel) in 2007, targeted at the X [Intel](http://en.wikipedia.org/wiki/Intel) in 2007, targeted at the X
Window System on Unix-based operating systems, now open-source. It is Window System on Unix-based operating systems, now open-source. It now also
supports Wayland through dmabuf. It is
currently not limited to Intel GPUs as other manufacturers are free to currently not limited to Intel GPUs as other manufacturers are free to
use this API, for example, [Imagination use this API, for example, [Imagination
Technologies](http://en.wikipedia.org/wiki/Imagination_Technologies) or Technologies](http://en.wikipedia.org/wiki/Imagination_Technologies) or
[S3 Graphics](http://en.wikipedia.org/wiki/S3_Graphics). Accessible to [S3 Graphics](http://en.wikipedia.org/wiki/S3_Graphics). Accessible to
GStreamer through GStreamer through the [gstreamer-vaapi](https://cgit.freedesktop.org/gstreamer/gstreamer-vaapi/) package.
the [gstreamer-vaapi](http://gitorious.org/vaapi/gstreamer-vaapi) and
[Fluendo](http://en.wikipedia.org/wiki/Fluendo)s Video Acceleration
Decoder (fluvadec) plugins.
[VDPAU](http://en.wikipedia.org/wiki/VDPAU) (*Video Decode and - [VDPAU](http://en.wikipedia.org/wiki/VDPAU) (*Video Decode and
Presentation API for UNIX*): Initially designed by Presentation API for UNIX*): Initially designed by
[NVidia](http://en.wikipedia.org/wiki/NVidia) in 2008, targeted at the X [NVidia](http://en.wikipedia.org/wiki/NVidia) in 2008, targeted at the X
Window System on Unix-based operating systems, now open-source. Although Window System on Unix-based operating systems, now open-source. Although
it is also an open-source library, no manufacturer other than NVidia is it is also an open-source library, no manufacturer other than NVidia is
using it yet. Accessible to GStreamer through using it yet. Accessible to GStreamer through
the [vdpau](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau) element the [vdpau](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau) element in plugins-bad.
in plugins-bad and [Fluendo](http://en.wikipedia.org/wiki/Fluendo)s
Video Acceleration Decoder (fluvadec) plugins.
[DXVA](http://en.wikipedia.org/wiki/DXVA) (*DirectX Video - [OpenMAX](http://en.wikipedia.org/wiki/OpenMAX) (*Open Media
Acceleration*): [Microsoft](http://en.wikipedia.org/wiki/Microsoft) API Acceleration*): Managed by the non-profit technology consortium [Khronos
specification for the Microsoft Windows and Xbox 360
platforms. Accessible to GStreamer through
the [Fluendo](http://en.wikipedia.org/wiki/Fluendo)s Video
Acceleration Decoder (fluvadec) plugin.
[XVBA](http://en.wikipedia.org/wiki/Xvba) (*X-Video Bitstream
Acceleration*): Designed by [AMD
Graphics](http://en.wikipedia.org/wiki/AMD_Graphics), is an arbitrary
extension of the X video extension (Xv) for the X Window System on Linux
operating-systems. Currently only AMD's ATI Radeon graphics cards
hardware that have support for Unified Video Decoder version 2.0 or
later are supported by the proprietary ATI Catalyst device
driver. Accessible to GStreamer through
the [Fluendo](http://en.wikipedia.org/wiki/Fluendo)s Video
Acceleration Decoder
(fluvadec) plugin.
[VDA](http://developer.apple.com/library/mac/#technotes/tn2267/_index.html)
(*Video Decode Acceleration*): Available on [Mac OS
X](http://en.wikipedia.org/wiki/OS_X) v10.6.3 and later with Mac models
equipped with the NVIDIA GeForce 9400M, GeForce 320M, GeForce GT 330M,
ATI HD Radeon GFX, Intel HD Graphics and others. Only accelerates
decoding of H.264 media. Accessible to GStreamer through
the [Fluendo](http://en.wikipedia.org/wiki/Fluendo)s Video
Acceleration Decoder (fluvadec) plugin.
[OpenMAX](http://en.wikipedia.org/wiki/OpenMAX) (*Open Media
Acceleration*): Managed by the non-profit technology consortium [Khronos
Group](http://en.wikipedia.org/wiki/Khronos_Group "Khronos Group"), Group](http://en.wikipedia.org/wiki/Khronos_Group "Khronos Group"),
it is a "royalty-free, cross-platform set of C-language programming it is a "royalty-free, cross-platform set of C-language programming
interfaces that provides abstractions for routines especially useful for interfaces that provides abstractions for routines especially useful for
audio, video, and still images". Accessible to GStreamer through audio, video, and still images". Accessible to GStreamer through
the [gstreamer-omx](http://git.freedesktop.org/gstreamer/gst-omx) plugin. the [gst-omx](http://git.freedesktop.org/gstreamer/gst-omx) plugin.
[OVD](http://developer.amd.com/sdks/AMDAPPSDK/assets/OpenVideo_Decode_API.PDF) - [OVD](http://developer.amd.com/sdks/AMDAPPSDK/assets/OpenVideo_Decode_API.PDF)
(*Open Video Decode*): Another API from [AMD (*Open Video Decode*): Another API from [AMD
Graphics](http://en.wikipedia.org/wiki/AMD_Graphics), designed to be a Graphics](http://en.wikipedia.org/wiki/AMD_Graphics), designed to be a
platform agnostic method for softrware developers to leverage the platform agnostic method for softrware developers to leverage the
[Universal Video [Universal Video
Decode](http://en.wikipedia.org/wiki/Unified_Video_Decoder) (UVD) Decode](http://en.wikipedia.org/wiki/Unified_Video_Decoder) (UVD)
hardware inside AMD Radeon graphics cards. Currently unavailable to hardware inside AMD Radeon graphics cards. Currently unavailable to
GStreamer. GStreamer .
[DCE](http://en.wikipedia.org/wiki/Distributed_Codec_Engine) - [DCE](http://en.wikipedia.org/wiki/Distributed_Codec_Engine)
(*Distributed Codec Engine*): An open source software library ("libdce") (*Distributed Codec Engine*): An open source software library ("libdce")
and API specification by [Texas and API specification by [Texas
Instruments](http://en.wikipedia.org/wiki/Texas_Instruments), targeted Instruments](http://en.wikipedia.org/wiki/Texas_Instruments), targeted
at Linux systems and ARM platforms. Accessible to GStreamer through at Linux systems and ARM platforms. Accessible to GStreamer through
the [gstreamer-ducati](https://github.com/robclark/gst-ducati) plugin. the [gstreamer-ducati](https://github.com/robclark/gst-ducati) plugin.
There exist some GStreamer plugins, like the - [Android
[gstreamer-vaapi](http://gitorious.org/vaapi/gstreamer-vaapi) project or MediaCodec](https://developer.android.com/reference/android/media/MediaCodec.html): This is Android's API to access the device's
the hardware decoder and encoder if available. This is accessible through the
[vdpau](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau) `androidmedia` plugin in gst-plugins-bad. This includes both encoding and
element in plugins-bad, which target one particular hardware decoding.
acceleration API and expose its functionality through different
GStreamer elements. The application is then responsible for selecting
the appropriate plugin depending on the available APIs.
Some other GStreamer plugins, like - Apple VideoTool Box Framework: Apple's API to access h is available
[Fluendo](http://en.wikipedia.org/wiki/Fluendo)s Video Acceleration through the `applemedia` plugin which includes both encoding through
Decoder (fluvadec), detect at runtime the available APIs and select one the `vtenc` element and decoding through the `vtdec` element.
automatically. This makes any program using these plugins independent of
the API, or even the operating system.
# Inner workings of hardware-accelerated video decoding plugins - Video4Linux: Recent Linux kernels have a kernel API to expose
hardware codecs in a standard way, this is now supported by the
`v4l2` plugin in `gst-plugins-good`. This can support both decoding
and encoding depending on the platform.
## Inner workings of hardware-accelerated video decoding plugins
These APIs generally offer a number of functionalities, like video These APIs generally offer a number of functionalities, like video
decoding, post-processing, presentation of the decoded frames, or decoding, post-processing, or presentation of the decoded
download of such frames to system memory. Correspondingly, plugins frames. Correspondingly, plugins generally offer a different GStreamer
generally offer a different GStreamer element for each of these element for each of these functions, so pipelines can be built to
functions, so pipelines can be built to accommodate any need. accommodate any need.
For example, the `gstreamer-vaapi` plugin offers the `vaapidecode`, For example, the `gstreamer-vaapi` plugin offers the `vaapidecode`,
`vaapiupload`, `vaapidownload` and `vaapisink` elements that allow `vaapipostproc` and `vaapisink` elements that allow
hardware-accelerated decoding through VAAPI, upload of raw video frames hardware-accelerated decoding through VAAPI, upload of raw video frames
to GPU memory, download of GPU frames to system memory and presentation to GPU memory, download of GPU frames to system memory and presentation
of GPU frames, respectively. of GPU frames, respectively.
It is important to distinguish between conventional GStreamer frames, It is important to distinguish between conventional GStreamer frames,
which reside in system memory, and frames generated by which reside in system memory, and frames generated by
hardware-accelerated APIs. The latter reside in GPU memory and cannot be hardware-accelerated APIs. The latter reside in GPU memory and cannot
touched by GStreamer. They can usually be downloaded to system memory be touched by GStreamer. They can usually be downloaded to system
and treated as conventional GStreamer frames, but it is far more memory and treated as conventional GStreamer frames when they are
efficient to leave them in the GPU and display them from there. mapped, but it is far more efficient to leave them in the GPU and
display them from there.
GStreamer needs to keep track of where these “hardware buffers” are GStreamer needs to keep track of where these “hardware buffers” are
though, so conventional buffers still travel from element to element, though, so conventional buffers still travel from element to
but their only content is a hardware buffer ID, or handler. If retrieved element. They look like regular buffers, but mapping their content is
with an `appsink`, for example, hardware buffers make no sense, since much slower as it has to be retrieved from the special memory used by
they are meant to be handled only by the plugin that generated them. hardware accelerated elements. This special memory types are
negotiated using the allocation query mechanism.
To indicate this, these buffers have special Caps, like
`video/x-vdpau-output` or `video/x-fluendo-va`. In this way, the
auto-plugging mechanism of GStreamer will not try to feed hardware
buffers to conventional elements, as they would not understand the
received buffers. Moreover, using these Caps, the auto-plugger is able
to automatically build pipelines that use hardware acceleration, since,
after a VAAPI decoder, a VAAPI sink is the only element that fits.
This all means that, if a particular hardware acceleration API is This all means that, if a particular hardware acceleration API is
present in the system, and the corresponding GStreamer plugin is also present in the system, and the corresponding GStreamer plugin is also
available, auto-plugging elements like `playbin` are free to use available, auto-plugging elements like `playbin` are free to use
hardware acceleration to build their pipelines; the application does not hardware acceleration to build their pipelines; the application does not
need to do anything special to enable it. Almost: need to do anything special to enable it. Almost:
When `playbin` has to choose among different equally valid elements, When `playbin` has to choose among different equally valid elements,
like conventional software decoding (through `vp8dec`, for example) or like conventional software decoding (through `vp8dec`, for example) or
hardware accelerated decoding (through `vaapidecode`, for example), it hardware accelerated decoding (through `vaapidecode`, for example), it
uses their *rank* to decide. The rank is a property of each element that uses their *rank* to decide. The rank is a property of each element that
indicates its priority; `playbin` will simply select the element that indicates its priority; `playbin` will simply select the element that
is able to build a complete pipeline and has the highest rank. is able to build a complete pipeline and has the highest rank.
So, whether `playbin` will use hardware acceleration or not will depend So, whether `playbin` will use hardware acceleration or not will depend
on the relative ranks of all elements capable of dealing with that media on the relative ranks of all elements capable of dealing with that media
type. Therefore, the easiest way to make sure hardware acceleration is type. Therefore, the easiest way to make sure hardware acceleration is
enabled or disabled is by changing the rank of the associated element, enabled or disabled is by changing the rank of the associated element,
@ -195,9 +157,9 @@ static void enable_factory (const gchar *name, gboolean enable) {
``` ```
The first parameter passed to this method is the name of the element to The first parameter passed to this method is the name of the element to
modify, for example, `vaapidecode` or `fluvadec`. modify, for example, `vaapidecode` or `fluvadec`.
The key method is `gst_plugin_feature_set_rank()`, which will set the The key method is `gst_plugin_feature_set_rank()`, which will set the
rank of the requested element factory to the desired level. For rank of the requested element factory to the desired level. For
convenience, ranks are divided in NONE, MARGINAL, SECONDARY and PRIMARY, convenience, ranks are divided in NONE, MARGINAL, SECONDARY and PRIMARY,
but any number will do. When enabling an element, we set it to but any number will do. When enabling an element, we set it to
@ -205,132 +167,8 @@ PRIMARY+1, so it has a higher rank than the rest of elements which
commonly have PRIMARY rank. Setting an elements rank to NONE will make commonly have PRIMARY rank. Setting an elements rank to NONE will make
the auto-plugging mechanism to never select it. the auto-plugging mechanism to never select it.
# Hardware-accelerated video decoding and the GStreamer SDK > ![warning] The GStreamer developers often rank hardware decoders lower than
> the software ones when they are defective. This should act as a warning.
There are no plugins deployed in the GStreamer SDK Amazon 2012.7 that
allow hardware-accelerated video decoding. The main reasons are that
some of them are not yet fully operational, or still have issues, or are
proprietary. Bear in mind that this situation is bound to change in the
near future, as this is a very active area of development.
Some of these plugins can be built from their publicly available
sources, using the Cerbero build system (see [Installing on
Linux](Installing%2Bon%2BLinux.html)) or independently (linking against
the GStreamer SDK libraries, obviously). Some other plugins are readily
available in binary form from their vendors.
The following sections try to summarize the current state of some of
these plugins.
### vdpau in gst-plugins-bad
- GStreamer element for VDPAU, present in
[gst-plugins-bad](http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/vdpau).
- Supported codecs: 
<table>
<thead>
<tr class="header">
<th>MPEG2</th>
<th>MPEG4</th>
<th>H.264</th>
</tr>
</thead>
<tbody>
</tbody>
</table>
### gstreamer-vaapi
- GStreamer element for VAAPI. Standalone project hosted at
[gstreamer-vaapi](http://gitorious.org/vaapi/gstreamer-vaapi).
- Supported codecs:
<table>
<thead>
<tr class="header">
<th>MPEG2</th>
<th>MPEG4</th>
<th>H.264</th>
<th>VC1</th>
<th>WMV3</th>
</tr>
</thead>
<tbody>
</tbody>
</table>
- Can interface directly with Clutter (See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)),
so frames do not need to leave the GPU.
- Compatible with `playbin`.
### gst-omx
- GStreamer element for OpenMAX. Standalone project hosted at
[gst-omx](http://git.freedesktop.org/gstreamer/gst-omx/).
- Supported codecs greatly vary depending on the underlying hardware.
### fluvadec
- GStreamer element for VAAPI, VDPAU, DXVA2, XVBA and VDA from
[Fluendo](http://en.wikipedia.org/wiki/Fluendo) (propietary).
- Supported codecs depend on the chosen API, which is selected at
runtime depending on what is available on the system:
<table>
<thead>
<tr class="header">
<th> </th>
<th>MPEG2</th>
<th>MPEG4</th>
<th>H.264</th>
<th>VC1</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td>VAAPI</td>
<td><span></span></td>
<td><span></span></td>
<td><span></span></td>
<td><span></span></td>
</tr>
<tr class="even">
<td>VDPAU</td>
<td><span></span></td>
<td><span></span></td>
<td><span></span></td>
<td><span></span></td>
</tr>
<tr class="odd">
<td>XVBA</td>
<td> </td>
<td> </td>
<td><span></span></td>
<td><span></span></td>
</tr>
<tr class="even">
<td>DXVA2</td>
<td> </td>
<td> </td>
<td><span></span></td>
<td> </td>
</tr>
<tr class="odd">
<td>VDA</td>
<td> </td>
<td> </td>
<td><span></span></td>
<td> </td>
</tr>
</tbody>
</table>
- Can interface directly with Clutter (See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)),
so frames do not need to leave the GPU.
- Compatible with `playbin`.
# Conclusion # Conclusion
@ -343,4 +181,6 @@ accelerated video decoding. Particularly,
- Hardware acceleration can be enabled or disabled by changing the - Hardware acceleration can be enabled or disabled by changing the
rank of the decoding element with `gst_plugin_feature_set_rank()`. rank of the decoding element with `gst_plugin_feature_set_rank()`.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
[warning]: images/icons/emoticons/warning.png

View file

@ -1,10 +1,10 @@
# Playback tutorial 1: Playbin usage # Playback tutorial 1: Playbin usage
# Goal ## Goal
We have already worked with the `playbin` element, which is capable of We have already worked with the `playbin` element, which is capable of
building a complete playback pipeline without much work on our side. building a complete playback pipeline without much work on our side.
This tutorial shows how to further customize `playbin` in case its This tutorial shows how to further customize `playbin` in case its
default values do not suit our particular needs. default values do not suit our particular needs.
We will learn: We will learn:
@ -15,10 +15,10 @@ We will learn:
- How to gather information regarding each stream. - How to gather information regarding each stream.
As a side note, even though its name is `playbin`, you can pronounce it As a side note, even though its name is `playbin`, you can pronounce it
“playbin”, since the original `playbin` element is deprecated and nobody “playbin”, since the original `playbin` element is deprecated and nobody
should be using it. should be using it.
# Introduction ## Introduction
More often than not, multiple audio, video and subtitle streams can be More often than not, multiple audio, video and subtitle streams can be
found embedded in a single file. The most common case are regular found embedded in a single file. The most common case are regular
@ -52,9 +52,9 @@ The following code recovers the amount of streams in the file, their
associated metadata, and allows switching the audio stream while the associated metadata, and allows switching the audio stream while the
media is playing. media is playing.
# The multilingual player ## The multilingual player
Copy this code into a text file named `playback-tutorial-1.c` (or find Copy this code into a text file named `playback-tutorial-1.c` (or find
it in the SDK installation). it in the SDK installation).
**playback-tutorial-1.c** **playback-tutorial-1.c**
@ -123,7 +123,7 @@ int main(int argc, char *argv[]) {
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data); gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef G_OS_WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin)); io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
#else #else
io_stdin = g_io_channel_unix_new (fileno (stdin)); io_stdin = g_io_channel_unix_new (fileno (stdin));
@ -266,7 +266,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
gchar *str = NULL; gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) { if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
int index = atoi (str); int index = g_ascii_strtoull (str, NULL, 0);
if (index < 0 || index >= data->n_audio) { if (index < 0 || index >= data->n_audio) {
g_printerr ("Index out of bounds\n"); g_printerr ("Index out of bounds\n");
} else { } else {
@ -283,26 +283,27 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
> ![information] If you need help to compile this code, refer to the > ![information] If you need help to compile this code, refer to the
> **Building the tutorials** section for your platform: [Mac] or > **Building the tutorials** section for your platform: [Mac] or
> [Windows] or use this specific command on Linux: > [Windows] or use this specific command on Linux:
>
> `` gcc playback-tutorial-1.c -o playback-tutorial-1 `pkg-config --cflags --libs gstreamer-1.0` `` > `` gcc playback-tutorial-1.c -o playback-tutorial-1 `pkg-config --cflags --libs gstreamer-1.0` ``
>
> If you need help to run this code, refer to the **Running the
> tutorials** section for your platform: [Mac OS X], [Windows][1], for
> [iOS] or for [android].
>
> This tutorial opens a window and displays a movie, with accompanying
> audio. The media is fetched from the Internet, so the window might take
> a few seconds to appear, depending on your connection speed. The number
> of audio streams is shown in the terminal, and the user can switch from
> one to another by entering a number and pressing enter. A small delay is
> to be expected.
>
> Bear in mind that there is no latency management (buffering), so on slow
> connections, the movie might stop after a few seconds. See how [Tutorial
> 12: Live streaming] solves this issue.
>
> Required libraries: `gstreamer-1.0`
If you need help to run this code, refer to the **Running the ## Walkthrough
tutorials** section for your platform: [Mac OS X], [Windows][1], for
[iOS] or for [android].
This tutorial opens a window and displays a movie, with accompanying
audio. The media is fetched from the Internet, so the window might take
a few seconds to appear, depending on your connection speed. The number
of audio streams is shown in the terminal, and the user can switch from
one to another by entering a number and pressing enter. A small delay is
to be expected.
Bear in mind that there is no latency management (buffering), so on slow
connections, the movie might stop after a few seconds. See how [Tutorial
12: Live streaming] solves this issue.
Required libraries: `gstreamer-1.0`
# Walkthrough
``` c ``` c
/* Structure to contain all our information, so we can pass it around */ /* Structure to contain all our information, so we can pass it around */
@ -338,9 +339,9 @@ typedef enum {
Later we are going to set some of `playbin`'s flags. We would like to Later we are going to set some of `playbin`'s flags. We would like to
have a handy enum that allows manipulating these flags easily, but since have a handy enum that allows manipulating these flags easily, but since
`playbin` is a plug-in and not a part of the GStreamer core, this enum `playbin` is a plug-in and not a part of the GStreamer core, this enum
is not available to us. The “trick” is simply to declare this enum in is not available to us. The “trick” is simply to declare this enum in
our code, as it appears in the `playbin` documentation: `GstPlayFlags`. our code, as it appears in the `playbin` documentation: `GstPlayFlags`.
GObject allows introspection, so the possible values for these flags can GObject allows introspection, so the possible values for these flags can
be retrieved at runtime without using this trick, but in a far more be retrieved at runtime without using this trick, but in a far more
cumbersome way. cumbersome way.
@ -352,17 +353,17 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
``` ```
Forward declarations for the two callbacks we will be using. Forward declarations for the two callbacks we will be using.
`handle_message` for the GStreamer messages, as we have already seen, `handle_message` for the GStreamer messages, as we have already seen,
and `handle_keyboard` for key strokes, since this tutorial is and `handle_keyboard` for key strokes, since this tutorial is
introducing a limited amount of interactivity. introducing a limited amount of interactivity.
We skip over the creation of the pipeline, the instantiation of We skip over the creation of the pipeline, the instantiation of
`playbin` and pointing it to our test media through the `uri` `playbin` and pointing it to our test media through the `uri`
property. `playbin` is in itself a pipeline, and in this case it is the property. `playbin` is in itself a pipeline, and in this case it is the
only element in the pipeline, so we skip completely the creation of the only element in the pipeline, so we skip completely the creation of the
pipeline, and use directly the  `playbin` element. pipeline, and use directly the `playbin` element.
We focus on some of the other properties of `playbin`, though: We focus on some of the other properties of `playbin`, though:
``` c ``` c
/* Set flags to show Audio and Video but ignore Subtitles */ /* Set flags to show Audio and Video but ignore Subtitles */
@ -372,8 +373,8 @@ flags &= ~GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin, "flags", flags, NULL); g_object_set (data.playbin, "flags", flags, NULL);
``` ```
`playbin`'s behavior can be changed through its `flags` property, which `playbin`'s behavior can be changed through its `flags` property, which
can have any combination of `GstPlayFlags`. The most interesting values can have any combination of `GstPlayFlags`. The most interesting values
are: are:
| Flag | Description | | Flag | Description |
@ -389,7 +390,7 @@ are:
In our case, for demonstration purposes, we are enabling audio and video In our case, for demonstration purposes, we are enabling audio and video
and disabling subtitles, leaving the rest of flags to their default and disabling subtitles, leaving the rest of flags to their default
values (this is why we read the current value of the flags with values (this is why we read the current value of the flags with
`g_object_get()` before overwriting it with `g_object_set()`). `g_object_get()` before overwriting it with `g_object_set()`).
``` c ``` c
/* Set connection speed. This will affect some internal decisions of playbin */ /* Set connection speed. This will affect some internal decisions of playbin */
@ -397,10 +398,10 @@ g_object_set (data.playbin, "connection-speed", 56, NULL);
``` ```
This property is not really useful in this example. This property is not really useful in this example.
`connection-speed` informs `playbin` of the maximum speed of our network `connection-speed` informs `playbin` of the maximum speed of our network
connection, so, in case multiple versions of the requested media are connection, so, in case multiple versions of the requested media are
available in the server, `playbin` chooses the most appropriate. This is available in the server, `playbin` chooses the most appropriate. This is
mostly used in combination with streaming protocols like `mms` or mostly used in combination with streaming protocols like `mms` or
`rtsp`. `rtsp`.
We have set all these properties one by one, but we could have all of We have set all these properties one by one, but we could have all of
@ -410,7 +411,7 @@ them with a single call to `g_object_set()`:
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL); g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL);
``` ```
This is why `g_object_set()` requires a NULL as the last parameter. This is why `g_object_set()` requires a NULL as the last parameter.
``` c ``` c
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
@ -438,13 +439,13 @@ g_main_loop_run (data.main_loop);
To allow interactivity, we will no longer poll the GStreamer bus To allow interactivity, we will no longer poll the GStreamer bus
manually. Instead, we create a `GMainLoop`(GLib main loop) and set it manually. Instead, we create a `GMainLoop`(GLib main loop) and set it
running with `g_main_loop_run()`. This function blocks and will not running with `g_main_loop_run()`. This function blocks and will not
return until `g_main_loop_quit()` is issued. In the meantime, it will return until `g_main_loop_quit()` is issued. In the meantime, it will
call the callbacks we have registered at the appropriate call the callbacks we have registered at the appropriate
times: `handle_message` when a message appears on the bus, and times: `handle_message` when a message appears on the bus, and
`handle_keyboard` when the user presses any key. `handle_keyboard` when the user presses any key.
There is nothing new in handle\_message, except that when the pipeline There is nothing new in handle\_message, except that when the pipeline
moves to the PLAYING state, it will call the `analyze_streams` function: moves to the PLAYING state, it will call the `analyze_streams` function:
``` c ``` c
/* Extract some metadata from the streams and print it on the screen */ /* Extract some metadata from the streams and print it on the screen */
@ -461,9 +462,9 @@ static void analyze_streams (CustomData *data) {
``` ```
As the comment says, this function just gathers information from the As the comment says, this function just gathers information from the
media and prints it on the screen. The number of video, audio and media and prints it on the screen. The number of video, audio and
subtitle streams is directly available through the `n-video`, subtitle streams is directly available through the `n-video`,
`n-audio` and `n-text` properties. `n-audio` and `n-text` properties.
``` c ``` c
for (i = 0; i < data->n_video; i++) { for (i = 0; i < data->n_video; i++) {
@ -481,13 +482,14 @@ for (i = 0; i < data->n_video; i++) {
``` ```
Now, for each stream, we want to retrieve its metadata. Metadata is Now, for each stream, we want to retrieve its metadata. Metadata is
stored as tags in a `GstTagList` structure, which is a list of data stored as tags in a `GstTagList` structure, which is a list of data
pieces identified by a name. The `GstTagList` associated with a stream pieces identified by a name. The `GstTagList` associated with a stream
can be recovered with `g_signal_emit_by_name()`, and then individual can be recovered with `g_signal_emit_by_name()`, and then individual
tags are extracted with the `gst_tag_list_get_*` functions tags are extracted with the `gst_tag_list_get_*` functions
like `gst_tag_list_get_string()` for example. like `gst_tag_list_get_string()` for example.
> ![information] This rather unintuitive way of retrieving the tag list > ![information]
> This rather unintuitive way of retrieving the tag list
> is called an Action Signal. Action signals are emitted by the > is called an Action Signal. Action signals are emitted by the
> application to a specific element, which then performs an action and > application to a specific element, which then performs an action and
> returns a result. They behave like a dynamic function call, in which > returns a result. They behave like a dynamic function call, in which
@ -496,7 +498,7 @@ like `gst_tag_list_get_string()` for example.
> documentation along with the regular signals, and are tagged “Action”. > documentation along with the regular signals, and are tagged “Action”.
> See `playbin`, for example. > See `playbin`, for example.
`playbin` defines 3 action signals to retrieve metadata: `playbin` defines 3 action signals to retrieve metadata:
`get-video-tags`, `get-audio-tags` and `get-text-tags`. The name if the `get-video-tags`, `get-audio-tags` and `get-text-tags`. The name if the
tags is standardized, and the list can be found in the `GstTagList` tags is standardized, and the list can be found in the `GstTagList`
documentation. In this example we are interested in the documentation. In this example we are interested in the
@ -511,11 +513,11 @@ g_object_get (data->playbin, "current-text", &data->current_text, NULL);
Once we have extracted all the metadata we want, we get the streams that Once we have extracted all the metadata we want, we get the streams that
are currently selected through 3 more properties of `playbin`: are currently selected through 3 more properties of `playbin`:
`current-video`, `current-audio` and `current-text`.  `current-video`, `current-audio` and `current-text`.
It is interesting to always check the currently selected streams and It is interesting to always check the currently selected streams and
never make any assumption. Multiple internal conditions can make never make any assumption. Multiple internal conditions can make
`playbin` behave differently in different executions. Also, the order in `playbin` behave differently in different executions. Also, the order in
which the streams are listed can change from one run to another, so which the streams are listed can change from one run to another, so
checking the metadata to identify one particular stream becomes crucial. checking the metadata to identify one particular stream becomes crucial.
@ -525,7 +527,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
gchar *str = NULL; gchar *str = NULL;
if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) { if (g_io_channel_read_line (source, &str, NULL, NULL, NULL) == G_IO_STATUS_NORMAL) {
int index = atoi (str); int index = g_ascii_strtoull (str, NULL, 0);
if (index < 0 || index >= data->n_audio) { if (index < 0 || index >= data->n_audio) {
g_printerr ("Index out of bounds\n"); g_printerr ("Index out of bounds\n");
} else { } else {
@ -542,38 +544,36 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
Finally, we allow the user to switch the running audio stream. This very Finally, we allow the user to switch the running audio stream. This very
basic function just reads a string from the standard input (the basic function just reads a string from the standard input (the
keyboard), interprets it as a number, and tries to set the keyboard), interprets it as a number, and tries to set the
`current-audio` property of `playbin` (which previously we have only `current-audio` property of `playbin` (which previously we have only
read). read).
Bear in mind that the switch is not immediate. Some of the previously Bear in mind that the switch is not immediate. Some of the previously
decoded audio will still be flowing through the pipeline, while the new decoded audio will still be flowing through the pipeline, while the new
stream becomes active and is decoded. The delay depends on the stream becomes active and is decoded. The delay depends on the
particular multiplexing of the streams in the container, and the length particular multiplexing of the streams in the container, and the length
`playbin` has selected for its internal queues (which depends on the `playbin` has selected for its internal queues (which depends on the
network conditions). network conditions).
If you execute the tutorial, you will be able to switch from one If you execute the tutorial, you will be able to switch from one
language to another while the movie is running by pressing 0, 1 or 2 language to another while the movie is running by pressing 0, 1 or 2
(and ENTER). This concludes this tutorial. (and ENTER). This concludes this tutorial.
_________________________________________________________________________ ## Conclusion
# Conclusion
This tutorial has shown: This tutorial has shown:
- A few more of `playbin`'s properties: `flags`, `connection-speed`, - A few more of `playbin`'s properties: `flags`, `connection-speed`,
`n-video`, `n-audio`, `n-text`, `current-video`, `current-audio` and `n-video`, `n-audio`, `n-text`, `current-video`, `current-audio` and
`current-text`. `current-text`.
- How to retrieve the list of tags associated with a stream - How to retrieve the list of tags associated with a stream
with `g_signal_emit_by_name()`. with `g_signal_emit_by_name()`.
- How to retrieve a particular tag from the list with - How to retrieve a particular tag from the list with
`gst_tag_list_get_string()`or `gst_tag_list_get_uint()` `gst_tag_list_get_string()`or `gst_tag_list_get_uint()`
- How to switch the current audio simply by writing to the - How to switch the current audio simply by writing to the
`current-audio` property. `current-audio` property.
The next playback tutorial shows how to handle subtitles, either The next playback tutorial shows how to handle subtitles, either
embedded in the container or in an external file. embedded in the container or in an external file.
@ -583,7 +583,7 @@ code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon! It has been a pleasure having you here, and see you soon!
[Playback tutorial 2: Subtitle management]: Playback%2Btutorial%2B2%253A%2BSubtitle%2Bmanagement.html [Playback tutorial 2: Subtitle management]: sdk-playback-tutorial-subtitle-management.md
[information]: images/icons/emoticons/information.png [information]: images/icons/emoticons/information.png
[Mac]: sdk-installing-on-mac-osx.md [Mac]: sdk-installing-on-mac-osx.md
[Windows]: Installing+on+Windows [Windows]: Installing+on+Windows
@ -591,6 +591,3 @@ It has been a pleasure having you here, and see you soon!
[1]: sdk-installing-on-windows.md#running-the-tutorials [1]: sdk-installing-on-windows.md#running-the-tutorials
[iOS]: sdk-installing-for-ios-development.md#building-the-tutorials [iOS]: sdk-installing-for-ios-development.md#building-the-tutorials
[android]: sdk-installing-for-android-development.md#building-the-tutorials [android]: sdk-installing-for-android-development.md#building-the-tutorials
[Tutorial 12: Live streaming]: http://docs.gstreamer.com/display/GstSDK/Tutorial+12%3A+Live+streaming
[Tutorial 17: DVD playback]: http://docs.gstreamer.com/display/GstSDK/Tutorial+17%3A+DVD+playback
[information]: images/icons/emoticons/information.png

View file

@ -1,12 +1,11 @@
# Playback tutorial 4: Progressive streaming # Playback tutorial 4: Progressive streaming
# Goal ## Goal
[Basic tutorial 12: [](sdk-basic-tutorial-streaming.md) showed how to
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) showed how to
enhance the user experience in poor network conditions, by taking enhance the user experience in poor network conditions, by taking
buffering into account. This tutorial further expands [Basic tutorial buffering into account. This tutorial further expands
12: Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) by enabling [](sdk-basic-tutorial-streaming.md) by enabling
the local storage of the streamed media, and describes the advantages of the local storage of the streamed media, and describes the advantages of
this technique. In particular, it shows: this technique. In particular, it shows:
@ -15,11 +14,11 @@ this technique. In particular, it shows:
- How to know where it has been downloaded - How to know where it has been downloaded
- How to limit the amount of downloaded data that is kept - How to limit the amount of downloaded data that is kept
# Introduction ## Introduction
When streaming, data is fetched from the network and a small buffer of When streaming, data is fetched from the network and a small buffer of
future-data is kept to ensure smooth playback (see [Basic tutorial 12: future-data is kept to ensure smooth playback (see
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html)). However, data [](sdk-basic-tutorial-streaming.md)). However, data
is discarded as soon as it is displayed or rendered (there is no is discarded as soon as it is displayed or rendered (there is no
past-data buffer). This means, that if a user wants to jump back and past-data buffer). This means, that if a user wants to jump back and
continue playback from a point in the past, data needs to be continue playback from a point in the past, data needs to be
@ -30,25 +29,16 @@ downloaded data stored locally for this contingency. A graphical widget
is also normally used to show how much of the file has already been is also normally used to show how much of the file has already been
downloaded. downloaded.
`playbin` offers similar functionalities through the `DOWNLOAD` flag `playbin` offers similar functionalities through the `DOWNLOAD` flag
which stores the media in a local temporary file for faster playback of which stores the media in a local temporary file for faster playback of
already-downloaded chunks. already-downloaded chunks.
This code also shows how to use the Buffering Query, which allows This code also shows how to use the Buffering Query, which allows
knowing what parts of the file are available. knowing what parts of the file are available.
# A network-resilient example with local storage ## A network-resilient example with local storage
Copy this code into a text file named `playback-tutorial-4.c`. Copy this code into a text file named `playback-tutorial-4.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
**playback-tutorial-4.c** **playback-tutorial-4.c**
@ -56,7 +46,7 @@ Copy this code into a text file named `playback-tutorial-4.c`.
#include <gst/gst.h> #include <gst/gst.h>
#include <string.h> #include <string.h>
#define GRAPH_LENGTH 80 #define GRAPH_LENGTH 78
/* playbin flags */ /* playbin flags */
typedef enum { typedef enum {
@ -74,6 +64,7 @@ static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSp
gchar *location; gchar *location;
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL); g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
g_print ("Temporary file: %s\n", location); g_print ("Temporary file: %s\n", location);
g_free (location);
/* Uncomment this line to keep the temporary file after the program exits */ /* Uncomment this line to keep the temporary file after the program exits */
/* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */ /* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */
} }
@ -131,7 +122,6 @@ static gboolean refresh_ui (CustomData *data) {
if (result) { if (result) {
gint n_ranges, range, i; gint n_ranges, range, i;
gchar graph[GRAPH_LENGTH + 1]; gchar graph[GRAPH_LENGTH + 1];
GstFormat format = GST_FORMAT_TIME;
gint64 position = 0, duration = 0; gint64 position = 0, duration = 0;
memset (graph, ' ', GRAPH_LENGTH); memset (graph, ' ', GRAPH_LENGTH);
@ -141,14 +131,14 @@ static gboolean refresh_ui (CustomData *data) {
for (range = 0; range < n_ranges; range++) { for (range = 0; range < n_ranges; range++) {
gint64 start, stop; gint64 start, stop;
gst_query_parse_nth_buffering_range (query, range, &start, &stop); gst_query_parse_nth_buffering_range (query, range, &start, &stop);
start = start * GRAPH_LENGTH / 100; start = start * GRAPH_LENGTH / (stop - start);
stop = stop * GRAPH_LENGTH / 100; stop = stop * GRAPH_LENGTH / (stop - start);
for (i = (gint)start; i < stop; i++) for (i = (gint)start; i < stop; i++)
graph [i] = '-'; graph [i] = '-';
} }
if (gst_element_query_position (data->pipeline, &format, &position) && if (gst_element_query_position (data->pipeline, GST_TIME_FORMAT, &position) &&
GST_CLOCK_TIME_IS_VALID (position) && GST_CLOCK_TIME_IS_VALID (position) &&
gst_element_query_duration (data->pipeline, &format, &duration) && gst_element_query_duration (data->pipeline, GST_TIME_FORMAT, &duration) &&
GST_CLOCK_TIME_IS_VALID (duration)) { GST_CLOCK_TIME_IS_VALID (duration)) {
i = (gint)(GRAPH_LENGTH * (double)position / (double)(duration + 1)); i = (gint)(GRAPH_LENGTH * (double)position / (double)(duration + 1));
graph [i] = data->buffering_level < 100 ? 'X' : '>'; graph [i] = data->buffering_level < 100 ? 'X' : '>';
@ -226,37 +216,34 @@ int main(int argc, char *argv[]) {
} }
``` ```
<table> > ![information] If you need help to compile this code, refer to the
<tbody> > **Building the tutorials** section for your platform: [Mac] or
<tr class="odd"> > [Windows] or use this specific command on Linux:
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td> >
<td><div id="expander-1295673640" class="expand-container"> > `` gcc playback-tutorial-4.c -o playback-tutorial-4 `pkg-config --cflags --libs gstreamer-1.0` ``
<div id="expander-control-1295673640" class="expand-control"> >
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span> > If you need help to run this code, refer to the **Running the
</div> > tutorials** section for your platform: [Mac OS X], [Windows][1], for
<div id="expander-content-1295673640" class="expand-content"> > [iOS] or for [android].
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p> >
<div class="panel" style="border-width: 1px;"> > This tutorial opens a window and displays a movie, with accompanying
<div class="panelContent"> > audio. The media is fetched from the Internet, so the window might
<p><code>gcc playback-tutorial-3.c -o playback-tutorial-3 `pkg-config --cflags --libs gstreamer-1.0`</code></p> > take a few seconds to appear, depending on your connection
</div> > speed. In the console window, you should see a message indicating
</div> > where the media is being stored, and a text graph representing the
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p> > downloaded portions and the current position. A buffering message
<p>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. In the console window, you should see a message indicating where the media is being stored, and a text graph representing the downloaded portions and the current position. A buffering message appears whenever buffering is required, which might never happen is your network connection is fast enough</p> > appears whenever buffering is required, which might never happen is
<p>Required libraries: <code>gstreamer-1.0</code></p> > your network connection is fast enough
</div> >
</div></td> > Required libraries: `gstreamer-1.0`
</tr>
</tbody>
</table>
# Walkthrough
This code is based on that of [Basic tutorial 12: ## Walkthrough
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html). Lets review
This code is based on that of [](sdk-basic-tutorial-streaming.md). Lets review
only the differences. only the differences.
#### Setup ### Setup
``` c ``` c
/* Set the download flag */ /* Set the download flag */
@ -265,18 +252,18 @@ flags |= GST_PLAY_FLAG_DOWNLOAD;
g_object_set (pipeline, "flags", flags, NULL); g_object_set (pipeline, "flags", flags, NULL);
``` ```
By setting this flag, `playbin` instructs its internal queue (a By setting this flag, `playbin` instructs its internal queue (a
`queue2` element, actually) to store all downloaded `queue2` element, actually) to store all downloaded
data. data.
``` c ``` c
g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL); g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_location), NULL);
``` ```
`deep-notify` signals are emitted by `GstObject` elements (like `deep-notify` signals are emitted by `GstObject` elements (like
`playbin`) when the properties of any of their children elements `playbin`) when the properties of any of their children elements
change. In this case we want to know when the `temp-location` property change. In this case we want to know when the `temp-location` property
changes, indicating that the `queue2` has decided where to store the changes, indicating that the `queue2` has decided where to store the
downloaded downloaded
data. data.
@ -285,30 +272,25 @@ static void got_location (GstObject *gstobject, GstObject *prop_object, GParamSp
gchar *location; gchar *location;
g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL); g_object_get (G_OBJECT (prop_object), "temp-location", &location, NULL);
g_print ("Temporary file: %s\n", location); g_print ("Temporary file: %s\n", location);
g_free (location);
/* Uncomment this line to keep the temporary file after the program exits */ /* Uncomment this line to keep the temporary file after the program exits */
/* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */ /* g_object_set (G_OBJECT (prop_object), "temp-remove", FALSE, NULL); */
} }
``` ```
The `temp-location` property is read from the element that triggered the The `temp-location` property is read from the element that triggered the
signal (the `queue2`) and printed on screen. signal (the `queue2`) and printed on screen.
When the pipeline state changes from `PAUSED` to `READY`, this file is When the pipeline state changes from `PAUSED` to `READY`, this file is
removed. As the comment reads, you can keep it by setting the removed. As the comment reads, you can keep it by setting the
`temp-remove` property of the `queue2` to `FALSE`. `temp-remove` property of the `queue2` to `FALSE`.
<table> > ![warning]
<tbody> > On Windows this file is usually created inside the `Temporary Internet Files` folder, which might hide it from Windows Explorer. If you cannot find the downloaded files, try to use the console.
<tr class="odd">
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
<td><p>On Windows this file is usually created inside the <code>Temporary Internet Files</code> folder, which might hide it from Windows Explorer. If you cannot find the downloaded files, try to use the console.</p></td>
</tr>
</tbody>
</table>
#### User Interface ### User Interface
In `main` we also install a timer which we use to refresh the UI every In `main` we also install a timer which we use to refresh the UI every
second. second.
``` c ``` c
@ -316,7 +298,7 @@ second.
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data); g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
``` ```
The `refresh_ui` method queries the pipeline to find out which parts of The `refresh_ui` method queries the pipeline to find out which parts of
the file have been downloaded and what the currently playing position the file have been downloaded and what the currently playing position
is. It builds a graph to display this information (sort of a text-mode is. It builds a graph to display this information (sort of a text-mode
user interface) and prints it on screen, overwriting the previous one so user interface) and prints it on screen, overwriting the previous one so
@ -324,9 +306,9 @@ it looks like it is animated:
[---->------- ] [---->------- ]
The dashes `-` indicate the downloaded parts, and the greater-than The dashes `-` indicate the downloaded parts, and the greater-than
sign `>` shows the current position (turning into an `X` when the sign `>` shows the current position (turning into an `X` when the
pipeline is paused). Keep in mind that if your network is fast enough, pipeline is paused). Keep in mind that if your network is fast enough,
you will not see the download bar (the dashes) advance at all; it will you will not see the download bar (the dashes) advance at all; it will
be completely full from the beginning. be completely full from the beginning.
@ -339,19 +321,18 @@ static gboolean refresh_ui (CustomData *data) {
``` ```
The first thing we do in `refresh_ui` is construct a new Buffering The first thing we do in `refresh_ui` is construct a new Buffering
`GstQuery` with `gst_query_new_buffering()` and pass it to the pipeline `GstQuery` with `gst_query_new_buffering()` and pass it to the pipeline
(`playbin`) with `gst_element_query()`. In [Basic tutorial 4: Time (`playbin`) with `gst_element_query()`. In [](sdk-basic-tutorial-time-management.md) we have
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) we have
already seen how to perform simple queries like Position and Duration already seen how to perform simple queries like Position and Duration
using specific methods. More complex queries, like Buffering, need to using specific methods. More complex queries, like Buffering, need to
use the more general `gst_element_query()`. use the more general `gst_element_query()`.
The Buffering query can be made in different `GstFormat` (TIME, BYTES, The Buffering query can be made in different `GstFormat` (TIME, BYTES,
PERCENTAGE and a few more). Not all elements can answer the query in all PERCENTAGE and a few more). Not all elements can answer the query in all
the formats, so you need to check which ones are supported in your the formats, so you need to check which ones are supported in your
particular pipeline. If `gst_element_query()` returns `TRUE`, the query particular pipeline. If `gst_element_query()` returns `TRUE`, the query
succeeded. The answer to the query is contained in the same succeeded. The answer to the query is contained in the same
`GstQuery` structure we created, and can be retrieved using multiple `GstQuery` structure we created, and can be retrieved using multiple
parse methods: parse methods:
``` c ``` c
@ -359,8 +340,8 @@ n_ranges = gst_query_get_n_buffering_ranges (query);
for (range = 0; range < n_ranges; range++) { for (range = 0; range < n_ranges; range++) {
gint64 start, stop; gint64 start, stop;
gst_query_parse_nth_buffering_range (query, range, &start, &stop); gst_query_parse_nth_buffering_range (query, range, &start, &stop);
start = start * GRAPH_LENGTH / 100; start = start * GRAPH_LENGTH / (stop - start);
stop = stop * GRAPH_LENGTH / 100; stop = stop * GRAPH_LENGTH / (stop - start);
for (i = (gint)start; i < stop; i++) for (i = (gint)start; i < stop; i++)
graph [i] = '-'; graph [i] = '-';
} }
@ -369,13 +350,13 @@ for (range = 0; range < n_ranges; range++) {
Data does not need to be downloaded in consecutive pieces from the Data does not need to be downloaded in consecutive pieces from the
beginning of the file: Seeking, for example, might force to start beginning of the file: Seeking, for example, might force to start
downloading from a new position and leave a downloaded chunk behind. downloading from a new position and leave a downloaded chunk behind.
Therefore, `gst_query_get_n_buffering_ranges()` returns the number of Therefore, `gst_query_get_n_buffering_ranges()` returns the number of
chunks, or *ranges* of downloaded data, and then, the position and size chunks, or *ranges* of downloaded data, and then, the position and size
of each range is retrieved with `gst_query_parse_nth_buffering_range()`. of each range is retrieved with `gst_query_parse_nth_buffering_range()`.
The format of the returned values (start and stop position for each The format of the returned values (start and stop position for each
range) depends on what we requested in the range) depends on what we requested in the
`gst_query_new_buffering()` call. In this case, PERCENTAGE. These `gst_query_new_buffering()` call. In this case, PERCENTAGE. These
values are used to generate the graph. values are used to generate the graph.
``` c ``` c
@ -396,7 +377,7 @@ percentage.
The current position is indicated with either a `>` or an `X` The current position is indicated with either a `>` or an `X`
depending on the buffering level. If it is below 100%, the code in the depending on the buffering level. If it is below 100%, the code in the
`cb_message` method will have set the pipeline to `PAUSED`, so we print `cb_message` method will have set the pipeline to `PAUSED`, so we print
an `X`. If the buffering level is 100% the pipeline is in the an `X`. If the buffering level is 100% the pipeline is in the
`PLAYING` state and we print a `>`. `PLAYING` state and we print a `>`.
@ -411,7 +392,7 @@ if (data->buffering_level < 100) {
Finally, if the buffering level is below 100%, we report this Finally, if the buffering level is below 100%, we report this
information (and delete it otherwise). information (and delete it otherwise).
#### Limiting the size of the downloaded file ### Limiting the size of the downloaded file
``` c ``` c
/* Uncomment this line to limit the amount of downloaded data */ /* Uncomment this line to limit the amount of downloaded data */
@ -423,23 +404,25 @@ size of the temporary file, by overwriting already played regions.
Observe the download bar to see which regions are kept available in the Observe the download bar to see which regions are kept available in the
file. file.
# Conclusion ## Conclusion
This tutorial has shown: This tutorial has shown:
- How to enable progressive downloading with the - How to enable progressive downloading with the
`GST_PLAY_FLAG_DOWNLOAD` `playbin` flag `GST_PLAY_FLAG_DOWNLOAD` `playbin` flag
- How to know what has been downloaded using a Buffering `GstQuery` - How to know what has been downloaded using a Buffering `GstQuery`
- How to know where it has been downloaded with the - How to know where it has been downloaded with the
`deep-notify::temp-location` signal `deep-notify::temp-location` signal
- How to limit the size of the temporary file with - How to limit the size of the temporary file with
the `ring-buffer-max-size` property of `playbin`. the `ring-buffer-max-size` property of `playbin`.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
## Attachments: [information]: images/icons/emoticons/information.png
[Mac]: sdk-installing-on-mac-osx.md
![](images/icons/bullet_blue.gif) [Windows]: Installing+on+Windows
[playback-tutorial-4.c](attachments/327808/2424846.c) (text/plain) [Mac OS X]: sdk-installing-on-mac-osx.md#building-the-tutorials
![](images/icons/bullet_blue.gif) [1]: sdk-installing-on-windows.md#running-the-tutorials
[vs2010.zip](attachments/327808/2424847.zip) (application/zip) [iOS]: sdk-installing-for-ios-development.md#building-the-tutorials
[android]: sdk-installing-for-android-development.md#building-the-tutorials
[warning]: images/icons/emoticons/warning.png

View file

@ -1,42 +1,30 @@
# Playback tutorial 3: Short-cutting the pipeline # Playback tutorial 3: Short-cutting the pipeline
# Goal ## Goal
[Basic tutorial 8: Short-cutting the [](sdk-basic-tutorial-short-cutting-the-pipeline.md) showed
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) showed
how an application can manually extract or inject data into a pipeline how an application can manually extract or inject data into a pipeline
by using two special elements called `appsrc` and `appsink`. by using two special elements called `appsrc` and `appsink`.
`playbin` allows using these elements too, but the method to connect `playbin` allows using these elements too, but the method to connect
them is different. To connect an `appsink` to `playbin` see [Playback them is different. To connect an `appsink` to `playbin` see [](sdk-playback-tutorial-custom-playbin-sinks.md).
tutorial 7: Custom playbin
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin%2Bsinks.html).
This tutorial shows: This tutorial shows:
- How to connect `appsrc` with `playbin` - How to connect `appsrc` with `playbin`
- How to configure the `appsrc` - How to configure the `appsrc`
# A playbin waveform generator ## A playbin waveform generator
Copy this code into a text file named `playback-tutorial-3.c`. Copy this code into a text file named `playback-tutorial-3.c`.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><p>This tutorial is included in the SDK since release 2012.7. If you cannot find it in the downloaded code, please install the latest release of the GStreamer SDK.</p></td>
</tr>
</tbody>
</table>
**playback-tutorial-3.c** **playback-tutorial-3.c**
``` c ``` c
#include <gst/gst.h> #include <gst/gst.h>
#include <gst/audio/audio.h>
#include <string.h> #include <string.h>
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */ #define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */ #define SAMPLE_RATE 44100 /* Samples per second we are sending */
#define AUDIO_CAPS "audio/x-raw-int,channels=1,rate=%d,signed=(boolean)true,width=16,depth=16,endianness=BYTE_ORDER"
/* Structure to contain all our information, so we can pass it to callbacks */ /* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData { typedef struct _CustomData {
@ -59,6 +47,7 @@ static gboolean push_data (CustomData *data) {
GstBuffer *buffer; GstBuffer *buffer;
GstFlowReturn ret; GstFlowReturn ret;
int i; int i;
GstMapInfo map;
gint16 *raw; gint16 *raw;
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */ gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
gfloat freq; gfloat freq;
@ -71,7 +60,8 @@ static gboolean push_data (CustomData *data) {
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE); GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
/* Generate some psychodelic waveforms */ /* Generate some psychodelic waveforms */
raw = (gint16 *)GST_BUFFER_DATA (buffer); gst_buffer_map (buffer, &map, GST_MAP_WRITE);
raw = (gint16 *)map.data;
data->c += data->d; data->c += data->d;
data->d -= data->c / 1000; data->d -= data->c / 1000;
freq = 1100 + 1000 * data->d; freq = 1100 + 1000 * data->d;
@ -80,6 +70,7 @@ static gboolean push_data (CustomData *data) {
data->b -= data->a / freq; data->b -= data->a / freq;
raw[i] = (gint16)(500 * data->a); raw[i] = (gint16)(500 * data->a);
} }
gst_buffer_unmap (buffer, &map);
data->num_samples += num_samples; data->num_samples += num_samples;
/* Push the buffer into the appsrc */ /* Push the buffer into the appsrc */
@ -133,16 +124,16 @@ static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
/* This function is called when playbin has created the appsrc element, so we have /* This function is called when playbin has created the appsrc element, so we have
* a chance to configure it. */ * a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) { static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text; GstAudioInfo info;
GstCaps *audio_caps; GstCaps *audio_caps;
g_print ("Source has been created. Configuring.\n"); g_print ("Source has been created. Configuring.\n");
data->app_source = source; data->app_source = source;
/* Configure appsrc */ /* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE); gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
audio_caps = gst_caps_from_string (audio_caps_text); audio_caps = gst_audio_info_to_caps (&info);
g_object_set (source, "caps", audio_caps, NULL); g_object_set (source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
g_signal_connect (source, "need-data", G_CALLBACK (start_feed), data); g_signal_connect (source, "need-data", G_CALLBACK (start_feed), data);
g_signal_connect (source, "enough-data", G_CALLBACK (stop_feed), data); g_signal_connect (source, "enough-data", G_CALLBACK (stop_feed), data);
gst_caps_unref (audio_caps); gst_caps_unref (audio_caps);
@ -185,16 +176,16 @@ int main(int argc, char *argv[]) {
} }
``` ```
To use an `appsrc` as the source for the pipeline, simply instantiate a To use an `appsrc` as the source for the pipeline, simply instantiate a
`playbin` and set its URI to `appsrc://` `playbin` and set its URI to `appsrc://`
``` c ``` c
/* Create the playbin element */ /* Create the playbin element */
data.pipeline = gst_parse_launch ("playbin uri=appsrc://", NULL); data.pipeline = gst_parse_launch ("playbin uri=appsrc://", NULL);
``` ```
`playbin` will create an internal `appsrc` element and fire the `playbin` will create an internal `appsrc` element and fire the
`source-setup` signal to allow the application to configure `source-setup` signal to allow the application to configure
it: it:
``` c ``` c
@ -202,7 +193,7 @@ g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &dat
``` ```
In particular, it is important to set the caps property of `appsrc`, In particular, it is important to set the caps property of `appsrc`,
since, once the signal handler returns, `playbin` will instantiate the since, once the signal handler returns, `playbin` will instantiate the
next element in the pipeline according to these next element in the pipeline according to these
caps: caps:
@ -210,16 +201,16 @@ caps:
/* This function is called when playbin has created the appsrc element, so we have /* This function is called when playbin has created the appsrc element, so we have
* a chance to configure it. */ * a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) { static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text; GstAudioInfo info;
GstCaps *audio_caps; GstCaps *audio_caps;
g_print ("Source has been created. Configuring.\n"); g_print ("Source has been created. Configuring.\n");
data->app_source = source; data->app_source = source;
/* Configure appsrc */ /* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE); gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
audio_caps = gst_caps_from_string (audio_caps_text); audio_caps = gst_audio_info_to_caps (&info);
g_object_set (source, "caps", audio_caps, NULL); g_object_set (source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
g_signal_connect (source, "need-data", G_CALLBACK (start_feed), data); g_signal_connect (source, "need-data", G_CALLBACK (start_feed), data);
g_signal_connect (source, "enough-data", G_CALLBACK (stop_feed), data); g_signal_connect (source, "enough-data", G_CALLBACK (stop_feed), data);
gst_caps_unref (audio_caps); gst_caps_unref (audio_caps);
@ -227,41 +218,28 @@ static void source_setup (GstElement *pipeline, GstElement *source, CustomData *
} }
``` ```
The configuration of the `appsrc` is exactly the same as in [Basic The configuration of the `appsrc` is exactly the same as in
tutorial 8: Short-cutting the [](sdk-basic-tutorial-short-cutting-the-pipeline.md):
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html): the caps are set to `audio/x-raw`, and two callbacks are registered,
the caps are set to `audio/x-raw-int`, and two callbacks are registered,
so the element can tell the application when it needs to start and stop so the element can tell the application when it needs to start and stop
pushing data. See [Basic tutorial 8: Short-cutting the pushing data. See [](sdk-basic-tutorial-short-cutting-the-pipeline.md)
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html)
for more details. for more details.
From this point onwards, `playbin` takes care of the rest of the From this point onwards, `playbin` takes care of the rest of the
pipeline, and the application only needs to worry about generating more pipeline, and the application only needs to worry about generating more
data when told so. data when told so.
To learn how data can be extracted from `playbin` using the To learn how data can be extracted from `playbin` using the
`appsink` element, see [Playback tutorial 7: Custom playbin `appsink` element, see [](sdk-playback-tutorial-custom-playbin-sinks.md).
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin%2Bsinks.html).
# Conclusion ## Conclusion
This tutorial applies the concepts shown in [Basic tutorial 8: This tutorial applies the concepts shown in
Short-cutting the [](sdk-basic-tutorial-short-cutting-the-pipeline.md) to
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) to
`playbin`. In particular, it has shown: `playbin`. In particular, it has shown:
- How to connect `appsrc` with `playbin` using the special - How to connect `appsrc` with `playbin` using the special
URI `appsrc://` URI `appsrc://`
- How to configure the `appsrc` using the `source-setup` signal - How to configure the `appsrc` using the `source-setup` signal
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[playback-tutorial-3.c](attachments/1442200/2424850.c) (text/plain)
![](images/icons/bullet_blue.gif)
[vs2010.zip](attachments/1442200/2424849.zip) (application/zip)
![](images/icons/bullet_blue.gif)
[playback-tutorial-3.c](attachments/1442200/2424848.c) (text/plain)

View file

@ -1,6 +1,6 @@
# Playback tutorial 2: Subtitle management # Playback tutorial 2: Subtitle management
# Goal ## Goal
This tutorial is very similar to the previous one, but instead of This tutorial is very similar to the previous one, but instead of
switching among different audio streams, we will use subtitle streams. switching among different audio streams, we will use subtitle streams.
@ -12,34 +12,35 @@ This will allow us to learn:
- How to customize the font used for the subtitles - How to customize the font used for the subtitles
# Introduction ## Introduction
We already know (from the previous tutorial) that container files can We already know (from the previous tutorial) that container files can
hold multiple audio and video streams, and that we can very easily hold multiple audio and video streams, and that we can very easily
choose among them by changing the `current-audio` or choose among them by changing the `current-audio` or
`current-video` `playbin` property. Switching subtitles is just as `current-video` `playbin` property. Switching subtitles is just as
easy. easy.
It is worth noting that, just like it happens with audio and video, It is worth noting that, just like it happens with audio and video,
`playbin` takes care of choosing the right decoder for the subtitles, `playbin` takes care of choosing the right decoder for the subtitles,
and that the plugin structure of GStreamer allows adding support for new and that the plugin structure of GStreamer allows adding support for new
formats as easily as copying a file. Everything is invisible to the formats as easily as copying a file. Everything is invisible to the
application developer. application developer.
Besides subtitles embedded in the container, `playbin` offers the Besides subtitles embedded in the container, `playbin` offers the
possibility to add an extra subtitle stream from an external URI. possibility to add an extra subtitle stream from an external URI.
This tutorial opens a file which already contains 5 subtitle streams, This tutorial opens a file which already contains 5 subtitle streams,
and adds another one from another file (for the Greek language). and adds another one from another file (for the Greek language).
# The multilingual player with subtitles ## The multilingual player with subtitles
Copy this code into a text file named `playback-tutorial-2.c` (or find Copy this code into a text file named `playback-tutorial-2.c` (or find
it in the SDK installation). it in the SDK installation).
**playback-tutorial-2.c** **playback-tutorial-2.c**
``` c ``` c
#include <stdio.h>
#include <gst/gst.h> #include <gst/gst.h>
/* Structure to contain all our information, so we can pass it around */ /* Structure to contain all our information, so we can pass it around */
@ -103,7 +104,7 @@ int main(int argc, char *argv[]) {
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data); gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef G_OS_WIN32
io_stdin = g_io_channel_win32_new_fd (fileno (stdin)); io_stdin = g_io_channel_win32_new_fd (fileno (stdin));
#else #else
io_stdin = g_io_channel_unix_new (fileno (stdin)); io_stdin = g_io_channel_unix_new (fileno (stdin));
@ -262,38 +263,38 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
} }
``` ```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-513883844" class="expand-container">
<div id="expander-control-513883844" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-513883844" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc playback-tutorial-2.c -o playback-tutorial-2 `pkg-config --cflags --libs gstreamer-1.0`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The number of subtitle streams is shown in the terminal, and the user can switch from one to another by entering a number and pressing enter. A small delay is to be expected. </span><strong>Please read the note at the bottom of this page</strong><span>.</span></p>
<p><span><span>Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how </span><a href="http://docs.gstreamer.com/display/GstSDK/Tutorial+12%3A+Live+streaming">Tutorial 12: Live streaming</a><span> solves this issue.</span></span></p>
<p></p>
<p>Required libraries: <code>gstreamer-1.0</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough > ![information] Need help?
>
> If you need help to compile this code, refer to the **Building the
> tutorials** section for your platform: [Linux], [Mac OS X] or
> [Windows], or use this specific command on Linux:
>
> `` gcc playback-tutorial-2.c -o playback-tutorial-2 `pkg-config --cflags --libs gstreamer-1.0` ``
>
> If you need help to run this code, refer to the **Running the
> tutorials** section for your platform: [Linux][1], [Mac OS X][2] or
> [Windows][3].
>
> This tutorial opens a window and displays a movie, with accompanying
> audio. The media is fetched from the Internet, so the window might
> take a few seconds to appear, depending on your connection
> speed. The number of subtitle streams is shown in the terminal, and
> the user can switch from one to another by entering a number and
> pressing enter. A small delay is to be
> expected. _Please read the note at the bottom of this
> page._ Bear in mind that
> there is no latency management (buffering), so on slow connections,
> the movie might stop after a few seconds. See how
> [](sdk-basic-tutorial-streaming.md) solves this issue.
>
> Required libraries: `gstreamer-1.0`
This tutorial is copied from [Playback tutorial 1: Playbin ## Walkthrough
usage](Playback%2Btutorial%2B1%253A%2BPlaybin%2Busage.html) with some
changes, so let's review only the changes. This tutorial is copied from
[](sdk-playback-tutorial-playbin-usage.md) with some changes, so let's
review only the changes.
``` c ``` c
/* Set the subtitle URI to play and some font description */ /* Set the subtitle URI to play and some font description */
@ -301,8 +302,8 @@ g_object_set (data.playbin, "suburi", "http://docs.gstreamer.com/media/sintel_tr
g_object_set (data.playbin, "subtitle-font-desc", "Sans, 18", NULL); g_object_set (data.playbin, "subtitle-font-desc", "Sans, 18", NULL);
``` ```
After setting the media URI, we set the `suburi` property, which points After setting the media URI, we set the `suburi` property, which points
`playbin` to a file containing a subtitle stream. In this case, the `playbin` to a file containing a subtitle stream. In this case, the
media file already contains multiple subtitle streams, so the one media file already contains multiple subtitle streams, so the one
provided in the `suburi` is added to the list, and will be the currently provided in the `suburi` is added to the list, and will be the currently
selected one. selected one.
@ -312,17 +313,17 @@ resides in the container file, therefore, subtitles not embedded in a
container will not have metadata. When running this tutorial you will container will not have metadata. When running this tutorial you will
find that the first subtitle stream does not have a language tag. find that the first subtitle stream does not have a language tag.
The `subtitle-font-desc` property allows specifying the font to render The `subtitle-font-desc` property allows specifying the font to render
the subtitles. Since [Pango](http://www.pango.org/) is the library used the subtitles. Since [Pango](http://www.pango.org/) is the library used
to render fonts, you can check its documentation to see how this font to render fonts, you can check its documentation to see how this font
should be specified, in particular, the should be specified, in particular, the
[pango-font-description-from-string](http://developer.gnome.org/pango/stable/pango-Fonts.html#pango-font-description-from-string) function. [pango-font-description-from-string](http://developer.gnome.org/pango/stable/pango-Fonts.html#pango-font-description-from-string) function.
In a nutshell, the format of the string representation is `[FAMILY-LIST] In a nutshell, the format of the string representation is `[FAMILY-LIST]
[STYLE-OPTIONS] [SIZE]` where `FAMILY-LIST` is a comma separated list of [STYLE-OPTIONS] [SIZE]` where `FAMILY-LIST` is a comma separated list of
families optionally terminated by a comma, `STYLE_OPTIONS` is a families optionally terminated by a comma, `STYLE_OPTIONS` is a
whitespace separated list of words where each word describes one of whitespace separated list of words where each word describes one of
style, variant, weight, or stretch, and `SIZE` is an decimal number style, variant, weight, or stretch, and `SIZE` is an decimal number
(size in points). For example the following are all valid string (size in points). For example the following are all valid string
representations: representations:
@ -333,7 +334,7 @@ representations:
The commonly available font families are: Normal, Sans, Serif and The commonly available font families are: Normal, Sans, Serif and
Monospace. Monospace.
The available styles are: Normal (the font is upright), Oblique (the The available styles are: Normal (the font is upright), Oblique (the
font is slanted, but in a roman style), Italic (the font is slanted in font is slanted, but in a roman style), Italic (the font is slanted in
an italic style). an italic style).
@ -344,10 +345,10 @@ The available variants are: Normal, Small\_Caps (A font with the lower
case characters replaced by smaller variants of the capital characters) case characters replaced by smaller variants of the capital characters)
The available stretch styles The available stretch styles
are: Ultra-Condensed, Extra-Condensed, Condensed, Semi-Condensed, Normal, Semi-Expanded, Expanded, are: Ultra-Condensed, Extra-Condensed, Condensed, Semi-Condensed, Normal, Semi-Expanded, Expanded,
Extra-Expanded, Ultra-Expanded Extra-Expanded, Ultra-Expanded
 
``` c ``` c
/* Set flags to show Audio, Video and Subtitles */ /* Set flags to show Audio, Video and Subtitles */
@ -356,17 +357,16 @@ flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin, "flags", flags, NULL); g_object_set (data.playbin, "flags", flags, NULL);
``` ```
We set the `flags` property to allow Audio, Video and Text (Subtitles). We set the `flags` property to allow Audio, Video and Text (Subtitles).
The rest of the tutorial is the same as [Playback tutorial 1: Playbin The rest of the tutorial is the same as [](sdk-playback-tutorial-playbin-usage.md), except
usage](Playback%2Btutorial%2B1%253A%2BPlaybin%2Busage.html), except that the keyboard input changes the `current-text` property instead of
that the keyboard input changes the `current-text` property instead of
the `current-audio`. As before, keep in mind that stream changes are not the `current-audio`. As before, keep in mind that stream changes are not
immediate, since there is a lot of information flowing through the immediate, since there is a lot of information flowing through the
pipeline that needs to reach the end of it before the new stream shows pipeline that needs to reach the end of it before the new stream shows
up. up.
# Conclusion ## Conclusion
This tutorial showed how to handle subtitles from `playbin`, whether This tutorial showed how to handle subtitles from `playbin`, whether
they are embedded in the container or in a different file: they are embedded in the container or in a different file:
@ -374,10 +374,10 @@ they are embedded in the container or in a different file:
- Subtitles are chosen using the `current-tex`t and `n-tex`t - Subtitles are chosen using the `current-tex`t and `n-tex`t
properties of `playbin`. properties of `playbin`.
- External subtitle files can be selected using the `suburi` property. - External subtitle files can be selected using the `suburi` property.
- Subtitle appearance can be customized with the - Subtitle appearance can be customized with the
`subtitle-font-desc` property. `subtitle-font-desc` property.
The next playback tutorial shows how to change the playback speed. The next playback tutorial shows how to change the playback speed.
@ -385,14 +385,4 @@ Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it. code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!
<table> [information]: images/icons/emoticons/information.png
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
<td><p>There is a bug in the current version of the sdk <a href="https://bugzilla.gnome.org/show_bug.cgi?id=638168">https://bugzilla.gnome.org/show_bug.cgi?id=638168</a>:</p>
<p>Switching subtitle tracks while there is a subtitle on the screen gives this warning:</p>
<p><code>WARN                 katedec gstkatedec.c:309:gst_kate_dec_chain:&lt;katedec1&gt; failed to push buffer: wrong-state</code></p>
<p>And after a while it freezes.</p></td>
</tr>
</tbody>
</table>

View file

@ -1,6 +1,4 @@
# Playback tutorials # Playback tutorials
# Welcome to the GStreamer SDK Playback tutorials
These tutorials explain everything you need to know to produce a media These tutorials explain everything you need to know to produce a media
playback application using GStreamer. playback application using GStreamer.