Remove all references to playbin2

This commit is contained in:
Thibault Saunier 2016-05-27 14:19:02 -04:00
parent 42e05576ed
commit 437968774c
37 changed files with 258 additions and 258 deletions

View file

@ -15,7 +15,7 @@ Android device. It shows:
It also uses the knowledge gathered in the [Basic It also uses the knowledge gathered in the [Basic
tutorials](Basic%2Btutorials.html) regarding: tutorials](Basic%2Btutorials.html) regarding:
- How to use `playbin2` to play any kind of media - How to use `playbin` to play any kind of media
- How to handle network resilience problems - How to handle network resilience problems
# Introduction # Introduction
@ -23,10 +23,10 @@ tutorials](Basic%2Btutorials.html) regarding:
From the previous tutorials, we already have almost all necessary pieces From the previous tutorials, we already have almost all necessary pieces
to build a media player. The most complex part is assembling a pipeline to build a media player. The most complex part is assembling a pipeline
which retrieves, decodes and displays the media, but we already know which retrieves, decodes and displays the media, but we already know
that the `playbin2` element can take care of all that for us. We only that the `playbin` element can take care of all that for us. We only
need to replace the manual pipeline we used in [Android tutorial 3: need to replace the manual pipeline we used in [Android tutorial 3:
Video](Android%2Btutorial%2B3%253A%2BVideo.html) with a single-element Video](Android%2Btutorial%2B3%253A%2BVideo.html) with a single-element
`playbin2` pipeline and we are good to go\! `playbin` pipeline and we are good to go\!
However, we can do better than. We will add a [Seek However, we can do better than. We will add a [Seek
Bar](http://developer.android.com/reference/android/widget/SeekBar.html), Bar](http://developer.android.com/reference/android/widget/SeekBar.html),
@ -303,7 +303,7 @@ public class Tutorial4 extends Activity implements SurfaceHolder.Callback, OnSee
### Supporting arbitrary media URIs ### Supporting arbitrary media URIs
The C code provides the `nativeSetUri()` method so we can indicate the The C code provides the `nativeSetUri()` method so we can indicate the
URI of the media to play. Since `playbin2` will be taking care of URI of the media to play. Since `playbin` will be taking care of
retrieving the media, we can use local or remote URIs indistinctly retrieving the media, we can use local or remote URIs indistinctly
(`file://` or `http://`, for example). From Java, though, we want to (`file://` or `http://`, for example). From Java, though, we want to
keep track of whether the file is local or remote, because we will not keep track of whether the file is local or remote, because we will not
@ -536,7 +536,7 @@ typedef struct _CustomData {
gboolean is_live; /* Live streams do not use buffering */ gboolean is_live; /* Live streams do not use buffering */
} CustomData; } CustomData;
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */ GST_PLAY_FLAG_TEXT = (1 << 2) /* We want subtitle output */
} GstPlayFlags; } GstPlayFlags;
@ -834,7 +834,7 @@ static void *app_function (void *userdata) {
g_main_context_push_thread_default(data->context); g_main_context_push_thread_default(data->context);
/* Build pipeline */ /* Build pipeline */
data->pipeline = gst_parse_launch("playbin2", &error); data->pipeline = gst_parse_launch("playbin", &error);
if (error) { if (error) {
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message); gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
g_clear_error (&error); g_clear_error (&error);
@ -925,7 +925,7 @@ static void gst_native_finalize (JNIEnv* env, jobject thiz) {
GST_DEBUG ("Done finalizing"); GST_DEBUG ("Done finalizing");
} }
/* Set playbin2's URI */ /* Set playbin's URI */
void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) { void gst_native_set_uri (JNIEnv* env, jobject thiz, jstring uri) {
CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id); CustomData *data = GET_CUSTOM_DATA (env, thiz, custom_data_field_id);
if (!data || !data->pipeline) return; if (!data || !data->pipeline) return;
@ -1090,14 +1090,14 @@ GStreamer with
and and
[ReleaseStringUTFChars()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17294). [ReleaseStringUTFChars()](http://docs.oracle.com/javase/1.5.0/docs/guide/jni/spec/functions.html#wp17294).
`playbin2` will only care about URI changes in the READY to PAUSED state `playbin` will only care about URI changes in the READY to PAUSED state
change, because the new URI might need a completely different playback change, because the new URI might need a completely different playback
pipeline (think about switching from a local Matroska file to a remote pipeline (think about switching from a local Matroska file to a remote
OGG file: this would require, at least, different source and demuxing OGG file: this would require, at least, different source and demuxing
elements). Thus, before passing the new URI to `playbin2` we set its elements). Thus, before passing the new URI to `playbin` we set its
state to READY (if we were in PAUSED or PLAYING). state to READY (if we were in PAUSED or PLAYING).
`playbin2`s URI is exposed as a common GObject property, so we simply `playbin`s URI is exposed as a common GObject property, so we simply
set it with `g_object_set()`. set it with `g_object_set()`.
We then reset the clip duration, so it is re-queried later, and bring We then reset the clip duration, so it is re-queried later, and bring
@ -1150,7 +1150,7 @@ static void check_media_size (CustomData *data) {
``` ```
We first retrieve the video sink element from the pipeline, using the We first retrieve the video sink element from the pipeline, using the
`video-sink` property of `playbin2`, and then its sink Pad. The `video-sink` property of `playbin`, and then its sink Pad. The
negotiated Caps of this Pad, which we recover using negotiated Caps of this Pad, which we recover using
`gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media. `gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media.
@ -1404,7 +1404,7 @@ in this tutorial the URI does not change, but it will in the next one).
# Conclusion # Conclusion
This tutorial has shown how to embed a `playbin2` pipeline into an This tutorial has shown how to embed a `playbin` pipeline into an
Android application. This, effectively, turns such application into a Android application. This, effectively, turns such application into a
basic media player, capable of streaming and decoding all the formats basic media player, capable of streaming and decoding all the formats
GStreamer understands. More particularly, it has shown: GStreamer understands. More particularly, it has shown:

View file

@ -74,7 +74,7 @@ allows you to choose a local media file, no matter what extension or
MIME type it has. MIME type it has.
If a new media is selected, it is passed onto the native code (which If a new media is selected, it is passed onto the native code (which
will set the pipeline to READY, pass the URI onto `playbin2`, and bring will set the pipeline to READY, pass the URI onto `playbin`, and bring
the pipeline back to the previous state). The current position is also the pipeline back to the previous state). The current position is also
reset, so the new clip does not start in the previous position. reset, so the new clip does not start in the previous position.

View file

@ -408,7 +408,7 @@ void Player::setUri(const QString & uri)
realUri = QUrl::fromLocalFile(realUri).toEncoded(); realUri = QUrl::fromLocalFile(realUri).toEncoded();
} }
if (!m_pipeline) { if (!m_pipeline) {
m_pipeline = QGst::ElementFactory::make("playbin2").dynamicCast<QGst::Pipeline>(); m_pipeline = QGst::ElementFactory::make("playbin").dynamicCast<QGst::Pipeline>();
if (m_pipeline) { if (m_pipeline) {
//let the video widget watch the pipeline for new video sinks //let the video widget watch the pipeline for new video sinks
watchPipeline(m_pipeline); watchPipeline(m_pipeline);
@ -617,7 +617,7 @@ void Player::setUri(const QString & uri)
realUri = QUrl::fromLocalFile(realUri).toEncoded(); realUri = QUrl::fromLocalFile(realUri).toEncoded();
} }
if (!m_pipeline) { if (!m_pipeline) {
m_pipeline = QGst::ElementFactory::make("playbin2").dynamicCast<QGst::Pipeline>(); m_pipeline = QGst::ElementFactory::make("playbin").dynamicCast<QGst::Pipeline>();
if (m_pipeline) { if (m_pipeline) {
//let the video widget watch the pipeline for new video sinks //let the video widget watch the pipeline for new video sinks
watchPipeline(m_pipeline); watchPipeline(m_pipeline);
@ -637,7 +637,7 @@ void Player::setUri(const QString & uri)
Here, we first ensure that the pipeline will receive a proper URI. If Here, we first ensure that the pipeline will receive a proper URI. If
`Player::setUri()` is called with `/home/user/some/file.mp3`, the path `Player::setUri()` is called with `/home/user/some/file.mp3`, the path
is modified to `file:///home/user/some/file.mp3`. `playbin2` only is modified to `file:///home/user/some/file.mp3`. `playbin` only
accepts complete URIs. accepts complete URIs.
The pipeline is created via `QGst::ElementFactory::make()`. The The pipeline is created via `QGst::ElementFactory::make()`. The
@ -704,7 +704,7 @@ void Player::handlePipelineStateChange(const QGst::StateChangedMessagePtr & scm)
} }
``` ```
Finally, we tell `playbin2` what to play by setting the `uri` property: Finally, we tell `playbin` what to play by setting the `uri` property:
``` lang=c ``` lang=c
m_pipeline->setProperty("uri", realUri); m_pipeline->setProperty("uri", realUri);

View file

@ -228,7 +228,7 @@ Let's recap a bit. Today we have learned:
- How to quickly build a pipeline from a textual description using - How to quickly build a pipeline from a textual description using
`gst_parse_launch()`. `gst_parse_launch()`.
- How to create an automatic playback pipeline using `playbin2`. - How to create an automatic playback pipeline using `playbin`.
- How to signal GStreamer to start playback using - How to signal GStreamer to start playback using
`gst_element_set_state()`. `gst_element_set_state()`.

View file

@ -225,15 +225,15 @@ producing for a particular pipeline, run `gst-launch-1.0` as usual, with the
#### Examples #### Examples
Play a media file using `playbin2` (as in [Basic tutorial 1: Hello Play a media file using `playbin` (as in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)): world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html)):
``` ```
gst-launch-1.0 playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm gst-launch-1.0 playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm
``` ```
A fully operation playback pipeline, with audio and video (more or less A fully operation playback pipeline, with audio and video (more or less
the same pipeline that `playbin2` will create the same pipeline that `playbin` will create
internally): internally):
``` ```

View file

@ -182,7 +182,7 @@ to output graph files. These are `.dot` files, readable with free
programs like [GraphViz](http://www.graphviz.org), that describe the programs like [GraphViz](http://www.graphviz.org), that describe the
topology of your pipeline, along with the caps negotiated in each link. topology of your pipeline, along with the caps negotiated in each link.
This is also very handy when using all-in-one elements like `playbin2` This is also very handy when using all-in-one elements like `playbin`
 or `uridecodebin`, which instantiate several elements inside them. Use  or `uridecodebin`, which instantiate several elements inside them. Use
the `.dot` files to learn what pipeline they have created inside (and the `.dot` files to learn what pipeline they have created inside (and
learn a bit of GStreamer along the way). learn a bit of GStreamer along the way).
@ -197,11 +197,11 @@ within your application, you can use the
`GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS()` macros to generate `.dot` files `GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS()` macros to generate `.dot` files
at your convenience. at your convenience.
Here you have an example of the kind of pipelines that playbin2 Here you have an example of the kind of pipelines that playbin
generates. It is very complex because `playbin2` can handle many generates. It is very complex because `playbin` can handle many
different cases: Your manual pipelines normally do not need to be this different cases: Your manual pipelines normally do not need to be this
long. If your manual pipeline is starting to get very big, consider long. If your manual pipeline is starting to get very big, consider
using `playbin2`. using `playbin`.
![](attachments/327830/2424840.png) ![](attachments/327830/2424840.png)
@ -224,4 +224,4 @@ It has been a pleasure having you here, and see you soon\!
## Attachments: ## Attachments:
![](images/icons/bullet_blue.gif) ![](images/icons/bullet_blue.gif)
[playbin2.png](attachments/327830/2424840.png) (image/png) [playbin.png](attachments/327830/2424840.png) (image/png)

View file

@ -26,7 +26,7 @@ waiting.
As it turns out, this solution is already implemented in GStreamer, but As it turns out, this solution is already implemented in GStreamer, but
the previous tutorials have not been benefiting from it. Some elements, the previous tutorials have not been benefiting from it. Some elements,
like the `queue2` and `multiqueue` found inside `playbin2`, are capable like the `queue2` and `multiqueue` found inside `playbin`, are capable
of building this buffer and post bus messages regarding the buffer level of building this buffer and post bus messages regarding the buffer level
(the state of the queue). An application wanting to have more network (the state of the queue). An application wanting to have more network
resilience, then, should listen to these messages and pause playback if resilience, then, should listen to these messages and pause playback if
@ -131,7 +131,7 @@ int main(int argc, char *argv[]) {
memset (&data, 0, sizeof (data)); memset (&data, 0, sizeof (data));
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
bus = gst_element_get_bus (pipeline); bus = gst_element_get_bus (pipeline);
/* Start playing */ /* Start playing */

View file

@ -42,10 +42,10 @@ so Seek Events are used in this tutorial instead.
To use these events, they are created and then passed onto the pipeline, To use these events, they are created and then passed onto the pipeline,
where they propagate upstream until they reach an element that can where they propagate upstream until they reach an element that can
handle them. If an event is passed onto a bin element like `playbin2`, handle them. If an event is passed onto a bin element like `playbin`,
it will simply feed the event to all its sinks, which will result in it will simply feed the event to all its sinks, which will result in
multiple seeks being performed. The common approach is to retrieve one multiple seeks being performed. The common approach is to retrieve one
of `playbin2`s sinks through the `video-sink` or of `playbin`s sinks through the `video-sink` or
`audio-sink` properties and feed the event directly into the sink. `audio-sink` properties and feed the event directly into the sink.
Frame stepping is a technique that allows playing a video frame by Frame stepping is a technique that allows playing a video frame by
@ -181,7 +181,7 @@ int main(int argc, char *argv[]) {
" 'Q' to quit\n"); " 'Q' to quit\n");
/* Build the pipeline */ /* Build the pipeline */
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef _WIN32
@ -243,7 +243,7 @@ int main(int argc, char *argv[]) {
# Walkthrough # Walkthrough
There is nothing new in the initialization code in the main function:  a There is nothing new in the initialization code in the main function:  a
`playbin2` pipeline is instantiated, an I/O watch is installed to track `playbin` pipeline is instantiated, an I/O watch is installed to track
keystrokes and a GLib main loop is executed. keystrokes and a GLib main loop is executed.
Then, in the keyboard handler function: Then, in the keyboard handler function:
@ -336,7 +336,7 @@ if (data->video_sink == NULL) {
As explained in the Introduction, to avoid performing multiple Seeks, As explained in the Introduction, to avoid performing multiple Seeks,
the Event is sent to only one sink, in this case, the video sink. It is the Event is sent to only one sink, in this case, the video sink. It is
obtained from `playbin2` through the `video-sink` property. It is read obtained from `playbin` through the `video-sink` property. It is read
at this time instead at initialization time because the actual sink may at this time instead at initialization time because the actual sink may
change depending on the media contents, and this wont be known until change depending on the media contents, and this wont be known until
the pipeline is PLAYING and some media has been read. the pipeline is PLAYING and some media has been read.
@ -369,7 +369,7 @@ A new Step Event is created with `gst_event_new_step()`, whose
parameters basically specify the amount to skip (1 frame in the example) parameters basically specify the amount to skip (1 frame in the example)
and the new rate (which we do not change). and the new rate (which we do not change).
The video sink is grabbed from `playbin2` in case we didnt have it yet, The video sink is grabbed from `playbin` in case we didnt have it yet,
just like before. just like before.
And with this we are done. When testing this tutorial, keep in mind that And with this we are done. When testing this tutorial, keep in mind that
@ -379,7 +379,7 @@ backward playback is not optimal in many elements.
<tbody> <tbody>
<tr class="odd"> <tr class="odd">
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td> <td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
<td><p>Changing the playback rate might only work with local files. If you cannot modify it, try changing the URI passed to <code>playbin2</code> in line 114 to a local URI, starting with <code>file:///</code></p></td> <td><p>Changing the playback rate might only work with local files. If you cannot modify it, try changing the URI passed to <code>playbin</code> in line 114 to a local URI, starting with <code>file:///</code></p></td>
</tr> </tr>
</tbody> </tbody>
</table> </table>

View file

@ -4,7 +4,7 @@
This tutorial gives a list of handy GStreamer elements that are worth This tutorial gives a list of handy GStreamer elements that are worth
knowing. They range from powerful all-in-one elements that allow you to knowing. They range from powerful all-in-one elements that allow you to
build complex pipelines easily (like `playbin2`), to little helper build complex pipelines easily (like `playbin`), to little helper
elements which are extremely useful when debugging. elements which are extremely useful when debugging.
For simplicity, the following examples are given using the For simplicity, the following examples are given using the
@ -19,7 +19,7 @@ These are Bin elements which you treat as a single element and they take
care of instantiating all the necessary internal pipeline to accomplish care of instantiating all the necessary internal pipeline to accomplish
their task. their task.
### `playbin2` ### `playbin`
This element has been extensively used throughout the tutorials. It This element has been extensively used throughout the tutorials. It
manages all aspects of media playback, from source to display, passing manages all aspects of media playback, from source to display, passing
@ -263,7 +263,7 @@ tutorial 12: Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html).
As a rule of thumb, prefer `queue2` over `queue` whenever network As a rule of thumb, prefer `queue2` over `queue` whenever network
buffering is a concern to you. See [Basic tutorial 12: buffering is a concern to you. See [Basic tutorial 12:
Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) for an example Streaming](Basic%2Btutorial%2B12%253A%2BStreaming.html) for an example
(`queue2` is hidden inside `playbin2`). (`queue2` is hidden inside `playbin`).
### `multiqueue` ### `multiqueue`

View file

@ -94,7 +94,7 @@ int main(int argc, char *argv[]) {
g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL); g_signal_connect (texture, "size-change", G_CALLBACK (size_change), NULL);
/* Build the GStreamer pipeline */ /* Build the GStreamer pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Instantiate the Clutter sink */ /* Instantiate the Clutter sink */
sink = gst_element_factory_make ("autocluttersink", NULL); sink = gst_element_factory_make ("autocluttersink", NULL);
@ -226,7 +226,7 @@ This texture is everything GStreamer needs to know about Clutter.
g_object_set (pipeline, "video-sink", sink, NULL); g_object_set (pipeline, "video-sink", sink, NULL);
``` ```
Finally, tell `playbin2` to use the sink we created instead of the Finally, tell `playbin` to use the sink we created instead of the
default one. default one.
Then the GStreamer pipeline and the Clutter timeline are started and the Then the GStreamer pipeline and the Clutter timeline are started and the

View file

@ -6,7 +6,7 @@ Even though GStreamer is a multiplatform framework, not all the elements
are available on all platforms. For example, the audio and video sinks are available on all platforms. For example, the audio and video sinks
depend heavily on the underlying windowing system, and a different one depend heavily on the underlying windowing system, and a different one
needs to be selected depending on the platform. You normally do not need needs to be selected depending on the platform. You normally do not need
to worry about this when using elements like `playbin2` or to worry about this when using elements like `playbin` or
`autovideosink`, but, for those cases when you need to use one of the `autovideosink`, but, for those cases when you need to use one of the
sinks that are only available on specific platforms, this tutorial hints sinks that are only available on specific platforms, this tutorial hints
you some of their peculiarities. you some of their peculiarities.
@ -185,14 +185,14 @@ This is the only audio sink available to GStreamer on iOS.
Source element to read iOS assets, this is, documents stored in the Source element to read iOS assets, this is, documents stored in the
Library (like photos, music and videos). It can be instantiated Library (like photos, music and videos). It can be instantiated
automatically by `playbin2` when URIs use the automatically by `playbin` when URIs use the
`assets-library://` scheme. `assets-library://` scheme.
### `iosavassetsrc` ### `iosavassetsrc`
Source element to read and decode iOS audiovisual assets, this is, Source element to read and decode iOS audiovisual assets, this is,
documents stored in the Library (like photos, music and videos). It can documents stored in the Library (like photos, music and videos). It can
be instantiated automatically by `playbin2` when URIs use the be instantiated automatically by `playbin` when URIs use the
`ipod-library://` scheme. Decoding is performed by the system, so `ipod-library://` scheme. Decoding is performed by the system, so
dedicated hardware will be used if available. dedicated hardware will be used if available.
@ -200,7 +200,7 @@ dedicated hardware will be used if available.
This tutorial has shown a few specific details about some GStreamer This tutorial has shown a few specific details about some GStreamer
elements which are not available on all platforms. You do not have to elements which are not available on all platforms. You do not have to
worry about them when using multiplatform elements like `playbin2` or worry about them when using multiplatform elements like `playbin` or
`autovideosink`, but it is good to know their personal quirks if `autovideosink`, but it is good to know their personal quirks if
instancing them manually. instancing them manually.

View file

@ -494,10 +494,10 @@ are already printing on screen the type of the video pads.
You should now see (and hear) the same movie as in [Basic tutorial 1: You should now see (and hear) the same movie as in [Basic tutorial 1:
Hello world!](Basic+tutorial+1+Hello+world.markdown). In Hello world!](Basic+tutorial+1+Hello+world.markdown). In
that tutorial you used `playbin2`, which is a handy element that that tutorial you used `playbin`, which is a handy element that
automatically takes care of all the demuxing and pad linking for you. automatically takes care of all the demuxing and pad linking for you.
Most of the [Playback tutorials](Playback+tutorials.markdown) are devoted Most of the [Playback tutorials](Playback+tutorials.markdown) are devoted
to `playbin2`. to `playbin`.
## Conclusion ## Conclusion
@ -515,7 +515,7 @@ You can now continue with the basic tutorials and learn about performing
seeks and time-related queries in [Basic tutorial 4: Time seeks and time-related queries in [Basic tutorial 4: Time
management](Basic+tutorial+4+Time+management.markdown) or move management](Basic+tutorial+4+Time+management.markdown) or move
to the [Playback tutorials](Playback+tutorials.markdown), and gain more to the [Playback tutorials](Playback+tutorials.markdown), and gain more
insight about the `playbin2` element. insight about the `playbin` element.
Remember that attached to this page you should find the complete source Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it. code of the tutorial and any accessory files needed to build it.

View file

@ -41,7 +41,7 @@ understand each other). This is the main goal of Capabilities.
As an application developer, you will usually build pipelines by linking As an application developer, you will usually build pipelines by linking
elements together (to a lesser extent if you use all-in-all elements elements together (to a lesser extent if you use all-in-all elements
like `playbin2`). In this case, you need to know the *Pad Caps* (as they like `playbin`). In this case, you need to know the *Pad Caps* (as they
are familiarly referred to) of your elements, or, at least, know what are familiarly referred to) of your elements, or, at least, know what
they are when GStreamer refuses to link two elements with a negotiation they are when GStreamer refuses to link two elements with a negotiation
error. error.

View file

@ -14,7 +14,7 @@ any time, in a variety of ways. This tutorial shows:
[Playback tutorial 3: Short-cutting the [Playback tutorial 3: Short-cutting the
pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) explains pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) explains
how to achieve the same goals in a playbin2-based pipeline. how to achieve the same goals in a playbin-based pipeline.
## Introduction ## Introduction
@ -499,8 +499,8 @@ gst_buffer_unref (buffer);
Once we have the buffer ready, we pass it to `appsrc` with the Once we have the buffer ready, we pass it to `appsrc` with the
`push-buffer` action signal (see information box at the end of [Playback `push-buffer` action signal (see information box at the end of [Playback
tutorial 1: Playbin2 tutorial 1: Playbin
usage](Playback+tutorial+1+Playbin2+usage.markdown)), and then usage](Playback+tutorial+1+Playbin+usage.markdown)), and then
`gst_buffer_unref()` it since we no longer need it. `gst_buffer_unref()` it since we no longer need it.
``` lang=c ``` lang=c
@ -538,7 +538,7 @@ This tutorial has shown how applications can:
- Retrieve data from a pipeline using the `appsink` element. - Retrieve data from a pipeline using the `appsink` element.
- Manipulate this data by accessing the `GstBuffer`. - Manipulate this data by accessing the `GstBuffer`.
In a playbin2-based pipeline, the same goals are achieved in a slightly In a playbin-based pipeline, the same goals are achieved in a slightly
different way. [Playback tutorial 3: Short-cutting the different way. [Playback tutorial 3: Short-cutting the
pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) shows pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) shows
how to do it. how to do it.

View file

@ -2,9 +2,9 @@
# Goal # Goal
We have already worked with the `playbin2` element, which is capable of We have already worked with the `playbin` element, which is capable of
building a complete playback pipeline without much work on our side. building a complete playback pipeline without much work on our side.
This tutorial shows how to further customize `playbin2` in case its This tutorial shows how to further customize `playbin` in case its
default values do not suit our particular needs. default values do not suit our particular needs.
We will learn: We will learn:
@ -14,7 +14,7 @@ We will learn:
- How to gather information regarding each stream. - How to gather information regarding each stream.
As a side note, even though its name is `playbin2`, you can pronounce it As a side note, even though its name is `playbin`, you can pronounce it
“playbin”, since the original `playbin` element is deprecated and nobody “playbin”, since the original `playbin` element is deprecated and nobody
should be using it. should be using it.
@ -66,7 +66,7 @@ it in the SDK installation).
/* Structure to contain all our information, so we can pass it around */ /* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData { typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */ GstElement *playbin; /* Our one and only element */
gint n_video; /* Number of embedded video streams */ gint n_video; /* Number of embedded video streams */
gint n_audio; /* Number of embedded audio streams */ gint n_audio; /* Number of embedded audio streams */
@ -79,7 +79,7 @@ typedef struct _CustomData {
GMainLoop *main_loop; /* GLib's Main Loop */ GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData; } CustomData;
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */ GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */ GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
@ -101,27 +101,27 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Create the elements */ /* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2"); data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin2) { if (!data.playbin) {
g_printerr ("Not all elements could be created.\n"); g_printerr ("Not all elements could be created.\n");
return -1; return -1;
} }
/* Set the URI to play */ /* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", NULL); g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", NULL);
/* Set flags to show Audio and Video but ignore Subtitles */ /* Set flags to show Audio and Video but ignore Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL); g_object_get (data.playbin, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO; flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
flags &= ~GST_PLAY_FLAG_TEXT; flags &= ~GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin2, "flags", flags, NULL); g_object_set (data.playbin, "flags", flags, NULL);
/* Set connection speed. This will affect some internal decisions of playbin2 */ /* Set connection speed. This will affect some internal decisions of playbin */
g_object_set (data.playbin2, "connection-speed", 56, NULL); g_object_set (data.playbin, "connection-speed", 56, NULL);
/* Add a bus watch, so we get notified when a message arrives */ /* Add a bus watch, so we get notified when a message arrives */
bus = gst_element_get_bus (data.playbin2); bus = gst_element_get_bus (data.playbin);
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data); gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
@ -133,10 +133,10 @@ int main(int argc, char *argv[]) {
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data); g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
/* Start playing */ /* Start playing */
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING); ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) { if (ret == GST_STATE_CHANGE_FAILURE) {
g_printerr ("Unable to set the pipeline to the playing state.\n"); g_printerr ("Unable to set the pipeline to the playing state.\n");
gst_object_unref (data.playbin2); gst_object_unref (data.playbin);
return -1; return -1;
} }
@ -148,8 +148,8 @@ int main(int argc, char *argv[]) {
g_main_loop_unref (data.main_loop); g_main_loop_unref (data.main_loop);
g_io_channel_unref (io_stdin); g_io_channel_unref (io_stdin);
gst_object_unref (bus); gst_object_unref (bus);
gst_element_set_state (data.playbin2, GST_STATE_NULL); gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin2); gst_object_unref (data.playbin);
return 0; return 0;
} }
@ -161,9 +161,9 @@ static void analyze_streams (CustomData *data) {
guint rate; guint rate;
/* Read some properties */ /* Read some properties */
g_object_get (data->playbin2, "n-video", &data->n_video, NULL); g_object_get (data->playbin, "n-video", &data->n_video, NULL);
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL); g_object_get (data->playbin, "n-audio", &data->n_audio, NULL);
g_object_get (data->playbin2, "n-text", &data->n_text, NULL); g_object_get (data->playbin, "n-text", &data->n_text, NULL);
g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n", g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n",
data->n_video, data->n_audio, data->n_text); data->n_video, data->n_audio, data->n_text);
@ -172,7 +172,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < data->n_video; i++) { for (i = 0; i < data->n_video; i++) {
tags = NULL; tags = NULL;
/* Retrieve the stream's video tags */ /* Retrieve the stream's video tags */
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
if (tags) { if (tags) {
g_print ("video stream %d:\n", i); g_print ("video stream %d:\n", i);
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str); gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
@ -186,7 +186,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < data->n_audio; i++) { for (i = 0; i < data->n_audio; i++) {
tags = NULL; tags = NULL;
/* Retrieve the stream's audio tags */ /* Retrieve the stream's audio tags */
g_signal_emit_by_name (data->playbin2, "get-audio-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
if (tags) { if (tags) {
g_print ("audio stream %d:\n", i); g_print ("audio stream %d:\n", i);
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) { if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
@ -208,7 +208,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < data->n_text; i++) { for (i = 0; i < data->n_text; i++) {
tags = NULL; tags = NULL;
/* Retrieve the stream's subtitle tags */ /* Retrieve the stream's subtitle tags */
g_signal_emit_by_name (data->playbin2, "get-text-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
if (tags) { if (tags) {
g_print ("subtitle stream %d:\n", i); g_print ("subtitle stream %d:\n", i);
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) { if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
@ -219,9 +219,9 @@ static void analyze_streams (CustomData *data) {
} }
} }
g_object_get (data->playbin2, "current-video", &data->current_video, NULL); g_object_get (data->playbin, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL); g_object_get (data->playbin, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL); g_object_get (data->playbin, "current-text", &data->current_text, NULL);
g_print ("\n"); g_print ("\n");
g_print ("Currently playing video stream %d, audio stream %d and text stream %d\n", g_print ("Currently playing video stream %d, audio stream %d and text stream %d\n",
@ -250,7 +250,7 @@ static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data)
case GST_MESSAGE_STATE_CHANGED: { case GST_MESSAGE_STATE_CHANGED: {
GstState old_state, new_state, pending_state; GstState old_state, new_state, pending_state;
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state); gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) { if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
if (new_state == GST_STATE_PLAYING) { if (new_state == GST_STATE_PLAYING) {
/* Once we are in the playing state, analyze the streams */ /* Once we are in the playing state, analyze the streams */
analyze_streams (data); analyze_streams (data);
@ -274,7 +274,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
} else { } else {
/* If the input was a valid audio stream index, set the current audio stream */ /* If the input was a valid audio stream index, set the current audio stream */
g_print ("Setting current audio stream to %d\n", index); g_print ("Setting current audio stream to %d\n", index);
g_object_set (data->playbin2, "current-audio", index, NULL); g_object_set (data->playbin, "current-audio", index, NULL);
} }
} }
g_free (str); g_free (str);
@ -335,7 +335,7 @@ Required libraries: <code>gstreamer-0.10</code>
``` lang=c ``` lang=c
/* Structure to contain all our information, so we can pass it around */ /* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData { typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */ GstElement *playbin; /* Our one and only element */
gint n_video; /* Number of embedded video streams */ gint n_video; /* Number of embedded video streams */
gint n_audio; /* Number of embedded audio streams */ gint n_audio; /* Number of embedded audio streams */
@ -356,7 +356,7 @@ to use a different mechanism to wait for messages that allows
interactivity, so we need a GLib's main loop object. interactivity, so we need a GLib's main loop object.
``` lang=c ``` lang=c
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */ GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */ GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
@ -364,11 +364,11 @@ typedef enum {
} GstPlayFlags; } GstPlayFlags;
``` ```
Later we are going to set some of `playbin2`'s flags. We would like to Later we are going to set some of `playbin`'s flags. We would like to
have a handy enum that allows manipulating these flags easily, but since have a handy enum that allows manipulating these flags easily, but since
`playbin2` is a plug-in and not a part of the GStreamer core, this enum `playbin` is a plug-in and not a part of the GStreamer core, this enum
is not available to us. The “trick” is simply to declare this enum in is not available to us. The “trick” is simply to declare this enum in
our code, as it appears in the `playbin2` documentation: `GstPlayFlags`. our code, as it appears in the `playbin` documentation: `GstPlayFlags`.
GObject allows introspection, so the possible values for these flags can GObject allows introspection, so the possible values for these flags can
be retrieved at runtime without using this trick, but in a far more be retrieved at runtime without using this trick, but in a far more
cumbersome cumbersome
@ -386,22 +386,22 @@ and `handle_keyboard` for key strokes, since this tutorial is
introducing a limited amount of interactivity. introducing a limited amount of interactivity.
We skip over the creation of the pipeline, the instantiation of We skip over the creation of the pipeline, the instantiation of
`playbin2` and pointing it to our test media through the `uri` `playbin` and pointing it to our test media through the `uri`
property. `playbin2` is in itself a pipeline, and in this case it is property. `playbin` is in itself a pipeline, and in this case it is
the only element in the pipeline, so we skip completely the creation of the only element in the pipeline, so we skip completely the creation of
the pipeline, and use directly the  `playbin2` element. the pipeline, and use directly the  `playbin` element.
We focus on some of the other properties of `playbin2`, though: We focus on some of the other properties of `playbin`, though:
``` lang=c ``` lang=c
/* Set flags to show Audio and Video but ignore Subtitles */ /* Set flags to show Audio and Video but ignore Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL); g_object_get (data.playbin, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO; flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO;
flags &= ~GST_PLAY_FLAG_TEXT; flags &= ~GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin2, "flags", flags, NULL); g_object_set (data.playbin, "flags", flags, NULL);
``` ```
`playbin2`'s behavior can be changed through its `flags` property, which `playbin`'s behavior can be changed through its `flags` property, which
can have any combination of `GstPlayFlags`. The most interesting values can have any combination of `GstPlayFlags`. The most interesting values
are: are:
@ -413,7 +413,7 @@ are:
| GST_PLAY_FLAG_VIS | Enable rendering of visualisations when there is no video stream. Playback tutorial 6: Audio visualization goes into more details. | | GST_PLAY_FLAG_VIS | Enable rendering of visualisations when there is no video stream. Playback tutorial 6: Audio visualization goes into more details. |
| GST_PLAY_FLAG_DOWNLOAD | See Basic tutorial 12: Streaming and Playback tutorial 4: Progressive streaming. | | GST_PLAY_FLAG_DOWNLOAD | See Basic tutorial 12: Streaming and Playback tutorial 4: Progressive streaming. |
| GST_PLAY_FLAG_BUFFERING | See Basic tutorial 12: Streaming and Playback tutorial 4: Progressive streaming. | | GST_PLAY_FLAG_BUFFERING | See Basic tutorial 12: Streaming and Playback tutorial 4: Progressive streaming. |
| GST_PLAY_FLAG_DEINTERLACE | If the video content was interlaced, this flag instructs playbin2 to deinterlace it before displaying it. | | GST_PLAY_FLAG_DEINTERLACE | If the video content was interlaced, this flag instructs playbin to deinterlace it before displaying it. |
In our case, for demonstration purposes, we are enabling audio and video In our case, for demonstration purposes, we are enabling audio and video
and disabling subtitles, leaving the rest of flags to their default and disabling subtitles, leaving the rest of flags to their default
@ -422,14 +422,14 @@ values (this is why we read the current value of the flags with
`g_object_set()`). `g_object_set()`).
``` lang=c ``` lang=c
/* Set connection speed. This will affect some internal decisions of playbin2 */ /* Set connection speed. This will affect some internal decisions of playbin */
g_object_set (data.playbin2, "connection-speed", 56, NULL); g_object_set (data.playbin, "connection-speed", 56, NULL);
``` ```
This property is not really useful in this example. This property is not really useful in this example.
`connection-speed` informs `playbin2` of the maximum speed of our `connection-speed` informs `playbin` of the maximum speed of our
network connection, so, in case multiple versions of the requested media network connection, so, in case multiple versions of the requested media
are available in the server, `playbin2` chooses the most are available in the server, `playbin` chooses the most
appropriate. This is mostly used in combination with streaming appropriate. This is mostly used in combination with streaming
protocols like `mms` or `rtsp`. protocols like `mms` or `rtsp`.
@ -438,7 +438,7 @@ them with a single call to
`g_object_set()`: `g_object_set()`:
``` lang=c ``` lang=c
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL); g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_cropped_multilingual.webm", "flags", flags, "connection-speed", 56, NULL);
``` ```
This is why `g_object_set()` requires a NULL as the last parameter. This is why `g_object_set()` requires a NULL as the last parameter.
@ -487,9 +487,9 @@ static void analyze_streams (CustomData *data) {
guint rate; guint rate;
/* Read some properties */ /* Read some properties */
g_object_get (data->playbin2, "n-video", &data->n_video, NULL); g_object_get (data->playbin, "n-video", &data->n_video, NULL);
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL); g_object_get (data->playbin, "n-audio", &data->n_audio, NULL);
g_object_get (data->playbin2, "n-text", &data->n_text, NULL); g_object_get (data->playbin, "n-text", &data->n_text, NULL);
``` ```
As the comment says, this function just gathers information from the As the comment says, this function just gathers information from the
@ -501,7 +501,7 @@ subtitle streams is directly available through the `n-video`,
for (i = 0; i < data->n_video; i++) { for (i = 0; i < data->n_video; i++) {
tags = NULL; tags = NULL;
/* Retrieve the stream's video tags */ /* Retrieve the stream's video tags */
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
if (tags) { if (tags) {
g_print ("video stream %d:\n", i); g_print ("video stream %d:\n", i);
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str); gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
@ -542,7 +542,7 @@ specific element, which then performs an action and returns a result.
They behave like a dynamic function call, in which methods of a class They behave like a dynamic function call, in which methods of a class
are identified by their name (the signal's name) instead of their memory are identified by their name (the signal's name) instead of their memory
address. These signals are listed In the documentation along with the address. These signals are listed In the documentation along with the
regular signals, and are tagged “Action”. See <code>playbin2</code>, for regular signals, and are tagged “Action”. See <code>playbin</code>, for
example. example.
</p> </p>
@ -555,7 +555,7 @@ example.
</table> </table>
`playbin2` defines 3 action signals to retrieve `playbin` defines 3 action signals to retrieve
metadata: `get-video-tags`, `get-audio-tags` and `get-text-tags`.  The metadata: `get-video-tags`, `get-audio-tags` and `get-text-tags`.  The
name if the tags is standardized, and the list can be found in the name if the tags is standardized, and the list can be found in the
`GstTagList` documentation. In this example we are interested in the `GstTagList` documentation. In this example we are interested in the
@ -564,18 +564,18 @@ name if the tags is standardized, and the list can be found in the
text). text).
``` lang=c ``` lang=c
g_object_get (data->playbin2, "current-video", &data->current_video, NULL); g_object_get (data->playbin, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL); g_object_get (data->playbin, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL); g_object_get (data->playbin, "current-text", &data->current_text, NULL);
``` ```
Once we have extracted all the metadata we want, we get the streams that Once we have extracted all the metadata we want, we get the streams that
are currently selected through 3 more properties of `playbin2`: are currently selected through 3 more properties of `playbin`:
`current-video`, `current-audio` and `current-text`.  `current-video`, `current-audio` and `current-text`. 
It is interesting to always check the currently selected streams and It is interesting to always check the currently selected streams and
never make any assumption. Multiple internal conditions can make never make any assumption. Multiple internal conditions can make
`playbin2` behave differently in different executions. Also, the order `playbin` behave differently in different executions. Also, the order
in which the streams are listed can change from one run to another, so in which the streams are listed can change from one run to another, so
checking the metadata to identify one particular stream becomes crucial. checking the metadata to identify one particular stream becomes crucial.
@ -591,7 +591,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
} else { } else {
/* If the input was a valid audio stream index, set the current audio stream */ /* If the input was a valid audio stream index, set the current audio stream */
g_print ("Setting current audio stream to %d\n", index); g_print ("Setting current audio stream to %d\n", index);
g_object_set (data->playbin2, "current-audio", index, NULL); g_object_set (data->playbin, "current-audio", index, NULL);
} }
} }
g_free (str); g_free (str);
@ -602,14 +602,14 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
Finally, we allow the user to switch the running audio stream. This very Finally, we allow the user to switch the running audio stream. This very
basic function just reads a string from the standard input (the basic function just reads a string from the standard input (the
keyboard), interprets it as a number, and tries to set the keyboard), interprets it as a number, and tries to set the
`current-audio` property of `playbin2` (which previously we have only `current-audio` property of `playbin` (which previously we have only
read). read).
Bear in mind that the switch is not immediate. Some of the previously Bear in mind that the switch is not immediate. Some of the previously
decoded audio will still be flowing through the pipeline, while the new decoded audio will still be flowing through the pipeline, while the new
stream becomes active and is decoded. The delay depends on the stream becomes active and is decoded. The delay depends on the
particular multiplexing of the streams in the container, and the length particular multiplexing of the streams in the container, and the length
`playbin2` has selected for its internal queues (which depends on the `playbin` has selected for its internal queues (which depends on the
network conditions). network conditions).
If you execute the tutorial, you will be able to switch from one If you execute the tutorial, you will be able to switch from one
@ -620,7 +620,7 @@ language to another while the movie is running by pressing 0, 1 or 2
This tutorial has shown: This tutorial has shown:
- A few more of `playbin2`'s properties: `flags`, `connection-speed`, - A few more of `playbin`'s properties: `flags`, `connection-speed`,
`n-video`, `n-audio`, `n-text`, `current-video`, `current-audio` and `n-video`, `n-audio`, `n-text`, `current-video`, `current-audio` and
`current-text`. `current-text`.

View file

@ -17,16 +17,16 @@ This will allow us to learn:
We already know (from the previous tutorial) that container files can We already know (from the previous tutorial) that container files can
hold multiple audio and video streams, and that we can very easily hold multiple audio and video streams, and that we can very easily
choose among them by changing the `current-audio` or choose among them by changing the `current-audio` or
`current-video` `playbin2` property. Switching subtitles is just as `current-video` `playbin` property. Switching subtitles is just as
easy. easy.
It is worth noting that, just like it happens with audio and video, It is worth noting that, just like it happens with audio and video,
`playbin2` takes care of choosing the right decoder for the subtitles, `playbin` takes care of choosing the right decoder for the subtitles,
and that the plugin structure of GStreamer allows adding support for new and that the plugin structure of GStreamer allows adding support for new
formats as easily as copying a file. Everything is invisible to the formats as easily as copying a file. Everything is invisible to the
application developer. application developer.
Besides subtitles embedded in the container, `playbin2` offers the Besides subtitles embedded in the container, `playbin` offers the
possibility to add an extra subtitle stream from an external URI. possibility to add an extra subtitle stream from an external URI.
This tutorial opens a file which already contains 5 subtitle streams, This tutorial opens a file which already contains 5 subtitle streams,
@ -44,7 +44,7 @@ it in the SDK installation).
/* Structure to contain all our information, so we can pass it around */ /* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData { typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */ GstElement *playbin; /* Our one and only element */
gint n_video; /* Number of embedded video streams */ gint n_video; /* Number of embedded video streams */
gint n_audio; /* Number of embedded audio streams */ gint n_audio; /* Number of embedded audio streams */
@ -57,7 +57,7 @@ typedef struct _CustomData {
GMainLoop *main_loop; /* GLib's Main Loop */ GMainLoop *main_loop; /* GLib's Main Loop */
} CustomData; } CustomData;
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */ GST_PLAY_FLAG_VIDEO = (1 << 0), /* We want video output */
GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */ GST_PLAY_FLAG_AUDIO = (1 << 1), /* We want audio output */
@ -79,27 +79,27 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Create the elements */ /* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2"); data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin2) { if (!data.playbin) {
g_printerr ("Not all elements could be created.\n"); g_printerr ("Not all elements could be created.\n");
return -1; return -1;
} }
/* Set the URI to play */ /* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.ogv", NULL); g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.ogv", NULL);
/* Set the subtitle URI to play and some font description */ /* Set the subtitle URI to play and some font description */
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL); g_object_set (data.playbin, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL); g_object_set (data.playbin, "subtitle-font-desc", "Sans, 18", NULL);
/* Set flags to show Audio, Video and Subtitles */ /* Set flags to show Audio, Video and Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL); g_object_get (data.playbin, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT; flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin2, "flags", flags, NULL); g_object_set (data.playbin, "flags", flags, NULL);
/* Add a bus watch, so we get notified when a message arrives */ /* Add a bus watch, so we get notified when a message arrives */
bus = gst_element_get_bus (data.playbin2); bus = gst_element_get_bus (data.playbin);
gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data); gst_bus_add_watch (bus, (GstBusFunc)handle_message, &data);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
@ -111,10 +111,10 @@ int main(int argc, char *argv[]) {
g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data); g_io_add_watch (io_stdin, G_IO_IN, (GIOFunc)handle_keyboard, &data);
/* Start playing */ /* Start playing */
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING); ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) { if (ret == GST_STATE_CHANGE_FAILURE) {
g_printerr ("Unable to set the pipeline to the playing state.\n"); g_printerr ("Unable to set the pipeline to the playing state.\n");
gst_object_unref (data.playbin2); gst_object_unref (data.playbin);
return -1; return -1;
} }
@ -126,8 +126,8 @@ int main(int argc, char *argv[]) {
g_main_loop_unref (data.main_loop); g_main_loop_unref (data.main_loop);
g_io_channel_unref (io_stdin); g_io_channel_unref (io_stdin);
gst_object_unref (bus); gst_object_unref (bus);
gst_element_set_state (data.playbin2, GST_STATE_NULL); gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin2); gst_object_unref (data.playbin);
return 0; return 0;
} }
@ -139,9 +139,9 @@ static void analyze_streams (CustomData *data) {
guint rate; guint rate;
/* Read some properties */ /* Read some properties */
g_object_get (data->playbin2, "n-video", &data->n_video, NULL); g_object_get (data->playbin, "n-video", &data->n_video, NULL);
g_object_get (data->playbin2, "n-audio", &data->n_audio, NULL); g_object_get (data->playbin, "n-audio", &data->n_audio, NULL);
g_object_get (data->playbin2, "n-text", &data->n_text, NULL); g_object_get (data->playbin, "n-text", &data->n_text, NULL);
g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n", g_print ("%d video stream(s), %d audio stream(s), %d text stream(s)\n",
data->n_video, data->n_audio, data->n_text); data->n_video, data->n_audio, data->n_text);
@ -150,7 +150,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < data->n_video; i++) { for (i = 0; i < data->n_video; i++) {
tags = NULL; tags = NULL;
/* Retrieve the stream's video tags */ /* Retrieve the stream's video tags */
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
if (tags) { if (tags) {
g_print ("video stream %d:\n", i); g_print ("video stream %d:\n", i);
gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str); gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
@ -164,7 +164,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < data->n_audio; i++) { for (i = 0; i < data->n_audio; i++) {
tags = NULL; tags = NULL;
/* Retrieve the stream's audio tags */ /* Retrieve the stream's audio tags */
g_signal_emit_by_name (data->playbin2, "get-audio-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
if (tags) { if (tags) {
g_print ("audio stream %d:\n", i); g_print ("audio stream %d:\n", i);
if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) { if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
@ -187,7 +187,7 @@ static void analyze_streams (CustomData *data) {
tags = NULL; tags = NULL;
/* Retrieve the stream's subtitle tags */ /* Retrieve the stream's subtitle tags */
g_print ("subtitle stream %d:\n", i); g_print ("subtitle stream %d:\n", i);
g_signal_emit_by_name (data->playbin2, "get-text-tags", i, &tags); g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
if (tags) { if (tags) {
if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) { if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
g_print (" language: %s\n", str); g_print (" language: %s\n", str);
@ -199,9 +199,9 @@ static void analyze_streams (CustomData *data) {
} }
} }
g_object_get (data->playbin2, "current-video", &data->current_video, NULL); g_object_get (data->playbin, "current-video", &data->current_video, NULL);
g_object_get (data->playbin2, "current-audio", &data->current_audio, NULL); g_object_get (data->playbin, "current-audio", &data->current_audio, NULL);
g_object_get (data->playbin2, "current-text", &data->current_text, NULL); g_object_get (data->playbin, "current-text", &data->current_text, NULL);
g_print ("\n"); g_print ("\n");
g_print ("Currently playing video stream %d, audio stream %d and subtitle stream %d\n", g_print ("Currently playing video stream %d, audio stream %d and subtitle stream %d\n",
@ -230,7 +230,7 @@ static gboolean handle_message (GstBus *bus, GstMessage *msg, CustomData *data)
case GST_MESSAGE_STATE_CHANGED: { case GST_MESSAGE_STATE_CHANGED: {
GstState old_state, new_state, pending_state; GstState old_state, new_state, pending_state;
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state); gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) { if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
if (new_state == GST_STATE_PLAYING) { if (new_state == GST_STATE_PLAYING) {
/* Once we are in the playing state, analyze the streams */ /* Once we are in the playing state, analyze the streams */
analyze_streams (data); analyze_streams (data);
@ -254,7 +254,7 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
} else { } else {
/* If the input was a valid subtitle stream index, set the current subtitle stream */ /* If the input was a valid subtitle stream index, set the current subtitle stream */
g_print ("Setting current subtitle stream to %d\n", index); g_print ("Setting current subtitle stream to %d\n", index);
g_object_set (data->playbin2, "current-text", index, NULL); g_object_set (data->playbin, "current-text", index, NULL);
} }
} }
g_free (str); g_free (str);
@ -291,18 +291,18 @@ static gboolean handle_keyboard (GIOChannel *source, GIOCondition cond, CustomDa
# Walkthrough # Walkthrough
This tutorial is copied from [Playback tutorial 1: Playbin2 This tutorial is copied from [Playback tutorial 1: Playbin
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html) with some usage](Playback%2Btutorial%2B1%253A%2BPlaybin%2Busage.html) with some
changes, so let's review only the changes. changes, so let's review only the changes.
``` lang=c ``` lang=c
/* Set the subtitle URI to play and some font description */ /* Set the subtitle URI to play and some font description */
g_object_set (data.playbin2, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL); g_object_set (data.playbin, "suburi", "http://docs.gstreamer.com/media/sintel_trailer_gr.srt", NULL);
g_object_set (data.playbin2, "subtitle-font-desc", "Sans, 18", NULL); g_object_set (data.playbin, "subtitle-font-desc", "Sans, 18", NULL);
``` ```
After setting the media URI, we set the `suburi` property, which points After setting the media URI, we set the `suburi` property, which points
`playbin2` to a file containing a subtitle stream. In this case, the `playbin` to a file containing a subtitle stream. In this case, the
media file already contains multiple subtitle streams, so the one media file already contains multiple subtitle streams, so the one
provided in the `suburi` is added to the list, and will be the currently provided in the `suburi` is added to the list, and will be the currently
selected one. selected one.
@ -351,15 +351,15 @@ Extra-Expanded, Ultra-Expanded
``` lang=c ``` lang=c
/* Set flags to show Audio, Video and Subtitles */ /* Set flags to show Audio, Video and Subtitles */
g_object_get (data.playbin2, "flags", &flags, NULL); g_object_get (data.playbin, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT; flags |= GST_PLAY_FLAG_VIDEO | GST_PLAY_FLAG_AUDIO | GST_PLAY_FLAG_TEXT;
g_object_set (data.playbin2, "flags", flags, NULL); g_object_set (data.playbin, "flags", flags, NULL);
``` ```
We set the `flags` property to allow Audio, Video and Text (Subtitles). We set the `flags` property to allow Audio, Video and Text (Subtitles).
The rest of the tutorial is the same as [Playback tutorial 1: Playbin2 The rest of the tutorial is the same as [Playback tutorial 1: Playbin
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html), except usage](Playback%2Btutorial%2B1%253A%2BPlaybin%2Busage.html), except
that the keyboard input changes the `current-text` property instead of that the keyboard input changes the `current-text` property instead of
the `current-audio`. As before, keep in mind that stream changes are not the `current-audio`. As before, keep in mind that stream changes are not
immediate, since there is a lot of information flowing through the immediate, since there is a lot of information flowing through the
@ -368,11 +368,11 @@ up.
# Conclusion # Conclusion
This tutorial showed how to handle subtitles from `playbin2`, whether This tutorial showed how to handle subtitles from `playbin`, whether
they are embedded in the container or in a different file: they are embedded in the container or in a different file:
- Subtitles are chosen using the `current-tex`t and `n-tex`t - Subtitles are chosen using the `current-tex`t and `n-tex`t
properties of `playbin2`. properties of `playbin`.
- External subtitle files can be selected using the `suburi` property. - External subtitle files can be selected using the `suburi` property.

View file

@ -6,16 +6,16 @@
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) showed pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) showed
how an application can manually extract or inject data into a pipeline how an application can manually extract or inject data into a pipeline
by using two special elements called `appsrc` and `appsink`. by using two special elements called `appsrc` and `appsink`.
`playbin2` allows using these elements too, but the method to connect `playbin` allows using these elements too, but the method to connect
them is different. To connect an `appsink` to `playbin2` see [Playback them is different. To connect an `appsink` to `playbin` see [Playback
tutorial 7: Custom playbin2 tutorial 7: Custom playbin
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin2%2Bsinks.html). sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin%2Bsinks.html).
This tutorial shows: This tutorial shows:
- How to connect `appsrc` with `playbin2` - How to connect `appsrc` with `playbin`
- How to configure the `appsrc` - How to configure the `appsrc`
# A playbin2 waveform generator # A playbin waveform generator
Copy this code into a text file named `playback-tutorial-3.c`. Copy this code into a text file named `playback-tutorial-3.c`.
@ -130,7 +130,7 @@ static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_main_loop_quit (data->main_loop); g_main_loop_quit (data->main_loop);
} }
/* This function is called when playbin2 has created the appsrc element, so we have /* This function is called when playbin has created the appsrc element, so we have
* a chance to configure it. */ * a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) { static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text; gchar *audio_caps_text;
@ -161,8 +161,8 @@ int main(int argc, char *argv[]) {
/* Initialize GStreamer */ /* Initialize GStreamer */
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Create the playbin2 element */ /* Create the playbin element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL); data.pipeline = gst_parse_launch ("playbin uri=appsrc://", NULL);
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data); g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */ /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
@ -186,14 +186,14 @@ int main(int argc, char *argv[]) {
``` ```
To use an `appsrc` as the source for the pipeline, simply instantiate a To use an `appsrc` as the source for the pipeline, simply instantiate a
`playbin2` and set its URI to `appsrc://` `playbin` and set its URI to `appsrc://`
``` lang=c ``` lang=c
/* Create the playbin2 element */ /* Create the playbin element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL); data.pipeline = gst_parse_launch ("playbin uri=appsrc://", NULL);
``` ```
`playbin2` will create an internal `appsrc` element and fire the `playbin` will create an internal `appsrc` element and fire the
`source-setup` signal to allow the application to configure `source-setup` signal to allow the application to configure
it: it:
@ -202,12 +202,12 @@ g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &dat
``` ```
In particular, it is important to set the caps property of `appsrc`, In particular, it is important to set the caps property of `appsrc`,
since, once the signal handler returns, `playbin2` will instantiate the since, once the signal handler returns, `playbin` will instantiate the
next element in the pipeline according to these next element in the pipeline according to these
caps: caps:
``` lang=c ``` lang=c
/* This function is called when playbin2 has created the appsrc element, so we have /* This function is called when playbin has created the appsrc element, so we have
* a chance to configure it. */ * a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) { static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text; gchar *audio_caps_text;
@ -236,22 +236,22 @@ pushing data. See [Basic tutorial 8: Short-cutting the
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html)
for more details. for more details.
From this point onwards, `playbin2` takes care of the rest of the From this point onwards, `playbin` takes care of the rest of the
pipeline, and the application only needs to worry about generating more pipeline, and the application only needs to worry about generating more
data when told so. data when told so.
To learn how data can be extracted from `playbin2` using the To learn how data can be extracted from `playbin` using the
`appsink` element, see [Playback tutorial 7: Custom playbin2 `appsink` element, see [Playback tutorial 7: Custom playbin
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin2%2Bsinks.html). sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin%2Bsinks.html).
# Conclusion # Conclusion
This tutorial applies the concepts shown in [Basic tutorial 8: This tutorial applies the concepts shown in [Basic tutorial 8:
Short-cutting the Short-cutting the
pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) to pipeline](Basic%2Btutorial%2B8%253A%2BShort-cutting%2Bthe%2Bpipeline.html) to
`playbin2`. In particular, it has shown: `playbin`. In particular, it has shown:
- How to connect `appsrc` with `playbin2` using the special - How to connect `appsrc` with `playbin` using the special
URI `appsrc://` URI `appsrc://`
- How to configure the `appsrc` using the `source-setup` signal - How to configure the `appsrc` using the `source-setup` signal

View file

@ -30,7 +30,7 @@ downloaded data stored locally for this contingency. A graphical widget
is also normally used to show how much of the file has already been is also normally used to show how much of the file has already been
downloaded. downloaded.
`playbin2` offers similar functionalities through the `DOWNLOAD` flag `playbin` offers similar functionalities through the `DOWNLOAD` flag
which stores the media in a local temporary file for faster playback of which stores the media in a local temporary file for faster playback of
already-downloaded chunks. already-downloaded chunks.
@ -58,7 +58,7 @@ Copy this code into a text file named `playback-tutorial-4.c`.
#define GRAPH_LENGTH 80 #define GRAPH_LENGTH 80
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_DOWNLOAD = (1 << 7) /* Enable progressive download (on selected formats) */ GST_PLAY_FLAG_DOWNLOAD = (1 << 7) /* Enable progressive download (on selected formats) */
} GstPlayFlags; } GstPlayFlags;
@ -182,7 +182,7 @@ int main(int argc, char *argv[]) {
data.buffering_level = 100; data.buffering_level = 100;
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
bus = gst_element_get_bus (pipeline); bus = gst_element_get_bus (pipeline);
/* Set the download flag */ /* Set the download flag */
@ -265,7 +265,7 @@ flags |= GST_PLAY_FLAG_DOWNLOAD;
g_object_set (pipeline, "flags", flags, NULL); g_object_set (pipeline, "flags", flags, NULL);
``` ```
By setting this flag, `playbin2` instructs its internal queue (a By setting this flag, `playbin` instructs its internal queue (a
`queue2` element, actually) to store all downloaded `queue2` element, actually) to store all downloaded
data. data.
@ -274,7 +274,7 @@ g_signal_connect (pipeline, "deep-notify::temp-location", G_CALLBACK (got_locati
``` ```
`deep-notify` signals are emitted by `GstObject` elements (like `deep-notify` signals are emitted by `GstObject` elements (like
`playbin2`) when the properties of any of their children elements `playbin`) when the properties of any of their children elements
change. In this case we want to know when the `temp-location` property change. In this case we want to know when the `temp-location` property
changes, indicating that the `queue2` has decided where to store the changes, indicating that the `queue2` has decided where to store the
downloaded downloaded
@ -340,7 +340,7 @@ static gboolean refresh_ui (CustomData *data) {
The first thing we do in `refresh_ui` is construct a new Buffering The first thing we do in `refresh_ui` is construct a new Buffering
`GstQuery` with `gst_query_new_buffering()` and pass it to the pipeline `GstQuery` with `gst_query_new_buffering()` and pass it to the pipeline
(`playbin2`) with `gst_element_query()`. In [Basic tutorial 4: Time (`playbin`) with `gst_element_query()`. In [Basic tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) we have management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) we have
already seen how to perform simple queries like Position and Duration already seen how to perform simple queries like Position and Duration
using specific methods. More complex queries, like Buffering, need to using specific methods. More complex queries, like Buffering, need to
@ -428,12 +428,12 @@ file.
This tutorial has shown: This tutorial has shown:
- How to enable progressive downloading with the - How to enable progressive downloading with the
`GST_PLAY_FLAG_DOWNLOAD` `playbin2` flag `GST_PLAY_FLAG_DOWNLOAD` `playbin` flag
- How to know what has been downloaded using a Buffering `GstQuery` - How to know what has been downloaded using a Buffering `GstQuery`
- How to know where it has been downloaded with the - How to know where it has been downloaded with the
`deep-notify::temp-location` signal `deep-notify::temp-location` signal
- How to limit the size of the temporary file with - How to limit the size of the temporary file with
the `ring-buffer-max-size` property of `playbin2`. the `ring-buffer-max-size` property of `playbin`.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!

View file

@ -17,10 +17,10 @@ already explained the concept of GObject interfaces: applications use
them to find out if certain functionality is available, regardless of them to find out if certain functionality is available, regardless of
the actual element which implements it. the actual element which implements it.
`playbin2` implements the Color Balance interface (`gstcolorbalance`), `playbin` implements the Color Balance interface (`gstcolorbalance`),
which allows access to the color balance settings. If any of the which allows access to the color balance settings. If any of the
elements in the `playbin2` pipeline support this interface, elements in the `playbin` pipeline support this interface,
`playbin2` simply forwards it to the application, otherwise, a `playbin` simply forwards it to the application, otherwise, a
colorbalance element is inserted in the pipeline. colorbalance element is inserted in the pipeline.
This interface allows querying for the available color balance channels This interface allows querying for the available color balance channels
@ -158,7 +158,7 @@ int main(int argc, char *argv[]) {
" 'Q' to quit\n"); " 'Q' to quit\n");
/* Build the pipeline */ /* Build the pipeline */
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef _WIN32
@ -219,7 +219,7 @@ int main(int argc, char *argv[]) {
# Walkthrough # Walkthrough
The `main()` function is fairly simple. A `playbin2` pipeline is The `main()` function is fairly simple. A `playbin` pipeline is
instantiated and set to run, and a keyboard watch is installed so instantiated and set to run, and a keyboard watch is installed so
keystrokes can be monitored. keystrokes can be monitored.

View file

@ -11,18 +11,18 @@ player, for example. This tutorial shows:
# Introduction # Introduction
Enabling audio visualization in `playbin2` is actually very easy. Just Enabling audio visualization in `playbin` is actually very easy. Just
set the appropriate `playbin2` flag and, when an audio-only stream is set the appropriate `playbin` flag and, when an audio-only stream is
found, it will instantiate the necessary elements to create and display found, it will instantiate the necessary elements to create and display
the visualization. the visualization.
If you want to specify the actual element that you want to use to If you want to specify the actual element that you want to use to
generate the visualization, you instantiate it yourself and then tell generate the visualization, you instantiate it yourself and then tell
`playbin2` about it through the `vis-plugin` property. `playbin` about it through the `vis-plugin` property.
This tutorial searches the GStreamer registry for all the elements of This tutorial searches the GStreamer registry for all the elements of
the Visualization class, tries to select `goom` (or another one if it is the Visualization class, tries to select `goom` (or another one if it is
not available) and passes it to `playbin2`. not available) and passes it to `playbin`.
# A fancy music player # A fancy music player
@ -42,7 +42,7 @@ Copy this code into a text file named `playback-tutorial-6.c`.
``` lang=c ``` lang=c
#include <gst/gst.h> #include <gst/gst.h>
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_VIS = (1 << 3) /* Enable rendering of visualizations when there is no video stream. */ GST_PLAY_FLAG_VIS = (1 << 3) /* Enable rendering of visualizations when there is no video stream. */
} GstPlayFlags; } GstPlayFlags;
@ -103,14 +103,14 @@ int main(int argc, char *argv[]) {
return -1; return -1;
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://radio.hbr1.com:19800/ambient.ogg", NULL); pipeline = gst_parse_launch ("playbin uri=http://radio.hbr1.com:19800/ambient.ogg", NULL);
/* Set the visualization flag */ /* Set the visualization flag */
g_object_get (pipeline, "flags", &flags, NULL); g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIS; flags |= GST_PLAY_FLAG_VIS;
g_object_set (pipeline, "flags", flags, NULL); g_object_set (pipeline, "flags", flags, NULL);
/* set vis plugin for playbin2 */ /* set vis plugin for playbin */
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL); g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
/* Start playing */ /* Start playing */
@ -157,7 +157,7 @@ int main(int argc, char *argv[]) {
# Walkthrough # Walkthrough
First off, we indicate `playbin2` that we want an audio visualization by First off, we indicate `playbin` that we want an audio visualization by
setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains setting the `GST_PLAY_FLAG_VIS` flag. If the media already contains
video, this flag has no effect. video, this flag has no effect.
@ -168,10 +168,10 @@ flags |= GST_PLAY_FLAG_VIS;
g_object_set (pipeline, "flags", flags, NULL); g_object_set (pipeline, "flags", flags, NULL);
``` ```
If no visualization plugin is enforced by the user, `playbin2` will use If no visualization plugin is enforced by the user, `playbin` will use
`goom` (audio visualization will be disabled if `goom` is not `goom` (audio visualization will be disabled if `goom` is not
available). The rest of the tutorial shows how to find out the available available). The rest of the tutorial shows how to find out the available
visualization elements and enforce one to `playbin2`. visualization elements and enforce one to `playbin`.
``` lang=c ``` lang=c
/* Get a list of all visualization plugins */ /* Get a list of all visualization plugins */
@ -243,10 +243,10 @@ if (!vis_plugin)
``` ```
The selected factory is used to instantiate an actual `GstElement` which The selected factory is used to instantiate an actual `GstElement` which
is then passed to `playbin2` through the `vis-plugin` property: is then passed to `playbin` through the `vis-plugin` property:
``` lang=c ``` lang=c
/* set vis plugin for playbin2 */ /* set vis plugin for playbin */
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL); g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
``` ```
@ -256,10 +256,10 @@ And we are done.
This tutorial has shown: This tutorial has shown:
- How to enable Audio Visualization in `playbin2` with the - How to enable Audio Visualization in `playbin` with the
`GST_PLAY_FLAG_VIS` flag `GST_PLAY_FLAG_VIS` flag
- How to enforce one particular visualization element with the - How to enforce one particular visualization element with the
`vis-plugin` `playbin2` property  `vis-plugin` `playbin` property 
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!

View file

@ -1,26 +1,26 @@
# Playback tutorial 7: Custom playbin2 sinks # Playback tutorial 7: Custom playbin sinks
# Goal # Goal
`playbin2` can be further customized by manually selecting its audio and `playbin` can be further customized by manually selecting its audio and
video sinks. This allows applications to rely on `playbin2` to retrieve video sinks. This allows applications to rely on `playbin` to retrieve
and decode the media and then manage the final render/display and decode the media and then manage the final render/display
themselves. This tutorial shows: themselves. This tutorial shows:
- How to replace the sinks selected by `playbin2`. - How to replace the sinks selected by `playbin`.
- How to use a complex pipeline as a sink. - How to use a complex pipeline as a sink.
# Introduction # Introduction
Two properties of `playbin2` allow selecting the desired audio and video Two properties of `playbin` allow selecting the desired audio and video
sinks: `audio-sink` and `video-sink` (respectively). The application sinks: `audio-sink` and `video-sink` (respectively). The application
only needs to instantiate the appropriate `GstElement` and pass it to only needs to instantiate the appropriate `GstElement` and pass it to
`playbin2` through these properties. `playbin` through these properties.
This method, though, only allows using a single Element as sink. If a This method, though, only allows using a single Element as sink. If a
more complex pipeline is required, for example, an equalizer plus an more complex pipeline is required, for example, an equalizer plus an
audio sink, it needs to be wrapped in a Bin, so it looks to audio sink, it needs to be wrapped in a Bin, so it looks to
`playbin2` as if it was a single Element. `playbin` as if it was a single Element.
A Bin (`GstBin`) is a container that encapsulates partial pipelines so A Bin (`GstBin`) is a container that encapsulates partial pipelines so
they can be managed as single elements. As an example, the they can be managed as single elements. As an example, the
@ -35,7 +35,7 @@ forward data from an external Pad to a given Pad on an internal Element.
**Figure 1:** A Bin with two Elements and one Ghost Pad. **Figure 1:** A Bin with two Elements and one Ghost Pad.
`GstBin`s are also a type of `GstElement`, so they can be used wherever `GstBin`s are also a type of `GstElement`, so they can be used wherever
an Element is required, in particular, as sinks for `playbin2` (and they an Element is required, in particular, as sinks for `playbin` (and they
are then known as **sink-bins**). are then known as **sink-bins**).
# An equalized player # An equalized player
@ -66,7 +66,7 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Create the elements inside the sink bin */ /* Create the elements inside the sink bin */
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer"); equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
@ -91,7 +91,7 @@ int main(int argc, char *argv[]) {
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL); g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL); g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
/* Set playbin2's audio sink to be our sink bin */ /* Set playbin's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL); g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
/* Start playing */ /* Start playing */
@ -190,14 +190,14 @@ Finally, the sink Pad we obtained from the equalizer needs to be release
with `gst_object_unref()`. with `gst_object_unref()`.
At this point, we have a functional sink-bin, which we can use as the At this point, we have a functional sink-bin, which we can use as the
audio sink in `playbin2`. We just need to instruct `playbin2` to use it: audio sink in `playbin`. We just need to instruct `playbin` to use it:
``` lang=c ``` lang=c
/* Set playbin2's audio sink to be our sink bin */ /* Set playbin's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL); g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
``` ```
It is as simple as setting the `audio-sink` property on `playbin2` to It is as simple as setting the `audio-sink` property on `playbin` to
the newly created sink. the newly created sink.
``` lang=c ``` lang=c
@ -224,10 +224,10 @@ pipeline fails to link due to incompatible caps.
This tutorial has shown: This tutorial has shown:
- How to set your own sinks to `playbin2` using the audio-sink and - How to set your own sinks to `playbin` using the audio-sink and
video-sink properties. video-sink properties.
- How to wrap a piece of pipeline into a `GstBin` so it can be used as - How to wrap a piece of pipeline into a `GstBin` so it can be used as
a **sink-bin** by `playbin2`. a **sink-bin** by `playbin`.
It has been a pleasure having you here, and see you soon\! It has been a pleasure having you here, and see you soon\!

View file

@ -154,18 +154,18 @@ after a VAAPI decoder, a VAAPI sink is the only element that fits.
This all means that, if a particular hardware acceleration API is This all means that, if a particular hardware acceleration API is
present in the system, and the corresponding GStreamer plugin is also present in the system, and the corresponding GStreamer plugin is also
available, auto-plugging elements like `playbin2` are free to use available, auto-plugging elements like `playbin` are free to use
hardware acceleration to build their pipelines; the application does not hardware acceleration to build their pipelines; the application does not
need to do anything special to enable it. Almost: need to do anything special to enable it. Almost:
When `playbin2` has to choose among different equally valid elements, When `playbin` has to choose among different equally valid elements,
like conventional software decoding (through `vp8dec`, for example) or like conventional software decoding (through `vp8dec`, for example) or
hardware accelerated decoding (through `vaapidecode`, for example), it hardware accelerated decoding (through `vaapidecode`, for example), it
uses their *rank* to decide. The rank is a property of each element that uses their *rank* to decide. The rank is a property of each element that
indicates its priority; `playbin2` will simply select the element that indicates its priority; `playbin` will simply select the element that
is able to build a complete pipeline and has the highest rank. is able to build a complete pipeline and has the highest rank.
So, whether `playbin2` will use hardware acceleration or not will depend So, whether `playbin` will use hardware acceleration or not will depend
on the relative ranks of all elements capable of dealing with that media on the relative ranks of all elements capable of dealing with that media
type. Therefore, the easiest way to make sure hardware acceleration is type. Therefore, the easiest way to make sure hardware acceleration is
enabled or disabled is by changing the rank of the associated element, enabled or disabled is by changing the rank of the associated element,
@ -263,7 +263,7 @@ these plugins.
- Can interface directly with Clutter (See [Basic tutorial 15: Clutter - Can interface directly with Clutter (See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)), integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)),
so frames do not need to leave the GPU. so frames do not need to leave the GPU.
- Compatible with `playbin2`. - Compatible with `playbin`.
### gst-omx ### gst-omx
@ -330,7 +330,7 @@ these plugins.
- Can interface directly with Clutter (See [Basic tutorial 15: Clutter - Can interface directly with Clutter (See [Basic tutorial 15: Clutter
integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)), integration](Basic%2Btutorial%2B15%253A%2BClutter%2Bintegration.html)),
so frames do not need to leave the GPU. so frames do not need to leave the GPU.
- Compatible with `playbin2`. - Compatible with `playbin`.
# Conclusion # Conclusion

View file

@ -39,7 +39,7 @@ data. For example, these elements typically accept `audio/x-raw-int` or
system, they may also accept `audio/mpeg`, `audio/x-ac3`, system, they may also accept `audio/mpeg`, `audio/x-ac3`,
`audio/x-eac3` or `audio/x-dts`. `audio/x-eac3` or `audio/x-dts`.
Then, when `playbin2` builds the decoding pipeline, it realizes that the Then, when `playbin` builds the decoding pipeline, it realizes that the
audio sink can be directly connected to the encoded data (typically audio sink can be directly connected to the encoded data (typically
coming out of a demuxer), so there is no need for a decoder. This coming out of a demuxer), so there is no need for a decoder. This
process is automatic and does not need any action from the application. process is automatic and does not need any action from the application.
@ -78,8 +78,8 @@ enabled, but, unfortunately, this option is not available in all audio
drivers. drivers.
Another solution involves, using a custom sinkbin (see [Playback Another solution involves, using a custom sinkbin (see [Playback
tutorial 7: Custom playbin2 tutorial 7: Custom playbin
sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin2%2Bsinks.html)) sinks](Playback%2Btutorial%2B7%253A%2BCustom%2Bplaybin%2Bsinks.html))
which includes a `capsfilter` element (see [Basic tutorial 14: Handy which includes a `capsfilter` element (see [Basic tutorial 14: Handy
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html)) and an elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html)) and an
audio sink. The caps that the external decoder supports are then set in audio sink. The caps that the external decoder supports are then set in
@ -97,7 +97,7 @@ only supports raw audio, and will ignore any compressed format.
This tutorial has shown a bit of how GStreamer deals with digital audio. This tutorial has shown a bit of how GStreamer deals with digital audio.
In particular, it has shown that: In particular, it has shown that:
- Applications using `playbin2` do not need to do anything special to - Applications using `playbin` do not need to do anything special to
enable digital audio output: it is managed from the audio control enable digital audio output: it is managed from the audio control
panel of the operating system. panel of the operating system.

View file

@ -103,6 +103,6 @@ QtGStreamer provides access to the underlying C objects, in case you
need them. This is accessible with a simple cast: need them. This is accessible with a simple cast:
``` lang=c ``` lang=c
ElementPtr qgstElement = QGst::ElementFactory::make("playbin2"); ElementPtr qgstElement = QGst::ElementFactory::make("playbin");
GstElement* gstElement = GST_ELEMENT(qgstElement); GstElement* gstElement = GST_ELEMENT(qgstElement);
``` ```

View file

@ -21,13 +21,13 @@ Pages to review:
- Basic+tutorial+15+Clutter+integration.markdown - Basic+tutorial+15+Clutter+integration.markdown
- Basic+tutorial+16+Platform-specific+elements.markdown - Basic+tutorial+16+Platform-specific+elements.markdown
- Playback+tutorials.markdown - Playback+tutorials.markdown
- Playback+tutorial+1+Playbin2+usage.markdown - Playback+tutorial+1+Playbin+usage.markdown
- Playback+tutorial+2+Subtitle+management.markdown - Playback+tutorial+2+Subtitle+management.markdown
- Playback+tutorial+3+Short-cutting+the+pipeline.markdown - Playback+tutorial+3+Short-cutting+the+pipeline.markdown
- Playback+tutorial+4+Progressive+streaming.markdown - Playback+tutorial+4+Progressive+streaming.markdown
- Playback+tutorial+5+Color+Balance.markdown - Playback+tutorial+5+Color+Balance.markdown
- Playback+tutorial+6+Audio+visualization.markdown - Playback+tutorial+6+Audio+visualization.markdown
- Playback+tutorial+7+Custom+playbin2+sinks.markdown - Playback+tutorial+7+Custom+playbin+sinks.markdown
- Playback+tutorial+8+Hardware-accelerated+video+decoding.markdown - Playback+tutorial+8+Hardware-accelerated+video+decoding.markdown
- Playback+tutorial+9+Digital+audio+pass-through.markdown - Playback+tutorial+9+Digital+audio+pass-through.markdown
- Android+tutorials.markdown - Android+tutorials.markdown

View file

@ -29,7 +29,7 @@ concepts is discussed.
- Tools: [Basic tutorial 10: GStreamer tools] - Tools: [Basic tutorial 10: GStreamer tools]
- Threads: [Basic tutorial 7: Multithreading and Pad Availability] - Threads: [Basic tutorial 7: Multithreading and Pad Availability]
[Playback tutorial 1: Playbin usage]: Playback+tutorial+1+Playbin2+usage.markdown [Playback tutorial 1: Playbin usage]: Playback+tutorial+1+Playbin+usage.markdown
[Basic tutorial 8: Short-cutting the pipeline]: Basic+tutorial+8+Short-cutting+the+pipeline.markdown [Basic tutorial 8: Short-cutting the pipeline]: Basic+tutorial+8+Short-cutting+the+pipeline.markdown
[Basic tutorial 2: GStreamer concepts]: Basic+tutorial+2+GStreamer+concepts.markdown [Basic tutorial 2: GStreamer concepts]: Basic+tutorial+2+GStreamer+concepts.markdown
[Basic tutorial 6: Media formats and Pad Capabilities]: Basic+tutorial+6+Media+formats+and+Pad+Capabilities.markdown [Basic tutorial 6: Media formats and Pad Capabilities]: Basic+tutorial+6+Media+formats+and+Pad+Capabilities.markdown

View file

@ -68,7 +68,7 @@ int main(int argc, char *argv[]) {
memset (&data, 0, sizeof (data)); memset (&data, 0, sizeof (data));
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
bus = gst_element_get_bus (pipeline); bus = gst_element_get_bus (pipeline);
/* Start playing */ /* Start playing */

View file

@ -3,7 +3,7 @@
#define GRAPH_LENGTH 80 #define GRAPH_LENGTH 80
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_DOWNLOAD = (1 << 7) /* Enable progressive download (on selected formats) */ GST_PLAY_FLAG_DOWNLOAD = (1 << 7) /* Enable progressive download (on selected formats) */
} GstPlayFlags; } GstPlayFlags;
@ -127,7 +127,7 @@ int main(int argc, char *argv[]) {
data.buffering_level = 100; data.buffering_level = 100;
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
bus = gst_element_get_bus (pipeline); bus = gst_element_get_bus (pipeline);
/* Set the download flag */ /* Set the download flag */

View file

@ -97,7 +97,7 @@ static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_main_loop_quit (data->main_loop); g_main_loop_quit (data->main_loop);
} }
/* This function is called when playbin2 has created the appsrc element, so we have /* This function is called when playbin has created the appsrc element, so we have
* a chance to configure it. */ * a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) { static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text; gchar *audio_caps_text;
@ -128,8 +128,8 @@ int main(int argc, char *argv[]) {
/* Initialize GStreamer */ /* Initialize GStreamer */
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Create the playbin2 element */ /* Create the playbin element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL); data.pipeline = gst_parse_launch ("playbin uri=appsrc://", NULL);
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data); g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */ /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */

View file

@ -97,7 +97,7 @@ static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_main_loop_quit (data->main_loop); g_main_loop_quit (data->main_loop);
} }
/* This function is called when playbin2 has created the appsrc element, so we have /* This function is called when playbin has created the appsrc element, so we have
* a chance to configure it. */ * a chance to configure it. */
static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) { static void source_setup (GstElement *pipeline, GstElement *source, CustomData *data) {
gchar *audio_caps_text; gchar *audio_caps_text;
@ -128,8 +128,8 @@ int main(int argc, char *argv[]) {
/* Initialize GStreamer */ /* Initialize GStreamer */
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Create the playbin2 element */ /* Create the playbin element */
data.pipeline = gst_parse_launch ("playbin2 uri=appsrc://", NULL); data.pipeline = gst_parse_launch ("playbin uri=appsrc://", NULL);
g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data); g_signal_connect (data.pipeline, "source-setup", G_CALLBACK (source_setup), &data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */ /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */

View file

@ -113,7 +113,7 @@ int main(int argc, char *argv[]) {
" 'Q' to quit\n"); " 'Q' to quit\n");
/* Build the pipeline */ /* Build the pipeline */
data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef _WIN32

View file

@ -1,6 +1,6 @@
#include <gst/gst.h> #include <gst/gst.h>
/* playbin2 flags */ /* playbin flags */
typedef enum { typedef enum {
GST_PLAY_FLAG_VIS = (1 << 3) /* Enable rendering of visualizations when there is no video stream. */ GST_PLAY_FLAG_VIS = (1 << 3) /* Enable rendering of visualizations when there is no video stream. */
} GstPlayFlags; } GstPlayFlags;
@ -61,14 +61,14 @@ int main(int argc, char *argv[]) {
return -1; return -1;
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://radio.hbr1.com:19800/ambient.ogg", NULL); pipeline = gst_parse_launch ("playbin uri=http://radio.hbr1.com:19800/ambient.ogg", NULL);
/* Set the visualization flag */ /* Set the visualization flag */
g_object_get (pipeline, "flags", &flags, NULL); g_object_get (pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_VIS; flags |= GST_PLAY_FLAG_VIS;
g_object_set (pipeline, "flags", flags, NULL); g_object_set (pipeline, "flags", flags, NULL);
/* set vis plugin for playbin2 */ /* set vis plugin for playbin */
g_object_set (pipeline, "vis-plugin", vis_plugin, NULL); g_object_set (pipeline, "vis-plugin", vis_plugin, NULL);
/* Start playing */ /* Start playing */

View file

@ -10,7 +10,7 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv); gst_init (&argc, &argv);
/* Build the pipeline */ /* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Create the elements inside the sink bin */ /* Create the elements inside the sink bin */
equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer"); equalizer = gst_element_factory_make ("equalizer-3bands", "equalizer");
@ -35,7 +35,7 @@ int main(int argc, char *argv[]) {
g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL); g_object_set (G_OBJECT (equalizer), "band1", (gdouble)-24.0, NULL);
g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL); g_object_set (G_OBJECT (equalizer), "band2", (gdouble)-24.0, NULL);
/* Set playbin2's audio sink to be our sink bin */ /* Set playbin's audio sink to be our sink bin */
g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL); g_object_set (GST_OBJECT (pipeline), "audio-sink", bin, NULL);
/* Start playing */ /* Start playing */

View file

@ -111,8 +111,8 @@ int main(int argc, char *argv[]) {
" 'Q' to quit\n"); " 'Q' to quit\n");
/* Build the pipeline */ /* Build the pipeline */
// data.pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL); // data.pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
data.pipeline = gst_parse_launch ("playbin2 uri=file:///f:/media/sintel/sintel_trailer-480p.webm", NULL); data.pipeline = gst_parse_launch ("playbin uri=file:///f:/media/sintel/sintel_trailer-480p.webm", NULL);
/* Add a keyboard watch so we get notified of keystrokes */ /* Add a keyboard watch so we get notified of keystrokes */
#ifdef _WIN32 #ifdef _WIN32

View file

@ -274,9 +274,9 @@ element (here: textoverlay) has multiple sink or source pads.
ffmpegcolorspace \! overlay.video\_sink filesrc location=movie.srt \! ffmpegcolorspace \! overlay.video\_sink filesrc location=movie.srt \!
subparse \! overlay.text\_sink** subparse \! overlay.text\_sink**
Play an AVI movie with an external text subtitle stream using playbin2 Play an AVI movie with an external text subtitle stream using playbin
**gst-launch-1.0 playbin2 uri=<file:///path/to/movie.avi> **gst-launch-1.0 playbin uri=<file:///path/to/movie.avi>
suburi=<file:///path/to/movie.srt>** suburi=<file:///path/to/movie.srt>**
**Network streaming** **Network streaming**

View file

@ -15,7 +15,7 @@ iOS device. It shows:
It also uses the knowledge gathered in the [Basic It also uses the knowledge gathered in the [Basic
tutorials](Basic%2Btutorials.html) regarding: tutorials](Basic%2Btutorials.html) regarding:
- How to use `playbin2` to play any kind of media - How to use `playbin` to play any kind of media
- How to handle network resilience problems - How to handle network resilience problems
# Introduction # Introduction
@ -23,10 +23,10 @@ tutorials](Basic%2Btutorials.html) regarding:
From the previous tutorials, we already have almost all necessary pieces From the previous tutorials, we already have almost all necessary pieces
to build a media player. The most complex part is assembling a pipeline to build a media player. The most complex part is assembling a pipeline
which retrieves, decodes and displays the media, but we already know which retrieves, decodes and displays the media, but we already know
that the `playbin2` element can take care of all that for us. We only that the `playbin` element can take care of all that for us. We only
need to replace the manual pipeline we used in [iOS tutorial 3: need to replace the manual pipeline we used in [iOS tutorial 3:
Video](iOS%2Btutorial%2B3%253A%2BVideo.html) with a Video](iOS%2Btutorial%2B3%253A%2BVideo.html) with a
single-element `playbin2` pipeline and we are good to go\! single-element `playbin` pipeline and we are good to go\!
However, we can do better than. We will add a [Time However, we can do better than. We will add a [Time
Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html), Slider](http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UISlider_Class/Reference/Reference.html),
@ -301,7 +301,7 @@ this view is collapsed by default. Click here to expand…
Supporting arbitrary media URIs Supporting arbitrary media URIs
The `GStreamerBackend`  provides the `setUri()` method so we can The `GStreamerBackend`  provides the `setUri()` method so we can
indicate the URI of the media to play. Since `playbin2` will be taking indicate the URI of the media to play. Since `playbin` will be taking
care of retrieving the media, we can use local or remote URIs care of retrieving the media, we can use local or remote URIs
indistinctly (`file://` or `http://`, for example). From the UI code, indistinctly (`file://` or `http://`, for example). From the UI code,
though, we want to keep track of whether the file is local or remote, though, we want to keep track of whether the file is local or remote,
@ -839,7 +839,7 @@ static void state_changed_cb (GstBus *bus, GstMessage *msg, GStreamerBackend *se
g_main_context_push_thread_default(context); g_main_context_push_thread_default(context);
/* Build pipeline */ /* Build pipeline */
pipeline = gst_parse_launch("playbin2", &error); pipeline = gst_parse_launch("playbin", &error);
if (error) { if (error) {
gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message); gchar *message = g_strdup_printf("Unable to build pipeline: %s", error->message);
g_clear_error (&error); g_clear_error (&error);
@ -921,7 +921,7 @@ one):
We first need to obtain a plain `char *` from within the `NSString *` we We first need to obtain a plain `char *` from within the `NSString *` we
get, using the `UTF8String` method. get, using the `UTF8String` method.
`playbin2`s URI is exposed as a common GObject property, so we simply `playbin`s URI is exposed as a common GObject property, so we simply
set it with `g_object_set()`. set it with `g_object_set()`.
### Reporting media size ### Reporting media size
@ -971,7 +971,7 @@ static void check_media_size (GStreamerBackend *self) {
``` ```
We first retrieve the video sink element from the pipeline, using We first retrieve the video sink element from the pipeline, using
the `video-sink` property of `playbin2`, and then its sink Pad. The the `video-sink` property of `playbin`, and then its sink Pad. The
negotiated Caps of this Pad, which we recover using negotiated Caps of this Pad, which we recover using
`gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media. `gst_pad_get_negotiated_caps()`,  are the Caps of the decoded media.
@ -1214,7 +1214,7 @@ the pipeline to the `target_state`.
### Conclusion ### Conclusion
This tutorial has shown how to embed a `playbin2` pipeline into an iOS This tutorial has shown how to embed a `playbin` pipeline into an iOS
application. This, effectively, turns such application into a basic application. This, effectively, turns such application into a basic
media player, capable of streaming and decoding all the formats media player, capable of streaming and decoding all the formats
GStreamer understands. More particularly, it has shown: GStreamer understands. More particularly, it has shown:

View file

@ -24,13 +24,13 @@ Home.markdown
Basic+tutorial+15+Clutter+integration.markdown Basic+tutorial+15+Clutter+integration.markdown
Basic+tutorial+16+Platform-specific+elements.markdown Basic+tutorial+16+Platform-specific+elements.markdown
Playback+tutorials.markdown Playback+tutorials.markdown
Playback+tutorial+1+Playbin2+usage.markdown Playback+tutorial+1+Playbin+usage.markdown
Playback+tutorial+2+Subtitle+management.markdown Playback+tutorial+2+Subtitle+management.markdown
Playback+tutorial+3+Short-cutting+the+pipeline.markdown Playback+tutorial+3+Short-cutting+the+pipeline.markdown
Playback+tutorial+4+Progressive+streaming.markdown Playback+tutorial+4+Progressive+streaming.markdown
Playback+tutorial+5+Color+Balance.markdown Playback+tutorial+5+Color+Balance.markdown
Playback+tutorial+6+Audio+visualization.markdown Playback+tutorial+6+Audio+visualization.markdown
Playback+tutorial+7+Custom+playbin2+sinks.markdown Playback+tutorial+7+Custom+playbin+sinks.markdown
Playback+tutorial+8+Hardware-accelerated+video+decoding.markdown Playback+tutorial+8+Hardware-accelerated+video+decoding.markdown
Playback+tutorial+9+Digital+audio+pass-through.markdown Playback+tutorial+9+Digital+audio+pass-through.markdown
Android+tutorials.markdown Android+tutorials.markdown