Cleaned up up to basic tutorial 5

This commit is contained in:
Olivier Crête 2016-05-17 10:52:21 -04:00
parent 055fc95e04
commit 8074004a6d
52 changed files with 783 additions and 970 deletions

View file

@ -1,11 +1,9 @@
# GStreamer SDK documentation : Basic tutorial 1: Hello world\!
# Basic tutorial 1: Hello world!
This page last changed on Jun 29, 2012 by xartigas.
# Goal
## Goal
Nothing better to get a first impression about a software library than
to print “Hello World” on the screen\!
to print “Hello World” on the screen!
But since we are dealing with multimedia frameworks, we are going to
play a video instead.
@ -16,14 +14,14 @@ always a bit verbose.
Without further ado, get ready for your first GStreamer application...
# Hello world
## Hello world
Copy this code into a text file named `basic-tutorial-1.c` (or find it
Copy this code into a text file named `basic-tutorial-1.c` (or find it
in the SDK installation).
**basic-tutorial-1.c**
``` theme: Default; brush: cpp; gutter: true
```
#include <gst/gst.h>
int main(int argc, char *argv[]) {
@ -35,7 +33,7 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv);
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
@ -54,45 +52,34 @@ int main(int argc, char *argv[]) {
}
```
Compile it as described in [Installing on
Linux](Installing%2Bon%2BLinux.html), [Installing on Mac OS
X](Installing%2Bon%2BMac%2BOS%2BX.html) or [Installing on
Windows](Installing%2Bon%2BWindows.html). If you get compilation errors,
Compile it as described in [Installing on
Linux](Installing+on+Linux.markdown), [Installing on Mac OS
X](Installing+on+Mac+OS+X.markdown) or [Installing on
Windows](Installing+on+Windows.markdown). If you get compilation errors,
double-check the instructions given in those sections.
If everything built fine, fire up the executable\! You should see a
If everything built fine, fire up the executable! You should see a
window pop up, containing a video being played straight from the
Internet, along with audio. Congratulations\!
Internet, along with audio. Congratulations!
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-750009403" class="expand-container">
<div id="expander-control-750009403" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-750009403" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. </span>Also, there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how <a href="Basic%2Btutorial%2B12%253A%2BStreaming.html">Basic tutorial 12: Streaming</a> solves this issue.</p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Build), or use this specific command on Linux:
>
> ``gcc basic-tutorial-1.c -o basic-tutorial-1 `pkg-config --cflags --libs gstreamer-0.10` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. Also, there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how [Basic tutorial 12: Streaming](Basic+tutorial+12+Streaming.markdown) solves this issue.
>
>Required libraries: `gstreamer-1.0`
# Walkthrough
## Walkthrough
Let's review these lines of code and see what they do:
``` first-line: 8; theme: Default; brush: cpp; gutter: true
```
/* Initialize GStreamer */
gst_init (&argc, &argv);
```
@ -106,21 +93,21 @@ This must always be your first GStreamer command. Among other things,
- Executes any command-line option intended for GStreamer
If you always pass your command-line parameters `argc` and `argv` to
If you always pass your command-line parameters argc` and `argv` to
`gst_init()`, your application will automatically benefit from the
GStreamer standard command-line options (more on this in [Basic tutorial
10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html))
tools](Basic+tutorial+10+GStreamer+tools.markdown))
``` first-line: 11; theme: Default; brush: cpp; gutter: true
```
/* Build the pipeline */
pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
```
This line is the heart of this tutorial, and exemplifies **two** key
points: `gst_parse_launch()` and `playbin2`.
points: `gst_parse_launch()` and `playbin`.
#### gst\_parse\_launch
### gst_parse_launch
GStreamer is a framework designed to handle multimedia flows. Media
travels from the “source” elements (the producers), down to the “sink”
@ -137,17 +124,17 @@ This function takes a textual representation of a pipeline and turns it
into an actual pipeline, which is very handy. In fact, this function is
so handy there is a tool built completely around it which you will get
very acquainted with (see [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) to
learn about `gst-launch` and the `gst-launch` syntax).
tools](Basic+tutorial+10+GStreamer+tools.markdown) to
learn about `gst-launch` and the `gst-launch` syntax).
#### playbin2
### playbin
So, what kind of pipeline are we asking `gst_parse_launch()`to build for
us? Here enters the second key point: We are building a pipeline
composed of a single element called `playbin2`.
composed of a single element called `playbin`.
`playbin2` is a special element which acts as a source and as a sink,
and is capable of implementing a whole pipeline. Internally, it creates
`playbin` is a special element which acts as a source and as a sink,
and is a whole pipeline. Internally, it creates
and connects all the necessary elements to play your media, so you do
not have to worry about it.
@ -155,17 +142,17 @@ It does not allow the control granularity that a manual pipeline does,
but, still, it permits enough customization to suffice for a wide range
of applications. Including this tutorial.
In this example, we are only passing one parameter to `playbin2`, which
In this example, we are only passing one parameter to `playbin`, which
is the URI of the media we want to play. Try changing it to something
else\! Whether it is an `http://` or `file://` URI, `playbin2` will
instantiate the appropriate GStreamer source transparently\!
else! Whether it is an `http://` or `file://` URI, `playbin` will
instantiate the appropriate GStreamer source transparently!
If you mistype the URI, or the file does not exist, or you are missing a
plug-in, GStreamer provides several notification mechanisms, but the
only thing we are doing in this example is exiting on error, so do not
expect much feedback.
``` first-line: 14; theme: Default; brush: cpp; gutter: true
```
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
```
@ -176,35 +163,35 @@ think of as the Play/Pause button in your regular DVD player. For now,
suffice to say that playback will not start unless you set the pipeline
to the PLAYING state.
In this line, `gst_element_set_state()` is setting `pipeline` (our only
In this line, `gst_element_set_state()` is setting `pipeline` (our only
element, remember) to the PLAYING state, thus initiating playback.
``` first-line: 17; theme: Default; brush: cpp; gutter: true
```
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
```
These lines will wait until an error occurs or the end of the stream is
found. `gst_element_get_bus()` retrieves the pipeline's bus, and
`gst_bus_timed_pop_filtered()` will block until you receive either an
These lines will wait until an error occurs or the end of the stream is
found. `gst_element_get_bus()` retrieves the pipeline's bus, and
`gst_bus_timed_pop_filtered()` will block until you receive either an
ERROR or an EOS (End-Of-Stream) through that bus. Do not worry much
about this line, the GStreamer bus is explained in [Basic tutorial 2:
GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html).
concepts](Basic+tutorial+2+GStreamer+concepts.markdown).
And that's it\! From this point onwards, GStreamer takes care of
And that's it! From this point onwards, GStreamer takes care of
everything. Execution will end when the media reaches its end (EOS) or
an error is encountered (try closing the video window, or unplugging the
network cable). The application can always be stopped by pressing
an error is encountered (try closing the video window, or unplugging the
network cable). The application can always be stopped by pressing
control-C in the console.
#### Cleanup
### Cleanup
Before terminating the application, though, there is a couple of things
we need to do to tidy up correctly after ourselves.
``` first-line: 21; theme: Default; brush: cpp; gutter: true
```
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
@ -216,23 +203,23 @@ gst_object_unref (pipeline);
Always read the documentation of the functions you use, to know if you
should free the objects they return after using them.
In this case, `gst_bus_timed_pop_filtered()` returned a message which
needs to be freed with `gst_message_unref()` (more about messages in
In this case, `gst_bus_timed_pop_filtered()` returned a message which
needs to be freed with `gst_message_unref()` (more about messages in
[Basic tutorial 2: GStreamer
concepts](Basic%2Btutorial%2B2%253A%2BGStreamer%2Bconcepts.html)).
concepts](Basic+tutorial+2+GStreamer+concepts.markdown)).
`gst_element_get_bus()` added a reference to the bus that must be freed
`gst_element_get_bus()` added a reference to the bus that must be freed
with `gst_object_unref()`. Setting the pipeline to the NULL state will
make sure it frees any resources it has allocated (More about states in
[Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html)).
pipelines](Basic+tutorial+3+Dynamic+pipelines.markdown)).
Finally, unreferencing the pipeline will destroy it, and all its
contents.
# Conclusion
## Conclusion
And so ends your first tutorial with GStreamer. We hope its brevity
serves as an example of how powerful this framework is\!
serves as an example of how powerful this framework is!
Let's recap a bit. Today we have learned:
@ -247,12 +234,10 @@ Let's recap a bit. Today we have learned:
`gst_element_set_state()`.
- How to sit back and relax, while GStreamer takes care of everything,
using `gst_element_get_bus()` and `gst_bus_timed_pop_filtered()`.
using `gst_element_get_bus()` and `gst_bus_timed_pop_filtered()`.
The next tutorial will keep introducing more basic GStreamer elements,
and show you how to build a pipeline manually.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27
It has been a pleasure having you here, and see you soon!

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 2: GStreamer concepts
# Basic tutorial 2: GStreamer concepts
This page last changed on Jul 02, 2012 by xartigas.
# Goal
## Goal
The previous tutorial showed how to build a pipeline automatically. Now
we are going to build a pipeline manually by instantiating each element
@ -17,14 +15,14 @@ and linking them all together. In the process, we will learn:
- How to watch the bus for error conditions and extract information
from GStreamer messages.
# Manual Hello World
## Manual Hello World
Copy this code into a text file named `basic-tutorial-2.c` (or find it
Copy this code into a text file named `basic-tutorial-2.c` (or find it
in the SDK installation).
**basic-tutorial-2.c**
``` theme: Default; brush: cpp; gutter: true
```
#include <gst/gst.h>
int main(int argc, char *argv[]) {
@ -103,81 +101,71 @@ int main(int argc, char *argv[]) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-179983452" class="expand-container">
<div id="expander-control-179983452" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-179983452" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-2.c -o basic-tutorial-2 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial opens a window and displays a test pattern, without audio.</span></p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Build), or use this specific command on Linux:
>
> `` gcc basic-tutorial-2.c -o basic-tutorial-2 `pkg-config --cflags --libs gstreamer-0.10` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
>This tutorial opens a window and displays a test pattern, without audio
>
>Required libraries: `gstreamer-1.0`
## Walkthrough
The basic construction block of GStreamer are the elements, which
process the data as it flows *downstream* from the source elements (the
process the data as it flows *downstream* from the source elements (the
producers of data) to the sink elements (the consumers of data), passing
through filter elements.
![](attachments/327782/1343490.png)
![](attachments/figure-1.png)
**Figure 1**. Example pipeline
#### Element creation
### Element creation
We will skip GStreamer initialization, since it is the same as the
previous tutorial:
``` first-line: 12; theme: Default; brush: cpp; gutter: true
```
/* Create the elements */
source = gst_element_factory_make ("videotestsrc", "source");
sink = gst_element_factory_make ("autovideosink", "sink");
```
As seen in this code, new elements can be created
with `gst_element_factory_make()`. The first parameter is the type of
with `gst_element_factory_make()`. The first parameter is the type of
element to create ([Basic tutorial 14: Handy
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html) shows a
few common types, and [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) shows how to
elements](Basic+tutorial+14+Handy+elements.markdown) shows a
few common types, and [Basic tutorial 10: GStreamer
tools](Basic+tutorial+10+GStreamer+tools.markdown) shows how to
obtain the list of all available types). The second parameter is the
name we want to give to this particular instance. Naming your elements
is useful to retrieve them later if you didn't keep a pointer (and for
more meaningful debug output). If you pass NULL for the name, however,
GStreamer will provide a unique name for you.
For this tutorial we create two elements: a `videotestsrc` and
an `autovideosink`.
For this tutorial we create two elements: a `videotestsrc` and
an `autovideosink`.
`videotestsrc` is a source element (it produces data), which creates a
`videotestsrc` is a source element (it produces data), which creates a
test video pattern. This element is useful for debugging purposes (and
tutorials) and is not usually found in real applications.
`autovideosink` is a sink element (it consumes data), which displays on
`autovideosink` is a sink element (it consumes data), which displays on
a window the images it receives. There exist several video sinks,
depending on the operating system, with a varying range of capabilities.
`autovideosink` automatically selects and instantiates the best one, so
`autovideosink` automatically selects and instantiates the best one, so
you do not have to worry with the details, and your code is more
platform-independent.
#### Pipeline creation
### Pipeline creation
``` first-line: 16; theme: Default; brush: cpp; gutter: true
```
/* Create the empty pipeline */
pipeline = gst_pipeline_new ("test-pipeline");
```
@ -186,7 +174,7 @@ All elements in GStreamer must typically be contained inside a pipeline
before they can be used, because it takes care of some clocking and
messaging functions. We create the pipeline with `gst_pipeline_new()`.
``` first-line: 24; theme: Default; brush: cpp; gutter: true
```
/* Build the pipeline */
gst_bin_add_many (GST_BIN (pipeline), source, sink, NULL);
if (gst_element_link (source, sink) != TRUE) {
@ -198,22 +186,22 @@ if (gst_element_link (source, sink) != TRUE) {
A pipeline is a particular type of `bin`, which is the element used to
contain other elements. Therefore all methods which apply to bins also
apply to pipelines. In our case, we call `gst_bin_add_many()` to add the
apply to pipelines. In our case, we call `gst_bin_add_many()` to add the
elements to the pipeline (mind the cast). This function accepts a list
of elements to be added, ending with NULL. Individual elements can be
added with `gst_bin_add()`.
added with `gst_bin_add()`.
These elements, however, are not linked with each other yet. For this,
we need to use `gst_element_link()`. Its first parameter is the source,
and the second one the destination. The order counts, because links must
be established following the data flow (this is, from source elements to
sink elements). Keep in mind that only elements residing in the same bin
sink elements). Keep in mind that only elements residing in the same bin
can be linked together, so remember to add them to the pipeline before
trying to link them\!
trying to link them!
#### Properties
### Properties
``` first-line: 32; theme: Default; brush: cpp; gutter: true
```
/* Modify the source's properties */
g_object_set (source, "pattern", 0, NULL);
```
@ -223,30 +211,30 @@ that can be modified to change the element's behavior (writable
properties) or inquired to find out about the element's internal state
(readable properties).
Properties are read from with `g_object_get()` and written to
with `g_object_set()`.
Properties are read from with `g_object_get()` and written to
with `g_object_set()`.
`g_object_set()` accepts a NULL-terminated list of property-name,
`g_object_set()` accepts a NULL-terminated list of property-name,
property-value pairs, so multiple properties can be changed in one go
(GStreamer elements are all a particular kind of `GObject`, which is the
entity offering property facilities: This is why the property handling
methods have the `g_` prefix).
methods have the `g_` prefix).
The line of code above changes the “pattern” property of `videotestsrc`,
The line of code above changes the “pattern” property of `videotestsrc`,
which controls the type of test video the element outputs. Try different
values\!
values!
The names and possible values of all the properties an element exposes
can be found using the gst-inspect tool described in [Basic tutorial 10:
GStreamer tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html).
GStreamer tools](Basic+tutorial+10+GStreamer+tools.markdown).
#### Error checking
### Error checking
At this point, we have the whole pipeline built and setup, and the rest
of the tutorial is very similar to the previous one, but we are going to
add more error checking:
``` first-line: 35; theme: Default; brush: cpp; gutter: true
```
/* Start playing */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
@ -259,9 +247,9 @@ if (ret == GST_STATE_CHANGE_FAILURE) {
We call `gst_element_set_state()`, but this time we check its return
value for errors. Changing states is a delicate process and a few more
details are given in [Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html).
pipelines](Basic+tutorial+3+Dynamic+pipelines.markdown).
``` first-line: 43; theme: Default; brush: cpp; gutter: true
```
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);
@ -291,67 +279,67 @@ if (msg != NULL) {
}
```
`gst_bus_timed_pop_filtered()` waits for execution to end and returns
with a `GstMessage` which we previously ignored. We
asked `gst_bus_timed_pop_filtered()` to return when GStreamer
`gst_bus_timed_pop_filtered()` waits for execution to end and returns
with a `GstMessage` which we previously ignored. We
asked `gst_bus_timed_pop_filtered()` to return when GStreamer
encountered either an error condition or an EOS, so we need to check
which one happened, and print a message on screen (Your application will
probably want to undertake more complex actions).
`GstMessage` is a very versatile structure which can deliver virtually
`GstMessage` is a very versatile structure which can deliver virtually
any kind of information. Fortunately, GStreamer provides a series of
parsing functions for each kind of message.
In this case, once we know the message contains an error (by using the
`GST_MESSAGE_TYPE()` macro), we can use
`gst_message_parse_error()` which returns a GLib `GError` error
structure and a string useful for debugging. Examine the code to see how
these are used and freed afterward. 
`GST_MESSAGE_TYPE()` macro), we can use
`gst_message_parse_error()` which returns a GLib `GError` error
structure and a string useful for debugging. Examine the code to see how
these are used and freed afterward.
#### The GStreamer bus
### The GStreamer bus
At this point it is worth introducing the GStreamer bus a bit more
formally. It is the object responsible for delivering to the application
the `GstMessage`s generated by the elements, in order and to the
the `GstMessage`s generated by the elements, in order and to the
application thread. This last point is important, because the actual
streaming of media is done in another thread than the application.
Messages can be extracted from the bus synchronously with
`gst_bus_timed_pop_filtered()` and its siblings, or asynchronously,
using signals (shown in the next tutorial). Your application should
`gst_bus_timed_pop_filtered()` and its siblings, or asynchronously,
using signals (shown in the next tutorial). Your application should
always keep an eye on the bus to be notified of errors and other
playback-related issues.
The rest of the code is the cleanup sequence, which is the same as
in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html).
in [Basic tutorial 1: Hello
world!](Basic+tutorial+1+Hello+world.markdown).
# Exercise
## Exercise
If you feel like practicing, try this exercise: Add a video filter
element in between the source and the sink of this pipeline. Use
`vertigotv` for a nice effect. You will need to create it, add it to the
`vertigotv` for a nice effect. You will need to create it, add it to the
pipeline, and link it with the other elements.
Depending on your platform and available plugins, you might get a
“negotiation” error, because the sink does not understand what the
filter is producing (more about negotiation in [Basic tutorial 6: Media
filter is producing (more about negotiation in [Basic tutorial 6: Media
formats and Pad
Capabilities](Basic%2Btutorial%2B6%253A%2BMedia%2Bformats%2Band%2BPad%2BCapabilities.html)).
In this case, try to add an element called `ffmpegcolorspace` after the
Capabilities](Basic+tutorial+6+Media+formats+and+Pad+Capabilities.markdown)).
In this case, try to add an element called `videoconvert` after the
filter (this is, build a pipeline of 4 elements. More on
`ffmpegcolorspace` in [Basic tutorial 14: Handy
elements](Basic%2Btutorial%2B14%253A%2BHandy%2Belements.html)).
`videoconvert` in [Basic tutorial 14: Handy
elements](Basic+tutorial+14+Handy+elements.markdown)).
# Conclusion
## Conclusion
This tutorial showed:
- How to create elements with `gst_element_factory_make()`
- How to create an empty pipeline with `gst_pipeline_new()`
- How to create an empty pipeline with `gst_pipeline_new()`
- How to add elements to the pipeline with `gst_bin_add_many()`
- How to add elements to the pipeline with `gst_bin_add_many()`
- How to link the elements with each other with `gst_element_link()`
@ -361,12 +349,4 @@ concepts. The second one comes next.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[figure-1.png](attachments/327782/1343490.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27
It has been a pleasure having you here, and see you soon!

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 3: Dynamic pipelines
# Basic tutorial 3: Dynamic pipelines
This page last changed on May 30, 2012 by xartigas.
# Goal
## Goal
This tutorial shows the rest of the basic concepts required to use
GStreamer, which allow building the pipeline "on the fly", as
@ -10,7 +8,7 @@ information becomes available, instead of having a monolithic pipeline
defined at the beginning of your application.
After this tutorial, you will have the necessary knowledge to start the
[Playback tutorials](Playback%2Btutorials.html). The points reviewed
[Playback tutorials](Playback+tutorials.html). The points reviewed
here will be:
- How to attain finer control when linking elements.
@ -19,17 +17,18 @@ here will be:
- The various states in which an element can be.
# Introduction
## Introduction
As you are about to see, the pipeline in this tutorial is not completely
built before it is set to the playing state. This is OK. If we did not
take further action, data would reach the end of the pipeline and just
get discarded. But we are going to take further action...
As you are about to see, the pipeline in this tutorial is not
completely built before it is set to the playing state. This is OK. If
we did not take further action, data would reach the end of the
pipeline and the pipeline would produce an error message and stop. But
we are going to take further action...
In this example we are opening a file which is multiplexed (or *muxed)*,
this is, audio and video are stored together inside a *container* file.
The elements responsible for opening such containers are called
*demuxers*, and some examples of container formats are Matroska (MKV),
*demuxers*, and some examples of container formats are Matroska (MKV),
Quick Time (QT, MOV), Ogg, or Advanced Systems Format (ASF, WMV, WMA).
If a container embeds multiple streams (one video and two audio tracks,
@ -44,14 +43,14 @@ It follows naturally that source elements only contain source pads, sink
elements only contain sink pads, and filter elements contain
both.
![](attachments/327784/1540098.png) ![](attachments/327784/1540099.png) ![](attachments/327784/1540100.png)
![](attachments/src-element.png) ![](attachments/filter-element.png) ![](attachments/sink-element.png)
**Figure 1**. GStreamer elements with their pads.
A demuxer contains one sink pad, through which the muxed data arrives,
and multiple source pads, one for each stream found in the container:
![](attachments/327784/1540101.png)
![](attachments/filter-element-multi.png)
**Figure 2**. A demuxer with two source pads.
@ -59,12 +58,10 @@ For completeness, here you have a simplified pipeline containing a
demuxer and two branches, one for audio and one for video. This is
**NOT** the pipeline that will be built in this example:
![](attachments/327784/1540102.png)
![](attachments/simple-player.png)
**Figure 3**. Example pipeline with two branches.
 
The main complexity when dealing with demuxers is that they cannot
produce any information until they have received some data and have had
a chance to look at the container to see what is inside. This is,
@ -81,14 +78,14 @@ demuxer pads.
For simplicity, in this example, we will only link to the audio pad and
ignore the video.
# Dyamic Hello World
## Dyamic Hello World
Copy this code into a text file named `basic-tutorial-3.c` (or find it
Copy this code into a text file named `basic-tutorial-3.c` (or find it
in the SDK installation).
**basic-tutorial-3.c**
``` theme: Default; brush: cpp; gutter: true
```
#include <gst/gst.h>
/* Structure to contain all our information, so we can pass it to callbacks */
@ -214,7 +211,7 @@ static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *dat
}
/* Check the new pad's type */
new_pad_caps = gst_pad_get_caps (new_pad);
new_pad_caps = gst_pad_query_caps (new_pad, NULL);
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
new_pad_type = gst_structure_get_name (new_pad_struct);
if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
@ -240,33 +237,22 @@ exit:
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-1093416675" class="expand-container">
<div id="expander-control-1093416675" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-1093416675" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-3.c -o basic-tutorial-3 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial only plays audio. The media is fetched from the Internet, so it might take a few seconds to start, depending on your connection speed.</span></p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Build), or use this specific command on Linux:
> ``gcc basic-tutorial-3.c -o basic-tutorial-3 `pkg-config --cflags --libs gstreamer-0.10` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
> This tutorial only plays audio. The media is fetched from the Internet, so it might take a few seconds to start, depending on your connection speed.
>
>Required libraries: `gstreamer-1.0`
``` first-line: 3; theme: Default; brush: cpp; gutter: true
## Walkthrough
```
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
GstElement *pipeline;
@ -277,40 +263,40 @@ typedef struct _CustomData {
```
So far we have kept all the information we needed (pointers
to `GstElement`s, basically) as local variables. Since this tutorial
to `GstElement`s, basically) as local variables. Since this tutorial
(and most real applications) involves callbacks, we will group all our
data in a structure for easier handling.
``` first-line: 11; theme: Default; brush: cpp; gutter: true
```
/* Handler for the pad-added signal */
static void pad_added_handler (GstElement *src, GstPad *pad, CustomData *data);
```
This is a forward reference, to be used later.
``` first-line: 24; theme: Default; brush: cpp; gutter: true
```
/* Create the elements */
data.source = gst_element_factory_make ("uridecodebin", "source");
data.convert = gst_element_factory_make ("audioconvert", "convert");
data.sink = gst_element_factory_make ("autoaudiosink", "sink");
```
We create the elements as usual. `uridecodebin` will internally
We create the elements as usual. `uridecodebin` will internally
instantiate all the necessary elements (sources, demuxers and decoders)
to turn a URI into raw audio and/or video streams. It does half the work
that `playbin2` does. Since it contains demuxers, its source pads are
that `playbin` does. Since it contains demuxers, its source pads are
not initially available and we will need to link to them on the fly.
`audioconvert` is useful for converting between different audio formats,
`audioconvert` is useful for converting between different audio formats,
making sure that this example will work on any platform, since the
format produced by the audio decoder might not be the same that the
audio sink expects.
The `autoaudiosink` is the equivalent of `autovideosink` seen in the
The `autoaudiosink` is the equivalent of `autovideosink` seen in the
previous tutorial, for audio. It will render the audio stream to the
audio card.
``` first-line: 40; theme: Default; brush: cpp; gutter: true
```
if (!gst_element_link (data.convert, data.sink)) {
g_printerr ("Elements could not be linked.\n");
gst_object_unref (data.pipeline);
@ -318,11 +304,11 @@ if (!gst_element_link (data.convert, data.sink)) {
}
```
Here we link the converter element to the sink, but we **DO NOT** link
them with the source, since at this point it contains no source pads. We
Here we link the converter element to the sink, but we **DO NOT** link
them with the source, since at this point it contains no source pads. We
just leave this branch (converter + sink) unlinked, until later on.
``` first-line: 46; theme: Default; brush: cpp; gutter: true
```
/* Set the URI to play */
g_object_set (data.source, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
```
@ -332,30 +318,30 @@ the previous tutorial.
### Signals
``` first-line: 49; theme: Default; brush: cpp; gutter: true
```
/* Connect to the pad-added signal */
g_signal_connect (data.source, "pad-added", G_CALLBACK (pad_added_handler), &data);
```
`GSignals` are a crucial point in GStreamer. They allow you to be
`GSignals` are a crucial point in GStreamer. They allow you to be
notified (by means of a callback) when something interesting has
happened. Signals are identified by a name, and each `GObject` has its
happened. Signals are identified by a name, and each `GObject` has its
own signals.
In this line, we are *attaching* to the “pad-added” signal of our source
(an `uridecodebin` element). To do so, we use `g_signal_connect()` and
(an `uridecodebin` element). To do so, we use `g_signal_connect()` and
provide the callback function to be used (`pad_added_handler`) and a
data pointer. GStreamer does nothing with this data pointer, it just
forwards it to the callback so we can share information with it. In this
case, we pass a pointer to the `CustomData` structure we built specially
case, we pass a pointer to the `CustomData` structure we built specially
for this purpose.
The signals that a `GstElement` generates can be found in its
documentation or using the `gst-inspect` tool as described in [Basic
The signals that a `GstElement` generates can be found in its
documentation or using the `gst-inspect` tool as described in [Basic
tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html).
tools](Basic+tutorial+10+GStreamer+tools.markdown).
We are now ready to go\! Just set the pipeline to the PLAYING state and
We are now ready to go! Just set the pipeline to the PLAYING state and
start listening to the bus for interesting messages (like ERROR or EOS),
just like in the previous tutorials.
@ -366,31 +352,32 @@ producing data, it will create source pads, and trigger the “pad-added”
signal. At this point our callback will be
called:
``` first-line: 110; theme: Default; brush: cpp; gutter: true
```
static void pad_added_handler (GstElement *src, GstPad *new_pad, CustomData *data) {
```
`src` is the `GstElement` which triggered the signal. In this example,
`src` is the `GstElement` which triggered the signal. In this example,
it can only be the `uridecodebin`, since it is the only signal to which
we have attached.
we have attached. The first parameter of a signal handler is always the object
that has triggered it.
`new_pad` is the `GstPad` that has just been added to the `src` element.
`new_pad` is the `GstPad` that has just been added to the `src` element.
This is usually the pad to which we want to link.
`data` is the pointer we provided when attaching to the signal. In this
example, we use it to pass the `CustomData` pointer.
`data` is the pointer we provided when attaching to the signal. In this
example, we use it to pass the `CustomData` pointer.
``` first-line: 111; theme: Default; brush: cpp; gutter: true
```
GstPad *sink_pad = gst_element_get_static_pad (data->convert, "sink");
```
From `CustomData` we extract the converter element, and then retrieve
its sink pad using `gst_element_get_static_pad ()`. This is the pad to
which we want to link `new_pad`. In the previous tutorial we linked
From `CustomData` we extract the converter element, and then retrieve
its sink pad using `gst_element_get_static_pad ()`. This is the pad to
which we want to link `new_pad`. In the previous tutorial we linked
element against element, and let GStreamer choose the appropriate pads.
Now we are going to link the pads directly.
``` first-line: 119; theme: Default; brush: cpp; gutter: true
```
/* If our converter is already linked, we have nothing to do here */
if (gst_pad_is_linked (sink_pad)) {
g_print (" We are already linked. Ignoring.\n");
@ -398,13 +385,13 @@ if (gst_pad_is_linked (sink_pad)) {
}
```
`uridecodebin` can create as many pads as it sees fit, and for each one,
`uridecodebin` can create as many pads as it sees fit, and for each one,
this callback will be called. These lines of code will prevent us from
trying to link to a new pad once we are already linked.
``` first-line: 125; theme: Default; brush: cpp; gutter: true
```
/* Check the new pad's type */
new_pad_caps = gst_pad_get_caps (new_pad);
new_pad_caps = gst_pad_query_caps (new_pad, NULL);
new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
new_pad_type = gst_structure_get_name (new_pad_struct);
if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
@ -416,28 +403,28 @@ if (!g_str_has_prefix (new_pad_type, "audio/x-raw")) {
Now we will check the type of data this new pad is going to output,
because we are only interested in pads producing audio. We have
previously created a piece of pipeline which deals with audio (an
`audioconvert` linked with an `autoaudiosink`), and we will not be able
`audioconvert` linked with an `autoaudiosink`), and we will not be able
to link it to a pad producing video, for example.
`gst_pad_get_caps()` retrieves the *capabilities* of the pad (this is,
the kind of data it supports), wrapped in a `GstCaps` structure. A pad
can offer many capabilities, and hence `GstCaps` can contain many
`gst_pad_query_caps()` retrieves the *capabilities* of the pad (this is,
the kind of data it supports), wrapped in a `GstCaps` structure. A pad
can offer many capabilities, and hence `GstCaps` can contain many
`GstStructure`, each representing a different capability.
Since, in this case, we know that the pad we want only had one
capability (audio), we retrieve the first `GstStructure` with
capability (audio), we retrieve the first `GstStructure` with
`gst_caps_get_structure()`.
Finally, with `gst_structure_get_name()` we recover the name of the
structure, which contains the main description of the format (its MIME
type, actually).
Finally, with `gst_structure_get_name()` we recover the name of the
structure, which contains the main description of the format (its *media
type*, actually).
If the name does not start with `audio/x-raw`, this is not a decoded
If the name is not `audio/x-raw`, this is not a decoded
audio pad, and we are not interested in it.
Otherwise, attempt the link:
``` first-line: 134; theme: Default; brush: cpp; gutter: true
```
/* Attempt the link */
ret = gst_pad_link (new_pad, sink_pad);
if (GST_PAD_LINK_FAILED (ret)) {
@ -447,17 +434,17 @@ if (GST_PAD_LINK_FAILED (ret)) {
}
```
`gst_pad_link()` tries to link two pads. As it was the case
with `gst_element_link()`, the link must be specified from source to
sink, and both pads must be owned by elements residing in the same bin
`gst_pad_link()` tries to link two pads. As it was the case
with `gst_element_link()`, the link must be specified from source to
sink, and both pads must be owned by elements residing in the same bin
(or pipeline).
And we are done\! When a pad of the right kind appears, it will be
And we are done! When a pad of the right kind appears, it will be
linked to the rest of the audio-processing pipeline and execution will
continue until ERROR or EOS. However, we will squeeze a bit more content
from this tutorial by also introducing the concept of State.
#### GStreamer States
### GStreamer States
We already talked a bit about states when we said that playback does not
start until you bring the pipeline to the PLAYING state. We will
@ -466,21 +453,21 @@ in GStreamer:
<table>
<tbody>
<tr class="odd">
<td><p><code class="western">NULL</code></p></td>
<tr>
<td><p><code>NULL</p></td>
<td><p>the NULL state or initial state of an element.</p></td>
</tr>
<tr class="even">
<td><p><code class="western">READY</code></p></td>
<tr>
<td><p><code>READY</code></p></td>
<td><p>the element is ready to go to PAUSED.</p></td>
</tr>
<tr class="odd">
<td><p><code class="western">PAUSED</code></p></td>
<tr>
<td><p><code>PAUSED</code></p></td>
<td><p>the element is PAUSED, it is ready to accept and process data. Sink elements however only accept one buffer and then block.</p></td>
</tr>
<tr class="even">
<td><p><code class="western">PLAYING</code></p></td>
<td><p>the element is PLAYING, the c<span style="text-decoration: none;">lock</span> is running and the data is flowing.</p></td>
<tr>
<td><p><code>PLAYING</code></p></td>
<td><p>the element is PLAYING, the clock is running and the data is flowing.</p></td>
</tr>
</tbody>
</table>
@ -490,7 +477,7 @@ to PLAYING, you have to go through the intermediate READY and PAUSED
states. If you set the pipeline to PLAYING, though, GStreamer will make
the intermediate transitions for you.
``` theme: Default; brush: cpp; gutter: false
```
case GST_MESSAGE_STATE_CHANGED:
/* We are only interested in state-changed messages from the pipeline */
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data.pipeline)) {
@ -504,7 +491,7 @@ case GST_MESSAGE_STATE_CHANGED:
We added this piece of code that listens to bus messages regarding state
changes and prints them on screen to help you understand the
transitions. Every element puts messages on the bus regarding its
transitions. Every element puts messages on the bus regarding its
current state, so we filter them out and only listen to messages coming
from the pipeline.
@ -512,27 +499,27 @@ Most applications only need to worry about going to PLAYING to start
playback, then to PAUSE to perform a pause, and then back to NULL at
program exit to free all resources.
# Exercise
## Exercise
Dynamic pad linking has traditionally been a difficult topic for a lot
of programmers. Prove that you have achieved its mastery by
instantiating an `autovideosink` (probably with an `ffmpegcolorspace` in
instantiating an `autovideosink` (probably with an `ffmpegcolorspace` in
front) and link it to the demuxer when the right pad appears. Hint: You
are already printing on screen the type of the video pads.
You should now see (and hear) the same movie as in [Basic tutorial 1:
Hello world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html). In
Hello world!](Basic+tutorial+1+Hello+world.markdown). In
that tutorial you used `playbin2`, which is a handy element that
automatically takes care of all the demuxing and pad linking for you.
Most of the [Playback tutorials](Playback%2Btutorials.html) are devoted
to `playbin2`.
Most of the [Playback tutorials](Playback+tutorials.markdown) are devoted
to `playbin2`.
# Conclusion
## Conclusion
In this tutorial, you learned:
- How to be notified of events using `GSignals`
- How to connect `GstPad`s directly instead of their parent elements.
- How to be notified of events using `GSignals`
- How to connect `GstPad`s directly instead of their parent elements.
- The various states of a GStreamer element.
You also combined these items to build a dynamic pipeline, which was not
@ -540,27 +527,11 @@ defined at program start, but was created as information regarding the
media was available.
You can now continue with the basic tutorials and learn about performing
seeks and time-related queries in [Basic tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html) or move
to the [Playback tutorials](Playback%2Btutorials.html), and gain more
insight about the `playbin2` element.
seeks and time-related queries in [Basic tutorial 4: Time
management](Basic+tutorial+4+Time+management.markdown) or move
to the [Playback tutorials](Playback+tutorials.markdown), and gain more
insight about the `playbin2` element.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[src-element.png](attachments/327784/1540098.png) (image/png)
![](images/icons/bullet_blue.gif)
[filter-element.png](attachments/327784/1540099.png) (image/png)
![](images/icons/bullet_blue.gif)
[sink-element.png](attachments/327784/1540100.png) (image/png)
![](images/icons/bullet_blue.gif)
[filter-element-multi.png](attachments/327784/1540101.png) (image/png)
![](images/icons/bullet_blue.gif)
[simple-player.png](attachments/327784/1540102.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27
It has been a pleasure having you here, and see you soon!

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 4: Time management
# Basic tutorial 4: Time management
This page last changed on Jun 15, 2012 by xartigas.
# Goal
## Goal
This tutorial shows how to use GStreamer time-related facilities. In
particular:
@ -13,9 +11,9 @@ particular:
- How to seek (jump) to a different position (time instant) inside the
stream.
# Introduction
## Introduction
`GstQuery` is a mechanism that allows asking an element or pad for a
`GstQuery` is a mechanism that allows asking an element or pad for a
piece of information. In this example we ask the pipeline if seeking is
allowed (some sources, like live streams, do not allow seeking). If it
is allowed, then, once the movie has been running for ten seconds, we
@ -30,9 +28,9 @@ User Interface on a periodic basis.
Finally, the stream duration is queried and updated whenever it changes.
# Seeking example
## Seeking example
Copy this code into a text file named `basic-tutorial-4.c` (or find it
Copy this code into a text file named `basic-tutorial-4.c` (or find it
in the SDK installation).
**basic-tutorial-4.c**
@ -42,7 +40,7 @@ in the SDK installation).
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */
GstElement *playbin; /* Our one and only element */
gboolean playing; /* Are we in the PLAYING state? */
gboolean terminate; /* Should we terminate execution? */
gboolean seek_enabled; /* Is seeking enabled for this media? */
@ -69,26 +67,26 @@ int main(int argc, char *argv[]) {
gst_init (&argc, &argv);
/* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin2) {
if (!data.playbin) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Start playing */
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
g_printerr ("Unable to set the pipeline to the playing state.\n");
gst_object_unref (data.playbin2);
gst_object_unref (data.playbin);
return -1;
}
/* Listen to the bus */
bus = gst_element_get_bus (data.playbin2);
bus = gst_element_get_bus (data.playbin);
do {
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
@ -99,17 +97,16 @@ int main(int argc, char *argv[]) {
} else {
/* We got no message, this means the timeout expired */
if (data.playing) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 current = -1;
/* Query the current position of the stream */
if (!gst_element_query_position (data.playbin2, &fmt, &current)) {
if (!gst_element_query_position (data.playbin, GST_TIME_FORMAT, &current)) {
g_printerr ("Could not query current position.\n");
}
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
if (!gst_element_query_duration (data.playbin2, &fmt, &data.duration)) {
if (!gst_element_query_duration (data.playbin, GST_TIME_FORMAT, &data.duration)) {
g_printerr ("Could not query current duration.\n");
}
}
@ -121,7 +118,7 @@ int main(int argc, char *argv[]) {
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
g_print ("\nReached 10s, performing seek...\n");
gst_element_seek_simple (data.playbin2, GST_FORMAT_TIME,
gst_element_seek_simple (data.playbin, GST_FORMAT_TIME,
GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, 30 * GST_SECOND);
data.seek_done = TRUE;
}
@ -131,8 +128,8 @@ int main(int argc, char *argv[]) {
/* Free resources */
gst_object_unref (bus);
gst_element_set_state (data.playbin2, GST_STATE_NULL);
gst_object_unref (data.playbin2);
gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin);
return 0;
}
@ -160,7 +157,7 @@ static void handle_message (CustomData *data, GstMessage *msg) {
case GST_MESSAGE_STATE_CHANGED: {
GstState old_state, new_state, pending_state;
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) {
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
g_print ("Pipeline state changed from %s to %s:\n",
gst_element_state_get_name (old_state), gst_element_state_get_name (new_state));
@ -172,7 +169,7 @@ static void handle_message (CustomData *data, GstMessage *msg) {
GstQuery *query;
gint64 start, end;
query = gst_query_new_seeking (GST_FORMAT_TIME);
if (gst_element_query (data->playbin2, query)) {
if (gst_element_query (data->playbin, query)) {
gst_query_parse_seeking (query, NULL, &data->seek_enabled, &start, &end);
if (data->seek_enabled) {
g_print ("Seeking is ENABLED from %" GST_TIME_FORMAT " to %" GST_TIME_FORMAT "\n",
@ -197,36 +194,25 @@ static void handle_message (CustomData *data, GstMessage *msg) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-1441912910" class="expand-container">
<div id="expander-control-1441912910" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-1441912910" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-4.c -o basic-tutorial-4 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>  or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. 10 seconds into the movie it skips to a new position.</span></p>
<p><span><span>Required libraries: </span><span> </span><code>gstreamer-0.10</code></span></p>
</div>
</div></td>
</tr>
</tbody>
</table>
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Build), or use this specific command on Linux:
>
> ``gcc basic-tutorial-4.c -o basic-tutorial-4 `pkg-config --cflags --libs gstreamer-1.0` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
> This tutorial opens a window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. 10 seconds into the movie it skips to a new position
>
>Required libraries: `gstreamer-1.0`
# Walkthrough
## Walkthrough
``` first-line: 3; theme: Default; brush: cpp; gutter: true
```
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only element */
GstElement *playbin; /* Our one and only element */
gboolean playing; /* Are we in the PLAYING state? */
gboolean terminate; /* Should we terminate execution? */
gboolean seek_enabled; /* Is seeking enabled for this media? */
@ -241,17 +227,17 @@ static void handle_message (CustomData *data, GstMessage *msg);
We start by defining a structure to contain all our information, so we
can pass it around to other functions. In particular, in this example we
move the message handling code to its own function
`handle_message` because it is growing a bit too big.
`handle_message` because it is growing a bit too big.
We would then build a pipeline composed of a single element, a
`playbin2`, which we already saw in [Basic tutorial 1: Hello
world\!](Basic%2Btutorial%2B1%253A%2BHello%2Bworld%2521.html). However,
`playbin2` is in itself a pipeline, and in this case it is the only
element in the pipeline, so we use directly the `playbin2` element. We
will skip the details: the URI of the clip is given to `playbin2` via
`playbin`, which we already saw in [Basic tutorial 1: Hello
world!](Basic+tutorial+1+Hello+world.markdown). However,
`playbin` is in itself a pipeline, and in this case it is the only
element in the pipeline, so we use directly the `playbin` element. We
will skip the details: the URI of the clip is given to `playbin` via
the URI property and the pipeline is set to the playing state.
``` first-line: 53; theme: Default; brush: cpp; gutter: true
```
msg = gst_bus_timed_pop_filtered (bus, 100 * GST_MSECOND,
GST_MESSAGE_STATE_CHANGED | GST_MESSAGE_ERROR | GST_MESSAGE_EOS | GST_MESSAGE_DURATION);
```
@ -261,16 +247,16 @@ Previously we did not provide a timeout to
message was received. Now we use a timeout of 100 milliseconds, so, if
no message is received, 10 times per second the function will return
with a NULL instead of a `GstMessage`. We are going to use this to
update our “UI”. Note that the timeout period is specified in
nanoseconds, so usage of the `GST_SECOND` or `GST_MSECOND` macros is
update our “UI”. Note that the timeout period is specified in
nanoseconds, so usage of the `GST_SECOND` or `GST_MSECOND` macros is
highly recommended.
If we got a message, we process it in the `handle_message`` `function
If we got a message, we process it in the `handle_message`` `function
(next subsection), otherwise:
#### User interface resfreshing
### User interface resfreshing
``` first-line: 60; theme: Default; brush: cpp; gutter: true
```
/* We got no message, this means the timeout expired */
if (data.playing) {
```
@ -280,45 +266,45 @@ anything here, since most queries would fail. Otherwise, it is time to
refresh the screen.
We get here approximately 10 times per second, a good enough refresh
rate for our UI. We are going to print on screen the current media
rate for our UI. We are going to print on screen the current media
position, which we can learn be querying the pipeline. This involves a
few steps that will be shown in the next subsection, but, since position
and duration are common enough queries, `GstElement` offers easier,
and duration are common enough queries, `GstElement` offers easier,
ready-made alternatives:
``` first-line: 65; theme: Default; brush: cpp; gutter: true
```
/* Query the current position of the stream */
if (!gst_element_query_position (data.pipeline, &fmt, &current)) {
if (!gst_element_query_position (data.pipeline, GST_FORMAT_TIME, &current)) {
g_printerr ("Could not query current position.\n");
}
```
`gst_element_query_position()` hides the management of the query object
`gst_element_query_position()` hides the management of the query object
and directly provides us with the result.
``` first-line: 70; theme: Default; brush: cpp; gutter: true
```
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data.duration)) {
if (!gst_element_query_duration (data.pipeline, &fmt, &data.duration)) {
if (!gst_element_query_duration (data.pipeline, GST_TIME_FORMAT, &data.duration)) {
g_printerr ("Could not query current duration.\n");
}
}
```
Now is a good moment to know the length of the stream, with
another `GstElement` helper function: `gst_element_query_duration()`
another `GstElement` helper function: `gst_element_query_duration()`
``` first-line: 77; theme: Default; brush: cpp; gutter: true
```
/* Print current position and total duration */
g_print ("Position %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
GST_TIME_ARGS (current), GST_TIME_ARGS (data.duration));
```
Note the usage of the `GST_TIME_FORMAT` and `GST_TIME_ARGS` macros to
Note the usage of the `GST_TIME_FORMAT` and `GST_TIME_ARGS` macros to
provide user-friendly representation of GStreamer
times.
``` first-line: 81; theme: Default; brush: cpp; gutter: true
```
/* If seeking is enabled, we have not done it yet, and the time is right, seek */
if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
g_print ("\nReached 10s, performing seek...\n");
@ -329,13 +315,13 @@ if (data.seek_enabled && !data.seek_done && current > 10 * GST_SECOND) {
```
Now we perform the seek, “simply” by
calling `gst_element_seek_simple()` on the pipeline. A lot of the
calling `gst_element_seek_simple()` on the pipeline. A lot of the
intricacies of seeking are hidden in this method, which is a good
thing\!
thing!
Let's review the parameters:
`GST_FORMAT_TIME` indicates that we are specifying the destination in
`GST_FORMAT_TIME` indicates that we are specifying the destination in
time, as opposite to bytes (and other more obscure mechanisms).
Then come the GstSeekFlags, let's review the most common:
@ -365,16 +351,16 @@ asked for), then provide this flag. Be warned that it might take longer
to calculate the seeking position (very long, on some files).
And finally we provide the position to seek to. Since we asked
for `GST_FORMAT_TIME` , this position is in nanoseconds, so we use
the `GST_SECOND` macro for simplicity.
for `GST_FORMAT_TIME` , this position is in nanoseconds, so we use
the `GST_SECOND` macro for simplicity.
#### Message Pump
### Message Pump
The `handle_message` function processes all messages received through
The `handle_message` function processes all messages received through
the pipeline's bus. ERROR and EOS handling is the same as in previous
tutorials, so we skip to the interesting part:
``` first-line: 116; theme: Default; brush: cpp; gutter: true
```
case GST_MESSAGE_DURATION:
/* The duration has changed, mark the current one as invalid */
data->duration = GST_CLOCK_TIME_NONE;
@ -385,7 +371,7 @@ This message is posted on the bus whenever the duration of the stream
changes. Here we simply mark the current duration as invalid, so it gets
re-queried later.
``` first-line: 120; theme: Default; brush: cpp; gutter: true
```
case GST_MESSAGE_STATE_CHANGED: {
GstState old_state, new_state, pending_state;
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
@ -397,15 +383,16 @@ case GST_MESSAGE_STATE_CHANGED: {
data->playing = (new_state == GST_STATE_PLAYING);
```
Seeks and time queries work better when in the PAUSED or PLAYING state,
since all elements have had a chance to receive information and
configure themselves. Here we take note of whether we are in the PLAYING
state or not with the `playing` variable.
Seeks and time queries generally only get a valid reply when in the
PAUSED or PLAYING state, since all elements have had a chance to
receive information and configure themselves. Here we take note of
whether we are in the PLAYING state or not with the `playing`
variable.
Also, if we have just entered the PLAYING state, we do our first query.
We ask the pipeline if seeking is allowed on this stream:
``` first-line: 130; theme: Default; brush: cpp; gutter: true
```
if (data->playing) {
/* We just moved to PLAYING. Check if seeking is possible */
GstQuery *query;
@ -428,36 +415,36 @@ if (data->playing) {
```
`gst_query_new_seeking()` creates a new query object of the "seeking"
type, with `GST_FORMAT_TIME` format. This indicates that we are
`gst_query_new_seeking()` creates a new query object of the "seeking"
type, with `GST_FORMAT_TIME` format. This indicates that we are
interested in seeking by specifying the new time to which we want to
move. We could also ask for `GST_FORMAT_BYTES`, and then seek to a
particular byte position inside the source file, but this is normally
less useful.
This query object is then passed to the pipeline with
`gst_element_query()`. The result is stored in the same query, and can
be easily retrieved with `gst_query_parse_seeking()`. It extracts a
`gst_element_query()`. The result is stored in the same query, and can
be easily retrieved with `gst_query_parse_seeking()`. It extracts a
boolean indicating if seeking is allowed, and the range in which seeking
is possible.
Don't forget to unref the query object when you are done with it.
And that's it\! With this knowledge a media player can be built which
And that's it! With this knowledge a media player can be built which
periodically updates a slider based on the current stream position and
allows seeking by moving the slider\!
allows seeking by moving the slider!
# Conclusion
## Conclusion
This tutorial has shown:
- How to query the pipeline for information using `GstQuery`
- How to query the pipeline for information using `GstQuery`
- How to obtain common information like position and duration
using `gst_element_query_position()` and `gst_element_query_duration()`
using `gst_element_query_position()` and `gst_element_query_duration()`
- How to seek to an arbitrary position in the stream
using `gst_element_seek_simple()`
using `gst_element_seek_simple()`
- In which states all these operations can be performed.
@ -467,7 +454,5 @@ Interface toolkit.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27
It has been a pleasure having you here, and see you soon!

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 5: GUI toolkit integration
# Basic tutorial 5: GUI toolkit integration
This page last changed on Dec 03, 2012 by xartigas.
# Goal
## Goal
This tutorial shows how to integrate GStreamer in a Graphical User
Interface (GUI) toolkit like [GTK+](http://www.gtk.org). Basically,
@ -24,65 +22,60 @@ In particular, you will learn:
- A mechanism to subscribe only to the messages you are interested in,
instead of being notified of all of them.
# Introduction
## Introduction
We are going to build a media player using the
[GTK+](http://www.gtk.org/) toolkit, but the concepts apply to other
[GTK+](http://www.gtk.org/) toolkit, but the concepts apply to other
toolkits like [QT](http://qt-project.org/), for example. A minimum
knowledge of [GTK+](http://www.gtk.org/) will help understand this
knowledge of [GTK+](http://www.gtk.org/) will help understand this
tutorial.
The main point is telling GStreamer to output the video to a window of
our choice. The specific mechanism depends on the operating system (or
rather, on the windowing system), but GStreamer provides a layer of
abstraction for the sake of platform independence. This independence
comes through the `XOverlay` interface, that allows the application to
comes through the `GstVideoOverlay` interface, that allows the application to
tell a video sink the handler of the window that should receive the
rendering.
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><strong>GObject interfaces</strong><br />
> ![Information](images/icons/emoticons/information.png)
> **GObject interfaces**
>
> A GObject *interface* (which GStreamer uses) is a set of functions that an element can implement. If it does, then it is said to support that particular interface. For example, video sinks usually create their own windows to display video, but, if they are also capable of rendering to an external window, they can choose to implement the `GstVideoOverlay` interface and provide functions to specify this external window. From the application developer point of view, if a certain interface is supported, you can use it and forget about which kind of element is implementing it. Moreover, if you are using `playbin`, it will automatically expose some of the interfaces supported by its internal elements: You can use your interface functions directly on `playbin` without knowing who is implementing them!
<p>A GObject <code>interface</code> (which GStreamer uses) is a set of functions that an element can implement. If it does, then it is said to support that particular interface. For example, video sinks usually create their own windows to display video, but, if they are also capable of rendering to an external window, they can choose to implement the <code>XOverlay</code> interface and provide functions to specify this external window. From the application developer point of view, if a certain interface is supported, you can use it and forget about which kind of element is implementing it. Moreover, if you are using <code>playbin2</code>, it will automatically expose some of the interfaces supported by its internal elements: You can use your interface functions directly on <code>playbin2</code> without knowing who is implementing them!</p></td>
</tr>
</tbody>
</table>
Another issue is that GUI toolkits usually only allow manipulation of
the graphical “widgets” through the main (or application) thread,
whereas GStreamer usually spawns multiple threads to take care of
different tasks. Calling [GTK+](http://www.gtk.org/) functions from
different tasks. Calling [GTK+](http://www.gtk.org/) functions from
within callbacks will usually fail, because callbacks execute in the
calling thread, which does not need to be the main thread. This problem
can be solved by posting a message on the GStreamer bus in the callback:
The messages will be received by the main thread which will then react
accordingly.
Finally, so far we have registered a `handle_message` function that got
Finally, so far we have registered a `handle_message` function that got
called every time a message appeared on the bus, which forced us to
parse every message to see if it was of interest to us. In this tutorial
a different method is used that registers a callback for each kind of
message, so there is less parsing and less code overall.
# A media player in GTK+
## A media player in GTK+
Let's write a very simple media player based on playbin2, this time,
with a GUI\!
Let's write a very simple media player based on playbin, this time,
with a GUI!
Copy this code into a text file named `basic-tutorial-5.c` (or find it
Copy this code into a text file named `basic-tutorial-5.c` (or find it
in the SDK installation).
**basic-tutorial-5.c**
``` theme: Default; brush: cpp; gutter: true
```
#include <string.h>
#include <gtk/gtk.h>
#include <gst/gst.h>
#include <gst/interfaces/xoverlay.h>
#include <gst/video/videooverlay.h>
#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
@ -95,7 +88,7 @@ in the SDK installation).
/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
GstElement *playbin2; /* Our one and only pipeline */
GstElement *playbin; /* Our one and only pipeline */
GtkWidget *slider; /* Slider widget to keep track of current position */
GtkWidget *streams_list; /* Text widget to display info about the streams */
@ -107,13 +100,13 @@ typedef struct _CustomData {
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
* and pass it to GStreamer through the XOverlay interface. */
* and pass it to GStreamer through the GstVideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
GdkWindow *window = gtk_widget_get_window (widget);
guintptr window_handle;
if (!gdk_window_ensure_native (window))
g_error ("Couldn't create native window needed for GstXOverlay!");
g_error ("Couldn't create native window needed for GstVideoOverlay!");
/* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
@ -123,23 +116,23 @@ static void realize_cb (GtkWidget *widget, CustomData *data) {
#elif defined (GDK_WINDOWING_X11)
window_handle = GDK_WINDOW_XID (window);
#endif
/* Pass it to playbin2, which implements XOverlay and will forward it to the video sink */
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->playbin2), window_handle);
/* Pass it to playbin, which implements GstVideoOverlay and will forward it to the video sink */
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}
/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin2, GST_STATE_PLAYING);
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}
/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin2, GST_STATE_PAUSED);
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}
/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin2, GST_STATE_READY);
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when the main window is closed */
@ -151,16 +144,13 @@ static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *dat
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
* we simply draw a black rectangle to avoid garbage showing up. */
static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData *data) {
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
if (data->state < GST_STATE_PAUSED) {
GtkAllocation allocation;
GdkWindow *window = gtk_widget_get_window (widget);
cairo_t *cr;
/* Cairo is a 2D graphics library which we use here to clean the video window.
* It is used by GStreamer for other reasons, so it will always be available to us. */
gtk_widget_get_allocation (widget, &allocation);
cr = gdk_cairo_create (window);
cairo_set_source_rgb (cr, 0, 0, 0);
cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
cairo_fill (cr);
@ -174,7 +164,7 @@ static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData
* new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
gst_element_seek_simple (data->playbin2, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
(gint64)(value * GST_SECOND));
}
@ -193,7 +183,7 @@ static void create_ui (CustomData *data) {
video_window = gtk_drawing_area_new ();
gtk_widget_set_double_buffered (video_window, FALSE);
g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
g_signal_connect (video_window, "expose_event", G_CALLBACK (expose_cb), data);
g_signal_connect (video_window, "draw", G_CALLBACK (draw_cb), data);
play_button = gtk_button_new_from_stock (GTK_STOCK_MEDIA_PLAY);
g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb), data);
@ -211,17 +201,17 @@ static void create_ui (CustomData *data) {
data->streams_list = gtk_text_view_new ();
gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);
controls = gtk_hbox_new (FALSE, 0);
controls = gtk_box_new (GTK_ORIENTATION_HORIZONTAL,, 0);
gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);
main_hbox = gtk_hbox_new (FALSE, 0);
main_hbox = gtk_box_new (GTK_ORIENTATION_HORIZONTAL,, 0);
gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);
main_box = gtk_vbox_new (FALSE, 0);
main_box = gtk_box_new (GTK_ORIENTATION_VERTICAL,, 0);
gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
gtk_container_add (GTK_CONTAINER (main_window), main_box);
@ -232,7 +222,6 @@ static void create_ui (CustomData *data) {
/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 current = -1;
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
@ -241,7 +230,7 @@ static gboolean refresh_ui (CustomData *data) {
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
if (!gst_element_query_duration (data->playbin2, &fmt, &data->duration)) {
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
g_printerr ("Could not query current duration.\n");
} else {
/* Set the range of the slider to the clip duration, in SECONDS */
@ -249,7 +238,7 @@ static gboolean refresh_ui (CustomData *data) {
}
}
if (gst_element_query_position (data->playbin2, &fmt, &current)) {
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
/* Block the "value-changed" signal, so the slider_cb function is not called
* (which would trigger a seek the user has not requested) */
g_signal_handler_block (data->slider, data->slider_update_signal_id);
@ -262,11 +251,11 @@ static gboolean refresh_ui (CustomData *data) {
}
/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin2, gint stream, CustomData *data) {
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
/* We are possibly in a GStreamer working thread, so we notify the main
* thread of this event through a message in the bus */
gst_element_post_message (playbin2,
gst_message_new_application (GST_OBJECT (playbin2),
gst_element_post_message (playbin,
gst_message_new_application (GST_OBJECT (playbin),
gst_structure_new ("tags-changed", NULL)));
}
@ -283,14 +272,14 @@ static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_free (debug_info);
/* Set the pipeline to READY (which stops playback) */
gst_element_set_state (data->playbin2, GST_STATE_READY);
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when an End-Of-Stream message is posted on the bus.
* We just set the pipeline to READY (which stops playback) */
static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
g_print ("End-Of-Stream reached.\n");
gst_element_set_state (data->playbin2, GST_STATE_READY);
gst_element_set_state (data->playbin, GST_STATE_READY);
}
/* This function is called when the pipeline changes states. We use it to
@ -298,7 +287,7 @@ static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
GstState old_state, new_state, pending_state;
gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin2)) {
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
data->state = new_state;
g_print ("State set to %s\n", gst_element_state_get_name (new_state));
if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
@ -322,14 +311,14 @@ static void analyze_streams (CustomData *data) {
gtk_text_buffer_set_text (text, "", -1);
/* Read some properties */
g_object_get (data->playbin2, "n-video", &n_video, NULL);
g_object_get (data->playbin2, "n-audio", &n_audio, NULL);
g_object_get (data->playbin2, "n-text", &n_text, NULL);
g_object_get (data->playbin, "n-video", &n_video, NULL);
g_object_get (data->playbin, "n-audio", &n_audio, NULL);
g_object_get (data->playbin, "n-text", &n_text, NULL);
for (i = 0; i < n_video; i++) {
tags = NULL;
/* Retrieve the stream's video tags */
g_signal_emit_by_name (data->playbin2, "get-video-tags", i, &tags);
g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
if (tags) {
total_str = g_strdup_printf ("video stream %d:\n", i);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
@ -346,7 +335,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < n_audio; i++) {
tags = NULL;
/* Retrieve the stream's audio tags */
g_signal_emit_by_name (data->playbin2, "get-audio-tags", i, &tags);
g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
if (tags) {
total_str = g_strdup_printf ("\naudio stream %d:\n", i);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
@ -375,7 +364,7 @@ static void analyze_streams (CustomData *data) {
for (i = 0; i < n_text; i++) {
tags = NULL;
/* Retrieve the stream's subtitle tags */
g_signal_emit_by_name (data->playbin2, "get-text-tags", i, &tags);
g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
if (tags) {
total_str = g_strdup_printf ("\nsubtitle stream %d:\n", i);
gtk_text_buffer_insert_at_cursor (text, total_str, -1);
@ -394,7 +383,7 @@ static void analyze_streams (CustomData *data) {
/* This function is called when an "application" message is posted on the bus.
* Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
if (g_strcmp0 (gst_structure_get_name (msg->structure), "tags-changed") == 0) {
if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure(msg)), "tags-changed") == 0) {
/* If the message is the "tags-changed" (only one we are currently issuing), update
* the stream info GUI */
analyze_streams (data);
@ -417,26 +406,26 @@ int main(int argc, char *argv[]) {
data.duration = GST_CLOCK_TIME_NONE;
/* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin2) {
if (!data.playbin) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
/* Connect to interesting signals in playbin2 */
g_signal_connect (G_OBJECT (data.playbin2), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin2), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin2), "text-tags-changed", (GCallback) tags_cb, &data);
/* Connect to interesting signals in playbin */
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
/* Create the GUI */
create_ui (&data);
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.playbin2);
bus = gst_element_get_bus (data.playbin);
gst_bus_add_signal_watch (bus);
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
@ -445,10 +434,10 @@ int main(int argc, char *argv[]) {
gst_object_unref (bus);
/* Start playing */
ret = gst_element_set_state (data.playbin2, GST_STATE_PLAYING);
ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
g_printerr ("Unable to set the pipeline to the playing state.\n");
gst_object_unref (data.playbin2);
gst_object_unref (data.playbin);
return -1;
}
@ -459,48 +448,37 @@ int main(int argc, char *argv[]) {
gtk_main ();
/* Free resources */
gst_element_set_state (data.playbin2, GST_STATE_NULL);
gst_object_unref (data.playbin2);
gst_element_set_state (data.playbin, GST_STATE_NULL);
gst_object_unref (data.playbin);
return 0;
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-700857978" class="expand-container">
<div id="expander-control-700857978" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-700857978" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-5.c -o basic-tutorial-5 `pkg-config --cflags --libs gstreamer-interfaces-0.10 gtk+-2.0 gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p></p>
<p><span>This tutorial opens a GTK+ window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The Window has some GTK+ buttons to Pause, Stop and Play the movie, and a slider to show the current position of the stream, which can be dragged to change it. Also, information about the stream is shown on a column at the right edge of the window.</span></p>
<p><span><span>Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how <a href="Basic%2Btutorial%2B12%253A%2BStreaming.html">Basic tutorial 12: Streaming</a> </span><span>solves this issue.</span></span></p>
<p></p>
<p>Required libraries: <code> gstreamer-interfaces-0.10 gtk+-2.0 gstreamer-0.10</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.html#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.html#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.html#InstallingonWindows-Build), or use this specific command on Linux:
>
> ``gcc basic-tutorial-5.c -o basic-tutorial-5 `pkg-config --cflags --libs gstreamer-interfaces-0.10 gtk+-3.0 gstreamer-1.0``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.html#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.html#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.html#InstallingonWindows-Run).
>
> This tutorial opens a GTK+ window and displays a movie, with accompanying audio. The media is fetched from the Internet, so the window might take a few seconds to appear, depending on your connection speed. The Window has some GTK+ buttons to Pause, Stop and Play the movie, and a slider to show the current position of the stream, which can be dragged to change it. Also, information about the stream is shown on a column at the right edge of the window.
>
>
> Bear in mind that there is no latency management (buffering), so on slow connections, the movie might stop after a few seconds. See how [Basic tutorial 12: Streaming](Basic+tutorial+12+Streaming.markdown) solves this issue.
>
> Required libraries: `gstreamer-video-1.0 gtk+-3.0 gstreamer-1.0`
# Walkthrough
## Walkthrough
Regarding this tutorial's structure, we are not going to use forward
function definitions anymore: Functions will be defined before they are
used. Also, for clarity of explanation, the order in which the snippets
of code are presented will not always match the program order. Use the
of code are presented will not always match the program order. Use the
line numbers to locate the snippets in the complete code.
``` first-line: 7; theme: Default; brush: cpp; gutter: true
```
#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
@ -518,10 +496,10 @@ that many supported windowing systems, so these three lines often
suffice: X11 for Linux, Win32 for Windows and Quartz for Mac OSX.
This tutorial is composed mostly of callback functions, which will be
called from GStreamer or GTK+, so let's review the `main` function,
called from GStreamer or GTK+, so let's review the `main` function,
which registers all these callbacks.
``` first-line: 324; theme: Default; brush: cpp; gutter: true
```
int main(int argc, char *argv[]) {
CustomData data;
GstStateChangeReturn ret;
@ -538,45 +516,45 @@ int main(int argc, char *argv[]) {
data.duration = GST_CLOCK_TIME_NONE;
/* Create the elements */
data.playbin2 = gst_element_factory_make ("playbin2", "playbin2");
data.playbin = gst_element_factory_make ("playbin", "playbin");
if (!data.playbin2) {
if (!data.playbin) {
g_printerr ("Not all elements could be created.\n");
return -1;
}
/* Set the URI to play */
g_object_set (data.playbin2, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
g_object_set (data.playbin, "uri", "http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
```
Standard GStreamer initialization and playbin2 pipeline creation, along
Standard GStreamer initialization and playbin pipeline creation, along
with GTK+ initialization. Not much new.
``` first-line: 350; theme: Default; brush: cpp; gutter: true
/* Connect to interesting signals in playbin2 */
g_signal_connect (G_OBJECT (data.playbin2), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin2), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin2), "text-tags-changed", (GCallback) tags_cb, &data);
```
/* Connect to interesting signals in playbin */
g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);
```
We are interested in being notified when new tags (metadata) appears on
the stream. For simplicity, we are going to handle all kinds of tags
(video, audio and text) from the same callback `tags_cb`.
``` first-line: 355; theme: Default; brush: cpp; gutter: true
```
/* Create the GUI */
create_ui (&data);
```
All GTK+ widget creation and signal registration happens in this
function. It contains only GTK-related function calls, so we will skip
function. It contains only GTK-related function calls, so we will skip
over its definition. The signals to which it registers convey user
commands, as shown below when reviewing the
callbacks.
``` first-line: 359; theme: Default; brush: cpp; gutter: true
```
/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
bus = gst_element_get_bus (data.playbin2);
bus = gst_element_get_bus (data.playbin);
gst_bus_add_signal_watch (bus);
g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
@ -585,55 +563,54 @@ g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application
gst_object_unref (bus);
```
In [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html), `gst_bus_add_watch()` is
In [Playback tutorial 1: Playbin
usage](Playback+tutorial+1+Playbin+usage.markdown), `gst_bus_add_watch()` is
used to register a function that receives every message posted to the
GStreamer bus. We can achieve a finer granularity by using signals
instead, which allow us to register only to the messages we are
interested in. By calling `gst_bus_add_signal_watch()` we instruct the
interested in. By calling `gst_bus_add_signal_watch()` we instruct the
bus to emit a signal every time it receives a message. This signal has
the name `message::detail` where *`detail`* is the message that
the name `message::detail` where *`detail`* is the message that
triggered the signal emission. For example, when the bus receives the
EOS message, it emits a signal with the name `message::eos`.
EOS message, it emits a signal with the name `message::eos`.
This tutorial is using the `Signals`'s details to register only to the
messages we care about. If we had registered to the `message` signal, we
This tutorial is using the `Signals`'s details to register only to the
messages we care about. If we had registered to the `message` signal, we
would be notified of every single message, just like
`gst_bus_add_watch()` would do.
`gst_bus_add_watch()` would do.
Keep in mind that, in order for the bus watches to work (be it a
`gst_bus_add_watch()` or a `gst_bus_add_signal_watch()`), there must be
GLib `Main Loop` running. In this case, it is hidden inside the
[GTK+](http://www.gtk.org/) main loop.
`gst_bus_add_watch()` or a `gst_bus_add_signal_watch()`), there must be
GLib `Main Loop` running. In this case, it is hidden inside the
[GTK+](http://www.gtk.org/) main loop.
``` first-line: 374; theme: Default; brush: cpp; gutter: true
```
/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);
```
Before transferring control to GTK+, we use `g_timeout_add_seconds
()` to register yet another callback, this time with a timeout, so it
()` to register yet another callback, this time with a timeout, so it
gets called every second. We are going to use it to refresh the GUI from
the `refresh_ui` function.
the `refresh_ui` function.
After this, we are done with the setup and can start the GTK+ main loop.
We will regain control from our callbacks when interesting things
happen. Let's review the callbacks. Each callback has a different
happen. Let's review the callbacks. Each callback has a different
signature, depending on who will call it. You can look up the signature
(the meaning of the parameters and the return value) in the
documentation of the
signal.
documentation of the signal.
``` first-line: 28; theme: Default; brush: cpp; gutter: true
```
/* This function is called when the GUI toolkit creates the physical window that will hold the video.
* At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
* and pass it to GStreamer through the XOverlay interface. */
* and pass it to GStreamer through the GstVideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
GdkWindow *window = gtk_widget_get_window (widget);
guintptr window_handle;
if (!gdk_window_ensure_native (window))
g_error ("Couldn't create native window needed for GstXOverlay!");
g_error ("Couldn't create native window needed for GstVideoOverlay!");
/* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
@ -643,37 +620,37 @@ static void realize_cb (GtkWidget *widget, CustomData *data) {
#elif defined (GDK_WINDOWING_X11)
window_handle = GDK_WINDOW_XID (window);
#endif
/* Pass it to playbin2, which implements XOverlay and will forward it to the video sink */
gst_x_overlay_set_window_handle (GST_X_OVERLAY (data->playbin2), window_handle);
/* Pass it to playbin, which implements GstVideoOverlay and will forward it to the video sink */
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}
```
The code comments talks by itself. At this point in the life cycle of
the application, we know the handle (be it an X11's `XID`, a Window's
`HWND` or a Quartz's `NSView`) of the window where GStreamer should
`HWND` or a Quartz's `NSView`) of the window where GStreamer should
render the video. We simply retrieve it from the windowing system and
pass it to `playbin2` through the `XOverlay` interface using
`gst_x_overlay_set_window_handle()`. `playbin2` will locate the video
pass it to `playbin` through the `GstVideoOverlay` interface using
`gst_video_overlay_set_window_handle()`. `playbin` will locate the video
sink and pass the handler to it, so it does not create its own window
and uses this one.
Not much more to see here; `playbin2` and the `XOverlay` really simplify
this process a lot\!
Not much more to see here; `playbin` and the `GstVideoOverlay` really simplify
this process a lot!
``` first-line: 50; theme: Default; brush: cpp; gutter: true
```
/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin2, GST_STATE_PLAYING);
gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}
/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin2, GST_STATE_PAUSED);
gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}
/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
gst_element_set_state (data->playbin2, GST_STATE_READY);
gst_element_set_state (data->playbin, GST_STATE_READY);
}
```
@ -681,11 +658,11 @@ These three little callbacks are associated with the PLAY, PAUSE and
STOP buttons in the GUI. They simply set the pipeline to the
corresponding state. Note that in the STOP state we set the pipeline to
`READY`. We could have brought the pipeline all the way down to the
`NULL` state, but, the transition would then be slower, since some
`NULL` state, but, the transition would then be a little slower, since some
resources (like the audio device) would need to be released and
re-acquired.
``` first-line: 65; theme: Default; brush: cpp; gutter: true
```
/* This function is called when the main window is closed */
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
stop_cb (NULL, data);
@ -693,13 +670,12 @@ static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *dat
}
```
gtk\_main\_quit() will eventually make the call to to gtk\_main\_run()
gtk_main_quit() will eventually make the call to to gtk_main_run()
in `main` to terminate, which, in this case, finishes the program. Here,
we call it when the main window is closed, after stopping the pipeline
(just for the sake of
tidiness).
(just for the sake of tidiness).
``` first-line: 71; theme: Default; brush: cpp; gutter: true
```
/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
* rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
* we simply draw a black rectangle to avoid garbage showing up. */
@ -723,18 +699,18 @@ static gboolean expose_cb (GtkWidget *widget, GdkEventExpose *event, CustomData
}
```
When there is data flow (in the `PAUSED` and `PLAYING` states) the video
When there is data flow (in the `PAUSED` and `PLAYING` states) the video
sink takes care of refreshing the content of the video window. In the
other cases, however, it will not, so we have to do it. In this example,
we just fill the window with a black
rectangle.
``` first-line: 93; theme: Default; brush: cpp; gutter: true
```
/* This function is called when the slider changes its position. We perform a seek to the
* new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
gst_element_seek_simple (data->playbin2, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
(gint64)(value * GST_SECOND));
}
```
@ -743,8 +719,8 @@ This is an example of how a complex GUI element like a seeker bar (or
slider that allows seeking) can be very easily implemented thanks to
GStreamer and GTK+ collaborating. If the slider has been dragged to a
new position, tell GStreamer to seek to that position
with `gst_element_seek_simple()` (as seen in [Basic tutorial 4: Time
management](Basic%2Btutorial%2B4%253A%2BTime%2Bmanagement.html)). The
with `gst_element_seek_simple()` (as seen in [Basic tutorial 4: Time
management](Basic+tutorial+4+Time+management.html)). The
slider has been setup so its value represents seconds.
It is worth mentioning that some performance (and responsiveness) can be
@ -755,10 +731,9 @@ before allowing another one. Otherwise, the application might look
unresponsive if the user drags the slider frantically, which would not
allow any seek to complete before a new one is queued.
``` first-line: 153; theme: Default; brush: cpp; gutter: true
```
/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 current = -1;
/* We do not want to update anything unless we are in the PAUSED or PLAYING states */
@ -767,14 +742,14 @@ static gboolean refresh_ui (CustomData *data) {
```
This function will move the slider to reflect the current position of
the media. First off, if we are not in the `PLAYING` state, we have
the media. First off, if we are not in the `PLAYING` state, we have
nothing to do here (plus, position and duration queries will normally
fail).
``` first-line: 162; theme: Default; brush: cpp; gutter: true
```
/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
if (!gst_element_query_duration (data->playbin2, &fmt, &data->duration)) {
if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
g_printerr ("Could not query current duration.\n");
} else {
/* Set the range of the slider to the clip duration, in SECONDS */
@ -786,8 +761,8 @@ if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
We recover the duration of the clip if we didn't know it, so we can set
the range for the slider.
``` first-line: 172; theme: Default; brush: cpp; gutter: true
if (gst_element_query_position (data->playbin2, &fmt, &current)) {
```
if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
/* Block the "value-changed" signal, so the slider_cb function is not called
* (which would trigger a seek the user has not requested) */
g_signal_handler_block (data->slider, data->slider_update_signal_id);
@ -801,23 +776,23 @@ return TRUE;
We query the current pipeline position, and set the position of the
slider accordingly. This would trigger the emission of the
`value-changed` signal, which we use to know when the user is dragging
`value-changed` signal, which we use to know when the user is dragging
the slider. Since we do not want seeks happening unless the user
requested them, we disable the `value-changed` signal emission during
this operation with `g_signal_handler_block()` and
requested them, we disable the `value-changed` signal emission during
this operation with `g_signal_handler_block()` and
`g_signal_handler_unblock()`.
Returning TRUE from this function will keep it called in the future. If
we return FALSE, the timer will be
removed.
``` first-line: 184; theme: Default; brush: cpp; gutter: true
```
/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin2, gint stream, CustomData *data) {
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
/* We are possibly in a GStreamer working thread, so we notify the main
* thread of this event through a message in the bus */
gst_element_post_message (playbin2,
gst_message_new_application (GST_OBJECT (playbin2),
gst_element_post_message (playbin,
gst_message_new_application (GST_OBJECT (playbin),
gst_structure_new ("tags-changed", NULL)));
}
```
@ -829,16 +804,16 @@ thread. What we want to do here is to update a GTK+ widget to reflect
this new information, but **GTK+ does not allow operating from threads
other than the main one**.
The solution is to make `playbin2` post a message on the bus and return
The solution is to make `playbin` post a message on the bus and return
to the calling thread. When appropriate, the main thread will pick up
this message and update GTK.
`gst_element_post_message()` makes a GStreamer element post the given
message to the bus. `gst_message_new_application()` creates a new
message of the `APPLICATION` type. GStreamer messages have different
`gst_element_post_message()` makes a GStreamer element post the given
message to the bus. `gst_message_new_application()` creates a new
message of the `APPLICATION` type. GStreamer messages have different
types, and this particular type is reserved to the application: it will
go through the bus unaffected by GStreamer. The list of types can be
found in the `GstMessageType` documentation.
found in the `GstMessageType` documentation.
Messages can deliver additional information through their embedded
`GstStructure`, which is a very flexible data container. Here, we create
@ -846,14 +821,14 @@ a new structure with `gst_structure_new`, and name it `tags-changed`, to
avoid confusion in case we wanted to send other application messages.
Later, once in the main thread, the bus will receive this message and
emit the `message::application` signal, which we have associated to the
`application_cb` function:
emit the `message::application` signal, which we have associated to the
`application_cb` function:
``` first-line: 314; theme: Default; brush: cpp; gutter: true
```
/* This function is called when an "application" message is posted on the bus.
* Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
if (g_strcmp0 (gst_structure_get_name (msg->structure), "tags-changed") == 0) {
if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
/* If the message is the "tags-changed" (only one we are currently issuing), update
* the stream info GUI */
analyze_streams (data);
@ -861,48 +836,47 @@ static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
}
```
Once me made sure it is the `tags-changed` message, we call the
`analyze_streams` function, which is also used in [Playback tutorial 1:
Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html) and is
Once me made sure it is the `tags-changed` message, we call the
`analyze_streams` function, which is also used in [Playback tutorial 1:
Playbin usage](Playback+tutorial+1+Playbin+usage.html) and is
more detailed there. It basically recovers the tags from the stream and
writes them in a text widget in the GUI.
The `error_cb`, `eos_cb` and `state_changed_cb` are not really worth
The `error_cb`, `eos_cb` and `state_changed_cb` are not really worth
explaining, since they do the same as in all previous tutorials, but
from their own function now.
And this is it\! The amount of code in this tutorial might seem daunting
And this is it! The amount of code in this tutorial might seem daunting
but the required concepts are few and easy. If you have followed the
previous tutorials and have a little knowledge of GTK, you probably
understood this one can now enjoy your very own media player\!
understood this one can now enjoy your very own media player!
![](attachments/327796/1540121.png)
![](attachments/basic-tutorial-5.png)
# Exercise
## Exercise
If this media player is not good enough for you, try to change the text
widget that displays the information about the streams into a proper
list view (or tree view). Then, when the user selects a different
stream, make GStreamer switch streams\! To switch streams, you will need
to read [Playback tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html).
stream, make GStreamer switch streams! To switch streams, you will need
to read [Playback tutorial 1: Playbin
usage](Playback+tutorial+1+Playbin+usage.html).
# Conclusion
## Conclusion
This tutorial has shown:
- How to output the video to a particular window handle
using `gst_x_overlay_set_window_handle()`.
using `gst_video_overlay_set_window_handle()`.
- How to refresh the GUI periodically by registering a timeout
callback with `g_timeout_add_seconds ()`.
callback with `g_timeout_add_seconds ()`.
- How to convey information to the main thread by means of application
messages through the bus with `gst_element_post_message()`.
messages through the bus with `gst_element_post_message()`.
- How to be notified only of interesting messages by making the bus
emit signals with `gst_bus_add_signal_watch()` and discriminating
emit signals with `gst_bus_add_signal_watch()` and discriminating
among all message types using the signal details.
This allows you to build a somewhat complete media player with a proper
@ -911,16 +885,6 @@ Graphical User Interface.
The following basic tutorials keep focusing on other individual
GStreamer topics
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-5.png](attachments/327796/1540122.png) (image/png)
![](images/icons/bullet_blue.gif)
[basic-tutorial-5.png](attachments/327796/1540123.png) (image/png)
![](images/icons/bullet_blue.gif)
[basic-tutorial-5.png](attachments/327796/1540121.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 6: Media formats and Pad Capabilities
# Basic tutorial 6: Media formats and Pad Capabilities
This page last changed on Dec 03, 2012 by xartigas.
# Goal
## Goal
Pad Capabilities are a fundamental element of GStreamer, although most
of the time they are invisible because the framework handles them
@ -16,19 +14,19 @@ automatically. This somewhat theoretical tutorial shows:
- Why you need to know about them.
# Introduction
## Introduction
### Pads
As it has already been shown, Pads allow information to enter and leave
an element. The *Capabilities* (or *Caps*, for short) of a Pad, then,
an element. The *Capabilities* (or *Caps*, for short) of a Pad, then,
specify what kind of information can travel through the Pad. For
example, “RGB video with a resolution of 320x200 pixels and 30 frames
per second”, or “16-bits per sample audio, 5.1 channels at 44100 samples
per second”, or even compressed formats like mp3 or h264.
Pads can support multiple Capabilities (for example, a video sink can
support video in the RGB or YUV formats) and Capabilities can be
support video in different types of RGB or YUV formats) and Capabilities can be
specified as *ranges* (for example, an audio sink can support samples
rates from 1 to 48000 samples per second). However, the actual
information traveling from Pad to Pad must have only one well-specified
@ -51,10 +49,10 @@ error.
### Pad templates
Pads are created from *Pad Templates*, which indicate all possible
Capabilities a Pad could have. Templates are useful to create several
Capabilities a Pad could ever have. Templates are useful to create several
similar Pads, and also allow early refusal of connections between
elements: If the Capabilities of their Pad Templates do not have a
common subset (their *intersection *is empty), there is no need to
common subset (their *intersection* is empty), there is no need to
negotiate further.
Pad Templates can be viewed as the first step in the negotiation
@ -63,51 +61,46 @@ Capabilities refined until they are fixed (or negotiation fails).
### Capabilities examples
``` theme: Default; brush: plain; gutter: false
```
SINK template: 'sink'
Availability: Always
Capabilities:
audio/x-raw-int
signed: true
width: 16
depth: 16
audio/x-raw
format: S16LE
rate: [ 1, 2147483647 ]
channels: [ 1, 2 ]
audio/x-raw-int
signed: false
width: 8
depth: 8
audio/x-raw
format: U8
rate: [ 1, 2147483647 ]
channels: [ 1, 2 ]
```
This pad is a sink which is always available on the element (we will not
talk about availability for now). It supports two kinds of media, both
raw audio in integer format (`audio/x-raw-int`): signed,16-bit and
raw audio in integer format (`audio/x-raw`): signed, 16-bit little endian and
unsigned 8-bit. The square brackets indicate a range: for instance, the
number of channels varies from 1 to 2.
``` theme: Default; brush: plain; gutter: false
```
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw-yuv
video/x-raw
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
format: { I420, NV12, NV21, YV12, YUY2, Y42B, Y444, YUV9, YVU9, Y41B, Y800, Y8 , GREY, Y16 , UYVY, YVYU, IYU1, v308, AYUV, A420 } 
format: { I420, NV12, NV21, YV12, YUY2, Y42B, Y444, YUV9, YVU9, Y41B, Y800, Y8, GREY, Y16 , UYVY, YVYU, IYU1, v308, AYUV, A420 }
```
`video/x-raw-yuv` indicates that this source pad outputs video in YUV
format (1 Luminance + 2 Chrominance planes). It supports a wide range of
dimensions and framerates, and a set of YUV formats (The curly braces
indicate a *list*). All these formats indicate different packing and
subsampling of the image planes.
`video/x-raw` indicates that this source pad outputs raw video. It
supports a wide range of dimensions and framerates, and a set of YUV
formats (The curly braces indicate a *list*). All these formats
indicate different packing and subsampling of the image planes.
### Last remarks
You can use the `gst-inspect-0.10` tool described in [Basic tutorial 10:
GStreamer tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) to
You can use the `gst-inspect-1.0` tool described in [Basic tutorial 10:
GStreamer tools](Basic+tutorial+10+GStreamer+tools.markdown) to
learn about the Caps of any GStreamer element.
Bear in mind that some elements query the underlying hardware for
@ -122,14 +115,14 @@ to play. On each state change, the Capabilities of the sink element's
Pad are shown, so you can observe how the negotiation proceeds until the
Pad Caps are fixed.
# A trivial Pad Capabilities Example
## A trivial Pad Capabilities Example
Copy this code into a text file named `basic-tutorial-6.c` (or find it
Copy this code into a text file named `basic-tutorial-6.c` (or find it
in the SDK installation).
**basic-tutorial-6.c**
``` theme: Default; brush: cpp; gutter: true
```
#include <gst/gst.h>
/* Functions below print the Capabilities in a human-friendly format */
@ -169,14 +162,14 @@ static void print_pad_templates_information (GstElementFactory * factory) {
GstStaticPadTemplate *padtemplate;
g_print ("Pad Templates for %s:\n", gst_element_factory_get_longname (factory));
if (!factory->numpadtemplates) {
if (!gst_element_factory_get_num_pad_templates (factory)) {
g_print (" none\n");
return;
}
pads = factory->staticpadtemplates;
pads = gst_element_factory_get_static_pad_templates (factory);
while (pads) {
padtemplate = (GstStaticPadTemplate *) (pads->data);
padtemplate = pads->data
pads = g_list_next (pads);
if (padtemplate->direction == GST_PAD_SRC)
@ -196,8 +189,12 @@ static void print_pad_templates_information (GstElementFactory * factory) {
g_print (" Availability: UNKNOWN!!!\n");
if (padtemplate->static_caps.string) {
GstCaps *caps;
g_print (" Capabilities:\n");
print_caps (gst_static_caps_get (&padtemplate->static_caps), " ");
caps = gst_static_caps_get (&padtemplate->static_caps);
print_caps (caps, " ");
gst_caps_unref (caps);
}
g_print ("\n");
@ -217,9 +214,9 @@ static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
}
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
caps = gst_pad_get_negotiated_caps (pad);
caps = gst_pad_get_current_caps (pad);
if (!caps)
caps = gst_pad_get_caps_reffed (pad);
caps = gst_pad_query_caps (pad, NULL);
/* Print and free */
g_print ("Caps for the %s pad:\n", pad_name);
@ -335,39 +332,29 @@ int main(int argc, char *argv[]) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-775303622" class="expand-container">
<div id="expander-control-775303622" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-775303622" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-6.c -o basic-tutorial-6 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial simply displays information regarding the Pad Capabilities in different time instants.</span></p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
# Walkthrough
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.markdownb#InstallingonWindows-Build), or use this specific command on Linux:
>
> `` gcc basic-tutorial-6.c -o basic-tutorial-6 `pkg-config --cflags --libs gstreamer-0.10` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
> This tutorial simply displays information regarding the Pad Capabilities in different time instants.
>
> Required libraries: `gstreamer-1.0`
The `print_field`, `print_caps` and `print_pad_templates` simply
## Walkthrough
The `print_field`, `print_caps` and `print_pad_templates` simply
display, in a human-friendly format, the capabilities structures. If you
want to learn about the internal organization of the
`GstCaps` structure, read  the `GStreamer Documentation` regarding Pad
`GstCaps` structure, read the `GStreamer Documentation` regarding Pad
Caps.
``` first-line: 75; theme: Default; brush: cpp; gutter: true
```
/* Shows the CURRENT capabilities of the requested pad in the given element */
static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
GstPad *pad = NULL;
@ -381,9 +368,9 @@ static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
}
/* Retrieve negotiated caps (or acceptable caps if negotiation is not finished yet) */
caps = gst_pad_get_negotiated_caps (pad);
caps = gst_pad_get_current_caps (pad);
if (!caps)
caps = gst_pad_get_caps_reffed (pad);
caps = gst_pad_query_caps (pad, NULL);
/* Print and free */
g_print ("Caps for the %s pad:\n", pad_name);
@ -393,25 +380,22 @@ static void print_pad_capabilities (GstElement *element, gchar *pad_name) {
}
```
`gst_element_get_static_pad()` retrieves the named Pad from the given
element. This Pad is *static* because it is always present in the
element. To know more about Pad availability read the `GStreamer
documentation` about Pads.
`gst_element_get_static_pad()` retrieves the named Pad from the given
element. This Pad is *static* because it is always present in the
element. To know more about Pad availability read the `GStreamer
documentation` about Pads.
Then we call `gst_pad_get_negotiated_caps()` to retrieve the Pad's
Then we call `gst_pad_get_current_caps()` to retrieve the Pad's
current Capabilities, which can be fixed or not, depending on the state
of the negotiation process. They could even be non-existent, in which
case, we call `gst_pad_get_caps_reffed()` to retrieve the currently
case, we call `gst_pad_query_caps()` to retrieve the currently
acceptable Pad Capabilities. The currently acceptable Caps will be the
Pad Template's Caps in the NULL state, but might change in later states,
as the actual hardware Capabilities might be queried.
`gst_pad_get_caps_reffed()` is usually faster than `gst_pad_get_caps()`,
and is enough if the retrieved Caps do not need to be modified.
We then print these Capabilities.
``` first-line: 110; theme: Default; brush: cpp; gutter: true
```
/* Create the element factories */
source_factory = gst_element_factory_find ("audiotestsrc");
sink_factory = gst_element_factory_find ("autoaudiosink");
@ -430,14 +414,14 @@ sink = gst_element_factory_create (sink_factory, "sink");
```
In the previous tutorials we created the elements directly using
`gst_element_factory_make()` and skipped talking about factories, but we
will do now. A `GstElementFactory` is in charge of instantiating a
`gst_element_factory_make()` and skipped talking about factories, but we
will do now. A `GstElementFactory` is in charge of instantiating a
particular type of element, identified by its factory name.
You can use `gst_element_factory_find()` to create a factory of type
You can use `gst_element_factory_find()` to create a factory of type
“videotestsrc”, and then use it to instantiate multiple “videotestsrc”
elements using `gst_element_factory_create()`.
`gst_element_factory_make()` is really a shortcut for
`gst_element_factory_make()` is really a shortcut for
`gst_element_factory_find()`+ `gst_element_factory_create()`.
The Pad Templates can already be accessed through the factories, so they
@ -446,7 +430,7 @@ are printed as soon as the factories are created.
We skip the pipeline creation and start, and go to the State-Changed
message handling:
``` theme: Default; brush: cpp; gutter: false
```
case GST_MESSAGE_STATE_CHANGED:
/* We are only interested in state-changed messages from the pipeline */
if (GST_MESSAGE_SRC (msg) == GST_OBJECT (pipeline)) {
@ -465,14 +449,14 @@ pipeline changes. You should see, in the output, how the initial caps
(the Pad Template's Caps) are progressively refined until they are
completely fixed (they contain a single type with no ranges).
# Conclusion
## Conclusion
This tutorial has shown:
- What are Pad Capabilities and Pad Template Capabilities.
- How to retrieve them
with `gst_pad_get_negotiated_caps()` or `gst_pad_get_caps_reffed()`.
with `gst_pad_get_current_caps()` or `gst_pad_query_caps()`.
- That they have different meaning depending on the state of the
pipeline (initially they indicate all the possible Capabilities,
@ -481,16 +465,14 @@ This tutorial has shown:
- That Pad Caps are important to know beforehand if two elements can
be linked together.
- That Pad Caps can be found using the `gst-inspect` tool described
in [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html).
- That Pad Caps can be found using the `gst-inspect` tool described
in [Basic tutorial 10: GStreamer
tools](Basic+tutorial+10+GStreamer+tools.markdown).
Next tutorial shows how data can be manually injected into and extracted
from the GStreamer pipeline.
Remember that attached to this page you should find the complete source
code of the tutorial and any accessory files needed to build it.
It has been a pleasure having you here, and see you soon\!
Document generated by Confluence on Oct 08, 2015 10:27
It has been a pleasure having you here, and see you soon!

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 7: Multithreading and Pad Availability
# Basic tutorial 7: Multithreading and Pad Availability
This page last changed on Jul 03, 2012 by xartigas.
# Goal
## Goal
GStreamer handles multithreading automatically, but, under some
circumstances, you might need to decouple threads manually. This
@ -16,7 +14,7 @@ about Pad Availability. More precisely, this document explains:
- How to replicate streams
# Introduction
## Introduction
### Multithreading
@ -31,7 +29,7 @@ explicitly that a *branch* (a part of the pipeline) runs on a different
thread (for example, to have the audio and video decoders executing
simultaneously).
This is accomplished using the `queue` element, which works as follows.
This is accomplished using the `queue` element, which works as follows.
The sink pad just enqueues data and returns control. On a different
thread, data is dequeued and pushed downstream. This element is also
used for buffering, as seen later in the streaming tutorials. The size
@ -41,10 +39,10 @@ of the queue can be controlled through properties.
This example builds the following pipeline:
![](attachments/327812/1540141.png)
![](attachments/basic-tutorial-7.png)
The source is a synthetic audio signal (a continuous tone) which is
split using a `tee` element (it sends through its source pads everything
split using a `tee` element (it sends through its source pads everything
it receives through its sink pad). One branch then sends the signal to
the audio card, and the other renders a video of the waveform and sends
it to the screen.
@ -58,16 +56,16 @@ there is only one thread, being blocked by the first sink.
### Request pads
In [Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html) we saw
pipelines](Basic+tutorial+3+Dynamic+pipelines.markdown) we saw
an element (`uridecodebin`) which had no pads to begin with, and they
appeared as data started to flow and the element learned about the
media. These are called **Sometimes Pads**, and contrast with the
regular pads which are always available and are called **Always Pads**.
The third kind of pad is the **Request Pad**, which is created on
demand. The classical example is the `tee` element, which has one sink
demand. The classical example is the `tee` element, which has one sink
pad and no initial source pads: they need to be requested and then
`tee` adds them. In this way, an input stream can be replicated any
`tee` adds them. In this way, an input stream can be replicated any
number of times. The disadvantage is that linking elements with Request
Pads is not as automatic, as linking Always Pads, as the walkthrough for
this example will show.
@ -79,9 +77,9 @@ READY states, though.
Without further delay, let's see the code.
# Simple multithreaded example
## Simple multithreaded example
Copy this code into a text file named `basic-tutorial-7.c` (or find it
Copy this code into a text file named `basic-tutorial-7.c` (or find it
in the SDK installation).
**basic-tutorial-7.c**
@ -110,7 +108,7 @@ int main(int argc, char *argv[]) {
audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
video_queue = gst_element_factory_make ("queue", "video_queue");
visual = gst_element_factory_make ("wavescope", "visual");
video_convert = gst_element_factory_make ("ffmpegcolorspace", "csp");
video_convert = gst_element_factory_make ("videoconvert", "csp");
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
/* Create the empty pipeline */
@ -138,7 +136,7 @@ int main(int argc, char *argv[]) {
}
/* Manually link the Tee, which has "Request" pads */
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src%d");
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src_%d");
tee_audio_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
queue_audio_pad = gst_element_get_static_pad (audio_queue, "sink");
@ -178,33 +176,22 @@ int main(int argc, char *argv[]) {
}
```
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/information.png" width="16" height="16" /></td>
<td><div id="expander-863959374" class="expand-container">
<div id="expander-control-863959374" class="expand-control">
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-863959374" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-7.c -o basic-tutorial-7 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.</span></p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
</div></td>
</tr>
</tbody>
</table>
> ![Information](images/icons/emoticons/information.png)
> Need help?
>
> If you need help to compile this code, refer to the **Building the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Build), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Build), or use this specific command on Linux:
>
> ``gcc basic-tutorial-7.c -o basic-tutorial-7 `pkg-config --cflags --libs gstreamer-1.0` ``
>
>If you need help to run this code, refer to the **Running the tutorials** section for your platform: [Linux](Installing+on+Linux.markdown#InstallingonLinux-Run), [Mac OS X](Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run) or [Windows](Installing+on+Windows.markdown#InstallingonWindows-Run).
>
> This tutorial plays an audible tone through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.
>
> Required libraries: `gstreamer-1.0`
# Walkthrough
## Walkthrough
``` first-line: 15; theme: Default; brush: cpp; gutter: true
```
/* Create the elements */
audio_source = gst_element_factory_make ("audiotestsrc", "audio_source");
tee = gst_element_factory_make ("tee", "tee");
@ -214,41 +201,41 @@ audio_convert = gst_element_factory_make ("audioconvert", "audio_convert");
audio_sink = gst_element_factory_make ("autoaudiosink", "audio_sink");
video_queue = gst_element_factory_make ("queue", "video_queue");
visual = gst_element_factory_make ("wavescope", "visual");
video_convert = gst_element_factory_make ("ffmpegcolorspace", "csp");
video_convert = gst_element_factory_make ("videoconvert", "video_convert");
video_sink = gst_element_factory_make ("autovideosink", "video_sink");
```
All the elements in the above picture are instantiated here:
`audiotestsrc` produces a synthetic tone. `wavescope` consumes an audio
`audiotestsrc` produces a synthetic tone. `wavescope` consumes an audio
signal and renders a waveform as if it was an (admittedly cheap)
oscilloscope. We have already worked with the `autoaudiosink` and
oscilloscope. We have already worked with the `autoaudiosink` and
`autovideosink`.
The conversion elements (`audioconvert`, `audioresample` and
`ffmpegcolorspace`) are necessary to guarantee that the pipeline can be
The conversion elements (`audioconvert`, `audioresample` and
`videoconvert`) are necessary to guarantee that the pipeline can be
linked. Indeed, the Capabilities of the audio and video sinks depend on
the hardware, and you do not know at design time if they will match the
Caps produced by the `audiotestsrc` and `wavescope`. If the Caps
Caps produced by the `audiotestsrc` and `wavescope`. If the Caps
matched, though, these elements act in “pass-through” mode and do not
modify the signal, having negligible impact on performance.
``` first-line: 36; theme: Default; brush: cpp; gutter: true
```
/* Configure elements */
g_object_set (audio_source, "freq", 215.0f, NULL);
g_object_set (visual, "shader", 0, "style", 1, NULL);
```
Small adjustments for better demonstration: The “freq” property of
`audiotestsrc` controls the frequency of the wave (215Hz makes the wave
`audiotestsrc` controls the frequency of the wave (215Hz makes the wave
appear almost stationary in the window), and this style and shader for
`wavescope` make the wave continuous. Use the `gst-inspect` tool
`wavescope` make the wave continuous. Use the `gst-inspect-1.0` tool
described in [Basic tutorial 10: GStreamer
tools](Basic%2Btutorial%2B10%253A%2BGStreamer%2Btools.html) to learn all
tools](Basic+tutorial+10+GStreamer+tools.markdown) to learn all
the properties of these
elements.
``` first-line: 40; theme: Default; brush: cpp; gutter: true
```
/* Link all elements that can be automatically linked because they have "Always" pads */
gst_bin_add_many (GST_BIN (pipeline), audio_source, tee, audio_queue, audio_convert, audio_sink,
video_queue, visual, video_convert, video_sink, NULL);
@ -265,18 +252,12 @@ This code block adds all elements to the pipeline and then links the
ones that can be automatically linked (the ones with Always Pads, as the
comment says).
<table>
<tbody>
<tr class="odd">
<td><img src="images/icons/emoticons/warning.png" width="16" height="16" /></td>
<td><p><code>gst_element_link_many()</code> can actually link elements with Request Pads. It internally requests the Pads so you do not have worry about the elements being linked having Always or Request Pads. Strange as it might seem, this is actually inconvenient, because you still need to release the requested Pads afterwards, and, if the Pad was requested automatically by <code>gst_element_link_many()</code>, it is easy to forget. Stay out of trouble by always requesting Request Pads manually, as shown in the next code block.</p></td>
</tr>
</tbody>
</table>
> ![Warning](images/icons/emoticons/warning.png)
> `gst_element_link_many()` can actually link elements with Request Pads. It internally requests the Pads so you do not have worry about the elements being linked having Always or Request Pads. Strange as it might seem, this is actually inconvenient, because you still need to release the requested Pads afterwards, and, if the Pad was requested automatically by `gst_element_link_many()`, it is easy to forget. Stay out of trouble by always requesting Request Pads manually, as shown in the next code block.
``` first-line: 51; theme: Default; brush: cpp; gutter: true
```
/* Manually link the Tee, which has "Request" pads */
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src%d");
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (tee), "src_%d");
tee_audio_pad = gst_element_request_pad (tee, tee_src_pad_template, NULL, NULL);
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
queue_audio_pad = gst_element_get_static_pad (audio_queue, "sink");
@ -297,9 +278,9 @@ To link Request Pads, they need to be obtained by “requesting” them to
the element. An element might be able to produce different kinds of
Request Pads, so, when requesting them, the desired Pad Template must be
provided. Pad templates are obtained with
`gst_element_class_get_pad_template()` and are identified by their name.
In the documentation for the `tee` element we see that it has two pad
templates named “sink” (for its sink Pads) and “src%d” (for the Request
`gst_element_class_get_pad_template()` and are identified by their name.
In the documentation for the `tee` element we see that it has two pad
templates named “sink” (for its sink Pads) and “src_%d” (for the Request
Pads).
Once we have the Pad template, we request two Pads from the tee (for the
@ -310,7 +291,7 @@ Request Pads need to be linked. These are normal Always Pads, so we
obtain them with `gst_element_get_static_pad()`.
Finally, we link the pads with `gst_pad_link()`. This is the function
that `gst_element_link()` and `gst_element_link_many()` use internally.
that `gst_element_link()` and `gst_element_link_many()` use internally.
The sink Pads we have obtained need to be released with
`gst_object_unref()`. The Request Pads will be released when we no
@ -320,7 +301,7 @@ We then set the pipeline to playing as usual, and wait until an error
message or an EOS is produced. The only thing left to so is cleanup the
requested Pads:
``` first-line: 75; theme: Default; brush: cpp; gutter: true
```
/* Release the request pads from the Tee, and unref them */
gst_element_release_request_pad (tee, tee_audio_pad);
gst_element_release_request_pad (tee, tee_video_pad);
@ -328,36 +309,26 @@ gst_object_unref (tee_audio_pad);
gst_object_unref (tee_video_pad);
```
`gst_element_release_request_pad()` releases the pad from the `tee`, but
it still needs to be unreferenced (freed) with `gst_object_unref()`.
`gst_element_release_request_pad()` releases the pad from the `tee`, but
it still needs to be unreferenced (freed) with `gst_object_unref()`.
# Conclusion
## Conclusion
 This tutorial has shown:
This tutorial has shown:
- How to make parts of a pipeline run on a different thread by using
`queue` elements.
`queue` elements.
- What is a Request Pad and how to link elements with request pads,
with `gst_element_class_get_pad_template()`, `gst_element_request_pad()`, `gst_pad_link()` and
 `gst_element_release_request_pad()`.
with `gst_element_class_get_pad_template()`, `gst_element_request_pad()`, `gst_pad_link()` and
`gst_element_release_request_pad()`.
- How to have the same stream available in different branches by using
`tee` elements.
`tee` elements.
The next tutorial builds on top of this one to show how data can be
manually injected into and extracted from a running pipeline.
It has been a pleasure having you here, and see you soon\!
It has been a pleasure having you here, and see you soon!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-7.png](attachments/327812/1540161.png) (image/png)
![](images/icons/bullet_blue.gif)
[basic-tutorial-7.png](attachments/327812/2424839.png) (image/png)
![](images/icons/bullet_blue.gif)
[basic-tutorial-7.png](attachments/327812/1540141.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,8 +1,6 @@
# GStreamer SDK documentation : Basic tutorial 8: Short-cutting the pipeline
# Basic tutorial 8: Short-cutting the pipeline
This page last changed on Jun 19, 2012 by xartigas.
# Goal
## Goal
Pipelines constructed with GStreamer do not need to be completely
closed. Data can be injected into the pipeline and extracted from it at
@ -15,10 +13,10 @@ any time, in a variety of ways. This tutorial shows:
- How to access and manipulate this data.
[Playback tutorial 3: Short-cutting the
pipeline](Playback%2Btutorial%2B3%253A%2BShort-cutting%2Bthe%2Bpipeline.html) explains
pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) explains
how to achieve the same goals in a playbin2-based pipeline.
# Introduction
## Introduction
Applications can interact with the data flowing through a GStreamer
pipeline in several ways. This tutorial describes the easiest one, since
@ -27,25 +25,25 @@ it uses elements that have been created for this sole purpose.
The element used to inject application data into a GStreamer pipeline is
`appsrc`, and its counterpart, used to extract GStreamer data back to
the application is `appsink`. To avoid confusing the names, think of it
from GStreamer's point of view: `appsrc` is just a regular source, that
from GStreamer's point of view: `appsrc` is just a regular source, that
provides data magically fallen from the sky (provided by the
application, actually). `appsink` is a regular sink, where the data
application, actually). `appsink` is a regular sink, where the data
flowing through a GStreamer pipeline goes to die (it is recovered by the
application, actually).
`appsrc` and `appsink` are so versatile that they offer their own API
`appsrc` and `appsink` are so versatile that they offer their own API
(see their documentation), which can be accessed by linking against the
`gstreamer-app` library. In this tutorial, however, we will use a
`gstreamer-app` library. In this tutorial, however, we will use a
simpler approach and control them through signals.
`appsrc` can work in a variety of modes: in **pull** mode, it requests
`appsrc` can work in a variety of modes: in **pull** mode, it requests
data from the application every time it needs it. In **push** mode, the
application pushes data at its own pace. Furthermore, in push mode, the
application can choose to be blocked in the push function when enough
data has already been provided, or it can listen to the
`enough-data` and `need-data` signals to control flow. This example
`enough-data` and `need-data` signals to control flow. This example
implements the latter approach. Information regarding the other methods
can be found in the `appsrc` documentation.
can be found in the `appsrc` documentation.
### Buffers
@ -56,23 +54,23 @@ Since this example produces and consumes data, we need to know about
Source Pads produce buffers, that are consumed by Sink Pads; GStreamer
takes these buffers and passes them from element to element.
A buffer simply represents a piece of data, do not assume that all
A buffer simply represents a unit of data, do not assume that all
buffers will have the same size, or represent the same amount of time.
Neither should you assume that if a single buffer enters an element, a
single buffer will come out. Elements are free to do with the received
buffers as they please.
buffers as they please. `GstBuffer`s may also contain more than one
actual memory buffer. Actual memory buffers are abstracted away using
`GstMemory` objects, and a `GstBuffer` can contain multiple `GstMemory` objects.
Every buffer has an attached `GstCaps` structure that describes the kind
of media contained in the buffer. Also, buffers have an attached
time-stamp and duration, that describe in which moment the content of
the buffer should be rendered or displayed. Time stamping is a very
complex and delicate subject, but this simplified vision should suffice
for now.
Every buffer has attached time-stamps and duration, that describe in
which moment the content of the buffer should be decoded, rendered or
displayed. Time stamping is a very complex and delicate subject, but
this simplified vision should suffice for now.
As an example, a `filesrc` (a GStreamer element that reads files)
As an example, a `filesrc` (a GStreamer element that reads files)
produces buffers with the “ANY” caps and no time-stamping information.
After demuxing (see [Basic tutorial 3: Dynamic
pipelines](Basic%2Btutorial%2B3%253A%2BDynamic%2Bpipelines.html))
pipelines](Basic+tutorial+3+Dynamic+pipelines.markdown))
buffers can have some specific caps, for example “video/x-h264”. After
decoding, each buffer will contain a single video frame with raw caps
(for example, “video/x-raw-yuv”) and very precise time stamps indicating
@ -81,28 +79,28 @@ when should that frame be displayed.
### This tutorial
This tutorial expands [Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html) in
two ways: Firstly, the `audiotestsrc` is replaced by an `appsrc` that
Availability](Basic+tutorial+7+Multithreading+and+Pad+Availability.markdown) in
two ways: firstly, the `audiotestsrc` is replaced by an `appsrc` that
will generate the audio data. Secondly, a new branch is added to the
`tee` so data going into the audio sink and the wave display is also
replicated into an `appsink`. The `appsink` uploads the information back
`tee` so data going into the audio sink and the wave display is also
replicated into an `appsink`. The `appsink` uploads the information back
into the application, which then just notifies the user that data has
been received, but it could obviously perform more complex tasks.
![](attachments/1442189/1540158.png)
![](attachments/basic-tutorial-8.png.png)
# A crude waveform generator
## A crude waveform generator
Copy this code into a text file named `basic-tutorial-8.c` (or find it
Copy this code into a text file named `basic-tutorial-8.c` (or find it
in the SDK installation).
``` theme: Default; brush: cpp; gutter: true
```
#include <gst/gst.h>
#include <gst/audio/audio.h>
#include <string.h>
#define CHUNK_SIZE 1024 /* Amount of bytes we are sending in each buffer */
#define SAMPLE_RATE 44100 /* Samples per second we are sending */
#define AUDIO_CAPS "audio/x-raw-int,channels=1,rate=%d,signed=(boolean)true,width=16,depth=16,endianness=BYTE_ORDER"
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct _CustomData {
@ -126,6 +124,7 @@ static gboolean push_data (CustomData *data) {
GstBuffer *buffer;
GstFlowReturn ret;
int i;
GstMapInfo map;
gint16 *raw;
gint num_samples = CHUNK_SIZE / 2; /* Because each sample is 16 bits */
gfloat freq;
@ -138,7 +137,8 @@ static gboolean push_data (CustomData *data) {
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale (CHUNK_SIZE, GST_SECOND, SAMPLE_RATE);
/* Generate some psychodelic waveforms */
raw = (gint16 *)GST_BUFFER_DATA (buffer);
gst_buffer_map (buffer, &map, GST_MAP_WRITE);
raw = (gint16 *)map.data;
data->c += data->d;
data->d -= data->c / 1000;
freq = 1100 + 1000 * data->d;
@ -147,6 +147,7 @@ static gboolean push_data (CustomData *data) {
data->b -= data->a / freq;
raw[i] = (gint16)(500 * data->a);
}
gst_buffer_unmap (buffer, &map);
data->num_samples += num_samples;
/* Push the buffer into the appsrc */
@ -183,15 +184,15 @@ static void stop_feed (GstElement *source, CustomData *data) {
}
/* The appsink has received a buffer */
static void new_buffer (GstElement *sink, CustomData *data) {
GstBuffer *buffer;
static void new_sample (GstElement *sink, CustomData *data) {
GstSample *sample;
/* Retrieve the buffer */
g_signal_emit_by_name (sink, "pull-buffer", &buffer);
if (buffer) {
g_signal_emit_by_name (sink, "pull-sample", &sample);
if (sample) {
/* The only thing we do in this example is print a * to indicate a received buffer */
g_print ("*");
gst_buffer_unref (buffer);
gst_buffer_unref (sample);
}
}
@ -215,7 +216,7 @@ int main(int argc, char *argv[]) {
GstPadTemplate *tee_src_pad_template;
GstPad *tee_audio_pad, *tee_video_pad, *tee_app_pad;
GstPad *queue_audio_pad, *queue_video_pad, *queue_app_pad;
gchar *audio_caps_text;
GstAudioInfo info;
GstCaps *audio_caps;
GstBus *bus;
@ -237,7 +238,7 @@ int main(int argc, char *argv[]) {
data.video_queue = gst_element_factory_make ("queue", "video_queue");
data.audio_convert2 = gst_element_factory_make ("audioconvert", "audio_convert2");
data.visual = gst_element_factory_make ("wavescope", "visual");
data.video_convert = gst_element_factory_make ("ffmpegcolorspace", "csp");
data.video_convert = gst_element_factory_make ("videoconvert", "csp");
data.video_sink = gst_element_factory_make ("autovideosink", "video_sink");
data.app_queue = gst_element_factory_make ("queue", "app_queue");
data.app_sink = gst_element_factory_make ("appsink", "app_sink");
@ -256,15 +257,15 @@ int main(int argc, char *argv[]) {
g_object_set (data.visual, "shader", 0, "style", 0, NULL);
/* Configure appsrc */
audio_caps_text = g_strdup_printf (AUDIO_CAPS, SAMPLE_RATE);
audio_caps = gst_caps_from_string (audio_caps_text);
g_object_set (data.app_source, "caps", audio_caps, NULL);
gst_audio_info_set_format (&info, GST_AUDIO_FORMAT_S16, SAMPLE_RATE, 1, NULL);
audio_caps = gst_audio_info_to_caps (&info);
g_object_set (data.app_source, "caps", audio_caps, "format", GST_FORMAT_TIME, NULL);
g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
/* Configure appsink */
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
g_signal_connect (data.app_sink, "new-buffer", G_CALLBACK (new_buffer), &data);
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
gst_caps_unref (audio_caps);
g_free (audio_caps_text);
@ -282,7 +283,7 @@ int main(int argc, char *argv[]) {
}
/* Manually link the Tee, which has "Request" pads */
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data.tee), "src%d");
tee_src_pad_template = gst_element_class_get_pad_template (GST_ELEMENT_GET_CLASS (data.tee), "src_%d");
tee_audio_pad = gst_element_request_pad (data.tee, tee_src_pad_template, NULL, NULL);
g_print ("Obtained request pad %s for audio branch.\n", gst_pad_get_name (tee_audio_pad));
queue_audio_pad = gst_element_get_static_pad (data.audio_queue, "sink");
@ -340,13 +341,13 @@ int main(int argc, char *argv[]) {
<span class="expand-control-icon"><img src="images/icons/grey_arrow_down.gif" class="expand-control-image" /></span><span class="expand-control-text">Need help? (Click to expand)</span>
</div>
<div id="expander-content-1760171459" class="expand-content">
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Build">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<p>If you need help to compile this code, refer to the <strong>Building the tutorials</strong> section for your platform: <a href="Installing+on+Linux.markdown#InstallingonLinux-Build">Linux</a>, <a href="Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Build">Mac OS X</a> or <a href="Installing+on+Windows.markdown#InstallingonWindows-Build">Windows</a>, or use this specific command on Linux:</p>
<div class="panel" style="border-width: 1px;">
<div class="panelContent">
<p><code>gcc basic-tutorial-8.c -o basic-tutorial-8 `pkg-config --cflags --libs gstreamer-0.10`</code></p>
<p><code>gcc basic-tutorial-8.c -o basic-tutorial-8 `pkg-config --cflags --libs gstreamer-1.0 gst-audio-1.0`</code></p>
</div>
</div>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing%2Bon%2BLinux.html#InstallingonLinux-Run">Linux</a>, <a href="Installing%2Bon%2BMac%2BOS%2BX.html#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing%2Bon%2BWindows.html#InstallingonWindows-Run">Windows</a></p>
<p>If you need help to run this code, refer to the <strong>Running the tutorials</strong> section for your platform: <a href="Installing+on+Linux.markdown#InstallingonLinux-Run">Linux</a>, <a href="Installing+on+Mac+OS+X.markdown#InstallingonMacOSX-Run">Mac OS X</a> or <a href="Installing+on+Windows.markdown#InstallingonWindows-Run">Windows</a></p>
<p><span>This tutorial plays an audible tone for varying frequency through the audio card and opens a window with a waveform representation of the tone. The waveform should be a sinusoid, but due to the refreshing of the window might not appear so.</span></p>
<p>Required libraries: <code>gstreamer-0.10</code></p>
</div>
@ -355,15 +356,15 @@ int main(int argc, char *argv[]) {
</tbody>
</table>
# Walkthrough
## Walkthrough
The code to create the pipeline (Lines 131 to 205) is an enlarged
The code to create the pipeline (Lines 131 to 205) is an enlarged
version of [Basic tutorial 7: Multithreading and Pad
Availability](Basic%2Btutorial%2B7%253A%2BMultithreading%2Band%2BPad%2BAvailability.html).
Availability](Basic+tutorial+7+Multithreading+and+Pad+Availability.markdown).
It involves instantiating all the elements, link the elements with
Always Pads, and manually link the Request Pads of the `tee` element.
Always Pads, and manually link the Request Pads of the `tee` element.
Regarding the configuration of the `appsrc` and `appsink` elements:
Regarding the configuration of the `appsrc` and `appsink` elements:
``` first-line: 159; theme: Default; brush: cpp; gutter: true
/* Configure appsrc */
@ -374,30 +375,30 @@ g_signal_connect (data.app_source, "need-data", G_CALLBACK (start_feed), &data);
g_signal_connect (data.app_source, "enough-data", G_CALLBACK (stop_feed), &data);
```
The first property that needs to be set on the `appsrc` is `caps`. It
The first property that needs to be set on the `appsrc` is `caps`. It
specifies the kind of data that the element is going to produce, so
GStreamer can check if linking with downstream elements is possible
(this is, if the downstream elements will understand this kind of data).
This property must be a `GstCaps` object, which is easily built from a
This property must be a `GstCaps` object, which is easily built from a
string with `gst_caps_from_string()`.
We then connect to the `need-data` and `enough-data` signals. These are
fired by `appsrc` when its internal queue of data is running low or
We then connect to the `need-data` and `enough-data` signals. These are
fired by `appsrc` when its internal queue of data is running low or
almost full, respectively. We will use these signals to start and stop
(respectively) our signal generation process.
``` first-line: 166; theme: Default; brush: cpp; gutter: true
/* Configure appsink */
g_object_set (data.app_sink, "emit-signals", TRUE, "caps", audio_caps, NULL);
g_signal_connect (data.app_sink, "new-buffer", G_CALLBACK (new_buffer), &data);
g_signal_connect (data.app_sink, "new-sample", G_CALLBACK (new_sample), &data);
gst_caps_unref (audio_caps);
g_free (audio_caps_text);
```
Regarding the `appsink` configuration, we connect to the
`new-buffer` signal, which is emitted every time the sink receives a
Regarding the `appsink` configuration, we connect to the
`new-sample` signal, which is emitted every time the sink receives a
buffer. Also, the signal emission needs to be enabled through the
`emit-signals` property, because, by default, it is disabled.
`emit-signals` property, because, by default, it is disabled.
Starting the pipeline, waiting for messages and final cleanup is done as
usual. Let's review the callbacks we have just
@ -414,21 +415,21 @@ static void start_feed (GstElement *source, guint size, CustomData *data) {
}
```
This function is called when the internal queue of `appsrc` is about to
This function is called when the internal queue of `appsrc` is about to
starve (run out of data). The only thing we do here is register a GLib
idle function with `g_idle_add()` that feeds data to `appsrc` until it
idle function with `g_idle_add()` that feeds data to `appsrc` until it
is full again. A GLib idle function is a method that GLib will call from
its main loop whenever it is “idle”, this is, when it has no
higher-priority tasks to perform. It requires a GLib `GMainLoop` to be
higher-priority tasks to perform. It requires a GLib `GMainLoop` to be
instantiated and running, obviously.
This is only one of the multiple approaches that `appsrc` allows. In
particular, buffers do not need to be fed into `appsrc` from the main
thread using GLib, and you do not need to use the `need-data` and
`enough-data` signals to synchronize with `appsrc` (although this is
This is only one of the multiple approaches that `appsrc` allows. In
particular, buffers do not need to be fed into `appsrc` from the main
thread using GLib, and you do not need to use the `need-data` and
`enough-data` signals to synchronize with `appsrc` (although this is
allegedly the most convenient).
We take note of the sourceid that `g_idle_add()` returns, so we can
We take note of the sourceid that `g_idle_add()` returns, so we can
disable it
later.
@ -444,9 +445,9 @@ static void stop_feed (GstElement *source, CustomData *data) {
}
```
This function is called when the internal queue of `appsrc` is full
This function is called when the internal queue of `appsrc` is full
enough so we stop pushing data. Here we simply remove the idle function
by using `g_source_remove()` (The idle function is implemented as a
by using `g_source_remove()` (The idle function is implemented as a
`GSource`).
``` first-line: 22; theme: Default; brush: cpp; gutter: true
@ -475,24 +476,24 @@ static gboolean push_data (CustomData *data) {
This is the function that feeds `appsrc`. It will be called by GLib at
times and rates which are out of our control, but we know that we will
disable it when its job is done (when the queue in `appsrc` is full).
disable it when its job is done (when the queue in `appsrc` is full).
Its first task is to create a new buffer with a given size (in this
example, it is arbitrarily set to 1024 bytes) with
`gst_buffer_new_and_alloc()`.
We count the number of samples that we have generated so far with the
`CustomData.num_samples` variable, so we can time-stamp this buffer
using the `GST_BUFFER_TIMESTAMP` macro in `GstBuffer`.
`CustomData.num_samples` variable, so we can time-stamp this buffer
using the `GST_BUFFER_TIMESTAMP` macro in `GstBuffer`.
Since we are producing buffers of the same size, their duration is the
same and is set using the `GST_BUFFER_DURATION` in `GstBuffer`.
`gst_util_uint64_scale()` is a utility function that scales (multiply
`gst_util_uint64_scale()` is a utility function that scales (multiply
and divide) numbers which can be large, without fear of overflows.
The bytes that for the buffer can be accessed with GST\_BUFFER\_DATA in
`GstBuffer` (Be careful not to write past the end of the buffer: you
`GstBuffer` (Be careful not to write past the end of the buffer: you
allocated it, so you know its size).
We will skip over the waveform generation, since it is outside the scope
@ -507,11 +508,11 @@ g_signal_emit_by_name (data->app_source, "push-buffer", buffer, &ret);
gst_buffer_unref (buffer);
```
Once we have the buffer ready, we pass it to `appsrc` with the
`push-buffer` action signal (see information box at the end of [Playback
Once we have the buffer ready, we pass it to `appsrc` with the
`push-buffer` action signal (see information box at the end of [Playback
tutorial 1: Playbin2
usage](Playback%2Btutorial%2B1%253A%2BPlaybin2%2Busage.html)), and then
`gst_buffer_unref()` it since we no longer need it.
usage](Playback+tutorial+1+Playbin2+usage.markdown)), and then
`gst_buffer_unref()` it since we no longer need it.
``` first-line: 86; theme: Default; brush: cpp; gutter: true
/* The appsink has received a buffer */
@ -529,41 +530,29 @@ static void new_buffer (GstElement *sink, CustomData *data) {
```
Finally, this is the function that gets called when the
`appsink` receives a buffer. We use the `pull-buffer` action signal to
`appsink` receives a buffer. We use the `pull-buffer` action signal to
retrieve the buffer and then just print some indicator on the screen. We
can retrieve the data pointer using the `GST_BUFFER_DATA` macro and the
data size using the `GST_BUFFER_SIZE` macro in `GstBuffer`. Remember
can retrieve the data pointer using the `GST_BUFFER_DATA` macro and the
data size using the `GST_BUFFER_SIZE` macro in `GstBuffer`. Remember
that this buffer does not have to match the buffer that we produced in
the `push_data` function, any element in the path could have altered the
buffers in any way (Not in this example: there is only a `tee` in the
path between `appsrc` and `appsink`, and it does not change the content
the `push_data` function, any element in the path could have altered the
buffers in any way (Not in this example: there is only a `tee` in the
path between `appsrc` and `appsink`, and it does not change the content
of the buffers).
We then `gst_buffer_unref()` the buffer, and this tutorial is done.
We then `gst_buffer_unref()` the buffer, and this tutorial is done.
# Conclusion
## Conclusion
This tutorial has shown how applications can:
- Inject data into a pipeline using the `appsrc`element.
- Retrieve data from a pipeline using the `appsink` element.
- Retrieve data from a pipeline using the `appsink` element.
- Manipulate this data by accessing the `GstBuffer`.
In a playbin2-based pipeline, the same goals are achieved in a slightly
different way. [Playback tutorial 3: Short-cutting the
pipeline](Playback%2Btutorial%2B3%253A%2BShort-cutting%2Bthe%2Bpipeline.html) shows
In a playbin2-based pipeline, the same goals are achieved in a slightly
different way. [Playback tutorial 3: Short-cutting the
pipeline](Playback+tutorial+3+Short-cutting+the+pipeline.markdown) shows
how to do it.
It has been a pleasure having you here, and see you soon\!
## Attachments:
![](images/icons/bullet_blue.gif)
[basic-tutorial-8.png](attachments/1442189/1540159.png) (image/png)
![](images/icons/bullet_blue.gif)
[basic-tutorial-8.png](attachments/1442189/1540160.png) (image/png)
![](images/icons/bullet_blue.gif)
[basic-tutorial-8.png](attachments/1442189/1540158.png) (image/png)
Document generated by Confluence on Oct 08, 2015 10:27
It has been a pleasure having you here, and see you soon\!

View file

@ -1,11 +1,7 @@
# GStreamer SDK documentation : Basic tutorials
# Basic tutorials
This page last changed on Mar 28, 2012 by xartigas.
# Welcome to the GStreamer SDK Basic tutorials
## Welcome to the GStreamer SDK Basic tutorials
These tutorials describe general topics required to understand the rest
of tutorials in the GStreamer SDK.
Document generated by Confluence on Oct 08, 2015 10:27

View file

@ -1,6 +1,6 @@
# Tutorials
# Tutorials
# Welcome to the GStreamer SDK Tutorials\!
## Welcome to the GStreamer SDK Tutorials!
The following sections introduce a series of tutorials designed to help
you learn how to use GStreamer, the multi-platform, modular,
@ -10,16 +10,16 @@ open-source, media streaming framework.
Before following these tutorials, you need to set up your development
environment according to your platform. If you have not done so yet,
follow the appropriate link
for [Linux](Installing%2Bon%2BLinux.html), [Mac OS
X](Installing%2Bon%2BMac%2BOS%2BX.html) or [Windows](Installing%2Bon%2BWindows.html) and
come back here afterwards.
follow the appropriate link for [Linux](Installing+on+Linux.markdown),
[Mac OS X](Installing+on+Mac+OS+X.markdown) or
[Windows](Installing+on+Windows.markdown) and come back here
afterwards.
The tutorials are currently written only in the C programming language,
so you need to be comfortable with it. Even though C is not an
Object-Oriented (OO) language per se, the GStreamer framework uses
`GObject`s, so some knowledge of OO concepts will come in handy.
Knowledge of the `GObject` and `GLib` libraries is not mandatory, but
Knowledge of the `GObject` and `GLib` libraries is not mandatory, but
will make the trip easier.
### Source code
@ -32,57 +32,47 @@ SDK, as explained in the installation instructions.
### A short note on GObject and GLib
GStreamer is built on top of the `GObject` (for object orientation) and
`GLib` (for common algorithms) libraries, which means that every now and
GStreamer is built on top of the `GObject` (for object orientation) and
`GLib` (for common algorithms) libraries, which means that every now and
then you will have to call functions of these libraries. Even though the
tutorials will make sure that deep knowledge of these libraries is not
required, familiarity with them will certainly ease the process of
learning GStreamer.
You can always tell which library you are calling because all GStreamer
functions, structures and types have the `gst_` prefix, whereas GLib and
functions, structures and types have the `gst_` prefix, whereas GLib and
GObject use `g_`.
### Sources of documentation
Besides what you can find in this site, you have
the `GObject` and `GLib` reference guides, and, of course the
upstream [GStreamer
You have the `GObject` and `GLib` reference guides, and, of course the
upstream [GStreamer
documentation](http://gstreamer.freedesktop.org/documentation/).
When accessing external documentation, be careful to check that the
version of the documented library matches the versions used in this SDK
(which you can find in the [Releases](Releases.html) page)
### Structure
The tutorials are organized in sections, revolving about a common theme:
- [Basic tutorials](Basic%2Btutorials.html): Describe general topics
- [Basic tutorials](Basic+tutorials.markdown): Describe general topics
required to understand the rest of tutorials in the GStreamer SDK.
- [Playback tutorials](Playback%2Btutorials.html): Explain everything
- [Playback tutorials](Playback+tutorials.markdown): Explain everything
you need to know to produce a media playback application using
GStreamer.
- [Android tutorials](Android%2Btutorials.html): Tutorials dealing
- [Android tutorials](Android+tutorials.markdown): Tutorials dealing
with the few Android-specific topics you need to know.
- [iOS tutorials](iOS%2Btutorials.html): Tutorials dealing with the
- [iOS tutorials](iOS+tutorials.markdown): Tutorials dealing with the
few iOS-specific topics you need to know.
If you cannot remember in which tutorial a certain GStreamer concept is
explained, use the following:
- [Table of Concepts](Table%2Bof%2BConcepts.html)
Furthermore, you can find the list of planned tutorials here:
- [Upcoming tutorials](Upcoming%2Btutorials.html)
- [Table of Concepts](Table+of+Concepts.markdown)
### Sample media
The audio and video clips used throughout these tutorials are all
publicly available and the copyright remains with their respective
authors. In some cases they have been re-encoded for demonstration
purposes. All the files are hosted
at [docs.gstreamer.com](http://docs.gstreamer.com).
purposes.
- [Sintel, the Durian Open Movie Project](http://www.sintel.org/)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 494 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 494 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

View file

Before

Width:  |  Height:  |  Size: 494 KiB

After

Width:  |  Height:  |  Size: 494 KiB

View file

Before

Width:  |  Height:  |  Size: 33 KiB

After

Width:  |  Height:  |  Size: 33 KiB

View file

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

View file

Before

Width:  |  Height:  |  Size: 7 KiB

After

Width:  |  Height:  |  Size: 7 KiB

View file

Before

Width:  |  Height:  |  Size: 4.3 KiB

After

Width:  |  Height:  |  Size: 4.3 KiB

View file

Before

Width:  |  Height:  |  Size: 2.7 KiB

After

Width:  |  Height:  |  Size: 2.7 KiB

View file

Before

Width:  |  Height:  |  Size: 52 KiB

After

Width:  |  Height:  |  Size: 52 KiB

View file

Before

Width:  |  Height:  |  Size: 3.3 KiB

After

Width:  |  Height:  |  Size: 3.3 KiB

View file

Before

Width:  |  Height:  |  Size: 3.5 KiB

After

Width:  |  Height:  |  Size: 3.5 KiB

View file

@ -1,7 +1,8 @@
{
"index": "Home.markdown",
"add_anchors": true,
"output": "built_doc",
"project_name": "gstdotcom",
"add_anchors": true,
"command": "conf",
"index": "Home.markdown",
"output": "built_doc",
"project_name": "gstdotcom",
"sitemap": "sitemap.txt"
}
}

View file

@ -7,50 +7,49 @@ Home.markdown
Building+from+source+using+Cerbero.markdown
Tutorials.markdown
Basic+tutorials.markdown
Basic+tutorial+1%3A+Hello+world%21.markdown
Basic+tutorial+2%3A+GStreamer+concepts.markdown
Basic+tutorial+3%3A+Dynamic+pipelines.markdown
Basic+tutorial+4%3A+Time+management.markdown
Basic+tutorial+5%3A+GUI+toolkit+integration.markdown
Basic+tutorial+6%3A+Media+formats+and+Pad+Capabilities.markdown
Basic+tutorial+7%3A+Multithreading+and+Pad+Availability.markdown
Basic+tutorial+8%3A+Short-cutting+the+pipeline.markdown
Basic+tutorial+9%3A+Media+information+gathering.markdown
Basic+tutorial+10%3A+GStreamer+tools.markdown
Basic+tutorial+11%3A+Debugging+tools.markdown
Basic+tutorial+12%3A+Streaming.markdown
Basic+tutorial+13%3A+Playback+speed.markdown
Basic+tutorial+14%3A+Handy+elements.markdown
Basic+tutorial+15%3A+Clutter+integration.markdown
Basic+tutorial+16%3A+Platform-specific+elements.markdown
Basic+tutorial+1+Hello+world.markdown
Basic+tutorial+2+GStreamer+concepts.markdown
Basic+tutorial+3+Dynamic+pipelines.markdown
Basic+tutorial+4+Time+management.markdown
Basic+tutorial+5+GUI+toolkit+integration.markdown
Basic+tutorial+6+Media+formats+and+Pad+Capabilities.markdown
Basic+tutorial+7+Multithreading+and+Pad+Availability.markdown
Basic+tutorial+8+Short-cutting+the+pipeline.markdown
Basic+tutorial+9+Media+information+gathering.markdown
Basic+tutorial+10+GStreamer+tools.markdown
Basic+tutorial+11+Debugging+tools.markdown
Basic+tutorial+12+Streaming.markdown
Basic+tutorial+13+Playback+speed.markdown
Basic+tutorial+14+Handy+elements.markdown
Basic+tutorial+15+Clutter+integration.markdown
Basic+tutorial+16+Platform-specific+elements.markdown
Playback+tutorials.markdown
Playback+tutorial+1%3A+Playbin2+usage.markdown
Playback+tutorial+2%3A+Subtitle+management.markdown
Playback+tutorial+3%3A+Short-cutting+the+pipeline.markdown
Playback+tutorial+4%3A+Progressive+streaming.markdown
Playback+tutorial+5%3A+Color+Balance.markdown
Playback+tutorial+6%3A+Audio+visualization.markdown
Playback+tutorial+7%3A+Custom+playbin2+sinks.markdown
Playback+tutorial+8%3A+Hardware-accelerated+video+decoding.markdown
Playback+tutorial+9%3A+Digital+audio+pass-through.markdown
Playback+tutorial+1+Playbin2+usage.markdown
Playback+tutorial+2+Subtitle+management.markdown
Playback+tutorial+3+Short-cutting+the+pipeline.markdown
Playback+tutorial+4+Progressive+streaming.markdown
Playback+tutorial+5+Color+Balance.markdown
Playback+tutorial+6+Audio+visualization.markdown
Playback+tutorial+7+Custom+playbin2+sinks.markdown
Playback+tutorial+8+Hardware-accelerated+video+decoding.markdown
Playback+tutorial+9+Digital+audio+pass-through.markdown
Android+tutorials.markdown
Android+tutorial+1%3A+Link+against+GStreamer.markdown
Android+tutorial+2%3A+A+running+pipeline.markdown
Android+tutorial+3%3A+Video.markdown
Android+tutorial+4%3A+A+basic+media+player.markdown
Android+tutorial+5%3A+A+Complete+media+player.markdown
Android+tutorial+1+Link+against+GStreamer.markdown
Android+tutorial+2+A+running+pipeline.markdown
Android+tutorial+3+Video.markdown
Android+tutorial+4+A+basic+media+player.markdown
Android+tutorial+5+A+Complete+media+player.markdown
iOS+tutorials.markdown
iOS+tutorial+1%3A+Link+against+GStreamer.markdown
iOS+tutorial+2%3A+A+running+pipeline.markdown
iOS+tutorial+3%3A+Video.markdown
iOS+tutorial+4%3A+A+basic+media+player.markdown
iOS+tutorial+5%3A+A+Complete+media+player.markdown
iOS+tutorial+1+Link+against+GStreamer.markdown
iOS+tutorial+2+A+running+pipeline.markdown
iOS+tutorial+3+Video.markdown
iOS+tutorial+4+A+basic+media+player.markdown
iOS+tutorial+5+A+Complete+media+player.markdown
Qt+tutorials.markdown
Basic+Media+Player.markdown
QtGStreamer+vs+C+GStreamer.markdown
Using+appsink%2Fappsrc+in+Qt.markdown
Table+of+Concepts.markdown
Upcoming+tutorials.markdown
Deploying+your+application.markdown
Mac+OS+X+deployment.markdown
Windows+deployment.markdown