docs: remove app dev manual and plugin writer's guide

They have moved to gst-docs and will be maintained there in future.
This commit is contained in:
Tim-Philipp Müller 2016-11-01 17:35:18 +00:00
parent b6e263d753
commit 81a69d956d
101 changed files with 2 additions and 48620 deletions

View file

@ -1057,7 +1057,6 @@ tests/examples/adapter/Makefile
tests/examples/controller/Makefile
tests/examples/stepping/Makefile
tests/examples/helloworld/Makefile
tests/examples/manual/Makefile
tests/examples/memory/Makefile
tests/examples/netclock/Makefile
tests/examples/ptp/Makefile
@ -1073,8 +1072,6 @@ docs/gst/Makefile
docs/gst/gstreamer.types
docs/libs/Makefile
docs/plugins/Makefile
docs/manual/Makefile
docs/pwg/Makefile
docs/slides/Makefile
docs/xsl/Makefile
docs/version.entities

View file

@ -1,5 +1,5 @@
if ENABLE_DOCBOOK
SUBDIRS_DOCBOOK = faq manual pwg
SUBDIRS_DOCBOOK = faq
else
SUBDIRS_DOCBOOK =
endif
@ -17,7 +17,7 @@ endif
BUILT_SOURCES = version.entities
SUBDIRS = design gst libs $(PLUGIN_DOCS_DIRS) $(SUBDIRS_DOCBOOK)
DIST_SUBDIRS = design gst libs plugins faq manual pwg slides xsl
DIST_SUBDIRS = design gst libs plugins faq slides xsl
EXTRA_DIST = \
manuals.mak htmlinstall.mak \

View file

@ -1,7 +0,0 @@
Makefile
Makefile.in
.deps
build
html
*.pdf
*.ps

View file

@ -1,41 +0,0 @@
### this is the part you can customize if you need to
# parallel builds don't work, probably due to temporary files
MAKEFLAGS = -j1
# base name of doc
DOC = manual
# formats defined for upload-doc.mak
FORMATS=html ps pdf
# main xml file
MAIN = $(DOC).xml
# all xml sources
XML = $(notdir $(wildcard $(srcdir)/*.xml))
# base style sheet
CSS = base.css
# image sources
PNG_SRC = $(notdir $(wildcard $(srcdir)/*.png))
FIG_SRC = $(notdir $(wildcard $(srcdir)/*.fig))
# extra sources to copy in build directory
EXTRA_SRC =
### this is the generic bit and you shouldn't need to change this
# get the generic docbuilding Makefile stuff
include $(srcdir)/../manuals.mak
# get the generic upload target
include $(top_srcdir)/common/upload-doc.mak
### this is standard automake stuff
# package up all the source
EXTRA_DIST = $(SRC) README
# install documentation
manualdir = $(docdir)/$(DOC)
manual_DATA = $(PDF_DAT) $(PS_DAT)
include $(srcdir)/../htmlinstall.mak

View file

@ -1,61 +0,0 @@
Current requirements for building the docs :
--------------------------------------------
libxslt >= 1.0.6
libxml2 >= 2.4.12
These are not included with RH72. They are in debian. GDE has good rpms.
To build pdf's from xslt stuff, you need xmltex and (on redhat)
passivetex. They are not known to have been built on either redhat or
debian yet though.
Wingo's new comments on the doc building :
------------------------------------------
* Well he should add them soon here since he overhauled it. And did a good
job on it too ;)
Thomas's new comments on the doc building :
-------------------------------------------
* originally the manual was written with DocBook 3.0 in mind, which
supported the graphic tag. That is now deprecated, so I changed it to
the new mediaobject tag set.
* eps files in images/ should be generated from the makefile. You need to
have fig2dev installed for that.
Ensonic's comments on the doc build system :
--------------------------------------------
In case you like to share files between the manual and the pwg - it's
not trivial.
Before anything is done, the build-system copies all xml files into the build
subdir and this breaks including shared docs via entities.
The examples should be updated in the xml. We run a perlscript in
tests/examples/manual that extracts them.
Wtay's original comments :
--------------------------
For now use:
db2html gstreamer-manual
(On debian, db2html is in the cygnus-stylesheets package)
You will need the png support for docbook (see GNOME documentation project)
convert the fig images to png with:
fig2dev -L png -s 16 fig/<input file>.fig images/<input file>.png
Put a link in the gstreamer-manual directory with
ln -s ../images gstreamer-manual/images
point your browser to gstreamer-manual/gstreamer.html
Fix typing errors and correct bad english.
Let me know about the stuff that needs some more explanation.
Let me know about the structure of the document.

View file

@ -1,242 +0,0 @@
<chapter id="chapter-autoplugging">
<title>Autoplugging</title>
<para>
In <xref linkend="chapter-helloworld"/>, you've learned to build a
simple media player for Ogg/Vorbis files. By using alternative elements,
you are able to build media players for other media types, such as
Ogg/Speex, MP3 or even video formats. However, you would rather want
to build an application that can automatically detect the media type
of a stream and automatically generate the best possible pipeline
by looking at all available elements in a system. This process is called
autoplugging, and &GStreamer; contains high-quality autopluggers. If
you're looking for an autoplugger, don't read any further and go to
<xref linkend="chapter-playback-components"/>. This chapter will explain the
<emphasis>concept</emphasis> of autoplugging and typefinding. It will
explain what systems &GStreamer; includes to dynamically detect the
type of a media stream, and how to generate a pipeline of decoder
elements to playback this media. The same principles can also be used
for transcoding. Because of the full dynamicity of this concept,
&GStreamer; can be automatically extended to support new media types
without needing any adaptations to its autopluggers.
</para>
<para>
We will first introduce the concept of Media types as a dynamic and
extendible way of identifying media streams. After that, we will introduce
the concept of typefinding to find the type of a media stream. Lastly,
we will explain how autoplugging and the &GStreamer; registry can be
used to setup a pipeline that will convert media from one mediatype to
another, for example for media decoding.
</para>
<sect1 id="section-media">
<title>Media types as a way to identify streams</title>
<para>
We have previously introduced the concept of capabilities as a way
for elements (or, rather, pads) to agree on a media type when
streaming data from one element to the next (see <xref
linkend="section-caps"/>). We have explained that a capability is
a combination of a media type and a set of properties. For most
container formats (those are the files that you will find on your
hard disk; Ogg, for example, is a container format), no properties
are needed to describe the stream. Only a media type is needed. A
full list of media types and accompanying properties can be found
in <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html">the
Plugin Writer's Guide</ulink>.
</para>
<para>
An element must associate a media type to its source and sink pads
when it is loaded into the system. &GStreamer; knows about the
different elements and what type of data they expect and emit through
the &GStreamer; registry. This allows for very dynamic and extensible
element creation as we will see.
</para>
<para>
In <xref linkend="chapter-helloworld"/>, we've learned to build a
music player for Ogg/Vorbis files. Let's look at the media types
associated with each pad in this pipeline. <xref
linkend="section-mime-img"/> shows what media type belongs to each
pad in this pipeline.
</para>
<figure float="1" id="section-mime-img">
<title>The Hello world pipeline with media types</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/mime-world.&image;" format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
Now that we have an idea how &GStreamer; identifies known media
streams, we can look at methods &GStreamer; uses to setup pipelines
for media handling and for media type detection.
</para>
</sect1>
<sect1 id="section-typefinding">
<title>Media stream type detection</title>
<para>
Usually, when loading a media stream, the type of the stream is not
known. This means that before we can choose a pipeline to decode the
stream, we first need to detect the stream type. &GStreamer; uses the
concept of typefinding for this. Typefinding is a normal part of a
pipeline, it will read data for as long as the type of a stream is
unknown. During this period, it will provide data to all plugins
that implement a typefinder. When one of the typefinders recognizes
the stream, the typefind element will emit a signal and act as a
passthrough module from that point on. If no type was found, it will
emit an error and further media processing will stop.
</para>
<para>
Once the typefind element has found a type, the application can
use this to plug together a pipeline to decode the media stream.
This will be discussed in the next section.
</para>
<para>
Plugins in &GStreamer; can, as mentioned before, implement typefinder
functionality. A plugin implementing this functionality will submit
a media type, optionally a set of file extensions commonly used for this
media type, and a typefind function. Once this typefind function inside
the plugin is called, the plugin will see if the data in this media
stream matches a specific pattern that marks the media type identified
by that media type. If it does, it will notify the typefind element of
this fact, telling which mediatype was recognized and how certain we
are that this stream is indeed that mediatype. Once this run has been
completed for all plugins implementing a typefind functionality, the
typefind element will tell the application what kind of media stream
it thinks to have recognized.
</para>
<para>
The following code should explain how to use the typefind element.
It will print the detected media type, or tell that the media type
was not found. The next section will introduce more useful behaviours,
such as plugging together a decoding pipeline.
</para>
<programlisting><!-- example-begin typefind.c a -->
#include &lt;gst/gst.h&gt;
<!-- example-end typefind.c a -->
[.. my_bus_callback goes here ..]<!-- example-begin typefind.c b --><!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
break;
}
/* remove from queue */
return TRUE;
}
--><!-- example-end typefind.c b -->
<!-- example-begin typefind.c c -->
static gboolean
idle_exit_loop (gpointer data)
{
g_main_loop_quit ((GMainLoop *) data);
/* once */
return FALSE;
}
static void
cb_typefound (GstElement *typefind,
guint probability,
GstCaps *caps,
gpointer data)
{
GMainLoop *loop = data;
gchar *type;
type = gst_caps_to_string (caps);
g_print ("Media type %s found, probability %d%%\n", type, probability);
g_free (type);
/* since we connect to a signal in the pipeline thread context, we need
* to set an idle handler to exit the main loop in the mainloop context.
* Normally, your app should not need to worry about such things. */
g_idle_add (idle_exit_loop, loop);
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *pipeline, *filesrc, *typefind, *fakesink;
GstBus *bus;
/* init GStreamer */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* check args */
if (argc != 2) {
g_print ("Usage: %s &lt;filename&gt;\n", argv[0]);
return -1;
}
/* create a new pipeline to hold the elements */
pipeline = gst_pipeline_new ("pipe");
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_watch (bus, my_bus_callback, NULL);
gst_object_unref (bus);
/* create file source and typefind element */
filesrc = gst_element_factory_make ("filesrc", "source");
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
typefind = gst_element_factory_make ("typefind", "typefinder");
g_signal_connect (typefind, "have-type", G_CALLBACK (cb_typefound), loop);
fakesink = gst_element_factory_make ("fakesink", "sink");
/* setup */
gst_bin_add_many (GST_BIN (pipeline), filesrc, typefind, fakesink, NULL);
gst_element_link_many (filesrc, typefind, fakesink, NULL);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
g_main_loop_run (loop);
/* unset */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
return 0;
}
<!-- example-end typefind.c c --></programlisting>
<para>
Once a media type has been detected, you can plug an element (e.g. a
demuxer or decoder) to the source pad of the typefind element, and
decoding of the media stream will start right after.
</para>
</sect1>
<sect1 id="section-dynamic">
<title>Dynamically autoplugging a pipeline</title>
<para>
See <xref linkend="chapter-playback-components"/> for using the high
level object that you can use to dynamically construct pipelines.
</para>
</sect1>
</chapter>

View file

@ -1,450 +0,0 @@
<chapter id="chapter-buffering">
<title>Buffering</title>
<para>
The purpose of buffering is to accumulate enough data in a pipeline so that
playback can occur smoothly and without interruptions. It is typically done
when reading from a (slow) and non-live network source but can also be
used for live sources.
</para>
<para>
&GStreamer; provides support for the following use cases:
<itemizedlist>
<listitem>
<para>
Buffering up to a specific amount of data, in memory, before starting
playback so that network fluctuations are minimized.
See <xref linkend="section-buffering-stream"/>.
</para>
</listitem>
<listitem>
<para>
Download of the network file to a local disk with fast seeking in the
downloaded data. This is similar to the quicktime/youtube players.
See <xref linkend="section-buffering-download"/>.
</para>
</listitem>
<listitem>
<para>
Caching of (semi)-live streams to a local, on disk, ringbuffer with
seeking in the cached area. This is similar to tivo-like timeshifting.
See <xref linkend="section-buffering-timeshift"/>.
</para>
</listitem>
</itemizedlist>
</para>
<para>
&GStreamer; can provide the application with progress reports about the
current buffering state as well as let the application decide on how
to buffer and when the buffering stops.
</para>
<para>
In the most simple case, the application has to listen for BUFFERING
messages on the bus. If the percent indicator inside the BUFFERING message
is smaller than 100, the pipeline is buffering. When a message is
received with 100 percent, buffering is complete. In the buffering state,
the application should keep the pipeline in the PAUSED state. When buffering
completes, it can put the pipeline (back) in the PLAYING state.
</para>
<para>
What follows is an example of how the message handler could deal with
the BUFFERING messages. We will see more advanced methods in
<xref linkend="section-buffering-strategies"/>.
</para>
<programlisting>
<![CDATA[
[...]
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_BUFFERING:{
gint percent;
/* no state management needed for live pipelines */
if (is_live)
break;
gst_message_parse_buffering (message, &percent);
if (percent == 100) {
/* a 100% message means buffering is done */
buffering = FALSE;
/* if the desired state is playing, go back */
if (target_state == GST_STATE_PLAYING) {
gst_element_set_state (pipeline, GST_STATE_PLAYING);
}
} else {
/* buffering busy */
if (!buffering && target_state == GST_STATE_PLAYING) {
/* we were not buffering but PLAYING, PAUSE the pipeline. */
gst_element_set_state (pipeline, GST_STATE_PAUSED);
}
buffering = TRUE;
}
break;
case ...
[...]
]]>
</programlisting>
<sect1 id="section-buffering-stream">
<title>Stream buffering </title>
<programlisting>
+---------+ +---------+ +-------+
| httpsrc | | buffer | | demux |
| src - sink src - sink ....
+---------+ +---------+ +-------+
</programlisting>
<para>
In this case we are reading from a slow network source into a buffer
element (such as queue2).
</para>
<para>
The buffer element has a low and high watermark expressed in bytes. The
buffer uses the watermarks as follows:
</para>
<itemizedlist>
<listitem>
<para>
The buffer element will post BUFFERING messages until the high
watermark is hit. This instructs the application to keep the pipeline
PAUSED, which will eventually block the srcpad from pushing while
data is prerolled in the sinks.
</para>
</listitem>
<listitem>
<para>
When the high watermark is hit, a BUFFERING message with 100% will be
posted, which instructs the application to continue playback.
</para>
</listitem>
<listitem>
<para>
When during playback, the low watermark is hit, the queue will start
posting BUFFERING messages again, making the application PAUSE the
pipeline again until the high watermark is hit again. This is called
the rebuffering stage.
</para>
</listitem>
<listitem>
<para>
During playback, the queue level will fluctuate between the high and
the low watermark as a way to compensate for network irregularities.
</para>
</listitem>
</itemizedlist>
<para>
This buffering method is usable when the demuxer operates in push mode.
Seeking in the stream requires the seek to happen in the network source.
It is mostly desirable when the total duration of the file is not known,
such as in live streaming or when efficient seeking is not
possible/required.
</para>
<para>
The problem is configuring a good low and high watermark. Here are some
ideas:
</para>
<itemizedlist>
<listitem>
<para>
It is possible to measure the network bandwidth and configure the
low/high watermarks in such a way that buffering takes a fixed
amount of time.
</para>
<para>
The queue2 element in &GStreamer; core has the max-size-time property
that, together with the use-rate-estimate property, does exactly
that. Also the playbin buffer-duration property uses the rate estimate
to scale the amount of data that is buffered.
</para>
</listitem>
<listitem>
<para>
Based on the codec bitrate, it is also possible to set the watermarks
in such a way that a fixed amount of data is buffered before playback
starts. Normally, the buffering element doesn't know about the
bitrate of the stream but it can get this with a query.
</para>
</listitem>
<listitem>
<para>
Start with a fixed amount of bytes, measure the time between
rebuffering and increase the queue size until the time between
rebuffering is within the application's chosen limits.
</para>
</listitem>
</itemizedlist>
<para>
The buffering element can be inserted anywhere in the pipeline. You could,
for example, insert the buffering element before a decoder. This would
make it possible to set the low/high watermarks based on time.
</para>
<para>
The buffering flag on playbin, performs buffering on the parsed data.
Another advantage of doing the buffering at a later stage is that you can
let the demuxer operate in pull mode. When reading data from a slow
network drive (with filesrc) this can be an interesting way to buffer.
</para>
</sect1>
<sect1 id="section-buffering-download">
<title>Download buffering </title>
<programlisting>
+---------+ +---------+ +-------+
| httpsrc | | buffer | | demux |
| src - sink src - sink ....
+---------+ +----|----+ +-------+
V
file
</programlisting>
<para>
If we know the server is streaming a fixed length file to the client,
the application can choose to download the entire file on disk. The
buffer element will provide a push or pull based srcpad to the demuxer
to navigate in the downloaded file.
</para>
<para>
This mode is only suitable when the client can determine the length of
the file on the server.
</para>
<para>
In this case, buffering messages will be emitted as usual when the
requested range is not within the downloaded area + buffersize. The
buffering message will also contain an indication that incremental
download is being performed. This flag can be used to let the application
control the buffering in a more intelligent way, using the BUFFERING
query, for example. See <xref linkend="section-buffering-strategies"/>.
</para>
</sect1>
<sect1 id="section-buffering-timeshift">
<title>Timeshift buffering </title>
<programlisting>
+---------+ +---------+ +-------+
| httpsrc | | buffer | | demux |
| src - sink src - sink ....
+---------+ +----|----+ +-------+
V
file-ringbuffer
</programlisting>
<para>
In this mode, a fixed size ringbuffer is kept to download the server
content. This allows for seeking in the buffered data. Depending on the
size of the ringbuffer one can seek further back in time.
</para>
<para>
This mode is suitable for all live streams. As with the incremental
download mode, buffering messages are emitted along with an indication
that timeshifting download is in progress.
</para>
</sect1>
<sect1 id="section-buffering-live">
<title>Live buffering </title>
<para>
In live pipelines we usually introduce some fixed latency between the
capture and the playback elements. This latency can be introduced by
a queue (such as a jitterbuffer) or by other means (in the audiosink).
</para>
<para>
Buffering messages can be emitted in those live pipelines as well and
serve as an indication to the user of the latency buffering. The
application usually does not react to these buffering messages with a
state change.
</para>
</sect1>
<sect1 id="section-buffering-strategies">
<title>Buffering strategies </title>
<para>
What follows are some ideas for implementing different buffering
strategies based on the buffering messages and buffering query.
</para>
<sect2 id="section-buffering-norebuffer">
<title>No-rebuffer strategy </title>
<para>
We would like to buffer enough data in the pipeline so that playback
continues without interruptions. What we need to know to implement
this is know the total remaining playback time in the file and the
total remaining download time. If the buffering time is less than the
playback time, we can start playback without interruptions.
</para>
<para>
We have all this information available with the DURATION, POSITION and
BUFFERING queries. We need to periodically execute the buffering query
to get the current buffering status. We also need to have a large
enough buffer to hold the complete file, worst case. It is best to
use this buffering strategy with download buffering (see
<xref linkend="section-buffering-download"/>).
</para>
<para>
This is what the code would look like:
</para>
<programlisting>
<!-- example-begin norebuffer.c a -->
<![CDATA[
#include <gst/gst.h>
GstState target_state;
static gboolean is_live;
static gboolean is_buffering;
static gboolean
buffer_timeout (gpointer data)
{
GstElement *pipeline = data;
GstQuery *query;
gboolean busy;
gint percent;
gint64 estimated_total;
gint64 position, duration;
guint64 play_left;
query = gst_query_new_buffering (GST_FORMAT_TIME);
if (!gst_element_query (pipeline, query))
return TRUE;
gst_query_parse_buffering_percent (query, &busy, &percent);
gst_query_parse_buffering_range (query, NULL, NULL, NULL, &estimated_total);
if (estimated_total == -1)
estimated_total = 0;
/* calculate the remaining playback time */
if (!gst_element_query_position (pipeline, GST_FORMAT_TIME, &position))
position = -1;
if (!gst_element_query_duration (pipeline, GST_FORMAT_TIME, &duration))
duration = -1;
if (duration != -1 && position != -1)
play_left = GST_TIME_AS_MSECONDS (duration - position);
else
play_left = 0;
g_message ("play_left %" G_GUINT64_FORMAT", estimated_total %" G_GUINT64_FORMAT
", percent %d", play_left, estimated_total, percent);
/* we are buffering or the estimated download time is bigger than the
* remaining playback time. We keep buffering. */
is_buffering = (busy || estimated_total * 1.1 > play_left);
if (!is_buffering)
gst_element_set_state (pipeline, target_state);
return is_buffering;
}
static void
on_message_buffering (GstBus *bus, GstMessage *message, gpointer user_data)
{
GstElement *pipeline = user_data;
gint percent;
/* no state management needed for live pipelines */
if (is_live)
return;
gst_message_parse_buffering (message, &percent);
if (percent < 100) {
/* buffering busy */
if (!is_buffering) {
is_buffering = TRUE;
if (target_state == GST_STATE_PLAYING) {
/* we were not buffering but PLAYING, PAUSE the pipeline. */
gst_element_set_state (pipeline, GST_STATE_PAUSED);
}
}
}
}
static void
on_message_async_done (GstBus *bus, GstMessage *message, gpointer user_data)
{
GstElement *pipeline = user_data;
if (!is_buffering)
gst_element_set_state (pipeline, target_state);
else
g_timeout_add (500, buffer_timeout, pipeline);
}
gint
main (gint argc,
gchar *argv[])
{
GstElement *pipeline;
GMainLoop *loop;
GstBus *bus;
GstStateChangeReturn ret;
/* init GStreamer */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have a URI */
if (argc != 2) {
g_print ("Usage: %s &lt;URI&gt;\n", argv[0]);
return -1;
}
/* set up */
pipeline = gst_element_factory_make ("playbin", "pipeline");
g_object_set (G_OBJECT (pipeline), "uri", argv[1], NULL);
g_object_set (G_OBJECT (pipeline), "flags", 0x697 , NULL);
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message::buffering",
(GCallback) on_message_buffering, pipeline);
g_signal_connect (bus, "message::async-done",
(GCallback) on_message_async_done, pipeline);
gst_object_unref (bus);
is_buffering = FALSE;
target_state = GST_STATE_PLAYING;
ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
switch (ret) {
case GST_STATE_CHANGE_SUCCESS:
is_live = FALSE;
break;
case GST_STATE_CHANGE_FAILURE:
g_warning ("failed to PAUSE");
return -1;
case GST_STATE_CHANGE_NO_PREROLL:
is_live = TRUE;
break;
default:
break;
}
/* now run */
g_main_loop_run (loop);
/* also clean up */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
g_main_loop_unref (loop);
return 0;
}
]]>
<!-- example-end norebuffer.c a -->
</programlisting>
<para>
See how we set the pipeline to the PAUSED state first. We will receive
buffering messages during the preroll state when buffering is needed.
When we are prerolled (on_message_async_done) we see if buffering is
going on, if not, we start playback. If buffering was going on, we start
a timeout to poll the buffering state. If the estimated time to download
is less than the remaining playback time, we start playback.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,295 +0,0 @@
<chapter id="chapter-clocks">
<title>Clocks and synchronization in &GStreamer;</title>
<para>
When playing complex media, each sound and video sample must be played in a
specific order at a specific time. For this purpose, GStreamer provides a
synchronization mechanism.
</para>
<para>
&GStreamer; provides support for the following use cases:
<itemizedlist>
<listitem>
<para>
Non-live sources with access faster than playback rate. This is
the case where one is reading media from a file and playing it
back in a synchronized fashion. In this case, multiple streams need
to be synchronized, like audio, video and subtitles.
</para>
</listitem>
<listitem>
<para>
Capture and synchronized muxing/mixing of media from multiple live
sources. This is a typical use case where you record audio and
video from a microphone/camera and mux it into a file for
storage.
</para>
</listitem>
<listitem>
<para>
Streaming from (slow) network streams with buffering. This is the
typical web streaming case where you access content from a streaming
server with http.
</para>
</listitem>
<listitem>
<para>
Capture from live source and and playback to live source with
configurable latency. This is used when, for example, capture from
a camera, apply an effect and display the result. It is also used
when streaming low latency content over a network with UDP.
</para>
</listitem>
<listitem>
<para>
Simultaneous live capture and playback from prerecorded content.
This is used in audio recording cases where you play a previously
recorded audio and record new samples, the purpose is to have the
new audio perfectly in sync with the previously recorded data.
</para>
</listitem>
</itemizedlist>
</para>
<para>
&GStreamer; uses a <classname>GstClock</classname> object, buffer
timestamps and a SEGMENT event to synchronize streams in a pipeline
as we will see in the next sections.
</para>
<sect1 id="section-clock-time-types" xreflabel="Clock running-time">
<title>Clock running-time </title>
<para>
In a typical computer, there are many sources that can be used as a
time source, e.g., the system time, soundcards, CPU performance
counters, ... For this reason, there are many
<classname>GstClock</classname> implementations available in &GStreamer;.
The clock time doesn't always start from 0 or from some known value.
Some clocks start counting from some known start date, other clocks start
counting since last reboot, etc...
</para>
<para>
A <classname>GstClock</classname> returns the
<emphasis role="strong">absolute-time</emphasis>
according to that clock with <function>gst_clock_get_time ()</function>.
The absolute-time (or clock time) of a clock is monotonically increasing.
From the absolute-time is a <emphasis role="strong">running-time</emphasis>
calculated, which is simply the difference between a previous snapshot
of the absolute-time called the <emphasis role="strong">base-time</emphasis>.
So:
</para>
<para>
running-time = absolute-time - base-time
</para>
<para>
A &GStreamer; <classname>GstPipeline</classname> object maintains a
<classname>GstClock</classname> object and a base-time when it goes
to the PLAYING state. The pipeline gives a handle to the selected
<classname>GstClock</classname> to each element in the pipeline along
with selected base-time. The pipeline will select a base-time in such
a way that the running-time reflects the total time spent in the
PLAYING state. As a result, when the pipeline is PAUSED, the
running-time stands still.
</para>
<para>
Because all objects in the pipeline have the same clock and base-time,
they can thus all calculate the running-time according to the pipeline
clock.
</para>
</sect1>
<sect1 id="section-buffer-running-time" xreflabel="Buffer running-time">
<title>Buffer running-time</title>
<para>
To calculate a buffer running-time, we need a buffer timestamp and
the SEGMENT event that preceeded the buffer. First we can convert
the SEGMENT event into a <classname>GstSegment</classname> object
and then we can use the
<function>gst_segment_to_running_time ()</function> function to
perform the calculation of the buffer running-time.
</para>
<para>
Synchronization is now a matter of making sure that a buffer with a
certain running-time is played when the clock reaches the same
running-time. Usually this task is done by sink elements. Sink also
have to take into account the latency configured in the pipeline and
add this to the buffer running-time before synchronizing to the
pipeline clock.
</para>
<para>
Non-live sources timestamp buffers with a running-time starting
from 0. After a flushing seek, they will produce buffers again
from a running-time of 0.
</para>
<para>
Live sources need to timestamp buffers with a running-time matching
the pipeline running-time when the first byte of the buffer was
captured.
</para>
</sect1>
<sect1 id="section-buffer-stream-time" xreflabel="Buffer stream-time">
<title>Buffer stream-time</title>
<para>
The buffer stream-time, also known as the position in the stream,
is calculated from the buffer timestamps and the preceding SEGMENT
event. It represents the time inside the media as a value between
0 and the total duration of the media.
</para>
<para>
The stream-time is used in:
<itemizedlist>
<listitem>
<para>
Report the current position in the stream with the POSITION
query.
</para>
</listitem>
<listitem>
<para>
The position used in the seek events and queries.
</para>
</listitem>
<listitem>
<para>
The position used to synchronize controlled values.
</para>
</listitem>
</itemizedlist>
</para>
<para>
The stream-time is never used to synchronize streams, this is only
done with the running-time.
</para>
</sect1>
<sect1 id="section-time-overview" xreflabel="Time overview">
<title>Time overview</title>
<para>
Here is an overview of the various timelines used in &GStreamer;.
</para>
<para>
The image below represents the different times in the pipeline when
playing a 100ms sample and repeating the part between 50ms and
100ms.
</para>
<figure float="1" id="chapter-clock-img">
<title>&GStreamer; clock and various times</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/clocks.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
You can see how the running-time of a buffer always increments
monotonically along with the clock-time. Buffers are played when their
running-time is equal to the clock-time - base-time. The stream-time
represents the position in the stream and jumps backwards when
repeating.
</para>
</sect1>
<sect1 id="section-clocks-providers">
<title>Clock providers</title>
<para>
A clock provider is an element in the pipeline that can provide
a <classname>GstClock</classname> object. The clock object needs to
report an absolute-time that is monotonically increasing when the
element is in the PLAYING state. It is allowed to pause the clock
while the element is PAUSED.
</para>
<para>
Clock providers exist because they play back media at some rate, and
this rate is not necessarily the same as the system clock rate. For
example, a soundcard may playback at 44,1 kHz, but that doesn't mean
that after <emphasis>exactly</emphasis> 1 second <emphasis>according
to the system clock</emphasis>, the soundcard has played back 44.100
samples. This is only true by approximation. In fact, the audio
device has an internal clock based on the number of samples played
that we can expose.
</para>
<para>
If an element with an internal clock needs to synchronize, it needs
to estimate when a time according to the pipeline clock will take
place according to the internal clock. To estimate this, it needs
to slave its clock to the pipeline clock.
</para>
<para>
If the pipeline clock is exactly the internal clock of an element,
the element can skip the slaving step and directly use the pipeline
clock to schedule playback. This can be both faster and more
accurate.
Therefore, generally, elements with an internal clock like audio
input or output devices will be a clock provider for the pipeline.
</para>
<para>
When the pipeline goes to the PLAYING state, it will go over all
elements in the pipeline from sink to source and ask each element
if they can provide a clock. The last element that can provide a
clock will be used as the clock provider in the pipeline.
This algorithm prefers a clock from an audio sink in a typical
playback pipeline and a clock from source elements in a typical
capture pipeline.
</para>
<para>
There exist some bus messages to let you know about the clock and
clock providers in the pipeline. You can see what clock is selected
in the pipeline by looking at the NEW_CLOCK message on the bus.
When a clock provider is removed from the pipeline, a CLOCK_LOST
message is posted and the application should go to PAUSED and back
to PLAYING to select a new clock.
</para>
</sect1>
<sect1 id="section-clocks-latency">
<title>Latency</title>
<para>
The latency is the time it takes for a sample captured at timestamp X
to reach the sink. This time is measured against the clock in the
pipeline. For pipelines where the only elements that synchronize against
the clock are the sinks, the latency is always 0 since no other element
is delaying the buffer.
</para>
<para>
For pipelines with live sources, a latency is introduced, mostly because
of the way a live source works. Consider an audio source, it will start
capturing the first sample at time 0. If the source pushes buffers with
44100 samples at a time at 44100Hz it will have collected the buffer at
second 1. Since the timestamp of the buffer is 0 and the time of the
clock is now >= 1 second, the sink will drop this buffer because it is
too late. Without any latency compensation in the sink, all buffers will
be dropped.
</para>
<sect2 id="section-latency-compensation">
<title>Latency compensation</title>
<para>
Before the pipeline goes to the PLAYING state, it will, in addition to
selecting a clock and calculating a base-time, calculate the latency
in the pipeline. It does this by doing a LATENCY query on all the sinks
in the pipeline. The pipeline then selects the maximum latency in the
pipeline and configures this with a LATENCY event.
</para>
<para>
All sink elements will delay playback by the value in the LATENCY event.
Since all sinks delay with the same amount of time, they will be
relative in sync.
</para>
</sect2>
<sect2 id="section-latency-dynamic">
<title>Dynamic Latency</title>
<para>
Adding/removing elements to/from a pipeline or changing element
properties can change the latency in a pipeline. An element can
request a latency change in the pipeline by posting a LATENCY
message on the bus. The application can then decide to query and
redistribute a new latency or not. Changing the latency in a
pipeline might cause visual or audible glitches and should
therefore only be done by the application when it is allowed.
</para>
</sect2>
</sect1>
</chapter>

File diff suppressed because it is too large Load diff

View file

@ -1,102 +0,0 @@
<chapter id="chapter-dparams">
<title>Dynamic Controllable Parameters</title>
<sect1 id="section-dparams-getting-started">
<title>Getting Started</title>
<para>
The controller subsystem offers a lightweight way to adjust gobject
properties over stream-time. Normally these properties are changed using
<function>g_object_set()</function>. Timing those calls reliably so that
the changes affect certain stream times is close to impossible. The
controller takes time into account. It works by attaching control-sources
to properties using control-bindings. Control-sources provide values for a
given time-stamp that are usually in the range of 0.0 to 1.0.
Control-bindings map the control-value to a gobject property they are bound to
- converting the type and scaling to the target property value range.
At run-time the elements continuously pull values changes for the current
stream-time to update the gobject properties. GStreamer includes a few
different control-sources and control-bindings already, but applications can
define their own by sub-classing from the respective base classes.
</para>
<para>
Most parts of the controller mechanism is implemented in GstObject. Also the
base classes for control-sources and control-bindings are included in the core
library. The existing implementations are contained within the
<filename>gstcontroller</filename> library.
You need to include the header in your application's source file:
</para>
<programlisting>
...
#include &lt;gst/gst.h&gt;
#include &lt;gst/controller/gstinterpolationcontrolsource.h&gt;
#include &lt;gst/controller/gstdirectcontrolbinding.h&gt;
...
</programlisting>
<para>
Your application should link to the shared library
<filename>gstreamer-controller</filename>. One can get the required flag for
compiler and linker by using pkg-config for gstreamer-controller-1.0.
</para>
</sect1>
<sect1 id="section-dparams-parameters">
<title>Setting up parameter control</title>
<para>
If we have our pipeline set up and want to control some parameters, we first
need to create a control-source. Lets use an interpolation control-source:
</para>
<programlisting>
csource = gst_interpolation_control_source_new ();
g_object_set (csource, "mode", GST_INTERPOLATION_MODE_LINEAR, NULL);
</programlisting>
<para>
Now we need to attach the control-source to the gobject property. This is done
with a control-binding. One control source can be attached to several object
properties (even in different objects) using separate control-bindings.
</para>
<programlisting>
gst_object_add_control_binding (object, gst_direct_control_binding_new (object, "prop1", csource));
</programlisting>
<para>
This type control-source takes new property values from a list of time-stamped
parameter changes. The source can e.g. fill gaps by smoothing parameter changes
This behavior can be configured by setting the mode property of the
control-source. Other control sources e.g. produce a stream of values by
calling <function>sin()</function> function. They have parameters to control
e.g. the frequency. As control-sources are GstObjects too, one can attach
control-sources to these properties too.
</para>
<para>
Now we can set some control points. These are time-stamped gdouble values and
are usually in the range of 0.0 to 1.0. A value of 1.0 is later mapped to the
maximum value in the target properties value range.
The values become active when the timestamp is reached. They still stay
in the list. If e.g. the pipeline runs a loop (using a segmented seek),
the control-curve gets repeated as well.
</para>
<programlisting>
GstTimedValueControlSource *tv_csource = (GstTimedValueControlSource *)csource;
gst_timed_value_control_source_set (tv_csource, 0 * GST_SECOND, 0.0);
gst_timed_value_control_source_set (tv_csource, 1 * GST_SECOND, 1.0);
</programlisting>
<para>
Now everything is ready to play. If the control-source is e.g. bound to a
volume property, we will head a fade-in over 1 second. One word of caution,
the volume element that comes with gstreamer has a value range of 0.0 to 4.0
on its volume property. If the above control-source is attached to the property
the volume will ramp up to 400%!
</para>
<para>
One final note - the controller subsystem has a built-in live-mode. Even though
a property has a control-source assigned one can change the GObject property
through the <function>g_object_set()</function>.
This is highly useful when binding the GObject properties to GUI widgets.
When the user adjusts the value with the widget, one can set the GObject
property and this remains active until the next programmed control-source
value overrides it. This also works with smoothed parameters. It does not
work for control-sources that constantly update the property (e.g. the
lfo_control_source).
</para>
</sect1>
</chapter>

View file

@ -1,82 +0,0 @@
<chapter id="chapter-interfaces">
<title>Interfaces</title>
<para>
In <xref linkend="section-elements-properties"/>, you have learned how
to use <classname>GObject</classname> properties as a simple way to do
interaction between applications and elements. This method suffices for
the simple'n'straight settings, but fails for anything more complicated
than a getter and setter. For the more complicated use cases, &GStreamer;
uses interfaces based on the GObject <ulink type="http"
url="http://library.gnome.org/devel/gobject/stable/gtype-non-instantiable-classed.html"><classname>GTypeInterface</classname></ulink>
type.
</para>
<para>
Most of the interfaces handled here will not contain any example code.
See the API references for details. Here, we will just describe the
scope and purpose of each interface.
</para>
<sect1 id="section-interfaces-uri">
<title>The URI interface</title>
<para>
In all examples so far, we have only supported local files through the
<quote>filesrc</quote> element. &GStreamer;, obviously, supports many
more location sources. However, we don't want applications to need to
know any particular element implementation details, such as element
names for particular network source types and so on. Therefore, there
is a URI interface, which can be used to get the source element that
supports a particular URI type. There is no strict rule for URI naming,
but in general we follow naming conventions that others use, too. For
example, assuming you have the correct plugins installed, &GStreamer;
supports <quote>file:///&lt;path&gt;/&lt;file&gt;</quote>,
<quote>http://&lt;host&gt;/&lt;path&gt;/&lt;file&gt;</quote>,
<quote>mms://&lt;host&gt;/&lt;path&gt;/&lt;file&gt;</quote>, and so on.
</para>
<para>
In order to get the source or sink element supporting a particular URI,
use <function>gst_element_make_from_uri ()</function>, with the URI
type being either <classname>GST_URI_SRC</classname> for a source
element, or <classname>GST_URI_SINK</classname> for a sink element.
</para>
<para>
You can convert filenames to and from URIs using GLib's
<function>g_filename_to_uri ()</function> and
<function>g_uri_to_filename ()</function>.
</para>
</sect1>
<sect1 id="section-interfaces-colorbalance">
<title>The Color Balance interface</title>
<para>
The colorbalance interface is a way to control video-related properties
on an element, such as brightness, contrast and so on. It's sole
reason for existence is that, as far as its authors know, there's no
way to dynamically register properties using
<classname>GObject</classname>.
</para>
<para>
The colorbalance interface is implemented by several plugins, including
xvimagesink and the Video4linux2 elements.
</para>
</sect1>
<sect1 id="section-interfaces-videooverlay">
<title>The Video Overlay interface</title>
<para>
The Video Overlay interface was created to solve the problem of embedding
video streams in an application window. The application provides an
window handle to the element implementing this interface to draw on, and
the element will then use this window handle to draw on rather than creating
a new toplevel window. This is useful to embed video in video players.
</para>
<para>
This interface is implemented by, amongst others, the Video4linux2
elements and by ximagesink, xvimagesink and sdlvideosink.
</para>
</sect1>
</chapter>

View file

@ -1,196 +0,0 @@
<chapter id="chapter-metadata">
<title>Metadata</title>
<para>
&GStreamer; makes a clear distinction between two types of metadata, and
has support for both types. The first is stream tags, which describe the
content of a stream in a non-technical way. Examples include the author
of a song, the title of that very same song or the album it is a part of.
The other type of metadata is stream-info, which is a somewhat technical
description of the properties of a stream. This can include video size,
audio samplerate, codecs used and so on. Tags are handled using the
&GStreamer; tagging system. Stream-info can be retrieved from a
<classname>GstPad</classname> by getting the current (negotiated)
<classname>GstCaps</classname> for that pad.
</para>
<sect1 id="section-tags-read">
<title>Metadata reading</title>
<para>
Stream information can most easily be read by reading it from a
<classname>GstPad</classname>. This has already been discussed before
in <xref linkend="section-caps-metadata"/>. Therefore, we will skip
it here. Note that this requires access to all pads of which you
want stream information.
</para>
<para>
Tag reading is done through a bus in &GStreamer;, which has been
discussed previously in <xref linkend="chapter-bus"/>. You can
listen for <classname>GST_MESSAGE_TAG</classname> messages and handle
them as you wish.
</para>
<para>
Note, however, that the <classname>GST_MESSAGE_TAG</classname>
message may be fired multiple times in the pipeline. It is the
application's responsibility to put all those tags together and
display them to the user in a nice, coherent way. Usually, using
<function>gst_tag_list_merge ()</function> is a good enough way
of doing this; make sure to empty the cache when loading a new song,
or after every few minutes when listening to internet radio. Also,
make sure you use <classname>GST_TAG_MERGE_PREPEND</classname> as
merging mode, so that a new title (which came in later) has a
preference over the old one for display.
</para>
<para>
The following example will extract tags from a file and print them:
</para>
<programlisting>
/* compile with:
* gcc -o tags tags.c `pkg-config --cflags --libs gstreamer-1.0` */
#include &lt;gst/gst.h&gt;
static void
print_one_tag (const GstTagList * list, const gchar * tag, gpointer user_data)
{
int i, num;
num = gst_tag_list_get_tag_size (list, tag);
for (i = 0; i &lt; num; ++i) {
const GValue *val;
/* Note: when looking for specific tags, use the gst_tag_list_get_xyz() API,
* we only use the GValue approach here because it is more generic */
val = gst_tag_list_get_value_index (list, tag, i);
if (G_VALUE_HOLDS_STRING (val)) {
g_print ("\t%20s : %s\n", tag, g_value_get_string (val));
} else if (G_VALUE_HOLDS_UINT (val)) {
g_print ("\t%20s : %u\n", tag, g_value_get_uint (val));
} else if (G_VALUE_HOLDS_DOUBLE (val)) {
g_print ("\t%20s : %g\n", tag, g_value_get_double (val));
} else if (G_VALUE_HOLDS_BOOLEAN (val)) {
g_print ("\t%20s : %s\n", tag,
(g_value_get_boolean (val)) ? "true" : "false");
} else if (GST_VALUE_HOLDS_BUFFER (val)) {
GstBuffer *buf = gst_value_get_buffer (val);
guint buffer_size = gst_buffer_get_size (buf);
g_print ("\t%20s : buffer of size %u\n", tag, buffer_size);
} else if (GST_VALUE_HOLDS_DATE_TIME (val)) {
GstDateTime *dt = g_value_get_boxed (val);
gchar *dt_str = gst_date_time_to_iso8601_string (dt);
g_print ("\t%20s : %s\n", tag, dt_str);
g_free (dt_str);
} else {
g_print ("\t%20s : tag of type '%s'\n", tag, G_VALUE_TYPE_NAME (val));
}
}
}
static void
on_new_pad (GstElement * dec, GstPad * pad, GstElement * fakesink)
{
GstPad *sinkpad;
sinkpad = gst_element_get_static_pad (fakesink, "sink");
if (!gst_pad_is_linked (sinkpad)) {
if (gst_pad_link (pad, sinkpad) != GST_PAD_LINK_OK)
g_error ("Failed to link pads!");
}
gst_object_unref (sinkpad);
}
int
main (int argc, char ** argv)
{
GstElement *pipe, *dec, *sink;
GstMessage *msg;
gchar *uri;
gst_init (&amp;argc, &amp;argv);
if (argc &lt; 2)
g_error ("Usage: %s FILE or URI", argv[0]);
if (gst_uri_is_valid (argv[1])) {
uri = g_strdup (argv[1]);
} else {
uri = gst_filename_to_uri (argv[1], NULL);
}
pipe = gst_pipeline_new ("pipeline");
dec = gst_element_factory_make ("uridecodebin", NULL);
g_object_set (dec, "uri", uri, NULL);
gst_bin_add (GST_BIN (pipe), dec);
sink = gst_element_factory_make ("fakesink", NULL);
gst_bin_add (GST_BIN (pipe), sink);
g_signal_connect (dec, "pad-added", G_CALLBACK (on_new_pad), sink);
gst_element_set_state (pipe, GST_STATE_PAUSED);
while (TRUE) {
GstTagList *tags = NULL;
msg = gst_bus_timed_pop_filtered (GST_ELEMENT_BUS (pipe),
GST_CLOCK_TIME_NONE,
GST_MESSAGE_ASYNC_DONE | GST_MESSAGE_TAG | GST_MESSAGE_ERROR);
if (GST_MESSAGE_TYPE (msg) != GST_MESSAGE_TAG) /* error or async_done */
break;
gst_message_parse_tag (msg, &amp;tags);
g_print ("Got tags from element %s:\n", GST_OBJECT_NAME (msg-&gt;src));
gst_tag_list_foreach (tags, print_one_tag, NULL);
g_print ("\n");
gst_tag_list_unref (tags);
gst_message_unref (msg);
}
if (GST_MESSAGE_TYPE (msg) == GST_MESSAGE_ERROR) {
GError *err = NULL;
gst_message_parse_error (msg, &amp;err, NULL);
g_printerr ("Got error: %s\n", err->message);
g_error_free (err);
}
gst_message_unref (msg);
gst_element_set_state (pipe, GST_STATE_NULL);
gst_object_unref (pipe);
g_free (uri);
return 0;
}
</programlisting>
</sect1>
<sect1 id="section-tags-write">
<title>Tag writing</title>
<para>
Tag writing is done using the <ulink type="http"
url="&URLAPI;GstTagSetter.html"><classname>GstTagSetter</classname></ulink>
interface. All that's required is a tag-set-supporting element in
your pipeline. In order to see if any of the elements in your
pipeline supports tag writing, you can use the function
<function>gst_bin_iterate_all_by_interface (pipeline,
GST_TYPE_TAG_SETTER)</function>. On the resulting element, usually
an encoder or muxer, you can use <function>gst_tag_setter_merge
()</function> (with a taglist) or <function>gst_tag_setter_add
()</function> (with individual tags) to set tags on it.
</para>
<para>
A nice extra feature in &GStreamer; tag support is that tags are
preserved in pipelines. This means that if you transcode one file
containing tags into another media type, and that new media type
supports tags too, then the tags will be handled as part of the
data stream and be merged into the newly written media file, too.
</para>
</sect1>
</chapter>

View file

@ -1,235 +0,0 @@
<chapter id="chapter-queryevents">
<title>Position tracking and seeking</title>
<para>
So far, we've looked at how to create a pipeline to do media processing
and how to make it run. Most application developers will be interested
in providing feedback to the user on media progress. Media players, for
example, will want to show a slider showing the progress in the song,
and usually also a label indicating stream length. Transcoding
applications will want to show a progress bar on how much percent of
the task is done. &GStreamer; has built-in support for doing all this
using a concept known as <emphasis>querying</emphasis>. Since seeking
is very similar, it will be discussed here as well. Seeking is done
using the concept of <emphasis>events</emphasis>.
</para>
<sect1 id="section-querying">
<title>Querying: getting the position or length of a stream</title>
<para>
Querying is defined as requesting a specific stream property related
to progress tracking. This includes getting the length of a stream (if
available) or getting the current position. Those stream properties
can be retrieved in various formats such as time, audio samples, video
frames or bytes. The function most commonly used for this is
<function>gst_element_query ()</function>, although some convenience
wrappers are provided as well (such as
<function>gst_element_query_position ()</function> and
<function>gst_element_query_duration ()</function>). You can generally
query the pipeline directly, and it'll figure out the internal details
for you, like which element to query.
</para>
<para>
Internally, queries will be sent to the sinks, and
<quote>dispatched</quote> backwards until one element can handle it;
that result will be sent back to the function caller. Usually, that
is the demuxer, although with live sources (from a webcam), it is the
source itself.
</para>
<programlisting>
<!-- example-begin query.c a -->
#include &lt;gst/gst.h&gt;
<!-- example-end query.c a -->
<!-- example-begin query.c b --><!--
static void
my_bus_message_cb (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = (GMainLoop *) data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
break;
}
}
-->
<!-- example-end query.c b -->
<!-- example-begin query.c c -->
static gboolean
cb_print_position (GstElement *pipeline)
{
gint64 pos, len;
if (gst_element_query_position (pipeline, GST_FORMAT_TIME, &amp;pos)
&amp;&amp; gst_element_query_duration (pipeline, GST_FORMAT_TIME, &amp;len)) {
g_print ("Time: %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r",
GST_TIME_ARGS (pos), GST_TIME_ARGS (len));
}
/* call me again */
return TRUE;
}
gint
main (gint argc,
gchar *argv[])
{
GstElement *pipeline;
<!-- example-end query.c c -->
[..]<!-- example-begin query.c d --><!--
GstStateChangeReturn ret;
GMainLoop *loop;
GError *err = NULL;
GstBus *bus;
gchar *l;
/* init */
gst_init (&amp;argc, &amp;argv);
/* args */
if (argc != 2) {
g_print ("Usage: %s &lt;filename&gt;\n", argv[0]);
return -1;
}
loop = g_main_loop_new (NULL, FALSE);
/* build pipeline, the easy way */
l = g_strdup_printf ("filesrc location=\"%s\" ! oggdemux ! vorbisdec ! "
"audioconvert ! audioresample ! alsasink",
argv[1]);
pipeline = gst_parse_launch (l, &amp;err);
if (pipeline == NULL || err != NULL) {
g_printerr ("Cannot build pipeline: %s\n", err->message);
g_error_free (err);
g_free (l);
if (pipeline)
gst_object_unref (pipeline);
return -1;
}
g_free (l);
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message", G_CALLBACK (my_bus_message_cb), loop);
gst_object_unref (bus);
/* play */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE)
g_error ("Failed to set pipeline to PLAYING.\n");
--><!-- example-end query.c d -->
<!-- example-begin query.c e -->
/* run pipeline */
g_timeout_add (200, (GSourceFunc) cb_print_position, pipeline);
g_main_loop_run (loop);
<!-- example-end query.c e -->
[..]<!-- example-begin query.c f --><!--
/* clean up */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
return 0;
--><!-- example-end query.c f -->
<!-- example-begin query.c g -->
}
<!-- example-end query.c g --></programlisting>
</sect1>
<sect1 id="section-eventsseek">
<title>Events: seeking (and more)</title>
<para>
Events work in a very similar way as queries. Dispatching, for
example, works exactly the same for events (and also has the same
limitations), and they can similarly be sent to the toplevel pipeline
and it will figure out everything for you. Although there are more
ways in which applications and elements can interact using events,
we will only focus on seeking here. This is done using the seek-event.
A seek-event contains a playback rate, a seek offset format (which is
the unit of the offsets to follow, e.g. time, audio samples, video
frames or bytes), optionally a set of seeking-related flags (e.g.
whether internal buffers should be flushed), a seek method (which
indicates relative to what the offset was given), and seek offsets.
The first offset (cur) is the new position to seek to, while
the second offset (stop) is optional and specifies a position where
streaming is supposed to stop. Usually it is fine to just specify
GST_SEEK_TYPE_NONE and -1 as end_method and end offset. The behaviour
of a seek is also wrapped in the <function>gst_element_seek ()</function>.
</para>
<programlisting>
static void
seek_to_time (GstElement *pipeline,
gint64 time_nanoseconds)
{
if (!gst_element_seek (pipeline, 1.0, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH,
GST_SEEK_TYPE_SET, time_nanoseconds,
GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
g_print ("Seek failed!\n");
}
}
</programlisting>
<para>
Seeks with the GST_SEEK_FLAG_FLUSH should be done when the pipeline is
in PAUSED or PLAYING state. The pipeline will automatically go to preroll
state until the new data after the seek will cause the pipeline to preroll
again. After the pipeline is prerolled, it will go back to the state
(PAUSED or PLAYING) it was in when the seek was executed. You can wait
(blocking) for the seek to complete with
<function>gst_element_get_state()</function> or by waiting for the
ASYNC_DONE message to appear on the bus.
</para>
<para>
Seeks without the GST_SEEK_FLAG_FLUSH should only be done when the
pipeline is in the PLAYING state. Executing a non-flushing seek in the
PAUSED state might deadlock because the pipeline streaming threads might
be blocked in the sinks.
</para>
<para>
It is important to realise that seeks will not happen instantly in the
sense that they are finished when the function
<function>gst_element_seek ()</function> returns. Depending on the
specific elements involved, the actual seeking might be done later in
another thread (the streaming thread), and it might take a short time
until buffers from the new seek position will reach downstream elements
such as sinks (if the seek was non-flushing then it might take a bit
longer).
</para>
<para>
It is possible to do multiple seeks in short time-intervals, such as
a direct response to slider movement. After a seek, internally, the
pipeline will be paused (if it was playing), the position will be
re-set internally, the demuxers and decoders will decode from the new
position onwards and this will continue until all sinks have data
again. If it was playing originally, it will be set to playing again,
too. Since the new position is immediately available in a video output,
you will see the new frame, even if your pipeline is not in the playing
state.
</para>
</sect1>
</chapter>

View file

@ -1,481 +0,0 @@
<chapter id="chapter-threads">
<title>Threads</title>
<para>
&GStreamer; is inherently multi-threaded, and is fully thread-safe.
Most threading internals are hidden from the application, which should
make application development easier. However, in some cases, applications
may want to have influence on some parts of those. &GStreamer; allows
applications to force the use of multiple threads over some parts of
a pipeline.
See <xref linkend="section-threads-uses"/>.
</para>
<para>
&GStreamer; can also notify you when threads are created so that you can
configure things such as the thread priority or the threadpool to use.
See <xref linkend="section-threads-status"/>.
</para>
<sect1 id="section-threads-scheduling">
<title>Scheduling in &GStreamer;</title>
<para>
Each element in the &GStreamer; pipeline decides how it is going to
be scheduled. Elements can choose if their pads are to be scheduled
push-based or pull-based. An element can, for example, choose to start
a thread to start pulling from the sink pad or/and start pushing on
the source pad. An element can also choose to use the upstream or
downstream thread for its data processing in push and pull mode
respectively. &GStreamer; does not pose any restrictions on how the
element chooses to be scheduled. See the Plugin Writer Guide for more
details.
</para>
<para>
What will happen in any case is that some elements will start a thread
for their data processing, called the <quote>streaming threads</quote>.
The streaming threads, or <classname>GstTask</classname> objects, are
created from a <classname>GstTaskPool</classname> when the element
needs to make a streaming thread. In the next section we see how we
can receive notifications of the tasks and pools.
</para>
</sect1>
<sect1 id="section-threads-status">
<title>Configuring Threads in &GStreamer;</title>
<para>
A STREAM_STATUS message is posted on the bus to inform you about the
status of the streaming threads. You will get the following information
from the message:
<itemizedlist>
<listitem>
<para>
When a new thread is about to be created, you will be notified
of this with a GST_STREAM_STATUS_TYPE_CREATE type. It is then
possible to configure a <classname>GstTaskPool</classname> in
the <classname>GstTask</classname>. The custom taskpool will
provide custom threads for the task to implement the streaming
threads.
</para>
<para>
This message needs to be handled synchronously if you want to
configure a custom taskpool. If you don't configure the taskpool
on the task when this message returns, the task will use its
default pool.
</para>
</listitem>
<listitem>
<para>
When a thread is entered or left. This is the moment where you
could configure thread priorities. You also get a notification
when a thread is destroyed.
</para>
</listitem>
<listitem>
<para>
You get messages when the thread starts, pauses and stops. This
could be used to visualize the status of streaming threads in
a gui application.
</para>
</listitem>
</itemizedlist>
</para>
<para>
</para>
<para>
We will now look at some examples in the next sections.
</para>
<sect2 id="section-threads-rt">
<title>Boost priority of a thread</title>
<programlisting>
.----------. .----------.
| faksesrc | | fakesink |
| src->sink |
'----------' '----------'
</programlisting>
<para>
Let's look at the simple pipeline above. We would like to boost
the priority of the streaming thread.
It will be the fakesrc element that starts the streaming thread for
generating the fake data pushing them to the peer fakesink.
The flow for changing the priority would go like this:
</para>
<itemizedlist>
<listitem>
<para>
When going from READY to PAUSED state, fakesrc will require a
streaming thread for pushing data into the fakesink. It will
post a STREAM_STATUS message indicating its requirement for a
streaming thread.
</para>
</listitem>
<listitem>
<para>
The application will react to the STREAM_STATUS messages with a
sync bus handler. It will then configure a custom
<classname>GstTaskPool</classname> on the
<classname>GstTask</classname> inside the message. The custom
taskpool is responsible for creating the threads. In this
example we will make a thread with a higher priority.
</para>
</listitem>
<listitem>
<para>
Alternatively, since the sync message is called in the thread
context, you can use thread ENTER/LEAVE notifications to
change the priority or scheduling pollicy of the current thread.
</para>
</listitem>
</itemizedlist>
<para>
In a first step we need to implement a custom
<classname>GstTaskPool</classname> that we can configure on the task.
Below is the implementation of a <classname>GstTaskPool</classname>
subclass that uses pthreads to create a SCHED_RR real-time thread.
Note that creating real-time threads might require extra priveleges.
</para>
<programlisting>
<!-- example-begin testrtpool.c a -->
<!--
#include <gst/gst.h>
#define TEST_TYPE_RT_POOL (test_rt_pool_get_type ())
#define TEST_RT_POOL(pool) (G_TYPE_CHECK_INSTANCE_CAST ((pool), TEST_TYPE_RT_POOL, TestRTPool))
#define TEST_IS_RT_POOL(pool) (G_TYPE_CHECK_INSTANCE_TYPE ((pool), TEST_TYPE_RT_POOL))
#define TEST_RT_POOL_CLASS(pclass) (G_TYPE_CHECK_CLASS_CAST ((pclass), TEST_TYPE_RT_POOL, TestRTPoolClass))
#define TEST_IS_RT_POOL_CLASS(pclass) (G_TYPE_CHECK_CLASS_TYPE ((pclass), TEST_TYPE_RT_POOL))
#define TEST_RT_POOL_GET_CLASS(pool) (G_TYPE_INSTANCE_GET_CLASS ((pool), TEST_TYPE_RT_POOL, TestRTPoolClass))
#define TEST_RT_POOL_CAST(pool) ((TestRTPool*)(pool))
typedef struct _TestRTPool TestRTPool;
typedef struct _TestRTPoolClass TestRTPoolClass;
struct _TestRTPool {
GstTaskPool object;
};
struct _TestRTPoolClass {
GstTaskPoolClass parent_class;
};
GType test_rt_pool_get_type (void);
GstTaskPool * test_rt_pool_new (void);
-->
<!-- example-end testrtpool.c a-->
<!-- example-begin testrtpool.c b -->
<![CDATA[
#include <pthread.h>
typedef struct
{
pthread_t thread;
} TestRTId;
G_DEFINE_TYPE (TestRTPool, test_rt_pool, GST_TYPE_TASK_POOL);
static void
default_prepare (GstTaskPool * pool, GError ** error)
{
/* we don't do anything here. We could construct a pool of threads here that
* we could reuse later but we don't */
}
static void
default_cleanup (GstTaskPool * pool)
{
}
static gpointer
default_push (GstTaskPool * pool, GstTaskPoolFunction func, gpointer data,
GError ** error)
{
TestRTId *tid;
gint res;
pthread_attr_t attr;
struct sched_param param;
tid = g_slice_new0 (TestRTId);
pthread_attr_init (&attr);
if ((res = pthread_attr_setschedpolicy (&attr, SCHED_RR)) != 0)
g_warning ("setschedpolicy: failure: %p", g_strerror (res));
param.sched_priority = 50;
if ((res = pthread_attr_setschedparam (&attr, &param)) != 0)
g_warning ("setschedparam: failure: %p", g_strerror (res));
if ((res = pthread_attr_setinheritsched (&attr, PTHREAD_EXPLICIT_SCHED)) != 0)
g_warning ("setinheritsched: failure: %p", g_strerror (res));
res = pthread_create (&tid->thread, &attr, (void *(*)(void *)) func, data);
if (res != 0) {
g_set_error (error, G_THREAD_ERROR, G_THREAD_ERROR_AGAIN,
"Error creating thread: %s", g_strerror (res));
g_slice_free (TestRTId, tid);
tid = NULL;
}
return tid;
}
static void
default_join (GstTaskPool * pool, gpointer id)
{
TestRTId *tid = (TestRTId *) id;
pthread_join (tid->thread, NULL);
g_slice_free (TestRTId, tid);
}
static void
test_rt_pool_class_init (TestRTPoolClass * klass)
{
GstTaskPoolClass *gsttaskpool_class;
gsttaskpool_class = (GstTaskPoolClass *) klass;
gsttaskpool_class->prepare = default_prepare;
gsttaskpool_class->cleanup = default_cleanup;
gsttaskpool_class->push = default_push;
gsttaskpool_class->join = default_join;
}
static void
test_rt_pool_init (TestRTPool * pool)
{
}
GstTaskPool *
test_rt_pool_new (void)
{
GstTaskPool *pool;
pool = g_object_new (TEST_TYPE_RT_POOL, NULL);
return pool;
}
]]>
<!-- example-end testrtpool.c b -->
</programlisting>
<para>
The important function to implement when writing an taskpool is the
<quote>push</quote> function. The implementation should start a thread
that calls the given function. More involved implementations might
want to keep some threads around in a pool because creating and
destroying threads is not always the fastest operation.
</para>
<para>
In a next step we need to actually configure the custom taskpool when
the fakesrc needs it. For this we intercept the STREAM_STATUS messages
with a sync handler.
</para>
<programlisting>
<!-- example-begin testrtpool.c c -->
<![CDATA[
static GMainLoop* loop;
static void
on_stream_status (GstBus *bus,
GstMessage *message,
gpointer user_data)
{
GstStreamStatusType type;
GstElement *owner;
const GValue *val;
GstTask *task = NULL;
gst_message_parse_stream_status (message, &type, &owner);
val = gst_message_get_stream_status_object (message);
/* see if we know how to deal with this object */
if (G_VALUE_TYPE (val) == GST_TYPE_TASK) {
task = g_value_get_object (val);
}
switch (type) {
case GST_STREAM_STATUS_TYPE_CREATE:
if (task) {
GstTaskPool *pool;
pool = test_rt_pool_new();
gst_task_set_pool (task, pool);
}
break;
default:
break;
}
}
static void
on_error (GstBus *bus,
GstMessage *message,
gpointer user_data)
{
g_message ("received ERROR");
g_main_loop_quit (loop);
}
static void
on_eos (GstBus *bus,
GstMessage *message,
gpointer user_data)
{
g_main_loop_quit (loop);
}
int
main (int argc, char *argv[])
{
GstElement *bin, *fakesrc, *fakesink;
GstBus *bus;
GstStateChangeReturn ret;
gst_init (&argc, &argv);
/* create a new bin to hold the elements */
bin = gst_pipeline_new ("pipeline");
g_assert (bin);
/* create a source */
fakesrc = gst_element_factory_make ("fakesrc", "fakesrc");
g_assert (fakesrc);
g_object_set (fakesrc, "num-buffers", 50, NULL);
/* and a sink */
fakesink = gst_element_factory_make ("fakesink", "fakesink");
g_assert (fakesink);
/* add objects to the main pipeline */
gst_bin_add_many (GST_BIN (bin), fakesrc, fakesink, NULL);
/* link the elements */
gst_element_link (fakesrc, fakesink);
loop = g_main_loop_new (NULL, FALSE);
/* get the bus, we need to install a sync handler */
bus = gst_pipeline_get_bus (GST_PIPELINE (bin));
gst_bus_enable_sync_message_emission (bus);
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "sync-message::stream-status",
(GCallback) on_stream_status, NULL);
g_signal_connect (bus, "message::error",
(GCallback) on_error, NULL);
g_signal_connect (bus, "message::eos",
(GCallback) on_eos, NULL);
/* start playing */
ret = gst_element_set_state (bin, GST_STATE_PLAYING);
if (ret != GST_STATE_CHANGE_SUCCESS) {
g_message ("failed to change state");
return -1;
}
/* Run event loop listening for bus messages until EOS or ERROR */
g_main_loop_run (loop);
/* stop the bin */
gst_element_set_state (bin, GST_STATE_NULL);
gst_object_unref (bus);
g_main_loop_unref (loop);
return 0;
}
]]>
<!-- example-end testrtpool.c c -->
</programlisting>
<para>
Note that this program likely needs root permissions in order to
create real-time threads. When the thread can't be created, the
state change function will fail, which we catch in the application
above.
</para>
<para>
When there are multiple threads in the pipeline, you will receive
multiple STREAM_STATUS messages. You should use the owner of the
message, which is likely the pad or the element that starts the
thread, to figure out what the function of this thread is in the
context of the application.
</para>
</sect2>
</sect1>
<sect1 id="section-threads-uses">
<title>When would you want to force a thread?</title>
<para>
We have seen that threads are created by elements but it is also
possible to insert elements in the pipeline for the sole purpose of
forcing a new thread in the pipeline.
</para>
<para>
There are several reasons to force the use of threads. However,
for performance reasons, you never want to use one thread for every
element out there, since that will create some overhead.
Let's now list some situations where threads can be particularly
useful:
</para>
<itemizedlist>
<listitem>
<para>
Data buffering, for example when dealing with network streams or
when recording data from a live stream such as a video or audio
card. Short hickups elsewhere in the pipeline will not cause data
loss. See also <xref linkend="section-buffering-stream"/> about network
buffering with queue2.
</para>
<figure float="1" id="section-thread-buffering-img">
<title>Data buffering, from a networked source</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/thread-buffering.&image;" format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
</listitem>
<listitem>
<para>
Synchronizing output devices, e.g. when playing a stream containing
both video and audio data. By using threads for both outputs, they
will run independently and their synchronization will be better.
</para>
<figure float="1" id="section-thread-synchronizing-img">
<title>Synchronizing audio and video sinks</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/thread-synchronizing.&image;" format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
</listitem>
</itemizedlist>
<para>
Above, we've mentioned the <quote>queue</quote> element several times
now. A queue is the thread boundary element through which you can
force the use of threads. It does so by using a classic
provider/consumer model as learned in threading classes at
universities all around the world. By doing this, it acts both as a
means to make data throughput between threads threadsafe, and it can
also act as a buffer. Queues have several <classname>GObject</classname>
properties to be configured for specific uses. For example, you can set
lower and upper thresholds for the element. If there's less data than
the lower threshold (default: disabled), it will block output. If
there's more data than the upper threshold, it will block input or
(if configured to do so) drop data.
</para>
<para>
To use a queue (and therefore force the use of two distinct threads
in the pipeline), one can simply create a <quote>queue</quote> element
and put this in as part of the pipeline. &GStreamer; will take care of
all threading details internally.
</para>
</sect1>
</chapter>

View file

@ -1,204 +0,0 @@
<chapter id="chapter-checklist-element">
<title>Things to check when writing an application</title>
<para>
This chapter contains a fairly random selection of things that can be
useful to keep in mind when writing &GStreamer;-based applications. It's
up to you how much you're going to use the information provided here.
We will shortly discuss how to debug pipeline problems using &GStreamer;
applications. Also, we will touch upon how to acquire knowledge about
plugins and elements and how to test simple pipelines before building
applications around them.
</para>
<sect1 id="section-checklist-programming">
<title>Good programming habits</title>
<itemizedlist>
<listitem>
<para>
Always add a <classname>GstBus</classname> handler to your
pipeline. Always report errors in your application, and try
to do something with warnings and information messages, too.
</para>
</listitem>
<listitem>
<para>
Always check return values of &GStreamer; functions. Especially,
check return values of <function>gst_element_link ()</function>
and <function>gst_element_set_state ()</function>.
</para>
</listitem>
<listitem>
<para>
Dereference return values of all functions returning a non-base
type, such as <function>gst_element_get_pad ()</function>. Also,
always free non-const string returns, such as
<function>gst_object_get_name ()</function>.
</para>
</listitem>
<listitem>
<para>
Always use your pipeline object to keep track of the current state
of your pipeline. Don't keep private variables in your application.
Also, don't update your user interface if a user presses the
<quote>play</quote> button. Instead, listen for the
<quote>state-changed</quote> message on the
<classname>GstBus</classname> and only update the user interface
whenever this message is received.
</para>
</listitem>
<listitem>
<para>
Report all bugs that you find in &GStreamer; bugzilla at
<ulink type="http"
url="http://bugzilla.gnome.org">http://bugzilla.gnome.org/</ulink>.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-checklist-debug">
<title>Debugging</title>
<para>
Applications can make use of the extensive &GStreamer; debugging system
to debug pipeline problems. Elements will write output to this system
to log what they're doing. It's not used for error reporting, but it
is very useful for tracking what an element is doing exactly, which
can come in handy when debugging application issues (such as failing
seeks, out-of-sync media, etc.).
</para>
<para>
Most &GStreamer;-based applications accept the commandline option
<option>--gst-debug=LIST</option> and related family members. The
list consists of a comma-separated list of category/level pairs,
which can set the debugging level for a specific debugging category.
For example, <option>--gst-debug=oggdemux:5</option> would turn
on debugging for the Ogg demuxer element. You can use wildcards as
well. A debugging level of 0 will turn off all debugging, and a level
of 9 will turn on all debugging. Intermediate values only turn on
some debugging (based on message severity; 2, for example, will only
display errors and warnings). Here's a list of all available options:
</para>
<para>
<itemizedlist>
<listitem>
<para>
<option>--gst-debug-help</option> will print available debug
categories and exit.
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-level=<replaceable>LEVEL</replaceable></option>
will set the default debug level (which can range from 0 (no
output) to 9 (everything)).
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug=<replaceable>LIST</replaceable></option>
takes a comma-separated list of category_name:level pairs to
set specific levels for the individual categories. Example:
<option>GST_AUTOPLUG:5,avidemux:3</option>. Alternatively, you
can also set the <classname>GST_DEBUG</classname> environment
variable, which has the same effect.
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-no-color</option> will disable color debugging.
You can also set the GST_DEBUG_NO_COLOR environment variable to 1
if you want to disable colored debug output permanently. Note that
if you are disabling color purely to avoid messing up your pager
output, try using <command>less -R</command>.
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-color-mode=<replaceable>MODE</replaceable></option>
will change debug log coloring mode. <replaceable>MODE</replaceable>
can be one of the following: <option>on</option>,
<option>off</option>, <option>auto</option>,
<option>disable</option>, <option>unix</option>.
You can also set the GST_DEBUG_COLOR_MODE environment variable
if you want to change colored debug output permanently. Note that
if you are disabling color purely to avoid messing up your pager
output, try using <command>less -R</command>.
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-disable</option> disables debugging altogether.
</para>
</listitem>
<listitem>
<para>
<option>--gst-plugin-spew</option> enables printout of errors while
loading &GStreamer; plugins.
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
<sect1 id="section-checklist-conversion">
<title>Conversion plugins</title>
<para>
&GStreamer; contains a bunch of conversion plugins that most
applications will find useful. Specifically, those are videoscalers
(videoscale), colorspace convertors (videoconvert), audio format
convertors and channel resamplers (audioconvert) and audio samplerate
convertors (audioresample). Those convertors don't do anything when not
required, they will act in passthrough mode. They will activate when
the hardware doesn't support a specific request, though. All
applications are recommended to use those elements.
</para>
</sect1>
<sect1 id="section-checklist-applications">
<title>Utility applications provided with &GStreamer;</title>
<para>
&GStreamer; comes with a default set of command-line utilities that
can help in application development. We will discuss only
<command>gst-launch</command> and <command>gst-inspect</command> here.
</para>
<sect2 id="section-applications-launch">
<title><command>gst-launch</command></title>
<para>
<command>gst-launch</command> is a simple script-like commandline
application that can be used to test pipelines. For example, the
command <command>gst-launch audiotestsrc ! audioconvert !
audio/x-raw,channels=2 ! alsasink</command> will run
a pipeline which generates a sine-wave audio stream and plays it
to your ALSA audio card. <command>gst-launch</command> also allows
the use of threads (will be used automatically as required or as queue
elements are inserted in the pipeline) and bins (using brackets, so
<quote>(</quote> and <quote>)</quote>). You can use dots to imply
padnames on elements,
or even omit the padname to automatically select a pad. Using
all this, the pipeline
<command>gst-launch filesrc location=file.ogg ! oggdemux name=d
d. ! queue ! theoradec ! videoconvert ! xvimagesink
d. ! queue ! vorbisdec ! audioconvert ! audioresample ! alsasink
</command> will play an Ogg file
containing a Theora video-stream and a Vorbis audio-stream. You can
also use autopluggers such as decodebin on the commandline. See the
manual page of <command>gst-launch</command> for more information.
</para>
</sect2>
<sect2 id="section-applications-inspect">
<title><command>gst-inspect</command></title>
<para>
<command>gst-inspect</command> can be used to inspect all properties,
signals, dynamic parameters and the object hierarchy of an element.
This can be very useful to see which <classname>GObject</classname>
properties or which signals (and using what arguments) an element
supports. Run <command>gst-inspect fakesrc</command> to get an idea
of what it does. See the manual page of <command>gst-inspect</command>
for more information.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,76 +0,0 @@
<chapter id="chapter-compiling">
<title>Compiling</title>
<para>
This section talks about the different things you can do when building
and shipping your applications and plugins.
</para>
<sect1 id="section-compiling-embedding">
<title>Embedding static elements in your application</title>
<para>
The <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.html">Plugin
Writer's Guide</ulink> describes in great detail how to write elements
for the &GStreamer; framework. In this section, we will solely discuss
how to embed such elements statically in your application. This can be
useful for application-specific elements that have no use elsewhere in
&GStreamer;.
</para>
<para>
Dynamically loaded plugins contain a structure that's defined using
<function>GST_PLUGIN_DEFINE ()</function>. This structure is loaded
when the plugin is loaded by the &GStreamer; core. The structure
contains an initialization function (usually called
<function>plugin_init</function>) that will be called right after that.
It's purpose is to register the elements provided by the plugin with
the &GStreamer; framework.
If you want to embed elements directly in
your application, the only thing you need to do is to replace
<function>GST_PLUGIN_DEFINE ()</function> with a call to
<function>gst_plugin_register_static ()</function>. As soon as you
call <function>gst_plugin_register_static ()</function>, the elements
will from then on be available like any other element, without them
having to be dynamically loadable libraries. In the example below, you
would be able to call <function>gst_element_factory_make
("my-element-name", "some-name")</function> to create an instance of the
element.
</para>
<programlisting>
<![CDATA[
/*
* Here, you would write the actual plugin code.
*/
[..]
static gboolean
register_elements (GstPlugin *plugin)
{
return gst_element_register (plugin, "my-element-name",
GST_RANK_NONE, MY_PLUGIN_TYPE);
}
static
my_code_init (void)
{
...
gst_plugin_register_static (
GST_VERSION_MAJOR,
GST_VERSION_MINOR,
"my-private-plugins",
"Private elements of my application",
register_elements,
VERSION,
"LGPL",
"my-application-source",
"my-application",
"http://www.my-application.net/")
...
}
]]>
</programlisting>
</sect1>
</chapter>

View file

@ -1,331 +0,0 @@
<chapter id="chapter-intgration">
<title>Integration</title>
<para>
&GStreamer; tries to integrate closely with operating systems (such
as Linux and UNIX-like operating systems, OS X or Windows) and desktop
environments (such as GNOME or KDE). In this chapter, we'll mention
some specific techniques to integrate your application with your
operating system or desktop environment of choice.
</para>
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<sect1 id="section-integration-nix">
<title>Linux and UNIX-like operating systems</title>
<para>
&GStreamer; provides a basic set of elements that are useful when
integrating with Linux or a UNIX-like operating system.
</para>
<itemizedlist>
<listitem>
<para>
For audio input and output, &GStreamer; provides input and
output elements for several audio subsystems. Amongst others,
&GStreamer; includes elements for ALSA (alsasrc,
alsasink), OSS (osssrc, osssink) Pulesaudio (pulsesrc, pulsesink)
and Sun audio (sunaudiosrc, sunaudiomixer, sunaudiosink).
</para>
</listitem>
<listitem>
<para>
For video input, &GStreamer; contains source elements for
Video4linux2 (v4l2src, v4l2element, v4l2sink).
</para>
</listitem>
<listitem>
<para>
For video output, &GStreamer; provides elements for output
to X-windows (ximagesink), Xv-windows (xvimagesink; for
hardware-accelerated video), direct-framebuffer (dfbimagesink)
and openGL image contexts (glsink).
</para>
</listitem>
</itemizedlist>
</sect1>
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<sect1 id="section-integration-gnome">
<title>GNOME desktop</title>
<para>
&GStreamer; has been the media backend of the <ulink type="http"
url="http://www.gnome.org/">GNOME</ulink> desktop since GNOME-2.2
onwards. Nowadays, a whole bunch of GNOME applications make use of
&GStreamer; for media-processing, including (but not limited to)
<ulink type="http" url="http://www.rhythmbox.org/">Rhythmbox</ulink>,
<ulink type="http" url="https://wiki.gnome.org/Apps/Videos">Videos</ulink>
and <ulink type="http"
url="https://wiki.gnome.org/Apps/SoundJuicer">Sound
Juicer</ulink>.
</para>
<para>
Most of these GNOME applications make use of some specific techniques
to integrate as closely as possible with the GNOME desktop:
</para>
<itemizedlist>
<listitem>
<para>
GNOME applications usually call <function>gtk_init ()</function>
to parse command-line options and initialize GTK. &GStreamer;
applications would normally call <function>gst_init ()</function>
to do the same for GStreamer.
This would mean that only one of the two can parse command-line
options. To work around this issue, &GStreamer; can provide a
GLib <classname>GOptionGroup</classname> which can be passed to
<function>gnome_program_init ()</function>. The following
example requires GTK 2.6 or newer (previous GTK versions
do not support command line parsing via GOption yet)
</para>
<programlisting><!-- example-begin gnome.c a -->
#include &lt;gtk/gtk.h&gt;
#include &lt;gst/gst.h&gt;
static gchar **cmd_filenames = NULL;
static GOptionEntries cmd_options[] = {
/* here you can add command line options for your application. Check
* the GOption section in the GLib API reference for a more elaborate
* example of how to add your own command line options here */
/* at the end we have a special option that collects all remaining
* command line arguments (like filenames) for us. If you don&apos;t
* need this, you can safely remove it */
{ G_OPTION_REMAINING, 0, 0, G_OPTION_ARG_FILENAME_ARRAY, &amp;cmd_filenames,
"Special option that collects any remaining arguments for us" },
/* mark the end of the options array with a NULL option */
{ NULL, }
};
/* this should usually be defined in your config.h */
#define VERSION "0.0.1"
gint
main (gint argc, gchar **argv)
{
GOptionContext *context;
GOptionGroup *gstreamer_group, *gtk_group;
GError *err = NULL;
context = g_option_context_new ("gtk-demo-app");
/* get command line options from GStreamer and add them to the group */
gstreamer_group = gst_init_get_option_group ();
g_option_context_add_group (context, gstreamer_group);
gtk_group = gtk_get_option_group (TRUE);
g_option_context_add_group (context, gtk_group);
/* add our own options. If you are using gettext for translation of your
* strings, use GETTEXT_PACKAGE here instead of NULL */
g_option_context_add_main_entries (context, cmd_options, NULL);
/* now parse the commandline options, note that this already
* calls gtk_init() and gst_init() */
if (!g_option_context_parse (ctx, &amp;argc, &amp;argv, &amp;err)) {
g_print ("Error initializing: %s\n", err->message);
g_clear_error (&amp;err);
g_option_context_free (ctx);
exit (1);
}
g_option_context_free (ctx);
/* any filenames we got passed on the command line? parse them! */
if (cmd_filenames != NULL) {
guint i, num;
num = g_strv_length (cmd_filenames);
for (i = 0; i &lt; num; ++i) {
/* do something with the filename ... */
g_print ("Adding to play queue: %s\n", cmd_filenames[i]);
}
g_strfreev (cmd_filenames);
cmd_filenames = NULL;
}
<!-- example-end gnome.c a -->
[..]<!-- example-begin gnome.c b --><!--
return 0;
--><!-- example-end gnome.c b -->
<!-- example-begin gnome.c c -->
}
<!-- example-end gnome.c c --></programlisting>
</listitem>
<listitem>
<para>
GNOME uses Pulseaudio for audio, use the pulsesrc and
pulsesink elements to have access to all the features.
</para>
</listitem>
<listitem>
<para>
&GStreamer; provides data input/output elements for use with the
GIO VFS system. These elements are called <quote>giosrc</quote>
and <quote>giosink</quote>.
The deprecated GNOME-VFS system is supported too but shouldn't be
used for any new applications.
</para>
</listitem>
</itemizedlist>
</sect1>
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<sect1 id="section-integration-kde">
<title>KDE desktop</title>
<para>
&GStreamer; has been proposed for inclusion in KDE-4.0. Currently,
&GStreamer; is included as an optional component, and it's used by
several KDE applications, including <ulink type="http"
url="http://amarok.kde.org/">AmaroK</ulink>,
<ulink type="http"
url="http://www.xs4all.nl/~jjvrieze/kmplayer.html">KMPlayer</ulink> and
<ulink type="http"
url="http://kaffeine.sourceforge.net/">Kaffeine</ulink>.
</para>
<para>
Although not yet as complete as the GNOME integration bits, there
are already some KDE integration specifics available. This list will
probably grow as &GStreamer; starts to be used in KDE-4.0:
</para>
<itemizedlist>
<listitem>
<para>
AmaroK contains a kiosrc element, which is a source element that
integrates with the KDE VFS subsystem KIO.
</para>
</listitem>
</itemizedlist>
</sect1>
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<sect1 id="section-integration-osx">
<title>OS X</title>
<para>
&GStreamer; provides native video and audio output elements for OS X.
It builds using the standard development tools for OS X.
</para>
</sect1>
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<!-- ####################################################################### -->
<sect1 id="section-integration-win32">
<title>Windows</title>
<warning>
<para>
Note: this section is out of date. GStreamer-1.0 has much better
support for win32 than previous versions though and should usually compile
and work out-of-the-box both using MSYS/MinGW or Microsoft compilers. The
<ulink url="http://gstreamer.freedesktop.org">GStreamer web site</ulink> and the
<ulink url="http://news.gmane.org/gmane.comp.video.gstreamer.devel">mailing list
archives</ulink> are a good place to check the latest win32-related news.
</para>
</warning>
<para>
&GStreamer; builds using Microsoft Visual C .NET 2003 and using Cygwin.
</para>
<sect2 id="section-win32-build">
<title>Building <application>GStreamer</application> under Win32</title>
<para>There are different makefiles that can be used to build GStreamer with the usual Microsoft
compiling tools.</para>
<para>The Makefile is meant to be used with the GNU make program and the free
version of the Microsoft compiler (<ulink url="http://msdn.microsoft.com/visualc/vctoolkit2003/">http://msdn.microsoft.com/visualc/vctoolkit2003/</ulink>). You also
have to modify your system environment variables to use it from the command-line. You will also
need a working Platform SDK for Windows that is available for free from Microsoft.</para>
<para>The projects/makefiles will generate automatically some source files needed to compile
GStreamer. That requires that you have installed on your system some GNU tools and that they are
available in your system PATH.</para>
<para>The GStreamer project depends on other libraries, namely :</para>
<itemizedlist>
<listitem><para>GLib</para></listitem>
<listitem><para>libxml2</para></listitem>
<listitem><para>libintl</para></listitem>
<listitem><para>libiconv</para></listitem>
</itemizedlist>
<para>Work is being done to provide pre-compiled GStreamer-1.0 libraries as
a packages for win32. Check the <ulink url="http://gstreamer.freedesktop.org">
GStreamer web site</ulink> and check our
<ulink url="http://news.gmane.org/gmane.comp.video.gstreamer.devel">mailing list
</ulink> for the latest developments in this respect.</para>
<note>
<title>Notes</title>
<para>GNU tools needed that you can find on <ulink url="http://gnuwin32.sourceforge.net/">http://gnuwin32.sourceforge.net/</ulink></para>
<itemizedlist>
<listitem><para>GNU flex (tested with 2.5.4)</para></listitem>
<listitem><para>GNU bison (tested with 1.35)</para></listitem>
</itemizedlist>
<para>and <ulink url="http://www.mingw.org/">http://www.mingw.org/</ulink></para>
<itemizedlist>
<listitem><para>GNU make (tested with 3.80)</para></listitem>
</itemizedlist>
<para>the generated files from the -auto makefiles will be available soon separately on the net
for convenience (people who don't want to install GNU tools).</para>
</note>
</sect2>
<sect2 id="section-win32-install">
<title>Installation on the system</title>
<para>FIXME: This section needs be updated for GStreamer-1.0.</para>
<!--
<para>By default, GStreamer needs a registry. You have to generate it using "gst-register.exe". It will create
the file in c:\gstreamer\registry.xml that will hold all the plugins you can use.</para>
<para>You should install the GStreamer core in c:\gstreamer\bin and the plugins in c:\gstreamer\plugins. Both
directories should be added to your system PATH. The library dependencies should be installed in c:\usr</para>
<para>For example, my current setup is :</para>
<itemizedlist>
<listitem><para><filename>c:\gstreamer\registry.xml</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-inspect.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-launch.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-register.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstbytestream.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstelements.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstoptimalscheduler.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstspider.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\libgtreamer-0.8.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\plugins\gst-libs.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\plugins\gstmatroska.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\iconv.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\intl.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libglib-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgmodule-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgobject-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgthread-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libxml2.dll</filename></para></listitem>
</itemizedlist>
-->
</sect2>
</sect1>
</chapter>

View file

@ -1,101 +0,0 @@
<chapter id="chapter-licensing">
<title>Licensing advisory</title>
<sect1 id="section-application-licensing">
<title>How to license the applications you build with <application>GStreamer</application></title>
<para>
The licensing of GStreamer is no different from a lot of other libraries
out there like GTK+ or glibc: we use the LGPL. What complicates things
with regards to GStreamer is its plugin-based design and the heavily
patented and proprietary nature of many multimedia codecs. While patents
on software are currently only allowed in a small minority of world
countries (the US and Australia being the most important of those), the
problem is that due to the central place the US hold in the world economy
and the computing industry, software patents are hard to ignore wherever
you are.
Due to this situation, many companies, including major GNU/Linux
distributions, get trapped in a situation where they either get bad
reviews due to lacking out-of-the-box media playback capabilities (and
attempts to educate the reviewers have met with little success so far), or
go against their own - and the free software movement's - wish to avoid
proprietary software. Due to competitive pressure, most choose to add some
support. Doing that through pure free software solutions would have them
risk heavy litigation and punishment from patent owners. So when the
decision is made to include support for patented codecs, it leaves them
the choice of either using special proprietary applications, or try to
integrate the support for these codecs through proprietary plugins into
the multimedia infrastructure provided by GStreamer. Faced with one of
these two evils the GStreamer community of course prefer the second option.
</para>
<para>
The problem which arises is that most free software and open source
applications developed use the GPL as their license. While this is
generally a good thing, it creates a dilemma for people who want to put
together a distribution. The dilemma they face is that if they include
proprietary plugins in GStreamer to support patented formats in a way that
is legal for them, they do risk running afoul of the GPL license of the
applications. We have gotten some conflicting reports from lawyers on
whether this is actually a problem, but the official stance of the FSF is
that it is a problem. We view the FSF as an authority on this matter, so
we are inclined to follow their interpretation of the GPL license.
</para>
<para>
So what does this mean for you as an application developer? Well, it means
you have to make an active decision on whether you want your application
to be used together with proprietary plugins or not. What you decide here
will also influence the chances of commercial distributions and Unix
vendors shipping your application. The GStreamer community suggest you
license your software using a license that will allow proprietary plugins
to be bundled with GStreamer and your applications, in order to make sure
that as many vendors as possible go with GStreamer instead of less free
solutions. This in turn we hope and think will let GStreamer be a vehicle
for wider use of free formats like the Xiph.org formats.
</para>
<para>
If you do decide that you want to allow for non-free plugins to be used
with your application you have a variety of choices. One of the simplest
is using licenses like LGPL, MPL or BSD for your application instead of
the GPL. Or you can add an exception clause to your GPL license stating
that you except GStreamer plugins from the obligations of the GPL.
</para>
<para>
A good example of such a GPL exception clause would be, using the
Totem video player project as an example:
The authors of the Totem video player project hereby grants permission
for non-GPL-compatible GStreamer plugins to be used and distributed
together with GStreamer and Totem. This permission goes above and beyond
the permissions granted by the GPL license Totem is covered by.
</para>
<para>
Our suggestion among these choices is to use the LGPL license, as it is
what resembles the GPL most and it makes it a good licensing fit with the
major GNU/Linux desktop projects like GNOME and KDE. It also allows you to
share code more openly with projects that have compatible licenses.
Obviously, pure GPL code without the above-mentioned clause is not usable
in your application as such. By choosing the LGPL, there is no need for an
exception clause and thus code can be shared more freely.
</para>
<para>
I have above outlined the practical reasons for why the GStreamer
community suggests you allow non-free plugins to be used with your
applications. We feel that in the multimedia arena, the free software
community is still not strong enough to set the agenda and that blocking
non-free plugins to be used in our infrastructure hurts us more than it
hurts the patent owners and their ilk.
</para>
<para>
This view is not shared by everyone. The Free Software Foundation urges
you to use an unmodified GPL for your applications, so as to push back
against the temptation to use non-free plug-ins. They say that since not
everyone else has the strength to reject them because they are unethical,
they ask your help to give them a legal reason to do so.
</para>
<para>
This advisory is part of a bigger advisory with a FAQ which you can find
on the <ulink url="http://gstreamer.freedesktop.org/documentation/licensing.html">GStreamer website</ulink>
</para>
</sect1>
</chapter>

View file

@ -1,316 +0,0 @@
<chapter id="chapter-porting">
<title>Porting 0.8 applications to 0.10</title>
<para>
This section of the appendix will discuss shortly what changes to
applications will be needed to quickly and conveniently port most
applications from &GStreamer;-0.8 to &GStreamer;-0.10, with references
to the relevant sections in this Application Development Manual
where needed. With this list, it should be possible to port simple
applications to &GStreamer;-0.10 in less than a day.
</para>
<sect1 id="section-porting-objects">
<title>List of changes</title>
<itemizedlist>
<listitem>
<para>
Most functions returning an object or an object property have
been changed to return its own reference rather than a constant
reference of the one owned by the object itself. The reason for
this change is primarily thread safety. This means, effectively,
that return values of functions such as
<function>gst_element_get_pad ()</function>,
<function>gst_pad_get_name ()</function> and many more like these
have to be free'ed or unreferenced after use. Check the API
references of each function to know for sure whether return
values should be free'ed or not. It is important that all objects
derived from GstObject are ref'ed/unref'ed using gst_object_ref()
and gst_object_unref() respectively (instead of g_object_ref/unref).
</para>
</listitem>
<listitem>
<para>
Applications should no longer use signal handlers to be notified
of errors, end-of-stream and other similar pipeline events.
Instead, they should use the <classname>GstBus</classname>, which
has been discussed in <xref linkend="chapter-bus"/>. The bus will
take care that the messages will be delivered in the context of a
main loop, which is almost certainly the application's main thread.
The big advantage of this is that applications no longer need to
be thread-aware; they don't need to use <function>g_idle_add
()</function> in the signal handler and do the actual real work
in the idle-callback. &GStreamer; now does all that internally.
</para>
</listitem>
<listitem>
<para>
Related to this, <function>gst_bin_iterate ()</function> has been
removed. Pipelines will iterate in their own thread, and applications
can simply run a <classname>GMainLoop</classname> (or call the
mainloop of their UI toolkit, such as <function>gtk_main
()</function>).
</para>
</listitem>
<listitem>
<para>
State changes can be delayed (ASYNC). Due to the new fully threaded
nature of GStreamer-0.10, state changes are not always immediate,
in particular changes including the transition from READY to PAUSED
state. This means two things in the context of porting applications:
first of all, it is no longer always possible to do
<function>gst_element_set_state ()</function> and check for a return
value of GST_STATE_CHANGE_SUCCESS, as the state change might be
delayed (ASYNC) and the result will not be known until later. You
should still check for GST_STATE_CHANGE_FAILURE right away, it is
just no longer possible to assume that everything that is not SUCCESS
means failure. Secondly, state changes might not be immediate, so
your code needs to take that into account. You can wait for a state
change to complete if you use GST_CLOCK_TIME_NONE as timeout interval
with <function>gst_element_get_state ()</function>.
</para>
</listitem>
<listitem>
<para>
In 0.8, events and queries had to manually be sent to sinks in
pipelines (unless you were using playbin). This is no longer
the case in 0.10. In 0.10, queries and events can be sent to
toplevel pipelines, and the pipeline will do the dispatching
internally for you. This means less bookkeeping in your
application. For a short code example, see <xref
linkend="chapter-queryevents"/>. Related, seeking is now
threadsafe, and your video output will show the new video
position's frame while seeking, providing a better user
experience.
</para>
</listitem>
<listitem>
<para>
The <classname>GstThread</classname> object has been removed.
Applications can now simply put elements in a pipeline with
optionally some <quote>queue</quote> elements in between for
buffering, and &GStreamer; will take care of creating threads
internally. It is still possible to have parts of a pipeline
run in different threads than others, by using the
<quote>queue</quote> element. See <xref linkend="chapter-threads"/>
for details.
</para>
</listitem>
<listitem>
<para>
Filtered caps -> capsfilter element (the pipeline syntax for
gst-launch has not changed though).
</para>
</listitem>
<listitem>
<para>
libgstgconf-0.10.la does not exist. Use the
<quote>gconfvideosink</quote> and <quote>gconfaudiosink</quote>
elements instead, which will do live-updates and require no library
linking.
</para>
</listitem>
<listitem>
<para>
The <quote>new-pad</quote> and <quote>state-change</quote> signals on
<classname>GstElement</classname> were renamed to
<quote>pad-added</quote> and <quote>state-changed</quote>.
</para>
</listitem>
<listitem>
<para>
<function>gst_init_get_popt_table ()</function> has been removed
in favour of the new GOption command line option API that was
added to GLib 2.6. <function>gst_init_get_option_group ()</function>
is the new GOption-based equivalent to
<function>gst_init_get_ptop_table ()</function>.
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>
<chapter id="chapter-porting-1.0">
<title>Porting 0.10 applications to 1.0</title>
<para>This section outlines some of the changes necessary to port
applications from &GStreamer;-0.10 to &GStreamer;-1.0. For a
comprehensive and up-to-date list, see the separate <ulink
type="http"
url="http://cgit.freedesktop.org/gstreamer/gstreamer/plain/docs/random/porting-to-1.0.txt">
Porting to 1.0</ulink> document.
</para>
<para>
It should be possible to port simple applications to
&GStreamer;-1.0 in less than a day.
</para>
<sect1 id="section-porting-objects-1.0">
<title>List of changes</title>
<itemizedlist>
<listitem>
<para>
All deprecated methods were removed. Recompile against 0.10 with
GST_DISABLE_DEPRECATED defined (such as by adding
-DGST_DISABLE_DEPRECATED to the compiler flags) and fix issues
before attempting to port to 1.0.
</para>
</listitem>
<listitem>
<para>
"playbin2" has been renamed to "playbin", with similar API
</para>
</listitem>
<listitem>
<para>
"decodebin2" has been renamed to "decodebin", with similar API. Note
that there is no longer a "new-decoded-pad" signal, just use GstElement's
"pad-added" signal instead (but don't forget to remove the 'gboolean last'
argument from your old signal callback functino signature).
</para>
</listitem>
<listitem>
<para>
the names of some "formatted" pad templates has been changed from e.g.
"src%d" to "src%u" or "src_%u" or similar, since we don't want to see
negative numbers in pad names. This mostly affects applications that
create request pads from elements.
</para>
</listitem>
<listitem>
<para>
some elements that used to have a single dynamic source pad have a
source pad now. Example: wavparse, id3demux, iceydemux, apedemux.
(This does not affect applications using decodebin or playbin).
</para>
</listitem>
<listitem>
<para>
playbin now proxies the GstVideoOverlay (former GstXOverlay) interface,
so most applications can just remove the sync bus handler where they
would set the window ID, and instead just set the window ID on playbin
from the application thread before starting playback.
</para>
<para>
playbin also proxies the GstColorBalance and GstNavigation interfaces,
so applications that use this don't need to go fishing for elements
that may implement those any more, but can just use on playbin
unconditionally.
</para>
</listitem>
<listitem>
<para>
multifdsink, tcpclientsink, tcpclientsrc, tcpserversrc the protocol property
is removed, use gdppay and gdpdepay.
</para>
</listitem>
<listitem>
<para>
XML serialization was removed.
</para>
</listitem>
<listitem>
<para>
Probes and pad blocking was merged into new pad probes.
</para>
</listitem>
<listitem>
<para>
Position, duration and convert functions no longer use an inout parameter
for the destination format.
</para>
</listitem>
<listitem>
<para>
Video and audio caps were simplified. audio/x-raw-int and audio/x-raw-float
are now all under the audio/x-raw media type. Similarly, video/x-raw-rgb
and video/x-raw-yuv are now video/x-raw.
</para>
</listitem>
<listitem>
<para>
ffmpegcolorspace was removed and replaced with videoconvert.
</para>
</listitem>
<listitem>
<para>
GstMixerInterface / GstTunerInterface were removed without replacement.
</para>
</listitem>
<listitem>
<para>
The GstXOverlay interface was renamed to GstVideoOverlay, and now part
of the video library in gst-plugins-base, as the interfaces library
no longer exists.
</para>
<para>
The name of the GstXOverlay "prepare-xwindow-id" message has changed
to "prepare-window-handle" (and GstXOverlay has been renamed to
GstVideoOverlay). Code that checks for the string directly should be
changed to use gst_is_video_overlay_prepare_window_handle_message(message)
instead.
</para>
</listitem>
<listitem>
<para>
The GstPropertyProbe interface was removed. There is no replacement
for it in GStreamer 1.0.x and 1.2.x, but since version 1.4 there is
a more featureful replacement for device discovery and feature
querying provided by GstDeviceMonitor, GstDevice, and friends. See
the <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gstreamer-device-probing.html">
"GStreamer Device Discovery and Device Probing" documentation</ulink>.
</para>
</listitem>
<listitem>
<para>
gst_uri_handler_get_uri() and the get_uri vfunc now return a copy of
the URI string
</para>
<para>
gst_uri_handler_set_uri() and the set_uri vfunc now take an additional
GError argument so the handler can notify the caller why it didn't
accept a particular URI.
</para>
<para>
gst_uri_handler_set_uri() now checks if the protocol of the URI passed
is one of the protocols advertised by the uri handler, so set_uri vfunc
implementations no longer need to check that as well.
</para>
</listitem>
<listitem>
<para>
GstTagList is now an opaque mini object instead of being typedefed to a
GstStructure. While it was previously okay (and in some cases required because of
missing taglist API) to cast a GstTagList to a GstStructure or use
gst_structure_* API on taglists, you can no longer do that. Doing so will
cause crashes.
</para>
<para>
Also, tag lists are refcounted now, and can therefore not be freely
modified any longer. Make sure to call gst_tag_list_make_writable (taglist)
before adding, removing or changing tags in the taglist.
</para>
<para>
GST_TAG_IMAGE, GST_TAG_PREVIEW_IMAGE, GST_TAG_ATTACHMENT: many tags that
used to be of type GstBuffer are now of type GstSample (which is basically
a struct containing a buffer alongside caps and some other info).
</para>
</listitem>
<listitem>
<para>
GstController has now been merged into GstObject. It does not exists as an
individual object anymore. In addition core contains a GstControlSource base
class and the GstControlBinding. The actual control sources are in the controller
library as before. The 2nd big change is that control sources generate
a sequence of gdouble values and those are mapped to the property type and
value range by GstControlBindings.
</para>
<para>
The whole gst_controller_* API is gone and now available in simplified form
under gst_object_*. ControlSources are now attached via GstControlBinding
to properties. There are no GValue arguments used anymore when programming
control sources.
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,342 +0,0 @@
<chapter id="chapter-programs">
<title>Programs</title>
<para>
</para>
<sect1 id="section-programs-gst-launch">
<title><command>gst-launch</command></title>
<para>
This is a tool that will construct pipelines based on a command-line
syntax.
</para>
<para>
A simple commandline looks like:
<screen>
gst-launch filesrc location=hello.mp3 ! mad ! audioresample ! osssink
</screen>
A more complex pipeline looks like:
<screen>
gst-launch filesrc location=redpill.vob ! dvddemux name=demux \
demux.audio_00 ! queue ! a52dec ! audioconvert ! audioresample ! osssink \
demux.video_00 ! queue ! mpeg2dec ! videoconvert ! xvimagesink
</screen>
</para>
<para>
You can also use the parser in you own
code. <application>GStreamer</application> provides a function
gst_parse_launch () that you can use to construct a pipeline.
The following program lets you create an MP3 pipeline using the
gst_parse_launch () function:
</para>
<programlisting>
#include &lt;gst/gst.h&gt;
int
main (int argc, char *argv[])
{
GstElement *pipeline;
GstElement *filesrc;
GstMessage *msg;
GstBus *bus;
GError *error = NULL;
gst_init (&amp;argc, &amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;filename&gt;\n", argv[0]);
return -1;
}
pipeline = gst_parse_launch ("filesrc name=my_filesrc ! mad ! osssink", &amp;error);
if (!pipeline) {
g_print ("Parse error: %s\n", error->message);
exit (1);
}
filesrc = gst_bin_get_by_name (GST_BIN (pipeline), "my_filesrc");
g_object_set (filesrc, "location", argv[1], NULL);
g_object_unref (filesrc);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
bus = gst_element_get_bus (pipeline);
/* wait until we either get an EOS or an ERROR message. Note that in a real
* program you would probably not use gst_bus_poll(), but rather set up an
* async signal watch on the bus and run a main loop and connect to the
* bus's signals to catch certain messages or all messages */
msg = gst_bus_poll (bus, GST_MESSAGE_EOS | GST_MESSAGE_ERROR, -1);
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_EOS: {
g_print ("EOS\n");
break;
}
case GST_MESSAGE_ERROR: {
GError *err = NULL; /* error to show to users */
gchar *dbg = NULL; /* additional debug string for developers */
gst_message_parse_error (msg, &amp;err, &amp;dbg);
if (err) {
g_printerr ("ERROR: %s\n", err-&gt;message);
g_error_free (err);
}
if (dbg) {
g_printerr ("[Debug details: %s]\n", dbg);
g_free (dbg);
}
}
default:
g_printerr ("Unexpected message of type %d", GST_MESSAGE_TYPE (msg));
break;
}
gst_message_unref (msg);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
gst_object_unref (bus);
return 0;
}
</programlisting>
<para>
Note how we can retrieve the filesrc element from the constructed bin using the
element name.
</para>
<sect2>
<title>Grammar Reference</title>
<para>
The <command>gst-launch</command> syntax is processed by a flex/bison parser. This section
is intended to provide a full specification of the grammar; any deviations from this
specification is considered a bug.
</para>
<sect3>
<title>Elements</title>
<screen>
... mad ...
</screen>
<para>
A bare identifier (a string beginning with a letter and containing
only letters, numbers, dashes, underscores, percent signs, or colons)
will create an element from a given element factory. In this example,
an instance of the "mad" MP3 decoding plugin will be created.
</para>
</sect3>
<sect3>
<title>Links</title>
<screen>
... !sink ...
</screen>
<para>
An exclamation point, optionally having a qualified pad name (an the name of the pad,
optionally preceded by the name of the element) on both sides, will link two pads. If
the source pad is not specified, a source pad from the immediately preceding element
will be automatically chosen. If the sink pad is not specified, a sink pad from the next
element to be constructed will be chosen. An attempt will be made to find compatible
pads. Pad names may be preceded by an element name, as in
<computeroutput>my_element_name.sink_pad</computeroutput>.
</para>
</sect3>
<sect3>
<title>Properties</title>
<screen>
... location="http://gstreamer.net" ...
</screen>
<para>
The name of a property, optionally qualified with an element name, and a value,
separated by an equals sign, will set a property on an element. If the element is not
specified, the previous element is assumed. Strings can optionally be enclosed in
quotation marks. Characters in strings may be escaped with the backtick
(<literal>\</literal>). If the right-hand side is all digits, it is considered to be an
integer. If it is all digits and a decimal point, it is a double. If it is "true",
"false", "TRUE", or "FALSE" it is considered to be boolean. Otherwise, it is parsed as a
string. The type of the property is determined later on in the parsing, and the value is
converted to the target type. This conversion is not guaranteed to work, it relies on
the g_value_convert routines. No error message will be displayed on an invalid
conversion, due to limitations in the value convert API.
</para>
</sect3>
<sect3>
<title>Bins, Threads, and Pipelines</title>
<screen>
( ... )
</screen>
<para>
A pipeline description between parentheses is placed into a bin. The open paren may be
preceded by a type name, as in <computeroutput>jackbin.( ... )</computeroutput> to make
a bin of a specified type. Square brackets make pipelines, and curly braces make
threads. The default toplevel bin type is a pipeline, although putting the whole
description within parentheses or braces can override this default.
</para>
</sect3>
</sect2>
</sect1>
<sect1 id="section-programs-gst-inspect">
<title><command>gst-inspect</command></title>
<para>
This is a tool to query a plugin or an element about its properties.
</para>
<para>
To query the information about the element mad, you would specify:
</para>
<screen>
gst-inspect mad
</screen>
<para>
Below is the output of a query for the osssink element:
</para>
<screen>
<![CDATA[
Factory Details:
Rank: secondary (128)
Long-name: Audio Sink (OSS)
Klass: Sink/Audio
Description: Output to a sound card via OSS
Author: Erik Walthinsen <omega@cse.ogi.edu>, Wim Taymans <wim.taymans@chello.be>
Plugin Details:
Name: ossaudio
Description: OSS (Open Sound System) support for GStreamer
Filename: /home/wim/gst/head/gst-plugins-good/sys/oss/.libs/libgstossaudio.so
Version: 1.0.0.1
License: LGPL
Source module: gst-plugins-good
Source release date: 2012-09-25 12:52 (UTC)
Binary package: GStreamer Good Plug-ins git
Origin URL: Unknown package origin
GObject
+----GInitiallyUnowned
+----GstObject
+----GstElement
+----GstBaseSink
+----GstAudioBaseSink
+----GstAudioSink
+----GstOssSink
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
audio/x-raw
format: { S16LE, U16LE, S8, U8 }
layout: interleaved
rate: [ 1, 2147483647 ]
channels: 1
audio/x-raw
format: { S16LE, U16LE, S8, U8 }
layout: interleaved
rate: [ 1, 2147483647 ]
channels: 2
channel-mask: 0x0000000000000003
Element Flags:
no flags set
Element Implementation:
Has change_state() function: gst_audio_base_sink_change_state
Clocking Interaction:
element is supposed to provide a clock but returned NULL
Element has no indexing capabilities.
Element has no URI handling capabilities.
Pads:
SINK: 'sink'
Implementation:
Has chainfunc(): gst_base_sink_chain
Has custom eventfunc(): gst_base_sink_event
Has custom queryfunc(): gst_base_sink_sink_query
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Pad Template: 'sink'
Element Properties:
name : The name of the object
flags: readable, writable
String. Default: "osssink0"
parent : The parent of the object
flags: readable, writable
Object of type "GstObject"
sync : Sync on the clock
flags: readable, writable
Boolean. Default: true
max-lateness : Maximum number of nanoseconds that a buffer can be late before it is dropped (-1 unlimited)
flags: readable, writable
Integer64. Range: -1 - 9223372036854775807 Default: -1
qos : Generate Quality-of-Service events upstream
flags: readable, writable
Boolean. Default: false
async : Go asynchronously to PAUSED
flags: readable, writable
Boolean. Default: true
ts-offset : Timestamp offset in nanoseconds
flags: readable, writable
Integer64. Range: -9223372036854775808 - 9223372036854775807 Default: 0
enable-last-sample : Enable the last-sample property
flags: readable, writable
Boolean. Default: false
last-sample : The last sample received in the sink
flags: readable
Boxed pointer of type "GstSample"
blocksize : Size in bytes to pull per buffer (0 = default)
flags: readable, writable
Unsigned Integer. Range: 0 - 4294967295 Default: 4096
render-delay : Additional render delay of the sink in nanoseconds
flags: readable, writable
Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 0
throttle-time : The time to keep between rendered buffers
flags: readable, writable
Unsigned Integer64. Range: 0 - 18446744073709551615 Default: 0
buffer-time : Size of audio buffer in microseconds, this is the minimum latency that the sink reports
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 200000
latency-time : The minimum amount of data to write in each iteration in microseconds
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 10000
provide-clock : Provide a clock to be used as the global pipeline clock
flags: readable, writable
Boolean. Default: true
slave-method : Algorithm to use to match the rate of the masterclock
flags: readable, writable
Enum "GstAudioBaseSinkSlaveMethod" Default: 1, "skew"
(0): resample - GST_AUDIO_BASE_SINK_SLAVE_RESAMPLE
(1): skew - GST_AUDIO_BASE_SINK_SLAVE_SKEW
(2): none - GST_AUDIO_BASE_SINK_SLAVE_NONE
can-activate-pull : Allow pull-based scheduling
flags: readable, writable
Boolean. Default: false
alignment-threshold : Timestamp alignment threshold in nanoseconds
flags: readable, writable
Unsigned Integer64. Range: 1 - 18446744073709551614 Default: 40000000
drift-tolerance : Tolerance for clock drift in microseconds
flags: readable, writable
Integer64. Range: 1 - 9223372036854775807 Default: 40000
discont-wait : Window of time in nanoseconds to wait before creating a discontinuity
flags: readable, writable
Unsigned Integer64. Range: 0 - 18446744073709551614 Default: 1000000000
device : OSS device (usually /dev/dspN)
flags: readable, writable
String. Default: "/dev/dsp"
]]>
</screen>
<para>
To query the information about a plugin, you would do:
</para>
<screen>
gst-inspect gstelements
</screen>
</sect1>
</chapter>

View file

@ -1,350 +0,0 @@
<chapter id="chapter-quotes">
<title>Quotes from the Developers</title>
<para>
As well as being a cool piece of software,
<application>GStreamer</application> is a lively project, with
developers from around the globe very actively contributing.
We often hang out on the #gstreamer IRC channel on
irc.freenode.net: the following are a selection of amusing<footnote>
<para>No guarantee of sense of humour compatibility is given.</para>
</footnote> quotes from our conversations.
</para>
<variablelist>
<varlistentry>
<term>6 Mar 2006</term>
<listitem>
<para>
When I opened my eyes I was in a court room. There were masters McIlroy and
Thompson sitting in the jury and master Kernighan too. There were the GStreamer
developers standing in the defendant's place, accused of violating several laws
of Unix philosophy and customer lock-down via running on a proprietary
pipeline, different from that of the Unix systems. I heard Eric Raymond
whispering "got to add this case to my book.
</para>
<para><emphasis>behdad's blog</emphasis></para>
</listitem>
</varlistentry>
<varlistentry>
<term>22 May 2007</term>
<listitem>
<para><emphasis>&lt;__tim&gt;</emphasis>
Uraeus: amusing, isn't it?
</para>
<para><emphasis>&lt;Uraeus&gt;</emphasis>
__tim: I wrote that :)
</para>
<para><emphasis>&lt;__tim&gt;</emphasis>
Uraeus: of course you did; your refusal to surrender to the oppressive regime
of the third-person-singular-rule is so unique in its persistence that it's
hard to miss :)
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>12 Sep 2005</term>
<listitem>
<para><emphasis>&lt;wingo&gt;</emphasis>
we just need to get rid of that mmap stuff
</para>
<para><emphasis>&lt;wingo&gt;</emphasis>
i think gnomevfssrc is faster for files even
</para>
<para><emphasis>&lt;BBB&gt;</emphasis>
wingo, no
</para>
<para><emphasis>&lt;BBB&gt;</emphasis>
and no
</para>
<para><emphasis>&lt;wingo&gt;</emphasis>
good points ronald
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>23 Jun 2005</term>
<listitem>
<para><emphasis>* wingo</emphasis> back</para>
<para><emphasis>* thomasvs</emphasis> back</para>
<para>--- You are now known as everybody</para>
<para><emphasis>* everybody</emphasis> back back</para>
<para><emphasis>&lt;everybody&gt;</emphasis> now break it down</para>
<para>--- You are now known as thomasvs</para>
<para><emphasis>* bilboed</emphasis> back</para>
<para>--- bilboed is now known as john-sebastian</para>
<para><emphasis>* john-sebastian</emphasis> bach</para>
<para>--- john-sebastian is now known as bilboed</para>
<para>--- You are now known as scratch_my</para>
<para><emphasis>* scratch_my</emphasis> back</para>
<para>--- bilboed is now known as Illbe</para>
<para>--- You are now known as thomasvs</para>
<para><emphasis>* Illbe</emphasis> back</para>
<para>--- Illbe is now known as bilboed</para>
</listitem>
</varlistentry>
<varlistentry>
<term>20 Apr 2005</term>
<listitem>
<para>
<emphasis>thomas</emphasis>:
jrb, somehow his screenshotsrc grabs whatever X is showing and makes it
available as a stream of frames
</para>
<para>
<emphasis>jrb</emphasis>:
thomas: so, is the point that the screenshooter takes a video?
but won't the dialog be in the video? oh, nevermind. I'll just send mail...
</para>
<para>
<emphasis>thomas</emphasis>:
jrb, well, it would shoot first and ask questions later
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>2 Nov 2004</term>
<listitem>
<para>
<emphasis>zaheerm</emphasis>:
wtay: unfair u fixed the bug i was using as a feature!
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Oct 2004</term>
<listitem>
<para>
<emphasis>* zaheerm</emphasis>
wonders how he can break gstreamer today :)
</para>
<para>
<emphasis>ensonic</emphasis>:
zaheerm, spider is always a good starting point
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Jun 2004</term>
<listitem>
<para>
<emphasis>teuf</emphasis>: ok, things work much better when I don't write incredibly stupid and buggy code
</para>
<para>
<emphasis>thaytan</emphasis>: I find that too
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>23 Nov 2003</term>
<listitem>
<para>
<emphasis>Uraeus</emphasis>: ah yes, the sleeping part, my mind
is not multitasking so I was still thinking about exercise
</para>
<para>
<emphasis>dolphy</emphasis>: Uraeus: your mind is multitasking
</para>
<para>
<emphasis>dolphy</emphasis>: Uraeus: you just miss low latency patches
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Sep 2002</term>
<listitem>
<para>
--- <emphasis>wingo-party</emphasis> is now known as
<emphasis>wingo</emphasis>
</para>
<para>
* <emphasis>wingo</emphasis> holds head
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>4 Jun 2001</term>
<listitem>
<para><emphasis>taaz:</emphasis> you witchdoctors and your voodoo mpeg2 black magic... </para>
<para><emphasis>omega_:</emphasis> um. I count three, no four different cults there &lt;g&gt; </para>
<para><emphasis>ajmitch:</emphasis> hehe </para>
<para><emphasis>omega_:</emphasis> witchdoctors, voodoo, black magic, </para>
<para><emphasis>omega_:</emphasis> and mpeg </para>
</listitem>
</varlistentry>
<varlistentry>
<term>16 Feb 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
I shipped a few commerical products to &gt;40000 people now but
GStreamer is way more exciting...
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>16 Feb 2001</term>
<listitem>
<para>
*
<emphasis>tool-man</emphasis>
is a gstreamer groupie
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Jan 2001</term>
<listitem>
<para>
<emphasis>Omega:</emphasis>
did you run ldconfig? maybe it talks to init?
</para>
<para>
<emphasis>wtay:</emphasis>
not sure, don't think so...
I did run gstreamer-register though :-)
</para>
<para>
<emphasis>Omega:</emphasis>
ah, that did it then ;-)
</para>
<para>
<emphasis>wtay:</emphasis>
right
</para>
<para>
<emphasis>Omega:</emphasis>
probably not, but in case GStreamer starts turning into an OS, someone please let me know?
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>9 Jan 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
me tar, you rpm?
</para>
<para>
<emphasis>wtay:</emphasis>
hehe, forgot &quot;zan&quot;
</para>
<para>
<emphasis>Omega:</emphasis>
?
</para>
<para>
<emphasis>wtay:</emphasis>
me tar&quot;zan&quot;, you ...
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>7 Jan 2001</term>
<listitem>
<para>
<emphasis>Omega:</emphasis>
that means probably building an agreggating, cache-massaging
queue to shove N buffers across all at once, forcing cache
transfer.
</para>
<para>
<emphasis>wtay:</emphasis>
never done that before...
</para>
<para>
<emphasis>Omega:</emphasis>
nope, but it's easy to do in gstreamer &lt;g&gt;
</para>
<para>
<emphasis>wtay:</emphasis>
sure, I need to rewrite cp with gstreamer too, someday :-)
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>7 Jan 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
GStreamer; always at least one developer is awake...
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>5/6 Jan 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
we need to cut down the time to create an mp3 player down to
seconds...
</para>
<para>
<emphasis>richardb:</emphasis>
:)
</para>
<para>
<emphasis>Omega:</emphasis>
I'm wanting to something more interesting soon, I did the "draw an mp3
player in 15sec" back in October '99.
</para>
<para>
<emphasis>wtay:</emphasis>
by the time Omega gets his hands on the editor, you'll see a
complete audio mixer in the editor :-)
</para>
<para>
<emphasis>richardb:</emphasis>
Well, it clearly has the potential...
</para>
<para>
<emphasis>Omega:</emphasis>
Working on it... ;-)
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>28 Dec 2000</term>
<listitem>
<para>
<emphasis>MPAA:</emphasis>
We will sue you now, you have violated our IP rights!
</para>
<para>
<emphasis>wtay:</emphasis>
hehehe
</para>
<para>
<emphasis>MPAA:</emphasis>
How dare you laugh at us? We have lawyers! We have Congressmen! We have <emphasis>LARS</emphasis>!
</para>
<para>
<emphasis>wtay:</emphasis>
I'm so sorry your honor
</para>
<para>
<emphasis>MPAA:</emphasis>
Hrumph.
</para>
<para>
*
<emphasis>wtay</emphasis>
bows before thy
</para>
</listitem>
</varlistentry>
</variablelist>
</chapter>

View file

@ -1,3 +0,0 @@
pre.programlisting {
background: #E8E8FF;
}

View file

@ -1,185 +0,0 @@
<chapter id="chapter-bins">
<title>Bins</title>
<para>
A bin is a container element. You can add elements to a bin. Since a
bin is an element itself, a bin can be handled in the same way as any
other element. Therefore, the whole previous chapter (<xref
linkend="chapter-elements"/>) applies to bins as well.
</para>
<sect1 id="section-bins">
<title>What are bins</title>
<para>
Bins allow you to combine a group of linked elements into one
logical element. You do not deal with the individual elements
anymore but with just one element, the bin. We will see that
this is extremely powerful when you are going to construct
complex pipelines since it allows you to break up the pipeline
in smaller chunks.
</para>
<para>
The bin will also manage the elements contained in it. It will
perform state changes on the elements as well as collect and
forward bus messages.
</para>
<figure float="1" id="section-bin-img">
<title>Visualisation of a bin with some elements in it</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/bin-element.&image;" format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
There is one specialized type of bin available to the
&GStreamer; programmer:
</para>
<itemizedlist>
<listitem>
<para>
A pipeline: a generic container that manages the synchronization
and bus messages of the contained elements. The toplevel bin has
to be a pipeline, every application thus needs at least one of
these.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-bin-create">
<title>Creating a bin</title>
<para>
Bins are created in the same way that other elements are created,
i.e. using an element factory. There are also convenience functions
available (<function>gst_bin_new ()</function> and
<function>gst_pipeline_new ()</function>).
To add elements to a bin or remove elements from a
bin, you can use <function>gst_bin_add ()</function> and
<function>gst_bin_remove ()</function>. Note that the bin that you
add an element to will take ownership of that element. If you
destroy the bin, the element will be dereferenced with it. If you
remove an element from a bin, it will be dereferenced automatically.
</para>
<programlisting><!-- example-begin bin.c a -->
#include &lt;gst/gst.h&gt;
int
main (int argc,
char *argv[])
{
GstElement *bin, *pipeline, *source, *sink;
/* init */
gst_init (&amp;argc, &amp;argv);
/* create */
pipeline = gst_pipeline_new ("my_pipeline");
bin = gst_bin_new ("my_bin");
source = gst_element_factory_make ("fakesrc", "source");
sink = gst_element_factory_make ("fakesink", "sink");
/* First add the elements to the bin */
gst_bin_add_many (GST_BIN (bin), source, sink, NULL);
/* add the bin to the pipeline */
gst_bin_add (GST_BIN (pipeline), bin);
/* link the elements */
gst_element_link (source, sink);
<!-- example-end bin.c a -->
[..]<!-- example-begin bin.c b --><!--
return 0;
--><!-- example-end bin.c b -->
<!-- example-begin bin.c c -->
}
<!-- example-end bin.c c --></programlisting>
<para>
There are various functions to lookup elements in a bin. The most
commonly used are <function>gst_bin_get_by_name ()</function> and
<function>gst_bin_get_by_interface ()</function>. You can also
iterate over all elements that a bin contains using the function
<function>gst_bin_iterate_elements ()</function>. See the API references
of <ulink type="http"
url="&URLAPI;GstBin.html"><classname>GstBin</classname></ulink>
for details.
</para>
</sect1>
<sect1 id="section-bin-custom">
<title>Custom bins</title>
<para>
The application programmer can create custom bins packed with elements
to perform a specific task. This allows you, for example, to write
an Ogg/Vorbis decoder with just the following lines of code:
</para>
<programlisting>
int
main (int argc,
char *argv[])
{
GstElement *player;
/* init */
gst_init (&amp;argc, &amp;argv);
/* create player */
player = gst_element_factory_make ("oggvorbisplayer", "player");
/* set the source audio file */
g_object_set (player, "location", "helloworld.ogg", NULL);
/* start playback */
gst_element_set_state (GST_ELEMENT (player), GST_STATE_PLAYING);
[..]
}
</programlisting>
<para>
(This is a silly example of course, there already exists a much more
powerful and versatile custom bin like this: the playbin element.)
</para>
<para>
Custom bins can be created with a plugin or from the application. You
will find more information about creating custom bin in the <ulink
type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.html">Plugin
Writers Guide</ulink>.
</para>
<para>
Examples of such custom bins are the playbin and uridecodebin elements from<ulink
type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/index.html">
gst-plugins-base</ulink>.
</para>
</sect1>
<sect1 id="section-bin-state-change-handling">
<title>Bins manage states of their children</title>
<para>
Bins manage the state of all elements contained in them. If you set
a bin (or a pipeline, which is a special top-level type of bin) to
a certain target state using <function>gst_element_set_state ()</function>,
it will make sure all elements contained within it will also be set
to this state. This means it's usually only necessary to set the state
of the top-level pipeline to start up the pipeline or shut it down.
</para>
<para>
The bin will perform the state changes on all its children from the
sink element to the source element. This ensures that the downstream
element is ready to receive data when the upstream element is brought
to PAUSED or PLAYING. Similarly when shutting down, the sink elements
will be set to READY or NULL first, which will cause the upstream
elements to receive a FLUSHING error and stop the streaming threads
before the elements are set to the READY or NULL state.
</para>
<para>
Note, however, that if elements are added to a bin or pipeline that's
already running, , e.g. from within a "pad-added"
signal callback, its state will not automatically be brought in line with
the current state or target state of the bin or pipeline it was added to.
Instead, you have to need to set it to the desired target state yourself
using <function>gst_element_set_state ()</function> or
<function>gst_element_sync_state_with_parent ()</function> when adding
elements to an already-running pipeline.
</para>
</sect1>
</chapter>

View file

@ -1,289 +0,0 @@
<chapter id="chapter-bus">
<title>Bus</title>
<para>
A bus is a simple system that takes care of forwarding messages from
the streaming threads to an application in its own thread context. The
advantage of a bus is that an application does not need to be
thread-aware in order to use &GStreamer;, even though &GStreamer;
itself is heavily threaded.
</para>
<para>
Every pipeline contains a bus by default, so applications do not need
to create a bus or anything. The only thing applications should do is
set a message handler on a bus, which is similar to a signal handler
to an object. When the mainloop is running, the bus will periodically
be checked for new messages, and the callback will be called when any
message is available.
</para>
<sect1 id="section-bus-howto">
<title>How to use a bus</title>
<para>
There are two different ways to use a bus:
<itemizedlist>
<listitem>
<para>
Run a GLib/Gtk+ main loop (or iterate the default GLib main
context yourself regularly) and attach some kind of watch to the
bus. This way the GLib main loop will check the bus for new
messages and notify you whenever there are messages.
</para>
<para>
Typically you would use <function>gst_bus_add_watch ()</function>
or <function>gst_bus_add_signal_watch ()</function> in this case.
</para>
<para>
To use a bus, attach a message handler to the bus of a pipeline
using <function>gst_bus_add_watch ()</function>. This handler will
be called whenever the pipeline emits a message to the bus. In this
handler, check the signal type (see next section) and do something
accordingly. The return value of the handler should be TRUE to
keep the handler attached to the bus, return FALSE to remove it.
</para>
</listitem>
<listitem>
<para>
Check for messages on the bus yourself. This can be done using
<function>gst_bus_peek ()</function> and/or
<function>gst_bus_poll ()</function>.
</para>
</listitem>
</itemizedlist>
</para>
<programlisting><!-- example-begin bus.c a -->
#include &lt;gst/gst.h&gt;
static GMainLoop *loop;
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
g_print ("Got %s message\n", GST_MESSAGE_TYPE_NAME (message));
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* we want to be notified again the next time there is a message
* on the bus, so returning TRUE (FALSE means we want to stop watching
* for messages on the bus and our callback should not be called again)
*/
return TRUE;
}
gint
main (gint argc,
gchar *argv[])
{
GstElement *pipeline;
GstBus *bus;
guint bus_watch_id;
/* init */
gst_init (&amp;argc, &amp;argv);
/* create pipeline, add handler */
pipeline = gst_pipeline_new ("my_pipeline");
/* adds a watch for new message on our pipeline&apos;s message bus to
* the default GLib main context, which is the main context that our
* GLib main loop is attached to below
*/
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
bus_watch_id = gst_bus_add_watch (bus, my_bus_callback, NULL);
gst_object_unref (bus);
<!-- example-end bus.c a -->
[..]<!-- example-begin bus.c b -->
<!-- example-begin bus.c c -->
/* create a mainloop that runs/iterates the default GLib main context
* (context NULL), in other words: makes the context check if anything
* it watches for has happened. When a message has been posted on the
* bus, the default main context will automatically call our
* my_bus_callback() function to notify us of that message.
* The main loop will be run until someone calls g_main_loop_quit()
*/
loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (loop);
/* clean up */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
g_source_remove (bus_watch_id);
g_main_loop_unref (loop);
return 0;
}
<!-- example-end bus.c c -->
</programlisting>
<para>
It is important to know that the handler will be called in the thread
context of the mainloop. This means that the interaction between the
pipeline and application over the bus is
<emphasis>asynchronous</emphasis>, and thus not suited for some
real-time purposes, such as cross-fading between audio tracks, doing
(theoretically) gapless playback or video effects. All such things
should be done in the pipeline context, which is easiest by writing
a &GStreamer; plug-in. It is very useful for its primary purpose,
though: passing messages from pipeline to application.
The advantage of this approach is that all the threading that
&GStreamer; does internally is hidden from the application and the
application developer does not have to worry about thread issues at
all.
</para>
<para>
Note that if you're using the default GLib mainloop integration, you
can, instead of attaching a watch, connect to the <quote>message</quote>
signal on the bus. This way you don't have to
<function>switch()</function>
on all possible message types; just connect to the interesting signals
in form of <quote>message::&lt;type&gt;</quote>, where &lt;type&gt;
is a specific message type (see the next section for an explanation of
message types).
</para>
<para>
The above snippet could then also be written as:
</para>
<programlisting>
GstBus *bus;
[..]
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline);
gst_bus_add_signal_watch (bus);
g_signal_connect (bus, "message::error", G_CALLBACK (cb_message_error), NULL);
g_signal_connect (bus, "message::eos", G_CALLBACK (cb_message_eos), NULL);
[..]
</programlisting>
<para>
If you aren't using GLib mainloop, the asynchronous message signals won't
be available by default. You can however install a custom sync handler
that wakes up the custom mainloop and that uses
<function>gst_bus_async_signal_func ()</function> to emit the signals.
(see also <ulink type="http"
url="&URLAPI;GstBus.html">documentation</ulink> for details)
</para>
</sect1>
<sect1 id="section-bus-message-types">
<title>Message types</title>
<para>
&GStreamer; has a few pre-defined message types that can be passed
over the bus. The messages are extensible, however. Plug-ins can
define additional messages, and applications can decide to either
have specific code for those or ignore them. All applications are
strongly recommended to at least handle error messages by providing
visual feedback to the user.
</para>
<para>
All messages have a message source, type and timestamp. The message
source can be used to see which element emitted the message. For some
messages, for example, only the ones emitted by the top-level pipeline
will be interesting to most applications (e.g. for state-change
notifications). Below is a list of all messages and a short explanation
of what they do and how to parse message-specific content.
</para>
<itemizedlist>
<listitem>
<para>
Error, warning and information notifications: those are used
by elements if a message should be shown to the user about the
state of the pipeline. Error messages are fatal and terminate
the data-passing. The error should be repaired to resume pipeline
activity. Warnings are not fatal, but imply a problem nevertheless.
Information messages are for non-problem notifications. All those
messages contain a <classname>GError</classname> with the main
error type and message, and optionally a debug string. Both
can be extracted using <function>gst_message_parse_error
()</function>, <function>_parse_warning ()</function> and
<function>_parse_info ()</function>. Both error and debug strings
should be freed after use.
</para>
</listitem>
<listitem>
<para>
End-of-stream notification: this is emitted when the stream has
ended. The state of the pipeline will not change, but further
media handling will stall. Applications can use this to skip to
the next song in their playlist. After end-of-stream, it is also
possible to seek back in the stream. Playback will then continue
automatically. This message has no specific arguments.
</para>
</listitem>
<listitem>
<para>
Tags: emitted when metadata was found in the stream. This can be
emitted multiple times for a pipeline (e.g. once for descriptive
metadata such as artist name or song title, and another one for
stream-information, such as samplerate and bitrate). Applications
should cache metadata internally. <function>gst_message_parse_tag
()</function> should be used to parse the taglist, which should
be <function>gst_tag_list_unref ()</function>'ed when no longer
needed.
</para>
</listitem>
<listitem>
<para>
State-changes: emitted after a successful state change.
<function>gst_message_parse_state_changed ()</function> can be
used to parse the old and new state of this transition.
</para>
</listitem>
<listitem>
<para>
Buffering: emitted during caching of network-streams. One can
manually extract the progress (in percent) from the message by
extracting the <quote>buffer-percent</quote> property from the
structure returned by <function>gst_message_get_structure
()</function>. See also <xref linkend="chapter-buffering"/>.
</para>
</listitem>
<listitem>
<para>
Element messages: these are special messages that are unique to
certain elements and usually represent additional features. The
element's documentation should mention in detail which
element messages a particular element may send. As an example,
the 'qtdemux' QuickTime demuxer element may send a 'redirect'
element message on certain occasions if the stream contains a
redirect instruction.
</para>
</listitem>
<listitem>
<para>
Application-specific messages: any information on those can
be extracted by getting the message structure (see above) and
reading its fields. Usually these messages can safely be ignored.
</para>
<para>
Application messages are primarily meant for internal
use in applications in case the application needs to marshal
information from some thread into the main thread. This is
particularly useful when the application is making use of element
signals (as those signals will be emitted in the context of the
streaming thread).
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,101 +0,0 @@
<chapter id="chapter-data">
<title>Buffers and Events</title>
<para>
The data flowing through a pipeline consists of a combination of
buffers and events. Buffers contain the actual media data. Events
contain control information, such as seeking information and
end-of-stream notifiers. All this will flow through the pipeline
automatically when it's running. This chapter is mostly meant to
explain the concept to you; you don't need to do anything for this.
</para>
<sect1 id="section-buffers">
<title>Buffers</title>
<para>
Buffers contain the data that will flow through the pipeline you have
created. A source element will typically create a new buffer and pass
it through a pad to the next element in the chain. When using the
GStreamer infrastructure to create a media pipeline you will not have
to deal with buffers yourself; the elements will do that for you.
</para>
<para>
A buffer consists, amongst others, of:
</para>
<itemizedlist>
<listitem>
<para>
Pointers to memory objects. Memory objects encapsulate a region
in the memory.
</para>
</listitem>
<listitem>
<para>
A timestamp for the buffer.
</para>
</listitem>
<listitem>
<para>
A refcount that indicates how many elements are using this
buffer. This refcount will be used to destroy the buffer when no
element has a reference to it.
</para>
</listitem>
<listitem>
<para>
Buffer flags.
</para>
</listitem>
</itemizedlist>
<para>
The simple case is that a buffer is created, memory allocated, data
put in it, and passed to the next element. That element reads the
data, does something (like creating a new buffer and decoding into
it), and unreferences the buffer. This causes the data to be free'ed
and the buffer to be destroyed. A typical video or audio decoder
works like this.
</para>
<para>
There are more complex scenarios, though. Elements can modify buffers
in-place, i.e. without allocating a new one. Elements can also write
to hardware memory (such as from video-capture sources) or memory
allocated from the X-server (using XShm). Buffers can be read-only,
and so on.
</para>
</sect1>
<sect1 id="section-events">
<title>Events</title>
<para>
Events are control particles that are sent both up- and downstream in
a pipeline along with buffers. Downstream events notify fellow elements
of stream states. Possible events include seeking, flushes,
end-of-stream notifications and so on. Upstream events are used both
in application-element interaction as well as element-element interaction
to request changes in stream state, such as seeks. For applications,
only upstream events are important. Downstream events are just
explained to get a more complete picture of the data concept.
</para>
<para>
Since most applications seek in time units, our example below does so
too:
</para>
<programlisting>
static void
seek_to_time (GstElement *element,
guint64 time_ns)
{
GstEvent *event;
event = gst_event_new_seek (1.0, GST_FORMAT_TIME,
GST_SEEK_FLAG_NONE,
GST_SEEK_METHOD_SET, time_ns,
GST_SEEK_TYPE_NONE, G_GUINT64_CONSTANT (0));
gst_element_send_event (element, event);
}
</programlisting>
<para>
The function <function>gst_element_seek ()</function> is a shortcut
for this. This is mostly just to show how it all works.
</para>
</sect1>
</chapter>

View file

@ -1,567 +0,0 @@
<chapter id="chapter-elements" xreflabel="Elements">
<title>Elements</title>
<para>
The most important object in &GStreamer; for the application programmer
is the <ulink type="http"
url="&URLAPI;GstElement.html"><classname>GstElement</classname></ulink>
object. An element is the basic building block for a media pipeline. All
the different high-level components you will use are derived from
<classname>GstElement</classname>. Every decoder, encoder, demuxer, video
or audio output is in fact a <classname>GstElement</classname>
</para>
<sect1 id="section-elements-design" xreflabel="What are elements?">
<title>What are elements?</title>
<para>
For the application programmer, elements are best visualized as black
boxes. On the one end, you might put something in, the element does
something with it and something else comes out at the other side. For
a decoder element, for example, you'd put in encoded data, and the
element would output decoded data. In the next chapter (see <xref
linkend="chapter-pads"/>), you will learn more about data input and
output in elements, and how you can set that up in your application.
</para>
<sect2 id="section-elements-src">
<title>Source elements</title>
<para>
Source elements generate data for use by a pipeline, for example
reading from disk or from a sound card. <xref
linkend="section-element-srcimg"/> shows how we will visualise
a source element. We always draw a source pad to the right of
the element.
</para>
<figure float="1" id="section-element-srcimg">
<title>Visualisation of a source element</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/src-element.&image;"
format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
Source elements do not accept data, they only generate data. You can
see this in the figure because it only has a source pad (on the
right). A source pad can only generate data.
</para>
</sect2>
<sect2 id="section-elements-filter">
<title>Filters, convertors, demuxers, muxers and codecs</title>
<para>
Filters and filter-like elements have both input and outputs pads.
They operate on data that they receive on their input (sink) pads,
and will provide data on their output (source) pads. Examples of
such elements are a volume element (filter), a video scaler
(convertor), an Ogg demuxer or a Vorbis decoder.
</para>
<para>
Filter-like elements can have any number of source or sink pads. A
video demuxer, for example, would have one sink pad and several
(1-N) source pads, one for each elementary stream contained in the
container format. Decoders, on the other hand, will only have one
source and sink pads.
</para>
<figure float="1" id="section-element-filterimg">
<title>Visualisation of a filter element</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/filter-element.&image;"
format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
<xref linkend="section-element-filterimg"/> shows how we will
visualise a filter-like element. This specific element has one source
and one sink element. Sink pads, receiving input data, are depicted
at the left of the element; source pads are still on the right.
</para>
<figure float="1" id="section-element-multifilterimg">
<title>Visualisation of a filter element with
more than one output pad</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/filter-element-multi.&image;"
format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
<xref linkend="section-element-multifilterimg"/> shows another
filter-like element, this one having more than one output (source)
pad. An example of one such element could, for example, be an Ogg
demuxer for an Ogg stream containing both audio and video. One
source pad will contain the elementary video stream, another will
contain the elementary audio stream. Demuxers will generally fire
signals when a new pad is created. The application programmer can
then handle the new elementary stream in the signal handler.
</para>
</sect2>
<sect2 id="section-elements-sink">
<title>Sink elements</title>
<para>
Sink elements are end points in a media pipeline. They accept
data but do not produce anything. Disk writing, soundcard playback,
and video output would all be implemented by sink elements.
<xref linkend="section-element-sinkimg"/> shows a sink element.
</para>
<figure float="1" id="section-element-sinkimg">
<title>Visualisation of a sink element</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/sink-element.&image;"
format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
</sect2>
</sect1>
<sect1 id="section-elements-create">
<title>Creating a <classname>GstElement</classname></title>
<para>
The simplest way to create an element is to use <ulink type="http"
url="&URLAPI;GstElementFactory.html#gst-element-factory-make"><function>gst_element_factory_make
()</function></ulink>. This function takes a factory name and an
element name for the newly created element. The name of the element
is something you can use later on to look up the element in a bin,
for example. The name will also be used in debug output. You can
pass <symbol>NULL</symbol> as the name argument to get a unique,
default name.
</para>
<para>
When you don't need the element anymore, you need to unref it using
<ulink type="http"
url="&URLAPI;GstObject.html#gst-object-unref"><function>gst_object_unref
()</function></ulink>. This decreases the reference count for the
element by 1. An element has a refcount of 1 when it gets created.
An element gets destroyed completely when the refcount is decreased
to 0.
</para>
<para>
The following example &EXAFOOT; shows how to create an element named
<emphasis>source</emphasis> from the element factory named
<emphasis>fakesrc</emphasis>. It checks if the creation succeeded.
After checking, it unrefs the element.
</para>
<programlisting><!-- example-begin elementmake.c --><![CDATA[
#include <gst/gst.h>
int
main (int argc,
char *argv[])
{
GstElement *element;
/* init GStreamer */
gst_init (&argc, &argv);
/* create element */
element = gst_element_factory_make ("fakesrc", "source");
if (!element) {
g_print ("Failed to create element of type 'fakesrc'\n");
return -1;
}
gst_object_unref (GST_OBJECT (element));
return 0;
}
]]><!-- example-end elementmake.c --></programlisting>
<para>
<function>gst_element_factory_make</function> is actually a shorthand
for a combination of two functions. A <ulink type="http"
url="&URLAPI;GstElement.html"><classname>GstElement</classname></ulink>
object is created from a factory. To create the element, you have to
get access to a <ulink type="http"
url="&URLAPI;GstElementFactory.html"><classname>GstElementFactory</classname></ulink>
object using a unique factory name. This is done with <ulink type="http"
url="&URLAPI;GstElementFactory.html#gst-element-factory-find"><function>gst_element_factory_find
()</function></ulink>.
</para>
<para>
The following code fragment is used to get a factory that can be used
to create the <emphasis>fakesrc</emphasis> element, a fake data source.
The function <ulink type="http"
url="&URLAPI;GstElementFactory.html#gst-element-factory-create"><function>gst_element_factory_create
()</function></ulink> will use the element factory to create an
element with the given name.
</para>
<programlisting><!-- example-begin elementcreate.c --><![CDATA[
#include <gst/gst.h>
int
main (int argc,
char *argv[])
{
GstElementFactory *factory;
GstElement * element;
/* init GStreamer */
gst_init (&argc, &argv);
/* create element, method #2 */
factory = gst_element_factory_find ("fakesrc");
if (!factory) {
g_print ("Failed to find factory of type 'fakesrc'\n");
return -1;
}
element = gst_element_factory_create (factory, "source");
if (!element) {
g_print ("Failed to create element, even though its factory exists!\n");
return -1;
}
gst_object_unref (GST_OBJECT (element));
return 0;
}
]]><!-- example-end elementcreate.c --></programlisting>
</sect1>
<sect1 id="section-elements-properties">
<title>Using an element as a <classname>GObject</classname></title>
<para>
A <ulink type="http"
url="&URLAPI;GstElement.html"><classname>GstElement</classname></ulink>
can have several properties which are implemented using standard
<classname>GObject</classname> properties. The usual
<classname>GObject</classname> methods to query, set and get
property values and <classname>GParamSpecs</classname> are
therefore supported.
</para>
<para>
Every <classname>GstElement</classname> inherits at least one
property from its parent <classname>GstObject</classname>: the
"name" property. This is the name you provide to the functions
<function>gst_element_factory_make ()</function> or
<function>gst_element_factory_create ()</function>. You can get
and set this property using the functions
<function>gst_object_set_name</function> and
<function>gst_object_get_name</function> or use the
<classname>GObject</classname> property mechanism as shown below.
</para>
<programlisting><!-- example-begin elementget.c --><![CDATA[
#include <gst/gst.h>
int
main (int argc,
char *argv[])
{
GstElement *element;
gchar *name;
/* init GStreamer */
gst_init (&argc, &argv);
/* create element */
element = gst_element_factory_make ("fakesrc", "source");
/* get name */
g_object_get (G_OBJECT (element), "name", &name, NULL);
g_print ("The name of the element is '%s'.\n", name);
g_free (name);
gst_object_unref (GST_OBJECT (element));
return 0;
}
]]><!-- example-end elementget.c --></programlisting>
<para>
Most plugins provide additional properties to provide more information
about their configuration or to configure the element.
<command>gst-inspect</command> is a useful tool to query the properties
of a particular element, it will also use property introspection to give
a short explanation about the function of the property and about the
parameter types and ranges it supports. See
<xref linkend="section-applications-inspect"/>
in the appendix for details about <command>gst-inspect</command>.
</para>
<para>
For more information about <classname>GObject</classname>
properties we recommend you read the <ulink
url="http://developer.gnome.org/gobject/stable/rn01.html"
type="http">GObject manual</ulink> and an introduction to <ulink
url="http://developer.gnome.org/gobject/stable/pt01.html" type="http">
The Glib Object system</ulink>.
</para>
<para>
A <ulink type="http" url="&URLAPI;GstElement.html">
<classname>GstElement</classname></ulink> also provides various
<classname>GObject</classname> signals that can be used as a flexible
callback mechanism. Here, too, you can use <command>gst-inspect</command>
to see which signals a specific element supports. Together, signals
and properties are the most basic way in which elements and
applications interact.
</para>
</sect1>
<sect1 id="section-elements-factories">
<title>More about element factories</title>
<para>
In the previous section, we briefly introduced the <ulink type="http"
url="&URLAPI;GstElementFactory.html"><classname>GstElementFactory</classname></ulink>
object already as a way to create instances of an element. Element
factories, however, are much more than just that. Element factories
are the basic types retrieved from the &GStreamer; registry, they
describe all plugins and elements that &GStreamer; can create. This
means that element factories are useful for automated element
instancing, such as what autopluggers do, and for creating lists
of available elements.
</para>
<sect2 id="section-elements-factories-details">
<title>Getting information about an element using a factory</title>
<para>
Tools like <command>gst-inspect</command> will provide some generic
information about an element, such as the person that wrote the
plugin, a descriptive name (and a shortname), a rank and a category.
The category can be used to get the type of the element that can
be created using this element factory. Examples of categories include
<classname>Codec/Decoder/Video</classname> (video decoder),
<classname>Codec/Encoder/Video</classname> (video encoder),
<classname>Source/Video</classname> (a video generator),
<classname>Sink/Video</classname> (a video output), and all these
exist for audio as well, of course. Then, there's also
<classname>Codec/Demuxer</classname> and
<classname>Codec/Muxer</classname> and a whole lot more.
<command>gst-inspect</command> will give a list of all factories, and
<command>gst-inspect &lt;factory-name&gt;</command> will list all
of the above information, and a lot more.
</para>
<programlisting><!-- example-begin elementfactory.c --><![CDATA[
#include <gst/gst.h>
int
main (int argc,
char *argv[])
{
GstElementFactory *factory;
/* init GStreamer */
gst_init (&argc, &argv);
/* get factory */
factory = gst_element_factory_find ("fakesrc");
if (!factory) {
g_print ("You don't have the 'fakesrc' element installed!\n");
return -1;
}
/* display information */
g_print ("The '%s' element is a member of the category %s.\n"
"Description: %s\n",
gst_plugin_feature_get_name (GST_PLUGIN_FEATURE (factory)),
gst_element_factory_get_metadata (factory, GST_ELEMENT_METADATA_KLASS),
gst_element_factory_get_metadata (factory, GST_ELEMENT_METADATA_DESCRIPTION));
return 0;
}
]]><!-- example-end elementfactory.c --></programlisting>
<para>
You can use <function>gst_registry_pool_feature_list (GST_TYPE_ELEMENT_FACTORY)</function>
to get a list of all the element factories that &GStreamer; knows
about.
</para>
</sect2>
<sect2 id="section-elements-factories-padtemplates">
<title>Finding out what pads an element can contain</title>
<para>
Perhaps the most powerful feature of element factories is that
they contain a full description of the pads that the element
can generate, and the capabilities of those pads (in layman words:
what types of media can stream over those pads), without actually
having to load those plugins into memory. This can be used
to provide a codec selection list for encoders, or it can be used
for autoplugging purposes for media players. All current
&GStreamer;-based media players and autopluggers work this way.
We'll look closer at these features as we learn about
<classname>GstPad</classname> and <classname>GstCaps</classname>
in the next chapter: <xref linkend="chapter-pads"/>
</para>
</sect2>
</sect1>
<sect1 id="section-elements-link" xreflabel="Linking elements">
<title>Linking elements</title>
<para>
By linking a source element with zero or more filter-like
elements and finally a sink element, you set up a media
pipeline. Data will flow through the elements. This is the
basic concept of media handling in &GStreamer;.
</para>
<figure float="1" id="section-link">
<title>Visualisation of three linked elements</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/linked-elements.&image;"
format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
By linking these three elements, we have created a very simple
chain of elements. The effect of this will be that the output of
the source element (<quote>element1</quote>) will be used as input
for the filter-like element (<quote>element2</quote>). The
filter-like element will do something with the data and send the
result to the final sink element (<quote>element3</quote>).
</para>
<para>
Imagine the above graph as a simple Ogg/Vorbis audio decoder. The
source is a disk source which reads the file from disc. The second
element is a Ogg/Vorbis audio decoder. The sink element is your
soundcard, playing back the decoded audio data. We will use this
simple graph to construct an Ogg/Vorbis player later in this manual.
</para>
<para>
In code, the above graph is written like this:
</para>
<programlisting><!-- example-begin elementlink.c a -->
#include &lt;gst/gst.h&gt;
int
main (int argc,
char *argv[])
{
GstElement *pipeline;
GstElement *source, *filter, *sink;
/* init */
gst_init (&amp;argc, &amp;argv);
/* create pipeline */
pipeline = gst_pipeline_new ("my-pipeline");
/* create elements */
source = gst_element_factory_make ("fakesrc", "source");
filter = gst_element_factory_make ("identity", "filter");
sink = gst_element_factory_make ("fakesink", "sink");
/* must add elements to pipeline before linking them */
gst_bin_add_many (GST_BIN (pipeline), source, filter, sink, NULL);
/* link */
if (!gst_element_link_many (source, filter, sink, NULL)) {
g_warning ("Failed to link elements!");
}
<!-- example-end elementlink.c a -->
[..]<!-- example-begin elementlink.c b --><!--
return 0;
--><!-- example-end elementlink.c b -->
<!-- example-begin elementlink.c c -->
}
<!-- example-end elementlink.c c --></programlisting>
<para>
For more specific behaviour, there are also the functions
<function>gst_element_link ()</function> and
<function>gst_element_link_pads ()</function>. You can also obtain
references to individual pads and link those using various
<function>gst_pad_link_* ()</function> functions. See the API
references for more details.
</para>
<para>
Important: you must add elements to a bin or pipeline
<emphasis>before</emphasis> linking them, since adding an element to
a bin will disconnect any already existing links. Also, you cannot
directly link elements that are not in the same bin or pipeline; if
you want to link elements or pads at different hierarchy levels, you
will need to use ghost pads (more about ghost pads later,
see <xref linkend="section-pads-ghost"/>).
</para>
</sect1>
<sect1 id="section-elements-states">
<title>Element States</title>
<para>
After being created, an element will not actually perform any actions
yet. You need to change elements state to make it do something.
&GStreamer; knows four element states, each with a very specific
meaning. Those four states are:
</para>
<itemizedlist>
<listitem>
<para>
<classname>GST_STATE_NULL</classname>: this is the default state.
No resources are allocated in this state, so, transitioning to it
will free all resources. The element must be in this state when
its refcount reaches 0 and it is freed.
</para>
</listitem>
<listitem>
<para>
<classname>GST_STATE_READY</classname>: in the ready state, an
element has allocated all of its global resources, that is,
resources that can be kept within streams. You can think about
opening devices, allocating buffers and so on. However, the
stream is not opened in this state, so the stream positions is
automatically zero. If a stream was previously opened, it should
be closed in this state, and position, properties and such should
be reset.
</para>
</listitem>
<listitem>
<para>
<classname>GST_STATE_PAUSED</classname>: in this state, an
element has opened the stream, but is not actively processing
it. An element is allowed to modify a stream's position, read
and process data and such to prepare for playback as soon as
state is changed to PLAYING, but it is <emphasis>not</emphasis>
allowed to play the data which would make the clock run.
In summary, PAUSED is the same as PLAYING but without a running
clock.
</para>
<para>
Elements going into the PAUSED state should prepare themselves
for moving over to the PLAYING state as soon as possible. Video
or audio outputs would, for example, wait for data to arrive and
queue it so they can play it right after the state change. Also,
video sinks can already play the first frame (since this does
not affect the clock yet). Autopluggers could use this same
state transition to already plug together a pipeline. Most other
elements, such as codecs or filters, do not need to explicitly
do anything in this state, however.
</para>
</listitem>
<listitem>
<para>
<classname>GST_STATE_PLAYING</classname>: in the PLAYING state,
an element does exactly the same as in the PAUSED state, except
that the clock now runs.
</para>
</listitem>
</itemizedlist>
<para>
You can change the state of an element using the function
<function>gst_element_set_state ()</function>. If you set an element
to another state, &GStreamer; will internally traverse all intermediate
states. So if you set an element from NULL to PLAYING, &GStreamer;
will internally set the element to READY and PAUSED in between.
</para>
<para>
When moved to <classname>GST_STATE_PLAYING</classname>, pipelines
will process data automatically. They do not need to be iterated in
any form. Internally, &GStreamer; will start threads that take this
task on to them. &GStreamer; will also take care of switching
messages from the pipeline's thread into the application's own
thread, by using a <ulink type="http"
url="&URLAPI;GstBus.html"><classname>GstBus</classname></ulink>. See
<xref linkend="chapter-bus"/> for details.
</para>
<para>
When you set a bin or pipeline to a certain target state, it will usually
propagate the state change to all elements within the bin or pipeline
automatically, so it's usually only necessary to set the state of the
top-level pipeline to start up the pipeline or shut it down. However,
when adding elements dynamically to an already-running pipeline, e.g.
from within a "pad-added" signal callback, you
need to set it to the desired target state yourself using
<function>gst_element_set_state ()</function> or
<function>gst_element_sync_state_with_parent ()</function>.
</para>
</sect1>
</chapter>

View file

@ -1,277 +0,0 @@
<chapter id="chapter-helloworld">
<title>Your first application</title>
<para>
This chapter will summarize everything you've learned in the previous
chapters. It describes all aspects of a simple &GStreamer; application,
including initializing libraries, creating elements, packing elements
together in a pipeline and playing this pipeline. By doing all this,
you will be able to build a simple Ogg/Vorbis audio player.
</para>
<sect1 id="section-helloworld">
<title>Hello world</title>
<para>
We're going to create a simple first application, a simple Ogg/Vorbis
command-line audio player. For this, we will use only standard
&GStreamer; components. The player will read a file specified on
the command-line. Let's get started!
</para>
<para>
We've learned, in <xref linkend="chapter-init"/>, that the first thing
to do in your application is to initialize &GStreamer; by calling
<function>gst_init ()</function>. Also, make sure that the application
includes <filename>gst/gst.h</filename> so all function names and
objects are properly defined. Use <function>#include
&lt;gst/gst.h&gt;</function> to do that.
</para>
<para>
Next, you'll want to create the different elements using
<function>gst_element_factory_make ()</function>. For an Ogg/Vorbis
audio player, we'll need a source element that reads files from a
disk. &GStreamer; includes this element under the name
<quote>filesrc</quote>. Next, we'll need something to parse the
file and decode it into raw audio. &GStreamer; has two elements
for this: the first parses Ogg streams into elementary streams (video,
audio) and is called <quote>oggdemux</quote>. The second is a Vorbis
audio decoder, it's conveniently called <quote>vorbisdec</quote>.
Since <quote>oggdemux</quote> creates dynamic pads for each elementary
stream, you'll need to set a <quote>pad-added</quote> event handler
on the <quote>oggdemux</quote> element, like you've learned in
<xref linkend="section-pads-dynamic"/>, to link the Ogg demuxer and
the Vorbis decoder elements together. At last, we'll also need an
audio output element, we will use <quote>autoaudiosink</quote>, which
automatically detects your audio device.
</para>
<para>
The last thing left to do is to add all elements into a container
element, a <classname>GstPipeline</classname>, and wait until
we've played the whole song. We've previously
learned how to add elements to a container bin in <xref
linkend="chapter-bins"/>, and we've learned about element states
in <xref linkend="section-elements-states"/>. We will also attach
a message handler to the pipeline bus so we can retrieve errors
and detect the end-of-stream.
</para>
<para>
Let's now add all the code together to get our very first audio
player:
</para>
<programlisting>
<!-- example-begin helloworld.c -->
#include &lt;gst/gst.h&gt;
#include &lt;glib.h&gt;
static gboolean
bus_call (GstBus *bus,
GstMessage *msg,
gpointer data)
{
GMainLoop *loop = (GMainLoop *) data;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_EOS:
g_print ("End of stream\n");
g_main_loop_quit (loop);
break;
case GST_MESSAGE_ERROR: {
gchar *debug;
GError *error;
gst_message_parse_error (msg, &amp;error, &amp;debug);
g_free (debug);
g_printerr ("Error: %s\n", error->message);
g_error_free (error);
g_main_loop_quit (loop);
break;
}
default:
break;
}
return TRUE;
}
static void
on_pad_added (GstElement *element,
GstPad *pad,
gpointer data)
{
GstPad *sinkpad;
GstElement *decoder = (GstElement *) data;
/* We can now link this pad with the vorbis-decoder sink pad */
g_print ("Dynamic pad created, linking demuxer/decoder\n");
sinkpad = gst_element_get_static_pad (decoder, "sink");
gst_pad_link (pad, sinkpad);
gst_object_unref (sinkpad);
}
int
main (int argc,
char *argv[])
{
GMainLoop *loop;
GstElement *pipeline, *source, *demuxer, *decoder, *conv, *sink;
GstBus *bus;
guint bus_watch_id;
/* Initialisation */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* Check input arguments */
if (argc != 2) {
g_printerr ("Usage: %s &lt;Ogg/Vorbis filename&gt;\n", argv[0]);
return -1;
}
/* Create gstreamer elements */
pipeline = gst_pipeline_new ("audio-player");
source = gst_element_factory_make ("filesrc", "file-source");
demuxer = gst_element_factory_make ("oggdemux", "ogg-demuxer");
decoder = gst_element_factory_make ("vorbisdec", "vorbis-decoder");
conv = gst_element_factory_make ("audioconvert", "converter");
sink = gst_element_factory_make ("autoaudiosink", "audio-output");
if (!pipeline || !source || !demuxer || !decoder || !conv || !sink) {
g_printerr ("One element could not be created. Exiting.\n");
return -1;
}
/* Set up the pipeline */
/* we set the input filename to the source element */
g_object_set (G_OBJECT (source), "location", argv[1], NULL);
/* we add a message handler */
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
bus_watch_id = gst_bus_add_watch (bus, bus_call, loop);
gst_object_unref (bus);
/* we add all elements into the pipeline */
/* file-source | ogg-demuxer | vorbis-decoder | converter | alsa-output */
gst_bin_add_many (GST_BIN (pipeline),
source, demuxer, decoder, conv, sink, NULL);
/* we link the elements together */
/* file-source -&gt; ogg-demuxer ~&gt; vorbis-decoder -&gt; converter -&gt; alsa-output */
gst_element_link (source, demuxer);
gst_element_link_many (decoder, conv, sink, NULL);
g_signal_connect (demuxer, "pad-added", G_CALLBACK (on_pad_added), decoder);
/* note that the demuxer will be linked to the decoder dynamically.
The reason is that Ogg may contain various streams (for example
audio and video). The source pad(s) will be created at run time,
by the demuxer when it detects the amount and nature of streams.
Therefore we connect a callback function which will be executed
when the "pad-added" is emitted.*/
/* Set the pipeline to "playing" state*/
g_print ("Now playing: %s\n", argv[1]);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Iterate */
g_print ("Running...\n");
g_main_loop_run (loop);
/* Out of the main loop, clean up nicely */
g_print ("Returned, stopping playback\n");
gst_element_set_state (pipeline, GST_STATE_NULL);
g_print ("Deleting pipeline\n");
gst_object_unref (GST_OBJECT (pipeline));
g_source_remove (bus_watch_id);
g_main_loop_unref (loop);
return 0;
}
<!-- example-end helloworld.c -->
</programlisting>
<para>
We now have created a complete pipeline. We can visualise the
pipeline as follows:
</para>
<figure float="1" id="section-hello-img">
<title>The "hello world" pipeline</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/hello-world.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
</sect1>
<sect1 id="section-helloworld-compilerun">
<title>Compiling and Running helloworld.c</title>
<para>
To compile the helloworld example, use: <command>gcc -Wall
helloworld.c -o helloworld
$(pkg-config --cflags --libs gstreamer-&GST_API_VERSION;)</command>.
&GStreamer; makes use of <command>pkg-config</command> to get compiler
and linker flags needed to compile this application.
</para>
<para>
If you're running a non-standard installation (ie. you've installed
GStreamer from source yourself instead of using pre-built packages),
make sure the <classname>PKG_CONFIG_PATH</classname> environment variable
is set to the correct location (<filename>$libdir/pkgconfig</filename>).
</para>
<para>
In the unlikely case that you are using an uninstalled GStreamer
setup (ie. gst-uninstalled), you will need to use libtool to build the
hello world program, like this: <command>libtool --mode=link gcc -Wall
helloworld.c -o helloworld
$(pkg-config --cflags --libs gstreamer-&GST_API_VERSION;)</command>.
</para>
<para>
You can run this example application with <command>./helloworld
file.ogg</command>. Substitute <filename>file.ogg</filename>
with your favourite Ogg/Vorbis file.
</para>
</sect1>
<sect1 id="section-hello-world-conclusion">
<title>Conclusion</title>
<para>
This concludes our first example. As you see, setting up a pipeline
is very low-level but powerful. You will see later in this manual how
you can create a more powerful media player with even less effort
using higher-level interfaces. We will discuss all that in <xref
linkend="part-highlevel"/>. We will first, however, go more in-depth
into more advanced &GStreamer; internals.
</para>
<para>
It should be clear from the example that we can very easily replace
the <quote>filesrc</quote> element with some other element that
reads data from a network, or some other data source element that
is better integrated with your desktop environment. Also, you can
use other decoders and parsers/demuxers to support other media types. You
can use another audio sink if you're not running Linux, but Mac OS X,
Windows or FreeBSD, or you can instead use a filesink to write audio
files to disk instead of playing them back. By using an audio card
source, you can even do audio capture instead of playback. All this
shows the reusability of &GStreamer; elements, which is its greatest
advantage.
</para>
</sect1>
</chapter>

View file

@ -1,129 +0,0 @@
<chapter id="chapter-init">
<title>Initializing &GStreamer;</title>
<para>
When writing a &GStreamer; application, you can simply include
<filename>gst/gst.h</filename> to get access to the library
functions. Besides that, you will also need to initialize the
&GStreamer; library.
</para>
<sect1 id="section-init-c">
<title>Simple initialization</title>
<para>
Before the &GStreamer; libraries can be used,
<function>gst_init</function> has to be called from the main
application. This call will perform the necessary initialization
of the library as well as parse the &GStreamer;-specific command
line options.
</para>
<para>
A typical program &EXAFOOT; would have code to initialize
&GStreamer; that looks like this:
</para>
<example id="ex-init-c">
<title>Initializing GStreamer</title>
<programlisting>
<!-- example-begin init.c -->
#include &lt;stdio.h&gt;
#include &lt;gst/gst.h&gt;
int
main (int argc,
char *argv[])
{
const gchar *nano_str;
guint major, minor, micro, nano;
gst_init (&amp;argc, &amp;argv);
gst_version (&amp;major, &amp;minor, &amp;micro, &amp;nano);
if (nano == 1)
nano_str = "(CVS)";
else if (nano == 2)
nano_str = "(Prerelease)";
else
nano_str = "";
printf ("This program is linked against GStreamer %d.%d.%d %s\n",
major, minor, micro, nano_str);
return 0;
}
<!-- example-end init.c -->
</programlisting>
</example>
<para>
Use the <symbol>GST_VERSION_MAJOR</symbol>,
<symbol>GST_VERSION_MINOR</symbol> and <symbol>GST_VERSION_MICRO</symbol>
macros to get the &GStreamer; version you are building against, or
use the function <function>gst_version</function> to get the version
your application is linked against. &GStreamer; currently uses a
scheme where versions with the same major and minor versions are
API-/ and ABI-compatible.
</para>
<para>
It is also possible to call the <function>gst_init</function> function
with two <symbol>NULL</symbol> arguments, in which case no command line
options will be parsed by <application>GStreamer</application>.
</para>
</sect1>
<sect1>
<title>The GOption interface</title>
<para>
You can also use a GOption table to initialize your own parameters as
shown in the next example:
</para>
<example id="ex-goption-c">
<title>Initialisation using the GOption interface</title>
<programlisting>
<!-- example-begin goption.c -->
#include &lt;gst/gst.h&gt;
int
main (int argc,
char *argv[])
{
gboolean silent = FALSE;
gchar *savefile = NULL;
GOptionContext *ctx;
GError *err = NULL;
GOptionEntry entries[] = {
{ "silent", 's', 0, G_OPTION_ARG_NONE, &amp;silent,
"do not output status information", NULL },
{ "output", 'o', 0, G_OPTION_ARG_STRING, &amp;savefile,
"save xml representation of pipeline to FILE and exit", "FILE" },
{ NULL }
};
ctx = g_option_context_new ("- Your application");
g_option_context_add_main_entries (ctx, entries, NULL);
g_option_context_add_group (ctx, gst_init_get_option_group ());
if (!g_option_context_parse (ctx, &amp;argc, &amp;argv, &amp;err)) {
g_print ("Failed to initialize: %s\n", err->message);
g_clear_error (&amp;err);
g_option_context_free (ctx);
return 1;
}
g_option_context_free (ctx);
printf ("Run me with --help to see the Application options appended.\n");
return 0;
}
<!-- example-end goption.c -->
</programlisting>
</example>
<para>
As shown in this fragment, you can use a <ulink
url="http://developer.gnome.org/glib/stable/glib-Commandline-option-parser.html"
type="http">GOption</ulink> table to define your application-specific
command line options, and pass this table to the GLib initialization
function along with the option group returned from the
function <function>gst_init_get_option_group</function>. Your
application options will be parsed in addition to the standard
<application>GStreamer</application> options.
</para>
</sect1>
</chapter>

View file

@ -1,686 +0,0 @@
<chapter id="chapter-pads" xreflabel="Pads and capabilities">
<title>Pads and capabilities</title>
<para>
As we have seen in <xref linkend="chapter-elements"/>, the pads are
the element's interface to the outside world. Data streams from one
element's source pad to another element's sink pad. The specific
type of media that the element can handle will be exposed by the
pad's capabilities. We will talk more on capabilities later in this
chapter (see <xref linkend="section-caps"/>).
</para>
<sect1 id="section-pads">
<title>Pads</title>
<para>
A pad type is defined by two properties: its direction and its
availability. As we've mentioned before, &GStreamer; defines two
pad directions: source pads and sink pads. This terminology is
defined from the view of within the element: elements receive data
on their sink pads and generate data on their source pads.
Schematically, sink pads are drawn on the left side of an element,
whereas source pads are drawn on the right side of an element. In
such graphs, data flows from left to right.
<footnote>
<para>
In reality, there is no objection to data flowing from a
source pad to the sink pad of an element upstream (to the
left of this element in drawings). Data will, however, always
flow from a source pad of one element to the sink pad of
another.
</para>
</footnote>
</para>
<para>
Pad directions are very simple compared to pad availability. A pad
can have any of three availabilities: always, sometimes and on
request. The meaning of those three types is exactly as it says:
always pads always exist, sometimes pad exist only in certain
cases (and can disappear randomly), and on-request pads appear
only if explicitly requested by applications.
</para>
<sect2 id="section-pads-dynamic">
<title>Dynamic (or sometimes) pads</title>
<para>
Some elements might not have all of their pads when the element is
created. This can happen, for example, with an Ogg demuxer element.
The element will read the Ogg stream and create dynamic pads for
each contained elementary stream (vorbis, theora) when it detects
such a stream in the Ogg stream. Likewise, it will delete the pad
when the stream ends. This principle is very useful for demuxer
elements, for example.
</para>
<para>
Running <application>gst-inspect oggdemux</application> will show
that the element has only one pad: a sink pad called 'sink'. The
other pads are <quote>dormant</quote>. You can see this in the pad
template because there is an <quote>Exists: Sometimes</quote>
property. Depending on the type of Ogg file you play, the pads will
be created. We will see that this is very important when you are
going to create dynamic pipelines. You can attach a signal handler
to an element to inform you when the element has created a new pad
from one of its <quote>sometimes</quote> pad templates. The
following piece of code is an example of how to do this:
</para>
<programlisting><!-- example-begin pad.c a -->
#include &lt;gst/gst.h&gt;
static void
cb_new_pad (GstElement *element,
GstPad *pad,
gpointer data)
{
gchar *name;
name = gst_pad_get_name (pad);
g_print ("A new pad %s was created\n", name);
g_free (name);
/* here, you would setup a new pad link for the newly created pad */
<!-- example-end pad.c a -->[..]
<!-- example-begin pad.c b -->
}
int
main (int argc,
char *argv[])
{
GstElement *pipeline, *source, *demux;
GMainLoop *loop;
/* init */
gst_init (&amp;argc, &amp;argv);
/* create elements */
pipeline = gst_pipeline_new ("my_pipeline");
source = gst_element_factory_make ("filesrc", "source");
g_object_set (source, "location", argv[1], NULL);
demux = gst_element_factory_make ("oggdemux", "demuxer");
/* you would normally check that the elements were created properly */
/* put together a pipeline */
gst_bin_add_many (GST_BIN (pipeline), source, demux, NULL);
gst_element_link_pads (source, "src", demux, "sink");
/* listen for newly created pads */
g_signal_connect (demux, "pad-added", G_CALLBACK (cb_new_pad), NULL);
/* start the pipeline */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
loop = g_main_loop_new (NULL, FALSE);
g_main_loop_run (loop);
<!--example-end pad.c b -->
[..]<!-- example-begin pad.c c --><!--
return 0;
--><!-- example-end pad.c c -->
<!-- example-begin pad.c d -->
}
<!-- example-end pad.c d --></programlisting>
<para>
It is not uncommon to add elements to the pipeline only from within
the "pad-added" callback. If you do this, don't
forget to set the state of the newly-added elements to the target
state of the pipeline using
<function>gst_element_set_state ()</function> or
<function>gst_element_sync_state_with_parent ()</function>.
</para>
</sect2>
<sect2 id="section-pads-request">
<title>Request pads</title>
<para>
An element can also have request pads. These pads are not created
automatically but are only created on demand. This is very useful
for multiplexers, aggregators and tee elements. Aggregators are
elements that merge the content of several input streams together
into one output stream. Tee elements are the reverse: they are
elements that have one input stream and copy this stream to each
of their output pads, which are created on request. Whenever an
application needs another copy of the stream, it can simply request
a new output pad from the tee element.
</para>
<para>
The following piece of code shows how you can request a new output
pad from a <quote>tee</quote> element:
</para>
<programlisting>
static void
some_function (GstElement *tee)
{
GstPad * pad;
gchar *name;
pad = gst_element_get_request_pad (tee, "src%d");
name = gst_pad_get_name (pad);
g_print ("A new pad %s was created\n", name);
g_free (name);
/* here, you would link the pad */
[..]
/* and, after doing that, free our reference */
gst_object_unref (GST_OBJECT (pad));
}
</programlisting>
<para>
The <function>gst_element_get_request_pad ()</function> method
can be used to get a pad from the element based on the name of
the pad template. It is also possible to request a pad that is
compatible with another pad template. This is very useful if
you want to link an element to a multiplexer element and you
need to request a pad that is compatible. The method
<function>gst_element_get_compatible_pad ()</function> can be
used to request a compatible pad, as shown in the next example.
It will request a compatible pad from an Ogg multiplexer from
any input.
</para>
<programlisting>
static void
link_to_multiplexer (GstPad *tolink_pad,
GstElement *mux)
{
GstPad *pad;
gchar *srcname, *sinkname;
srcname = gst_pad_get_name (tolink_pad);
pad = gst_element_get_compatible_pad (mux, tolink_pad);
gst_pad_link (tolinkpad, pad);
sinkname = gst_pad_get_name (pad);
gst_object_unref (GST_OBJECT (pad));
g_print ("A new pad %s was created and linked to %s\n", sinkname, srcname);
g_free (sinkname);
g_free (srcname);
}
</programlisting>
</sect2>
</sect1>
<sect1 id="section-caps">
<title>Capabilities of a pad</title>
<para>
Since the pads play a very important role in how the element is
viewed by the outside world, a mechanism is implemented to describe
the data that can flow or currently flows through the pad by using
capabilities. Here, we will briefly describe what capabilities are
and how to use them, enough to get an understanding of the concept.
For an in-depth look into capabilities and a list of all capabilities
defined in &GStreamer;, see the <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.html">Plugin
Writers Guide</ulink>.
</para>
<para>
Capabilities are attached to pad templates and to pads. For pad
templates, it will describe the types of media that may stream
over a pad created from this template. For pads, it can either
be a list of possible caps (usually a copy of the pad template's
capabilities), in which case the pad is not yet negotiated, or it
is the type of media that currently streams over this pad, in
which case the pad has been negotiated already.
</para>
<sect2 id="section-caps-structure">
<title>Dissecting capabilities</title>
<para>
A pad's capabilities are described in a <classname>GstCaps</classname>
object. Internally, a <ulink type="http"
url="&URLAPI;gstreamer-GstCaps.html"><classname>GstCaps</classname></ulink>
will contain one or more <ulink type="http"
url="&URLAPI;gstreamer-GstStructure.html"><classname>GstStructure</classname></ulink>
that will describe one media type. A negotiated pad will have
capabilities set that contain exactly <emphasis>one</emphasis>
structure. Also, this structure will contain only
<emphasis>fixed</emphasis> values. These constraints are not
true for unnegotiated pads or pad templates.
</para>
<para>
As an example, below is a dump of the capabilities of the
<quote>vorbisdec</quote> element, which you will get by running
<command>gst-inspect vorbisdec</command>. You will see two pads:
a source and a sink pad. Both of these pads are always available,
and both have capabilities attached to them. The sink pad will
accept vorbis-encoded audio data, with the media type
<quote>audio/x-vorbis</quote>. The source pad will be used
to send raw (decoded) audio samples to the next element, with
a raw audio media type (in this case,
<quote>audio/x-raw</quote>). The source pad will also
contain properties for the audio samplerate and the amount of
channels, plus some more that you don't need to worry about
for now.
</para>
<programlisting>
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
audio/x-raw
format: F32LE
rate: [ 1, 2147483647 ]
channels: [ 1, 256 ]
SINK template: 'sink'
Availability: Always
Capabilities:
audio/x-vorbis
</programlisting>
</sect2>
<sect2 id="section-caps-props">
<title>Properties and values</title>
<para>
Properties are used to describe extra information for
capabilities. A property consists of a key (a string) and
a value. There are different possible value types that can be used:
</para>
<itemizedlist>
<listitem>
<para>
Basic types, this can be pretty much any
<classname>GType</classname> registered with Glib. Those
properties indicate a specific, non-dynamic value for this
property. Examples include:
</para>
<itemizedlist>
<listitem>
<para>
An integer value (<classname>G_TYPE_INT</classname>):
the property has this exact value.
</para>
</listitem>
<listitem>
<para>
A boolean value (<classname>G_TYPE_BOOLEAN</classname>):
the property is either TRUE or FALSE.
</para>
</listitem>
<listitem>
<para>
A float value (<classname>G_TYPE_FLOAT</classname>):
the property has this exact floating point value.
</para>
</listitem>
<listitem>
<para>
A string value (<classname>G_TYPE_STRING</classname>):
the property contains a UTF-8 string.
</para>
</listitem>
<listitem>
<para>
A fraction value (<classname>GST_TYPE_FRACTION</classname>):
contains a fraction expressed by an integer numerator and
denominator.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
Range types are <classname>GType</classname>s registered by
&GStreamer; to indicate a range of possible values. They are
used for indicating allowed audio samplerate values or
supported video sizes. The two types defined in &GStreamer;
are:
</para>
<itemizedlist>
<listitem>
<para>
An integer range value
(<classname>GST_TYPE_INT_RANGE</classname>): the property
denotes a range of possible integers, with a lower and an
upper boundary. The <quote>vorbisdec</quote> element, for
example, has a rate property that can be between 8000 and
50000.
</para>
</listitem>
<listitem>
<para>
A float range value
(<classname>GST_TYPE_FLOAT_RANGE</classname>): the property
denotes a range of possible floating point values, with a
lower and an upper boundary.
</para>
</listitem>
<listitem>
<para>
A fraction range value
(<classname>GST_TYPE_FRACTION_RANGE</classname>): the property
denotes a range of possible fraction values, with a
lower and an upper boundary.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
A list value (<classname>GST_TYPE_LIST</classname>): the
property can take any value from a list of basic values
given in this list.
</para>
<para>
Example: caps that express that either
a sample rate of 44100 Hz and a sample rate of 48000 Hz
is supported would use a list of integer values, with
one value being 44100 and one value being 48000.
</para>
</listitem>
<listitem>
<para>
An array value (<classname>GST_TYPE_ARRAY</classname>): the
property is an array of values. Each value in the array is a
full value on its own, too. All values in the array should be
of the same elementary type. This means that an array can
contain any combination of integers, lists of integers, integer
ranges together, and the same for floats or strings, but it can
not contain both floats and ints at the same time.
</para>
<para>
Example: for audio where there are more than two channels involved
the channel layout needs to be specified (for one and two channel
audio the channel layout is implicit unless stated otherwise in the
caps). So the channel layout would be an array of integer enum
values where each enum value represents a loudspeaker position.
Unlike a <classname>GST_TYPE_LIST</classname>, the values in an
array will be interpreted as a whole.
</para>
</listitem>
</itemizedlist>
</sect2>
</sect1>
<sect1 id="section-caps-api">
<title>What capabilities are used for</title>
<para>
Capabilities (short: caps) describe the type of data that is streamed
between two pads, or that one pad (template) supports. This makes them
very useful for various purposes:
</para>
<itemizedlist>
<listitem>
<para>
Autoplugging: automatically finding elements to link to a
pad based on its capabilities. All autopluggers use this
method.
</para>
</listitem>
<listitem>
<para>
Compatibility detection: when two pads are linked, &GStreamer;
can verify if the two pads are talking about the same media
type. The process of linking two pads and checking if they
are compatible is called <quote>caps negotiation</quote>.
</para>
</listitem>
<listitem>
<para>
Metadata: by reading the capabilities from a pad, applications
can provide information about the type of media that is being
streamed over the pad, which is information about the stream
that is currently being played back.
</para>
</listitem>
<listitem>
<para>
Filtering: an application can use capabilities to limit the
possible media types that can stream between two pads to a
specific subset of their supported stream types. An application
can, for example, use <quote>filtered caps</quote> to set a
specific (fixed or non-fixed) video size that should stream
between two pads. You will see an example of filtered caps
later in this manual, in <xref linkend="section-data-spoof"/>.
You can do caps filtering by inserting a capsfilter element into
your pipeline and setting its <quote>caps</quote> property. Caps
filters are often placed after converter elements like audioconvert,
audioresample, videoconvert or videoscale to force those
converters to convert data to a specific output format at a
certain point in a stream.
</para>
</listitem>
</itemizedlist>
<sect2 id="section-caps-metadata">
<title>Using capabilities for metadata</title>
<para>
A pad can have a set (i.e. one or more) of capabilities attached
to it. Capabilities (<classname>GstCaps</classname>) are represented
as an array of one or more <classname>GstStructure</classname>s, and
each <classname>GstStructure</classname> is an array of fields where
each field consists of a field name string (e.g. "width") and a
typed value (e.g. <classname>G_TYPE_INT</classname> or
<classname>GST_TYPE_INT_RANGE</classname>).
</para>
<para>
Note that there is a distinct difference between the
<emphasis>possible</emphasis> capabilities of a pad (ie. usually what
you find as caps of pad templates as they are shown in gst-inspect),
the <emphasis>allowed</emphasis> caps of a pad (can be the same as
the pad's template caps or a subset of them, depending on the possible
caps of the peer pad) and lastly <emphasis>negotiated</emphasis> caps
(these describe the exact format of a stream or buffer and contain
exactly one structure and have no variable bits like ranges or lists,
ie. they are fixed caps).
</para>
<para>
You can get values of properties in a set of capabilities
by querying individual properties of one structure. You can get
a structure from a caps using
<function>gst_caps_get_structure ()</function> and the number of
structures in a <classname>GstCaps</classname> using
<function>gst_caps_get_size ()</function>.
</para>
<para>
Caps are called <emphasis>simple caps</emphasis> when they contain
only one structure, and <emphasis>fixed caps</emphasis> when they
contain only one structure and have no variable field types (like
ranges or lists of possible values). Two other special types of caps
are <emphasis>ANY caps</emphasis> and <emphasis>empty caps</emphasis>.
</para>
<para>
Here is an example of how to extract the width and height from
a set of fixed video caps:
<programlisting>
static void
read_video_props (GstCaps *caps)
{
gint width, height;
const GstStructure *str;
g_return_if_fail (gst_caps_is_fixed (caps));
str = gst_caps_get_structure (caps, 0);
if (!gst_structure_get_int (str, "width", &amp;width) ||
!gst_structure_get_int (str, "height", &amp;height)) {
g_print ("No width/height available\n");
return;
}
g_print ("The video size of this set of capabilities is %dx%d\n",
width, height);
}
</programlisting>
</para>
</sect2>
<sect2 id="section-caps-filter">
<title>Creating capabilities for filtering</title>
<para>
While capabilities are mainly used inside a plugin to describe the
media type of the pads, the application programmer often also has
to have basic understanding of capabilities in order to interface
with the plugins, especially when using filtered caps. When you're
using filtered caps or fixation, you're limiting the allowed types of
media that can stream between two pads to a subset of their supported
media types. You do this using a <classname>capsfilter</classname>
element in your pipeline. In order to do this, you also need to
create your own <classname>GstCaps</classname>. The easiest way to
do this is by using the convenience function
<function>gst_caps_new_simple ()</function>:
</para>
<para>
<programlisting>
static gboolean
link_elements_with_filter (GstElement *element1, GstElement *element2)
{
gboolean link_ok;
GstCaps *caps;
caps = gst_caps_new_simple ("video/x-raw",
"format", G_TYPE_STRING, "I420",
"width", G_TYPE_INT, 384,
"height", G_TYPE_INT, 288,
"framerate", GST_TYPE_FRACTION, 25, 1,
NULL);
link_ok = gst_element_link_filtered (element1, element2, caps);
gst_caps_unref (caps);
if (!link_ok) {
g_warning ("Failed to link element1 and element2!");
}
return link_ok;
}
</programlisting>
This will force the data flow between those two elements to
a certain video format, width, height and framerate (or the linking
will fail if that cannot be achieved in the context of the elements
involved). Keep in mind that when you use <function>
gst_element_link_filtered ()</function> it will automatically create
a <classname>capsfilter</classname> element for you and insert it into
your bin or pipeline between the two elements you want to connect (this
is important if you ever want to disconnect those elements because then
you will have to disconnect both elements from the capsfilter instead).
</para>
<para>
In some cases, you will want to create a more elaborate set of
capabilities to filter a link between two pads. Then, this function
is too simplistic and you'll want to use the method
<function>gst_caps_new_full ()</function>:
</para>
<programlisting>
static gboolean
link_elements_with_filter (GstElement *element1, GstElement *element2)
{
gboolean link_ok;
GstCaps *caps;
caps = gst_caps_new_full (
gst_structure_new ("video/x-raw",
"width", G_TYPE_INT, 384,
"height", G_TYPE_INT, 288,
"framerate", GST_TYPE_FRACTION, 25, 1,
NULL),
gst_structure_new ("video/x-bayer",
"width", G_TYPE_INT, 384,
"height", G_TYPE_INT, 288,
"framerate", GST_TYPE_FRACTION, 25, 1,
NULL),
NULL);
link_ok = gst_element_link_filtered (element1, element2, caps);
gst_caps_unref (caps);
if (!link_ok) {
g_warning ("Failed to link element1 and element2!");
}
return link_ok;
}
</programlisting>
<para>
See the API references for the full API of
<ulink type="http"
url="&URLAPI;gstreamer-GstStructure.html"><classname>GstStructure</classname></ulink>
and <ulink type="http"
url="&URLAPI;gstreamer-GstCaps.html"><classname>GstCaps</classname></ulink>.
</para>
</sect2>
</sect1>
<sect1 id="section-pads-ghost">
<title>Ghost pads</title>
<para>
You can see from <xref linkend="section-bin-noghost-img"/> how a bin
has no pads of its own. This is where "ghost pads" come into play.
</para>
<figure float="1" id="section-bin-noghost-img">
<title>Visualisation of a <ulink type="http"
url="&URLAPI;GstBin.html"><classname>GstBin</classname></ulink>
element without ghost pads</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/bin-element-noghost.&image;"
format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
A ghost pad is a pad from some element in the bin that can be
accessed directly from the bin as well. Compare it to a symbolic
link in UNIX filesystems. Using ghost pads on bins, the bin also
has a pad and can transparently be used as an element in other
parts of your code.
</para>
<figure float="1" id="section-bin-ghost-img">
<title>Visualisation of a <ulink type="http"
url="&URLAPI;GstBin.html"><classname>GstBin</classname></ulink>
element with a ghost pad</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/bin-element-ghost.&image;"
format="&IMAGE;"/>
</imageobject>
</mediaobject>
</figure>
<para>
<xref linkend="section-bin-ghost-img"/> is a representation of a
ghost pad. The sink pad of element one is now also a pad of the bin.
Because ghost pads look and work like any other pads, they can be added
to any type of elements, not just to a <classname>GstBin</classname>,
just like ordinary pads.
</para>
<para>
A ghostpad is created using the function
<function>gst_ghost_pad_new ()</function>:
</para>
<programlisting><!-- example-begin ghostpad.c a -->
#include &lt;gst/gst.h&gt;
int
main (int argc,
char *argv[])
{
GstElement *bin, *sink;
GstPad *pad;
/* init */
gst_init (&amp;argc, &amp;argv);
/* create element, add to bin */
sink = gst_element_factory_make ("fakesink", "sink");
bin = gst_bin_new ("mybin");
gst_bin_add (GST_BIN (bin), sink);
/* add ghostpad */
pad = gst_element_get_static_pad (sink, "sink");
gst_element_add_pad (bin, gst_ghost_pad_new ("sink", pad));
gst_object_unref (GST_OBJECT (pad));
<!-- example-end ghostpad.c a -->
[..]<!-- example-begin ghostpad.c b --><!--
return 0;
--><!-- example-end ghostpad.c b -->
<!-- example-begin ghostpad.c c -->
}
<!-- example-end ghostpad.c c --></programlisting>
<para>
In the above example, the bin now also has a pad: the pad called
<quote>sink</quote> of the given element. The bin can, from here
on, be used as a substitute for the sink element. You could, for
example, link another element to the bin.
</para>
</sect1>
</chapter>

View file

@ -1,85 +0,0 @@
<chapter id="chapter-plugins">
<title>Plugins</title>
<!-- FIXME: introduce type definitions before this chapter -->
<para>
A plugin is a shared library that contains at least one of the following
items:
</para>
<itemizedlist>
<listitem>
<para>
one or more element factories
</para>
</listitem>
<listitem>
<para>
one or more type definitions
</para>
</listitem>
<listitem>
<para>
one or more auto-pluggers
</para>
</listitem>
<listitem>
<para>
exported symbols for use in other plugins
</para>
</listitem>
</itemizedlist>
<para>
All plugins should implement one function, <function>plugin_init</function>,
that creates all the element factories and registers all the type
definitions contained in the plugin.
Without this function, a plugin cannot be registered.
</para>
<para>
The plugins are maintained in the plugin system. Optionally, the
type definitions and the element factories can be saved into an XML
representation so that the plugin system does not have to load all
available plugins in order to know their definition.
</para>
<para>
The basic plugin structure has the following fields:
</para>
<programlisting>
typedef struct _GstPlugin GstPlugin;
struct _GstPlugin {
gchar *name; /* name of the plugin */
gchar *longname; /* long name of plugin */
gchar *filename; /* filename it came from */
GList *types; /* list of types provided */
gint numtypes;
GList *elements; /* list of elements provided */
gint numelements;
GList *autopluggers; /* list of autopluggers provided */
gint numautopluggers;
gboolean loaded; /* if the plugin is in memory */
};
</programlisting>
<para>
You can query a <classname>GList</classname> of available plugins with the
function <function>gst_registry_pool_plugin_list</function> as this example
shows:
</para>
<programlisting>
GList *plugins;
plugins = gst_registry_pool_plugin_list ();
while (plugins) {
GstPlugin *plugin = (GstPlugin *)plugins-&gt;data;
g_print ("plugin: %s\n", gst_plugin_get_name (plugin));
plugins = g_list_next (plugins);
}
</programlisting>
</chapter>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 8.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 64 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 30 KiB

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 133 KiB

File diff suppressed because it is too large Load diff

Before

Width:  |  Height:  |  Size: 661 KiB

File diff suppressed because it is too large Load diff

Before

Width:  |  Height:  |  Size: 561 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.7 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 100 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 21 KiB

View file

@ -1,617 +0,0 @@
<chapter id="chapter-playback-components">
<title>Playback Components</title>
<para>
&GStreamer; includes several higher-level components to simplify an
application developer's life. All of the components discussed here (for now) are
targetted at media playback. The idea of each of these components is
to integrate as closely as possible with a &GStreamer; pipeline, but
to hide the complexity of media type detection and several other
rather complex topics that have been discussed in <xref
linkend="part-advanced"/>.
</para>
<para>
We currently recommend people to use either playbin (see <xref
linkend="section-components-playbin"/>) or decodebin (see <xref
linkend="section-components-decodebin"/>), depending on their needs.
Playbin is the recommended solution for everything related to simple
playback of media that should just work. Decodebin is a more flexible
autoplugger that could be used to add more advanced features, such
as playlist support, crossfading of audio tracks and so on. Its
programming interface is more low-level than that of playbin, though.
</para>
<sect1 id="section-components-playbin">
<title>Playbin</title>
<para>
Playbin is an element that can be created using the standard &GStreamer;
API (e.g. <function>gst_element_factory_make ()</function>). The factory
is conveniently called <quote>playbin</quote>. By being a
<classname>GstPipeline</classname> (and thus a
<classname>GstElement</classname>), playbin automatically supports all
of the features of this class, including error handling, tag support,
state handling, getting stream positions, seeking, and so on.
</para>
<para>
Setting up a playbin pipeline is as simple as creating an instance of
the playbin element, setting a file location using the
<quote>uri</quote> property on playbin, and then setting the element
to the <classname>GST_STATE_PLAYING</classname> state (the location has to be a valid
URI, so <quote>&lt;protocol&gt;://&lt;location&gt;</quote>, e.g.
file:///tmp/my.ogg or http://www.example.org/stream.ogg). Internally,
playbin will set up a pipeline to playback the media location.
</para>
<programlisting><!-- example-begin playbin.c a -->
#include &lt;gst/gst.h&gt;
<!-- example-end playbin.c a -->
[.. my_bus_callback goes here ..]<!-- example-begin playbin.c b --><!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* remove message from the queue */
return TRUE;
}
--><!-- example-end playbin.c b -->
<!-- example-begin playbin.c c -->
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *play;
GstBus *bus;
/* init GStreamer */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have a URI */
if (argc != 2) {
g_print ("Usage: %s &lt;URI&gt;\n", argv[0]);
return -1;
}
/* set up */
play = gst_element_factory_make ("playbin", "play");
g_object_set (G_OBJECT (play), "uri", argv[1], NULL);
bus = gst_pipeline_get_bus (GST_PIPELINE (play));
gst_bus_add_watch (bus, my_bus_callback, loop);
gst_object_unref (bus);
gst_element_set_state (play, GST_STATE_PLAYING);
/* now run */
g_main_loop_run (loop);
/* also clean up */
gst_element_set_state (play, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (play));
return 0;
}
<!-- example-end playbin.c c --></programlisting>
<para>
Playbin has several features that have been discussed previously:
</para>
<itemizedlist>
<listitem>
<para>
Settable video and audio output (using the <quote>video-sink</quote>
and <quote>audio-sink</quote> properties).
</para>
</listitem>
<listitem>
<para>
Mostly controllable and trackable as a
<classname>GstElement</classname>, including error handling, eos
handling, tag handling, state handling (through the
<classname>GstBus</classname>), media position handling and
seeking.
</para>
</listitem>
<listitem>
<para>
Buffers network-sources, with buffer fullness notifications being
passed through the <classname>GstBus</classname>.
</para>
</listitem>
<listitem>
<para>
Supports visualizations for audio-only media.
</para>
</listitem>
<listitem>
<para>
Supports subtitles, both in the media as well as from separate
files. For separate subtitle files, use the <quote>suburi</quote>
property.
</para>
</listitem>
<listitem>
<para>
Supports stream selection and disabling. If your media has
multiple audio or subtitle tracks, you can dynamically choose
which one to play back, or decide to turn it off altogether
(which is especially useful to turn off subtitles). For each
of those, use the <quote>current-text</quote> and other related
properties.
</para>
</listitem>
</itemizedlist>
<para>
For convenience, it is possible to test <quote>playbin</quote> on
the commandline, using the command <quote>gst-launch-1.0 playbin
uri=file:///path/to/file</quote>.
</para>
</sect1>
<sect1 id="section-components-decodebin">
<title>Decodebin</title>
<para>
Decodebin is the actual autoplugger backend of playbin, which was
discussed in the previous section. Decodebin will, in short, accept
input from a source that is linked to its sinkpad and will try to
detect the media type contained in the stream, and set up decoder
routines for each of those. It will automatically select decoders.
For each decoded stream, it will emit the <quote>pad-added</quote>
signal, to let the client know about the newly found decoded stream.
For unknown streams (which might be the whole stream), it will emit
the <quote>unknown-type</quote> signal. The application is then
responsible for reporting the error to the user.
</para>
<programlisting><!-- example-begin decodebin.c a -->
<![CDATA[
#include <gst/gst.h>
]]>
<!-- example-end decodebin.c a -->
[.. my_bus_callback goes here ..]<!-- example-begin decodebin.c b -->
<!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* remove message from the queue */
return TRUE;
}
--><!-- example-end decodebin.c b -->
<!-- example-begin decodebin.c c -->
<![CDATA[
GstElement *pipeline, *audio;
static void
cb_newpad (GstElement *decodebin,
GstPad *pad,
gpointer data)
{
GstCaps *caps;
GstStructure *str;
GstPad *audiopad;
/* only link once */
audiopad = gst_element_get_static_pad (audio, "sink");
if (GST_PAD_IS_LINKED (audiopad)) {
g_object_unref (audiopad);
return;
}
/* check media type */
caps = gst_pad_query_caps (pad, NULL);
str = gst_caps_get_structure (caps, 0);
if (!g_strrstr (gst_structure_get_name (str), "audio")) {
gst_caps_unref (caps);
gst_object_unref (audiopad);
return;
}
gst_caps_unref (caps);
/* link'n'play */
gst_pad_link (pad, audiopad);
g_object_unref (audiopad);
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *src, *dec, *conv, *sink;
GstPad *audiopad;
GstBus *bus;
/* init GStreamer */
gst_init (&argc, &argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have input */
if (argc != 2) {
g_print ("Usage: %s <filename>\n", argv[0]);
return -1;
}
/* setup */
pipeline = gst_pipeline_new ("pipeline");
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_watch (bus, my_bus_callback, loop);
gst_object_unref (bus);
src = gst_element_factory_make ("filesrc", "source");
g_object_set (G_OBJECT (src), "location", argv[1], NULL);
dec = gst_element_factory_make ("decodebin", "decoder");
g_signal_connect (dec, "pad-added", G_CALLBACK (cb_newpad), NULL);
gst_bin_add_many (GST_BIN (pipeline), src, dec, NULL);
gst_element_link (src, dec);
/* create audio output */
audio = gst_bin_new ("audiobin");
conv = gst_element_factory_make ("audioconvert", "aconv");
audiopad = gst_element_get_static_pad (conv, "sink");
sink = gst_element_factory_make ("alsasink", "sink");
gst_bin_add_many (GST_BIN (audio), conv, sink, NULL);
gst_element_link (conv, sink);
gst_element_add_pad (audio,
gst_ghost_pad_new ("sink", audiopad));
gst_object_unref (audiopad);
gst_bin_add (GST_BIN (pipeline), audio);
/* run */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_main_loop_run (loop);
/* cleanup */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
return 0;
}
]]>
<!-- example-end decodebin.c c --></programlisting>
<para>
Decodebin, similar to playbin, supports the following features:
</para>
<itemizedlist>
<listitem>
<para>
Can decode an unlimited number of contained streams to decoded
output pads.
</para>
</listitem>
<listitem>
<para>
Is handled as a <classname>GstElement</classname> in all ways,
including tag or error forwarding and state handling.
</para>
</listitem>
</itemizedlist>
<para>
Although decodebin is a good autoplugger, there's a whole lot of
things that it does not do and is not intended to do:
</para>
<itemizedlist>
<listitem>
<para>
Taking care of input streams with a known media type (e.g. a DVD,
an audio-CD or such).
</para>
</listitem>
<listitem>
<para>
Selection of streams (e.g. which audio track to play in case of
multi-language media streams).
</para>
</listitem>
<listitem>
<para>
Overlaying subtitles over a decoded video stream.
</para>
</listitem>
</itemizedlist>
<para>
Decodebin can be easily tested on the commandline, e.g. by using the
command <command>gst-launch-1.0 filesrc location=file.ogg ! decodebin
! audioconvert ! audioresample ! autoaudiosink</command>.
</para>
</sect1>
<sect1 id="section-components-uridecodebin">
<title>URIDecodebin</title>
<para>
The uridecodebin element is very similar to decodebin, only that it
automatically plugs a source plugin based on the protocol of the URI
given.
</para>
<para>
Uridecodebin will also automatically insert buffering elements when
the uri is a slow network source. The buffering element will post
BUFFERING messages that the application needs to handle as explained
in <xref linkend="chapter-buffering"/>.
The following properties can be used to configure the buffering method:
</para>
<itemizedlist>
<listitem>
<para>
The buffer-size property allows you to configure a maximum size in
bytes for the buffer element.
</para>
</listitem>
<listitem>
<para>
The buffer-duration property allows you to configure a maximum size
in time for the buffer element. The time will be estimated based on
the bitrate of the network.
</para>
</listitem>
<listitem>
<para>
With the download property you can enable the download buffering
method as described in <xref linkend="section-buffering-download"/>.
Setting this option to TRUE will only enable download buffering
for selected formats such as quicktime, flash video, avi and
webm.
</para>
</listitem>
<listitem>
<para>
You can also enable buffering on the parsed/demuxed data with the
use-buffering property. This is interesting to enable buffering
on slower random access media such as a network file server.
</para>
</listitem>
</itemizedlist>
<para>
URIDecodebin can be easily tested on the commandline, e.g. by using the
command <command>gst-launch-1.0 uridecodebin uri=file:///file.ogg !
! audioconvert ! audioresample ! autoaudiosink</command>.
</para>
</sect1>
<sect1 id="section-components-playsink">
<title>Playsink</title>
<para>
The playsink element is a powerful sink element. It has request pads
for raw decoded audio, video and text and it will configure itself to
play the media streams. It has the following features:
</para>
<itemizedlist>
<listitem>
<para>
It exposes GstStreamVolume, GstVideoOverlay, GstNavigation and
GstColorBalance interfaces and automatically plugs software
elements to implement the interfaces when needed.
</para>
</listitem>
<listitem>
<para>
It will automatically plug conversion elements.
</para>
</listitem>
<listitem>
<para>
Can optionally render visualizations when there is no video input.
</para>
</listitem>
<listitem>
<para>
Configurable sink elements.
</para>
</listitem>
<listitem>
<para>
Configurable audio/video sync offset to fine-tune synchronization
in badly muxed files.
</para>
</listitem>
<listitem>
<para>
Support for taking a snapshot of the last video frame.
</para>
</listitem>
</itemizedlist>
<para>
Below is an example of how you can use playsink. We use a uridecodebin
element to decode into raw audio and video streams which we then link
to the playsink request pads. We only link the first audio and video
pads, you could use an input-selector to link all pads.
</para>
<programlisting>
<!-- example-begin playsink.c a -->
<![CDATA[
#include <gst/gst.h>
]]>
<!-- example-end playsink.c a -->
[.. my_bus_callback goes here ..]
<!-- example-begin playsink.c b -->
<!--
static gboolean
my_bus_callback (GstBus *bus,
GstMessage *message,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gst_message_parse_error (message, &amp;err, &amp;debug);
g_print ("Error: %s\n", err-&gt;message);
g_error_free (err);
g_free (debug);
g_main_loop_quit (loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
g_main_loop_quit (loop);
break;
default:
/* unhandled message */
break;
}
/* remove message from the queue */
return TRUE;
}
-->
<!-- example-end playsink.c b -->
<!-- example-begin playsink.c c -->
<![CDATA[
GstElement *pipeline, *sink;
static void
cb_pad_added (GstElement *dec,
GstPad *pad,
gpointer data)
{
GstCaps *caps;
GstStructure *str;
const gchar *name;
GstPadTemplate *templ;
GstElementClass *klass;
/* check media type */
caps = gst_pad_query_caps (pad, NULL);
str = gst_caps_get_structure (caps, 0);
name = gst_structure_get_name (str);
klass = GST_ELEMENT_GET_CLASS (sink);
if (g_str_has_prefix (name, "audio")) {
templ = gst_element_class_get_pad_template (klass, "audio_sink");
} else if (g_str_has_prefix (name, "video")) {
templ = gst_element_class_get_pad_template (klass, "video_sink");
} else if (g_str_has_prefix (name, "text")) {
templ = gst_element_class_get_pad_template (klass, "text_sink");
} else {
templ = NULL;
}
if (templ) {
GstPad *sinkpad;
sinkpad = gst_element_request_pad (sink, templ, NULL, NULL);
if (!gst_pad_is_linked (sinkpad))
gst_pad_link (pad, sinkpad);
gst_object_unref (sinkpad);
}
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *dec;
GstBus *bus;
/* init GStreamer */
gst_init (&argc, &argv);
loop = g_main_loop_new (NULL, FALSE);
/* make sure we have input */
if (argc != 2) {
g_print ("Usage: %s <uri>\n", argv[0]);
return -1;
}
/* setup */
pipeline = gst_pipeline_new ("pipeline");
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
gst_bus_add_watch (bus, my_bus_callback, loop);
gst_object_unref (bus);
dec = gst_element_factory_make ("uridecodebin", "source");
g_object_set (G_OBJECT (dec), "uri", argv[1], NULL);
g_signal_connect (dec, "pad-added", G_CALLBACK (cb_pad_added), NULL);
/* create audio output */
sink = gst_element_factory_make ("playsink", "sink");
gst_util_set_object_arg (G_OBJECT (sink), "flags",
"soft-colorbalance+soft-volume+vis+text+audio+video");
gst_bin_add_many (GST_BIN (pipeline), dec, sink, NULL);
/* run */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_main_loop_run (loop);
/* cleanup */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
return 0;
}
]]>
<!-- example-end playsink.c c -->
</programlisting>
<para>
This example will show audio and video depending on what you
give it. Try this example on an audio file and you will see that
it shows visualizations. You can change the visualization at runtime by
changing the vis-plugin property.
</para>
</sect1>
</chapter>

View file

@ -1,17 +0,0 @@
<chapter id="chapter-xml">
<title>XML in <application>GStreamer</application> (deprecated)</title>
<para>
<application>GStreamer</application> used to provide functions to
save pipeline definitions into XML format and later restore them
again from XML.
</para>
<para>
This never really worked properly for all but the most simple use cases
though, and is also pretty much impossible to make work correctly in a
useful way due to the dynamic nature of almost all non-trivial GStreamer
pipelines. Consequently, this API has been deprecated and will be
removed at some point. Don't use it.
</para>
</chapter>

View file

@ -1,2 +0,0 @@
*.eps
*.png

View file

@ -1,159 +0,0 @@
<chapter id="chapter-intro-basics">
<title>Foundations</title>
<para><!-- synchronize with PWG -->
This chapter of the guide introduces the basic concepts of &GStreamer;.
Understanding these concepts will be important in reading any of the
rest of this guide, all of them assume understanding of these basic
concepts.
</para>
<sect1 id="section-intro-basics-elements">
<title>Elements</title>
<para>
An <emphasis>element</emphasis> is the most important class of objects
in &GStreamer;. You will usually create a chain of elements linked
together and let data flow through this chain of elements. An element
has one specific function, which can be the reading of data from a
file, decoding of this data or outputting this data to your sound
card (or anything else). By chaining together several such elements,
you create a <emphasis>pipeline</emphasis> that can do a specific task,
for example media playback or capture. &GStreamer; ships with a large
collection of elements by default, making the development of a large
variety of media applications possible. If needed, you can also write
new elements. That topic is explained in great deal in the &GstPWG;.
</para>
</sect1>
<sect1 id="section-intro-basics-pads">
<title>Pads</title>
<para>
<emphasis>Pads</emphasis> are element's input and output, where
you can connect other elements. They are used to negotiate links and
data flow
between elements in &GStreamer;. A pad can be viewed as a
<quote>plug</quote> or <quote>port</quote> on an element where
links may be made with other elements, and through which data can
flow to or from those elements. Pads have specific data handling
capabilities: a pad can restrict the type of data that flows
through it. Links are only allowed between two pads when the
allowed data types of the two pads are compatible. Data types are
negotiated between pads using a process called <emphasis>caps
negotiation</emphasis>. Data types are described as a
<classname>GstCaps</classname>.
</para>
<para>
An analogy may be helpful here. A pad is similar to a plug or jack on a
physical device. Consider, for example, a home theater system consisting
of an amplifier, a DVD player, and a (silent) video projector. Linking
the DVD player to the amplifier is allowed because both devices have audio
jacks, and linking the projector to the DVD player is allowed because
both devices have compatible video jacks. Links between the
projector and the amplifier may not be made because the projector and
amplifier have different types of jacks. Pads in &GStreamer; serve the
same purpose as the jacks in the home theater system.
</para>
<para>
For the most part, all data in &GStreamer; flows one way through a link
between elements. Data flows out of one element through one or more
<emphasis>source pads</emphasis>, and elements accept incoming data
through one or more <emphasis>sink pads</emphasis>. Source and sink
elements have only source and sink pads, respectively. Data usually
means buffers (described by the <ulink type="http"
url="&URLAPI;gstreamer-GstBuffer.html">
<classname>GstBuffer</classname></ulink> object) and events (described
by the <ulink type="http" url="&URLAPI;gstreamer-GstEvent.html">
<classname>GstEvent</classname></ulink> object).
</para>
</sect1>
<sect1 id="section-intro-basics-bins">
<title>Bins and pipelines</title>
<para>
A <emphasis>bin</emphasis> is a container for a collection of elements.
Since bins are subclasses of elements
themselves, you can mostly control a bin as if it were an element,
thereby abstracting away a lot of complexity for your application. You
can, for example change state on all elements in a bin by changing the
state of that bin itself. Bins also forward bus messages from their
contained children (such as error messages, tag messages or EOS messages).
</para>
<para>
A <emphasis>pipeline</emphasis> is a top-level bin. It provides a bus for
the application and manages the synchronization for its children.
As you set it to PAUSED or PLAYING state, data flow will start and media
processing will take place. Once started, pipelines will run in a
separate thread until you stop them or the end
of the data stream is reached.
</para>
<figure float="1" id="section-pipeline-img">
<title>&GStreamer; pipeline for a simple ogg player</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/simple-player.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
</sect1>
<sect1 id="section-intro-basics-communication">
<title>Communication</title>
<para>
&GStreamer; provides several mechanisms for communication and data exchange
between the <emphasis>application</emphasis> and the <emphasis>pipeline</emphasis>.
</para>
<itemizedlist>
<listitem>
<para>
<emphasis>buffers</emphasis> are objects for passing streaming data
between elements in the pipeline. Buffers always travel from sources
to sinks (downstream).
</para>
</listitem>
<listitem>
<para>
<emphasis>events</emphasis> are objects sent between elements or from
the application to elements. Events can travel upstream and downstream.
Downstream events can be synchronised to the data flow.
</para>
</listitem>
<listitem>
<para>
<emphasis>messages</emphasis> are objects posted by elements on
the pipeline's message bus, where they will be held for collection
by the application. Messages can be intercepted synchronously from
the streaming thread context of the element posting the message, but
are usually handled asynchronously by the application from the
application's main thread. Messages are used to transmit information
such as errors, tags, state changes, buffering state, redirects etc.
from elements to the application in a thread-safe way.
</para>
</listitem>
<listitem>
<para>
<emphasis>queries</emphasis> allow applications to request information
such as duration or current playback position from the pipeline.
Queries are always answered synchronously. Elements can also use
queries to request information from their peer elements (such as the
file size or duration). They can be used both ways within a pipeline,
but upstream queries are more common.
</para>
</listitem>
</itemizedlist>
<figure float="1" id="section-communication-img">
<title>&GStreamer; pipeline with different communication flows</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/communication.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
</sect1>
</chapter>

View file

@ -1,99 +0,0 @@
<chapter id="chapter-gstreamer">
<title>What is &GStreamer;?</title>
<!-- ############ sect1 ############# -->
<!-- <sect1 id="section-intro-what"> --><!-- synchronize with PWG -->
<para>
&GStreamer; is a framework for creating streaming media applications.
The fundamental design comes from the video pipeline at Oregon Graduate
Institute, as well as some ideas from DirectShow.
</para>
<para>
&GStreamer;'s development framework makes it possible to write any
type of streaming multimedia application. The &GStreamer; framework
is designed to make it easy to write applications that handle audio
or video or both. It isn't restricted to audio and video, and can
process any kind of data flow.
The pipeline design is made to have little overhead above what the
applied filters induce. This makes &GStreamer; a good framework for
designing even high-end audio applications which put high demands on
latency.
</para>
<para>
One of the most obvious uses of &GStreamer; is using it to build
a media player. &GStreamer; already includes components for building a
media player that can support a very wide variety of formats, including
MP3, Ogg/Vorbis, MPEG-1/2, AVI, Quicktime, mod, and more. &GStreamer;,
however, is much more than just another media player. Its main advantages
are that the pluggable components can be mixed and matched into arbitrary
pipelines so that it's possible to write a full-fledged video or audio
editing application.
</para>
<para>
The framework is based on plugins that will provide the various codec
and other functionality. The plugins can be linked and arranged in
a pipeline. This pipeline defines the flow of the data. Pipelines can
also be edited with a GUI editor and saved as XML so that pipeline
libraries can be made with a minimum of effort.
</para>
<para>
The &GStreamer; core function is to provide a framework for plugins,
data flow and media type handling/negotiation. It also provides an
API to write applications using the various plugins.
</para>
<para>
Specifically, &GStreamer; provides
<itemizedlist>
<listitem><para>an API for multimedia applications</para></listitem>
<listitem><para>a plugin architecture</para></listitem>
<listitem><para>a pipeline architecture</para></listitem>
<listitem><para>a mechanism for media type handling/negotiation</para></listitem>
<listitem><para>a mechanism for synchronization</para></listitem>
<listitem><para>over 250 plug-ins providing more than 1000 elements</para></listitem>
<listitem><para>a set of tools</para></listitem>
</itemizedlist>
</para>
<para>
&GStreamer; plug-ins could be classified into
<itemizedlist>
<listitem><para>protocols handling</para></listitem>
<listitem><para>sources: for audio and video (involves protocol plugins)</para></listitem>
<listitem><para>formats: parsers, formaters, muxers, demuxers, metadata, subtitles</para></listitem>
<listitem><para>codecs: coders and decoders</para></listitem>
<listitem><para>filters: converters, mixers, effects, ...</para></listitem>
<listitem><para>sinks: for audio and video (involves protocol plugins)</para></listitem>
</itemizedlist>
</para>
<figure float="1" id="section-gstreamer-img">
<title>Gstreamer overview</title>
<mediaobject>
<imageobject>
<imagedata scale="75" fileref="images/gstreamer-overview.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
&GStreamer; is packaged into
<itemizedlist>
<listitem><para>gstreamer: the core package</para></listitem>
<listitem><para>gst-plugins-base: an essential exemplary set of elements</para></listitem>
<listitem><para>gst-plugins-good: a set of good-quality plug-ins under LGPL</para></listitem>
<listitem><para>gst-plugins-ugly: a set of good-quality plug-ins that might pose distribution problems</para></listitem>
<listitem><para>gst-plugins-bad: a set of plug-ins that need more quality</para></listitem>
<listitem><para>gst-libav: a set of plug-ins that wrap libav for decoding and encoding</para></listitem>
<listitem><para>a few others packages</para></listitem>
</itemizedlist>
</para>
</chapter>

View file

@ -1,300 +0,0 @@
<chapter id="chapter-motivation">
<title>Design principles</title>
<!--
<para>
Linux has historically lagged behind other operating systems in the
multimedia arena. Microsoft's <trademark>Windows</trademark> and
Apple's <trademark>MacOS</trademark> both have strong support for
multimedia devices, multimedia content creation, playback, and
realtime processing. Linux, on the other hand, has a poorly integrated
collection of multimedia utilities and applications available, which
can hardly compete with the professional level of software available
for MS Windows and MacOS.
</para>
<para>
GStreamer was designed to provide a solution to the current Linux media
problems.
</para>
<sect1 id="section-motivation-problems">
<title>Current problems</title>
<para>
We describe the typical problems in today's media handling on Linux.
</para>
<sect2 id="section-motivation-duplicate">
<title>Multitude of duplicate code</title>
<para>
The Linux user who wishes to hear a sound file must hunt through
their collection of sound file players in order to play the tens
of sound file formats in wide use today. Most of these players
basically reimplement the same code over and over again.
</para>
<para>
The Linux developer who wishes to embed a video clip in their
application must use crude hacks to run an external video player.
There is no library available that a developer can use to create
a custom media player.
</para>
</sect2>
<sect2 id="section-motivation-goal">
<title>'One goal' media players/libraries</title>
<para>
Your typical MPEG player was designed to play MPEG video and audio.
Most of these players have implemented a complete infrastructure
focused on achieving their only goal: playback. No provisions were
made to add filters or special effects to the video or audio data.
</para>
<para>
If you want to convert an MPEG-2 video stream into an AVI file,
your best option would be to take all of the MPEG-2 decoding
algorithms out of the player and duplicate them into your own
AVI encoder. These algorithms cannot easily be shared across
applications.
</para>
<para>
Attempts have been made to create libraries for handling various
media types. Because they focus on a very specific media type
(avifile, libmpeg2, ...), significant work is needed to integrate
them due to a lack of a common API. &GStreamer; allows you to
wrap these libraries with a common API, which significantly
simplifies integration and reuse.
</para>
</sect2>
<sect2 id="section-motivation-plugin">
<title>Non unified plugin mechanisms</title>
<para>
Your typical media player might have a plugin for different media
types. Two media players will typically implement their own plugin
mechanism so that the codecs cannot be easily exchanged. The plugin
system of the typical media player is also very tailored to the
specific needs of the application.
</para>
<para>
The lack of a unified plugin mechanism also seriously hinders the
creation of binary only codecs. No company is willing to port their
code to all the different plugin mechanisms.
</para>
<para>
While &GStreamer; also uses it own plugin system it offers a very rich
framework for the plugin developer and ensures the plugin can be used
in a wide range of applications, transparently interacting with other
plugins. The framework that &GStreamer; provides for the plugins is
flexible enough to host even the most demanding plugins.
</para>
</sect2>
<sect2 id="section-motivation-experience">
<title>Poor user experience</title>
<para>
Because of the problems mentioned above, application authors have
so far often been urged to spend a considerable amount of time in
writing their own backends, plugin mechanisms and so on. The result
has often been, unfortunately, that both the backend as well as the
user interface were only half-finished. Demotivated, the application
authors would start rewriting the whole thing and complete the circle.
This leads to a <emphasis>poor end user experience</emphasis>.
</para>
</sect2>
<sect2 id="section-motivation-network">
<title>Provision for network transparency</title>
<para>
No infrastructure is present to allow network transparent media
handling. A distributed MPEG encoder will typically duplicate the
same encoder algorithms found in a non-distributed encoder.
</para>
<para>
No provisions have been made for use by and use of technologies such
as the <ulink url="http://gnome.org/" type="http">GNOME</ulink>
desktop platform. Because the wheel is re-invented all the time,
it's hard to properly integrate multimedia into the bigger whole of
user's environment.
</para>
<para>
The &GStreamer; core does not use network transparent technologies
at the lowest level as it only adds overhead for the local case.
That said, it shouldn't be hard to create a wrapper around the
core components. There are tcp plugins now that implement a
&GStreamer; Data Protocol that allows pipelines to be split over
TCP. These are located in the gst-plugins module directory gst/tcp.
</para>
</sect2>
<sect2 id="section-motivation-catchup">
<title>Catch up with the <trademark>Windows</trademark> world</title>
<para>
We need solid media handling if we want to see Linux succeed on
the desktop.
</para>
<para>
We must clear the road for commercially backed codecs and multimedia
applications so that Linux can become an option for doing multimedia.
</para>
</sect2>
</sect1>
<sect1 id="section-goals-design">
<title>The design goals</title>
<para>
We describe what we try to achieve with &GStreamer;.
</para>
-->
<section id="section-goals-clean">
<title>Clean and powerful</title>
<para>
&GStreamer; provides a clean interface to:
</para>
<itemizedlist>
<listitem>
<para>
The application programmer who wants to build a media pipeline.
The programmer can use an extensive set of powerful tools to create
media pipelines without writing a single line of code. Performing
complex media manipulations becomes very easy.
</para>
</listitem>
<listitem>
<para>
The plugin programmer. Plugin programmers are provided a clean and
simple API to create self-contained plugins. An extensive debugging
and tracing mechanism has been integrated. GStreamer also comes with
an extensive set of real-life plugins that serve as examples too.
</para>
</listitem>
</itemizedlist>
</section>
<section id="section-goals-object">
<title>Object oriented</title>
<para>
&GStreamer; adheres to GObject, the GLib 2.0 object model. A programmer
familiar with GLib 2.0 or GTK+ will be
comfortable with &GStreamer;.
</para>
<para>
&GStreamer; uses the mechanism of signals and object properties.
</para>
<para>
All objects can be queried at runtime for their various properties and
capabilities.
</para>
<para>
&GStreamer; intends to be similar in programming methodology to GTK+.
This applies to the object model, ownership of objects, reference
counting, etc.
</para>
</section>
<section id="section-goals-extensible">
<title>Extensible</title>
<para>
All &GStreamer; Objects can be extended using the GObject
inheritance methods.
</para>
<para>
All plugins are loaded dynamically and can be extended and upgraded
independently.
</para>
</section>
<section id="section-goals-binary">
<title>Allow binary-only plugins</title>
<para>
Plugins are shared libraries that are loaded at runtime. Since all
the properties of the plugin can be set using the GObject properties,
there is no need (and in fact no way) to have any header files
installed for the plugins.
</para>
<para>
Special care has been taken to make plugins completely self-contained.
All relevant aspects of plugins can be queried at run-time.
</para>
</section>
<section id="section-goals-performance">
<title>High performance</title>
<para>
High performance is obtained by:
</para>
<itemizedlist>
<listitem>
<para>
using GLib's <classname>GSlice</classname> allocator
</para>
</listitem>
<listitem>
<para>
extremely light-weight links between plugins. Data can travel
the pipeline with minimal overhead. Data passing between
plugins only involves a pointer dereference in a typical
pipeline.
</para>
</listitem>
<listitem>
<para>
providing a mechanism to directly work on the target memory.
A plugin can for example directly write to the X server's
shared memory space. Buffers can also point to arbitrary
memory, such as a sound card's internal hardware buffer.
</para>
</listitem>
<listitem>
<para>
refcounting and copy on write minimize usage of memcpy.
Sub-buffers efficiently split buffers into manageable pieces.
</para>
</listitem>
<listitem>
<para>
dedicated streaming threads, with scheduling handled by the kernel.
</para>
</listitem>
<listitem>
<para>
allowing hardware acceleration by using specialized plugins.
</para>
</listitem>
<listitem>
<para>
using a plugin registry with the specifications of the plugins so
that the plugin loading can be delayed until the plugin is actually
used.
</para>
</listitem>
</itemizedlist>
</section>
<section id="section-goals-separation">
<title>Clean core/plugins separation</title>
<para>
The core of &GStreamer; is essentially media-agnostic. It only knows
about bytes and blocks, and only contains basic elements.
The core of &GStreamer; is functional enough to even implement
low-level system tools, like cp.
</para>
<para>
All of the media handling functionality is provided by plugins
external to the core. These tell the core how to handle specific
types of media.
</para>
</section>
<section id="section-goals-testbed">
<title>Provide a framework for codec experimentation</title>
<para>
&GStreamer; also wants to be an easy framework where codec
developers can experiment with different algorithms, speeding up the
development of open and free multimedia codecs like those developed
by the <ulink url="http://www.xiph.org" type="http">Xiph.Org
Foundation</ulink> (such as Theora and Vorbis).
</para>
</section>
<!--
</sect1>
-->
</chapter>

View file

@ -1,93 +0,0 @@
<!-- ############ sect1 ############# -->
<sect1 id="section-intro-who" xreflabel="Who Should Read This Manual?">
<title>Who should read this manual?</title>
<para>
This book is about &GStreamer; from an application developer's point of view; it
describes how to write a &GStreamer; application using the &GStreamer;
libraries and tools. For an explanation about writing plugins, we
suggest the <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.html">Plugin
Writers Guide</ulink>.
</para>
<para>
Also check out the other documentation available on the <ulink type="http"
url="http://gstreamer.freedesktop.org/documentation/">&GStreamer; web site</ulink>.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-intro-reading" xreflabel="Preliminary Reading">
<title>Preliminary reading</title>
<para><!-- synchronize with PWG -->
In order to understand this manual, you need to have a basic
understanding of the <emphasis>C language</emphasis>.
</para>
<para>
Since &GStreamer; adheres to the GObject programming model, this guide
also assumes that you understand the basics of <ulink type="http"
url="http://library.gnome.org/devel/gobject/stable/">GObject</ulink> and <ulink type="http"
url="http://library.gnome.org/devel/glib/stable/">glib</ulink> programming.
Especially,
<itemizedlist>
<listitem><para>GObject instantiation</para></listitem>
<listitem><para>GObject properties (set/get)</para></listitem>
<listitem><para>GObject casting</para></listitem>
<listitem><para>GObject referecing/dereferencing</para></listitem>
<listitem><para>glib memory management</para></listitem>
<listitem><para>glib signals and callbacks</para></listitem>
<listitem><para>glib main loop</para></listitem>
</itemizedlist>
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-intro-structure">
<title>Structure of this manual</title>
<para>
To help you navigate through this guide, it is divided into several large
parts. Each part addresses a particular broad topic concerning &GStreamer;
appliction development. The parts of this guide are laid out in the following
order:
</para>
<para>
<xref linkend="part-introduction"/> gives you an overview of &GStreamer;,
it's design principles and foundations.
</para>
<para>
<xref linkend="part-building"/> covers the basics of &GStreamer;
application programming. At the end of this part, you should be
able to build your own audio player using &GStreamer;
</para>
<para>
In <xref linkend="part-advanced"/>, we will move on to advanced
subjects which make &GStreamer; stand out of its competitors. We
will discuss application-pipeline interaction using dynamic parameters
and interfaces, we will discuss threading and threaded pipelines,
scheduling and clocks (and synchronization). Most of those topics are
not just there to introduce you to their API, but primarily to give
a deeper insight in solving application programming problems with
&GStreamer; and understanding their concepts.
</para>
<para>
Next, in <xref linkend="part-highlevel"/>, we will go into higher-level
programming APIs for &GStreamer;. You don't exactly need to know all
the details from the previous parts to understand this, but you will
need to understand basic &GStreamer; concepts nevertheless. We will,
amongst others, discuss XML, playbin and autopluggers.
</para>
<para>
Finally in <xref linkend="part-appendices"/>, you will find some random
information on integrating with GNOME, KDE, OS X or Windows, some
debugging help and general tips to improve and simplify &GStreamer;
programming.
</para>
</sect1>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 7.5 KiB

View file

@ -1,269 +0,0 @@
<?xml version='1.0'?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd" [
<!ENTITY % image-entities SYSTEM "image.entities">
%image-entities;
<!ENTITY % version-entities SYSTEM "version.entities">
%version-entities;
<!ENTITY % url-entities SYSTEM "url.entities">
%url-entities;
<!ENTITY EXAFOOT "
<footnote>
<para>
The code for this example is automatically extracted from
the documentation and built under <filename>tests/examples/manual</filename>
in the GStreamer tarball.
</para>
</footnote>
">
<!ENTITY TITLEPAGE SYSTEM "titlepage.xml">
<!-- Part 1: Overview -->
<!ENTITY INTRO SYSTEM "intro-preface.xml">
<!ENTITY GSTREAMER SYSTEM "intro-gstreamer.xml">
<!ENTITY MOTIVATION SYSTEM "intro-motivation.xml">
<!ENTITY CONCEPTS SYSTEM "intro-basics.xml">
<!-- Part 2: Basic Concepts -->
<!ENTITY INIT SYSTEM "basics-init.xml">
<!ENTITY ELEMENTS SYSTEM "basics-elements.xml">
<!ENTITY BINS SYSTEM "basics-bins.xml">
<!ENTITY BUS SYSTEM "basics-bus.xml">
<!ENTITY PADS SYSTEM "basics-pads.xml">
<!ENTITY DATA SYSTEM "basics-data.xml">
<!ENTITY HELLOWORLD SYSTEM "basics-helloworld.xml">
<!-- Part 3: Advanced Concepts -->
<!ENTITY QUERYEVENTS SYSTEM "advanced-position.xml">
<!ENTITY METADATA SYSTEM "advanced-metadata.xml">
<!ENTITY INTERFACES SYSTEM "advanced-interfaces.xml">
<!ENTITY CLOCKS SYSTEM "advanced-clocks.xml">
<!ENTITY BUFFERING SYSTEM "advanced-buffering.xml">
<!ENTITY DPARAMS SYSTEM "advanced-dparams.xml">
<!ENTITY THREADS SYSTEM "advanced-threads.xml">
<!ENTITY AUTOPLUGGING SYSTEM "advanced-autoplugging.xml">
<!ENTITY DATAACCESS SYSTEM "advanced-dataaccess.xml">
<!-- Part 4: Higher-level interfaces -->
<!ENTITY PLAYBACK SYSTEM "highlevel-playback.xml">
<!-- Appendices -->
<!ENTITY PROGRAMS SYSTEM "appendix-programs.xml">
<!ENTITY COMPILING SYSTEM "appendix-compiling.xml">
<!ENTITY CHECKLIST SYSTEM "appendix-checklist.xml">
<!ENTITY PORTING SYSTEM "appendix-porting.xml">
<!ENTITY INTEGRATION SYSTEM "appendix-integration.xml">
<!ENTITY LICENSING SYSTEM "appendix-licensing.xml">
<!ENTITY QUOTES SYSTEM "appendix-quotes.xml">
<!ENTITY GStreamer "<application>GStreamer</application>">
<!ENTITY GstPWG "<emphasis>GStreamer Plugin Writer's Guide</emphasis>">
]>
<book id="index">
&TITLEPAGE;
<!-- ############# Introduction ############### -->
<preface><title>Foreword</title>
<para><!-- synchronize with PWG -->
&GStreamer; is an extremely powerful and versatile framework for
creating streaming media applications. Many of the virtues of the
&GStreamer; framework come from its modularity: &GStreamer; can
seamlessly incorporate new plugin modules. But because modularity
and power often come at a cost of greater complexity, writing new
applications is not always easy.
</para>
<para>
This guide is intended to help you understand the &GStreamer;
framework (version &GST_VERSION;) so you can develop applications
based on it. The first chapters will focus on development of a
simple audio player, with much effort going into helping you
understand &GStreamer; concepts. Later chapters will go into
more advanced topics related to media playback, but also at
other forms of media processing (capture, editing, etc.).
</para>
</preface>
<preface><title>Introduction</title>
&INTRO;
</preface>
<!-- ############# Overview - part ############### -->
<part id="part-introduction">
<title>About GStreamer</title>
<partintro>
<para>
This part gives you an overview of the technologies described in
this book.
</para>
</partintro>
&GSTREAMER;
&MOTIVATION;
&CONCEPTS;
</part>
<!-- ############ Basic concepts - part ############# -->
<part id="part-building">
<title>Building an Application</title>
<partintro>
<para>
In these chapters, we will discuss the basic concepts of &GStreamer;
and the most-used objects, such as elements, pads and buffers. We
will use a visual representation of these objects so that we can
visualize the more complex pipelines you will learn to build later
on. You will get a first glance at the &GStreamer; API, which should
be enough for building elementary applications. Later on in this
part, you will also learn to build a basic command-line application.
</para>
<para>
Note that this part will give a look into the low-level API and
concepts of &GStreamer;. Once you're going to build applications,
you might want to use higher-level APIs. Those will be discussed
later on in this manual.
</para>
</partintro>
&INIT;
&ELEMENTS;
&BINS;
&BUS;
&PADS;
&DATA;
&HELLOWORLD;
</part>
<!-- ############ Advanced GStreamer - part ############# -->
<part id="part-advanced">
<title>Advanced &GStreamer; concepts</title>
<partintro>
<para>
In this part we will cover the more advanced features of &GStreamer;.
With the basics you learned in the previous part you should be
able to create a <emphasis>simple</emphasis> application. However,
&GStreamer; provides much more candy than just the basics of playing
back audio files. In this chapter, you will learn more of the
low-level features and internals of &GStreamer;.
</para>
<para>
Some parts of this part will serve mostly as an explanation of
how &GStreamer; works internally; they are not actually needed for
actual application development. This includes chapters such as the
ones covering scheduling, autoplugging and synchronization. Other
chapters, however, discuss more advanced ways of
pipeline-application interaction, and can turn out to be very useful
for certain applications. This includes the chapters on metadata,
querying and events, interfaces, dynamic parameters and pipeline
data manipulation.
</para>
</partintro>
&QUERYEVENTS;
&METADATA;
&INTERFACES;
&CLOCKS;
&BUFFERING;
&DPARAMS;
&THREADS;
&AUTOPLUGGING;
&DATAACCESS;
</part>
<!-- ############ Higher-level APIs in GStreamer - part ############# -->
<part id="part-highlevel">
<title>Higher-level interfaces for &GStreamer; applications</title>
<partintro>
<para>
In the previous two parts, you have learned many of the internals
and their corresponding low-level interfaces into &GStreamer;
application programming. Many people will, however, not need so
much control (and as much code), but will prefer to use a standard
playback interface that does most of the difficult internals for
them. In this chapter, we will introduce you into the concept of
autopluggers, playback managing elements and other such things.
Those higher-level interfaces are intended to
simplify &GStreamer;-based application programming. They do, however,
also reduce the flexibility. It is up to the application developer
to choose which interface he will want to use.
</para>
</partintro>
&PLAYBACK;
</part>
<!-- ############ Appendices - part ############# -->
<part id="part-appendices">
<title>Appendices</title>
<partintro>
<para>
By now, you've learned all about the internals of &GStreamer; and
application programming using the &GStreamer; framework. This part
will go into some random bits that are useful to know if you're
going to use &GStreamer; for serious application programming. It
will touch upon things related to integration with popular desktop
environments that we run on (GNOME, KDE, OS X, Windows), it will
shortly explain how applications included with &GStreamer; can help
making your life easier, and some information on debugging.
</para>
<para>
In addition, we also provide a porting guide which will explain
easily how to port &GStreamer;-0.10 applications to &GStreamer;-1.0.
</para>
</partintro>
<!--
Idea:
* Debugging and error handling
- 'error' signal in pipelines
- checking return values and how to handle them
- using signals for pipeline states
- gst-debug
- programs
* Desktop integration
- Linux/UNIX
. {x,xv}imagesink
. {oss,alsa}sink
. {v4l,v4l2,oss,alsa}src
- GNOME
. GConf ({video,audio}{src,sink})
. gnomevfssrc, gnomevfssink
. popt
. app examples (RB, Totem, gnome-media, ...)
- KDE
. kiosrc
. app examples (JuK, AmaroK)
. ask Scott/Mark
- Mac OS X
. native video/audio sink
- Windows
. build etc.
* Quotes from devs
- table please...
-->
&PROGRAMS;
&COMPILING;
&CHECKLIST;
&PORTING;
&INTEGRATION;
&LICENSING;
&QUOTES;
</part>
</book>

Binary file not shown.

Before

Width:  |  Height:  |  Size: 48 KiB

View file

@ -1,92 +0,0 @@
Overview
Introduction
(creating multimedia apps)
(pipeline/plugin based)
Motivation
(multitude of duplicate code)
(mostly focused on one goal)
(reinvent plugin mechanisms)
(network transparency?)
(catch up with Windows(tm) world)
Goals
(clean and powerfull)
(building graphs)
(building plugins)
(object oriented)
(using GTK+ object model)
(extensible)
(alow binary only plugins)
(alow high performance)
(HW acceleration)
(efficient memory use)
(kernel buffers etc..)
Basic concepts
elements
(what is it)
(types) sink, src, filter
(have pads)
linking elements
bin
(can contain elements)
pipeline (a complete graph)
thread (theaded operation)
buffers
(pass between elements)
(contains data)
(can cary metadata)
(use refcounting)
element states
(null)
(ready)
(paused)
(playing)
Building apps
helloworld
(fdsrc->mp3decoder->audiosink)
(step by step explanation)
More on factories
problems with helloworld
MIME types
GStreamer types
Basic types
Your second application
advanced concepts
threads
queues
cothreads
dynamic pipeline construction
ghost pads
type detection
utility functions
XML in GStreamer
(saving)
(loading a pipeline)
Plugin development
plugin types
chain based
loop based
buffers
metadata
subbuffers
adding pads
libraries
plugin registry
types
type detection
QoS messages
clocks
GStreamer programs
editor
gstplay

Binary file not shown.

Before

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 3.5 KiB

View file

@ -1,233 +0,0 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="311.62961"
height="322.74072"
id="svg2"
version="1.1"
inkscape:version="0.48.3.1 r9886"
sodipodi:docname="New document 1">
<defs
id="defs4" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.4"
inkscape:cx="161.52909"
inkscape:cy="161.37036"
inkscape:document-units="px"
inkscape:current-layer="layer1"
showgrid="false"
borderlayer="true"
fit-margin-top="0"
fit-margin-left="0"
fit-margin-right="0"
fit-margin-bottom="0"
inkscape:window-width="1600"
inkscape:window-height="841"
inkscape:window-x="0"
inkscape:window-y="27"
inkscape:window-maximized="1" />
<metadata
id="metadata7">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Layer 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(-219.1852,-376.7061)">
<g
id="g3006"
style="fill:none;stroke-width:0.025in"
transform="matrix(0.07407407,0,0,0.07407407,119.44446,321.40981)">
<!-- Circle -->
<circle
id="circle3008"
style="stroke:#000000;stroke-width:7"
r="480"
cy="1950"
cx="3600"
sodipodi:cx="3600"
sodipodi:cy="1950"
sodipodi:rx="480"
sodipodi:ry="480"
d="m 4080,1950 c 0,265.0967 -214.9033,480 -480,480 -265.0967,0 -480,-214.9033 -480,-480 0,-265.0967 214.9033,-480 480,-480 265.0967,0 480,214.9033 480,480 z" />
<!-- Circle -->
<circle
id="circle3010"
style="stroke:#000000;stroke-width:7"
r="480"
cy="3150"
cx="3600"
sodipodi:cx="3600"
sodipodi:cy="3150"
sodipodi:rx="480"
sodipodi:ry="480"
d="m 4080,3150 c 0,265.0967 -214.9033,480 -480,480 -265.0967,0 -480,-214.9033 -480,-480 0,-265.0967 214.9033,-480 480,-480 265.0967,0 480,214.9033 480,480 z" />
<!-- Circle -->
<circle
id="circle3012"
style="stroke:#000000;stroke-width:7"
r="480"
cy="4350"
cx="3600"
sodipodi:cx="3600"
sodipodi:cy="4350"
sodipodi:rx="480"
sodipodi:ry="480"
d="m 4080,4350 c 0,265.0967 -214.9033,480 -480,480 -265.0967,0 -480,-214.9033 -480,-480 0,-265.0967 214.9033,-480 480,-480 265.0967,0 480,214.9033 480,480 z" />
<!-- Circle -->
<circle
id="circle3014"
style="stroke:#000000;stroke-width:7"
r="480"
cy="4350"
cx="4875"
sodipodi:cx="4875"
sodipodi:cy="4350"
sodipodi:rx="480"
sodipodi:ry="480"
d="m 5355,4350 c 0,265.0967 -214.9033,480 -480,480 -265.0967,0 -480,-214.9033 -480,-480 0,-265.0967 214.9033,-480 480,-480 265.0967,0 480,214.9033 480,480 z" />
<!-- Line -->
<polyline
id="polyline3016"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:miter"
points="3600,900 3600,1380 " />
<!-- Arrowhead on XXXpoint 3600 900 - 3600 1515-->
<polygon
id="polygon3018"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="3600,1499 3630,1379 3570,1379 3570,1379 " />
<!-- Line: box -->
<rect
id="rect3020"
style="stroke:#ffffff;stroke-width:7;stroke-linecap:butt;stroke-linejoin:miter"
rx="0"
height="4350"
width="4200"
y="750"
x="1350" />
<!-- Line -->
<polyline
id="polyline3022"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:bevel"
points="3150,1875 3149,1876 3145,1878 3139,1881 3130,1887 3118,1895 3101,1905 3082,1917 3060,1931 3035,1948 3009,1966 2981,1985 2953,2006 2925,2028 2897,2051 2869,2075 2843,2101 2818,2127 2794,2155 2773,2184 2753,2215 2735,2248 2721,2283 2710,2321 2703,2360 2700,2400 2703,2440 2710,2479 2721,2517 2735,2552 2753,2585 2773,2616 2794,2645 2818,2673 2843,2699 2869,2725 2897,2749 2925,2772 2953,2794 2981,2815 3009,2834 3035,2852 3060,2869 3082,2883 3101,2895 3118,2905 3130,2913 3047,2863 " />
<!-- Arrowhead on XXXpoint 3130 2913 - 3162 2932-->
<polygon
id="polygon3024"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="3148,2924 3060,2836 3029,2888 3029,2888 " />
<!-- Line -->
<polyline
id="polyline3026"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:bevel"
points="3150,3150 3149,3151 3145,3153 3139,3156 3130,3162 3118,3170 3101,3180 3082,3192 3060,3206 3035,3223 3009,3241 2981,3260 2953,3281 2925,3303 2897,3326 2869,3350 2843,3376 2818,3402 2794,3430 2773,3459 2753,3490 2735,3523 2721,3558 2710,3596 2703,3635 2700,3675 2703,3715 2710,3754 2721,3792 2735,3827 2753,3860 2773,3891 2794,3920 2818,3948 2843,3974 2869,4000 2897,4024 2925,4047 2953,4069 2981,4090 3009,4109 3035,4127 3060,4144 3082,4158 3101,4170 3118,4180 3130,4188 3047,4138 " />
<!-- Arrowhead on XXXpoint 3130 4188 - 3162 4207-->
<polygon
id="polygon3028"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="3148,4199 3060,4111 3029,4163 3029,4163 " />
<!-- Line -->
<polyline
id="polyline3030"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:bevel"
points="3750,3900 3751,3899 3755,3896 3760,3891 3769,3884 3781,3875 3796,3863 3814,3850 3834,3834 3857,3818 3881,3802 3907,3785 3935,3768 3964,3752 3995,3737 4027,3723 4062,3710 4099,3699 4139,3689 4182,3682 4228,3677 4275,3675 4322,3677 4368,3682 4411,3689 4451,3699 4488,3710 4523,3723 4555,3737 4586,3752 4615,3768 4643,3785 4669,3802 4693,3818 4716,3834 4736,3850 4754,3863 4769,3875 4781,3884 4708,3823 " />
<!-- Arrowhead on XXXpoint 4781 3884 - 4811 3909-->
<polygon
id="polygon3032"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="4799,3899 4726,3799 4687,3845 4687,3845 " />
<!-- Line -->
<polyline
id="polyline3034"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:bevel"
points="4800,4800 4799,4801 4795,4804 4790,4809 4781,4816 4769,4825 4754,4837 4736,4850 4716,4866 4693,4882 4669,4898 4643,4915 4615,4932 4586,4948 4555,4963 4523,4977 4488,4990 4451,5001 4411,5011 4368,5018 4322,5023 4275,5025 4228,5023 4182,5018 4139,5011 4099,5001 4062,4990 4027,4977 3995,4963 3964,4948 3935,4932 3907,4915 3881,4898 3857,4882 3834,4866 3814,4850 3796,4837 3781,4825 3769,4816 3841,4876 " />
<!-- Arrowhead on XXXpoint 3769 4816 - 3738 4790-->
<polygon
id="polygon3036"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="3750,4800 3822,4900 3861,4854 3861,4854 " />
<!-- Line -->
<polyline
id="polyline3038"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:bevel"
points="5175,3975 5175,3974 5176,3972 5177,3967 5179,3960 5182,3951 5186,3937 5190,3921 5196,3901 5202,3877 5209,3849 5217,3819 5226,3785 5235,3748 5244,3708 5253,3666 5263,3622 5272,3577 5280,3530 5289,3481 5296,3432 5303,3382 5309,3331 5314,3280 5318,3228 5320,3175 5320,3121 5319,3067 5317,3011 5312,2955 5305,2897 5295,2839 5283,2779 5267,2719 5249,2658 5228,2596 5203,2535 5175,2475 5142,2414 5107,2356 5069,2301 5030,2250 4990,2203 4949,2160 4908,2120 4866,2083 4824,2050 4781,2019 4738,1990 4695,1964 4652,1940 4609,1917 4566,1896 4522,1877 4479,1859 4437,1842 4395,1827 4354,1812 4315,1799 4277,1787 4241,1776 4208,1766 4178,1758 4150,1750 4126,1744 4106,1739 4089,1734 4075,1731 4065,1729 4165,1755 " />
<!-- Arrowhead on XXXpoint 4065 1729 - 4035 1721-->
<polygon
id="polygon3040"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="4051,1725 4159,1784 4174,1726 4174,1726 " />
<!-- Line -->
<polyline
id="polyline3042"
style="stroke:#000000;stroke-width:7;stroke-linecap:butt;stroke-linejoin:bevel"
points="3225,4650 3224,4650 3222,4649 3217,4647 3210,4644 3201,4640 3187,4635 3170,4629 3149,4620 3125,4610 3096,4598 3063,4585 3026,4570 2985,4553 2942,4534 2895,4514 2845,4492 2793,4469 2739,4444 2683,4418 2626,4391 2568,4363 2510,4334 2452,4303 2393,4272 2336,4240 2278,4207 2222,4173 2167,4137 2113,4101 2060,4064 2009,4025 1960,3985 1913,3943 1867,3900 1824,3856 1783,3809 1745,3760 1710,3710 1678,3657 1650,3602 1626,3545 1605,3486 1590,3426 1580,3363 1575,3300 1576,3236 1583,3173 1596,3110 1613,3049 1634,2988 1660,2930 1689,2873 1721,2817 1757,2764 1795,2712 1835,2661 1878,2612 1924,2564 1971,2517 2019,2471 2070,2427 2122,2383 2175,2340 2229,2299 2285,2258 2341,2217 2397,2178 2454,2140 2510,2103 2567,2066 2622,2032 2676,1998 2729,1966 2779,1935 2828,1907 2874,1880 2916,1855 2956,1833 2992,1812 3024,1794 3052,1779 3076,1765 3097,1754 3113,1745 3126,1738 3136,1733 3046,1784 " />
<!-- Arrowhead on XXXpoint 3136 1733 - 3163 1717-->
<polygon
id="polygon3044"
style="fill:#000000;stroke:#000000;stroke-width:7;stroke-miterlimit:8"
points="3149,1725 3030,1760 3061,1811 3061,1811 " />
<!-- Text -->
<text
id="text3046"
font-size="144"
font-weight="normal"
font-style="normal"
y="2025"
x="3375"
xml:space="preserve"
style="font-size:144px;font-style:normal;font-weight:normal;text-anchor:start;fill:#000000;font-family:Times">NULL</text>
<!-- Text -->
<text
id="text3048"
font-size="144"
font-weight="normal"
font-style="normal"
y="3225"
x="3300"
xml:space="preserve"
style="font-size:144px;font-style:normal;font-weight:normal;text-anchor:start;fill:#000000;font-family:Times">READY</text>
<!-- Text -->
<text
id="text3050"
font-size="144"
font-weight="normal"
font-style="normal"
y="4425"
x="3225"
xml:space="preserve"
style="font-size:144px;font-style:normal;font-weight:normal;text-anchor:start;fill:#000000;font-family:Times">PLAYING</text>
<!-- Text -->
<text
id="text3052"
font-size="144"
font-weight="normal"
font-style="normal"
y="4425"
x="4500"
xml:space="preserve"
style="font-size:144px;font-style:normal;font-weight:normal;text-anchor:start;fill:#000000;font-family:Times">PAUSED</text>
</g>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 42 KiB

View file

@ -1,69 +0,0 @@
<bookinfo>
<authorgroup>
<author>
<firstname>Wim</firstname>
<surname>Taymans</surname>
<authorblurb>
<para>
<email>wim.taymans@chello.be</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Steve</firstname>
<surname>Baker</surname>
<authorblurb>
<para>
<email>stevebaker_org@yahoo.co.uk</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Andy</firstname>
<surname>Wingo</surname>
<authorblurb>
<para>
<email>wingo@pobox.com</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Ronald</firstname>
<othername>S.</othername>
<surname>Bultje</surname>
<authorblurb>
<para>
<email>rbultje@ronald.bitfreak.net</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Stefan</firstname>
<surname>Kost</surname>
<authorblurb>
<para>
<email>ensonic@users.sf.net</email>
</para>
</authorblurb>
</author>
</authorgroup>
<legalnotice id="misc-legalnotice">
<para>
This material may be distributed only subject to the terms and
conditions set forth in the Open Publication License, v1.0 or later (the
latest version is presently available at <ulink url="
http://www.opencontent.org/opl.shtml"
type="http">http://www.opencontent.org/opl.shtml</ulink>).
</para>
</legalnotice>
<title>&GStreamer; Application Development Manual (&GST_VERSION;)</title>
</bookinfo>

7
docs/pwg/.gitignore vendored
View file

@ -1,7 +0,0 @@
Makefile
Makefile.in
.deps
build
html
*.pdf
*.ps

View file

@ -1,38 +0,0 @@
### this is the part you can customize if you need to
# base name of doc
DOC = pwg
# formats defined for upload-doc.mak
FORMATS=html ps pdf
# main xml file
MAIN = $(DOC).xml
# all xml sources
XML = $(notdir $(wildcard $(srcdir)/*.xml))
# base style sheet
CSS = base.css
# image sources
PNG_SRC =
FIG_SRC = $(notdir $(wildcard $(srcdir)/*.fig))
# extra sources to copy in build directory
EXTRA_SRC =
### this is the generic bit and you shouldn't need to change this
# get the generic docbuilding Makefile stuff
include $(srcdir)/../manuals.mak
# get the generic upload target
include $(top_srcdir)/common/upload-doc.mak
### this is standard automake stuff
# package up all the source
EXTRA_DIST = $(SRC)
# install documentation
pwgdir = $(docdir)/$(DOC)
pwg_DATA = $(PDF_DAT) $(PS_DAT)
include $(srcdir)/../htmlinstall.mak

View file

@ -1,817 +0,0 @@
<chapter id="chapter-allocation" xreflabel="Memory allocation">
<title>Memory allocation</title>
<para>
Memory allocation and management is a very important topic in
multimedia. High definition video uses many megabytes to store
one single frame of video. It is important to reuse the memory
when possible instead of constantly allocating and freeing
the memory.
</para>
<para>
Multimedia systems usually use special purpose chips, such as
DSPs or GPUs to perform the heavy lifting (especially for video).
These special purpose chips have usually strict requirements
for the memory that they can operate on and how the memory
is accessed.
</para>
<para>
This chapter talks about the memory management features that
&GStreamer; plugins can use. We will first talk about the
lowlevel <classname>GstMemory</classname> object that manages
access to a piece of memory. We then continue with
<classname>GstBuffer</classname> that is used to exchange data
between plugins (and the application) and that uses
<classname>GstMemory</classname>. We talk about
<classname>GstMeta</classname> that can be placed on buffers to
give extra info about the buffer and its memory.
For efficiently managing buffers of the same size, we take a
look at <classname>GstBufferPool</classname>. To conclude this
chapter we take a look at the GST_QUERY_ALLOCATION query that
is used to negotiate memory management options between elements.
</para>
<sect1 id="section-allocation-memory" xreflabel="GstMemory">
<title>GstMemory</title>
<para>
<classname>GstMemory</classname> is an object that manages a region
of memory. The memory object points to a region of memory of
<quote>maxsize</quote>. The area in this memory starting at
<quote>offset</quote> and for <quote>size</quote> bytes is the
accessible region in the memory. the maxsize of the memory can
never be changed after the object is created, however, the offset
and size can be changed.
</para>
<sect2 id="section-allocation-allocator" xreflabel="GstAllocator">
<title>GstAllocator</title>
<para>
<classname>GstMemory</classname> objects are created by a
<classname>GstAllocator</classname> object. Most allocators implement the
default <function>gst_allocator_alloc()</function> method but some allocator
might implement a different method, for example when additional parameters
are needed to allocate the specific memory.
</para>
<para>
Different allocators exist for, for example, system memory, shared memory
and memory backed by a DMAbuf file descriptor. To implement support for a
new kind of memory type, you must implement a new allocator object as shown
below.
</para>
</sect2>
<sect2 id="section-allocation-memory-ex" xreflabel="GstMemory-ex">
<title>GstMemory API example</title>
<para>
Data access to the memory wrapped by the <classname>GstMemory</classname>
object is always protected with a <function>gst_memory_map()</function>
and <function>gst_memory_unmap()</function> pair. An access mode
(read/write) must be given when mapping memory. The map
function returns a pointer to the valid memory region that can
then be accessed according to the requested access mode.
</para>
<para>
Below is an example of making a <classname>GstMemory</classname>
object and using the <function>gst_memory_map()</function> to
access the memory region.
</para>
<programlisting>
<![CDATA[
[...]
GstMemory *mem;
GstMapInfo info;
gint i;
/* allocate 100 bytes */
mem = gst_allocator_alloc (NULL, 100, NULL);
/* get access to the memory in write mode */
gst_memory_map (mem, &info, GST_MAP_WRITE);
/* fill with pattern */
for (i = 0; i < info.size; i++)
info.data[i] = i;
/* release memory */
gst_memory_unmap (mem, &info);
[...]
]]>
</programlisting>
</sect2>
<sect2 id="section-allocation-allocator-ex" xreflabel="GstAllocator-ex">
<title>Implementing a GstAllocator</title>
<para>
WRITEME
</para>
</sect2>
</sect1>
<sect1 id="section-allocation-buffer" xreflabel="GstBuffer">
<title>GstBuffer</title>
<para>
A <classname>GstBuffer</classname> is an lightweight object that
is passed from an upstream to a downstream element and contains
memory and metadata. It represents the multimedia content that
is pushed or pull downstream by elements.
</para>
<para>
The buffer contains one or more <classname>GstMemory</classname>
objects that represent the data in the buffer.
</para>
<para>
Metadata in the buffer consists of:
</para>
<itemizedlist mark="opencircle">
<listitem>
<para>
DTS and PTS timestamps. These represent the decoding and
presentation timestamps of the buffer content and is used by
synchronizing elements to schedule buffers. Both these timestamps
can be GST_CLOCK_TIME_NONE when unknown/undefined.
</para>
</listitem>
<listitem>
<para>
The duration of the buffer contents. This duration can be
GST_CLOCK_TIME_NONE when unknown/undefined.
</para>
</listitem>
<listitem>
<para>
Media specific offsets and offset_end. For video this is the
frame number in the stream and for audio the sample number. Other
definitions for other media exist.
</para>
</listitem>
<listitem>
<para>
Arbitrary structures via <classname>GstMeta</classname>, see below.
</para>
</listitem>
</itemizedlist>
<sect2 id="section-allocation-writability" xreflabel="GstBuffer-write">
<title>GstBuffer writability</title>
<para>
A buffer is writable when the refcount of the object is exactly 1, meaning
that only one object is holding a ref to the buffer. You can only
modify anything in the buffer when the buffer is writable. This means
that you need to call <function>gst_buffer_make_writable()</function>
before changing the timestamps, offsets, metadata or adding and
removing memory blocks.
</para>
</sect2>
<sect2 id="section-allocation-buffer-ex" xreflabel="GstBuffer-ex">
<title>GstBuffer API examples</title>
<para>
You can create a buffer with <function>gst_buffer_new ()</function>
and then add memory objects to it or you can use a convenience function
<function>gst_buffer_new_allocate ()</function> which combines the
two. It's also possible to wrap existing memory with
<function>gst_buffer_new_wrapped_full () </function> where you can
give the function to call when the memory should be freed.
</para>
<para>
You can access the memory of the buffer by getting and mapping the
<classname>GstMemory</classname> objects individually or by using
<function>gst_buffer_map ()</function>. The latter merges all the
memory into one big block and then gives you a pointer to this block.
</para>
<para>
Below is an example of how to create a buffer and access its memory.
</para>
<programlisting>
<![CDATA[
[...]
GstBuffer *buffer;
GstMemory *mem;
GstMapInfo info;
/* make empty buffer */
buffer = gst_buffer_new ();
/* make memory holding 100 bytes */
mem = gst_allocator_alloc (NULL, 100, NULL);
/* add the buffer */
gst_buffer_append_memory (buffer, mem);
[...]
/* get WRITE access to the memory and fill with 0xff */
gst_buffer_map (buffer, &info, GST_MAP_WRITE);
memset (info.data, 0xff, info.size);
gst_buffer_unmap (buffer, &info);
[...]
/* free the buffer */
gst_buffer_unref (buffer);
[...]
]]>
</programlisting>
</sect2>
</sect1>
<sect1 id="section-allocation-meta" xreflabel="GstMeta">
<title>GstMeta</title>
<para>
With the <classname>GstMeta</classname> system you can add arbitrary
structures on buffers. These structures describe extra properties
of the buffer such as cropping, stride, region of interest etc.
</para>
<para>
The metadata system separates API specification (what the metadata
and its API look like) and the implementation (how it works). This makes
it possible to make different implementations of the same API,
for example, depending on the hardware you are running on.
</para>
<sect2 id="section-allocation-meta-ex" xreflabel="GstMeta-ex">
<title>GstMeta API example</title>
<para>
After allocating a new buffer, you can add metadata to the buffer
with the metadata specific API. This means that you will need to
link to the header file where the metadata is defined to use
its API.
</para>
<para>
By convention, a metadata API with name <classname>FooBar</classname>
should provide two methods, a
<function>gst_buffer_add_foo_bar_meta ()</function> and a
<function>gst_buffer_get_foo_bar_meta ()</function>. Both functions
should return a pointer to a <classname>FooBarMeta</classname>
structure that contains the metadata fields. Some of the
<function>_add_*_meta ()</function> can have extra parameters that
will usually be used to configure the metadata structure for you.
</para>
<para>
Let's have a look at the metadata that is used to specify a cropping
region for video frames.
</para>
<programlisting>
<![CDATA[
#include <gst/video/gstvideometa.h>
[...]
GstVideoCropMeta *meta;
/* buffer points to a video frame, add some cropping metadata */
meta = gst_buffer_add_video_crop_meta (buffer);
/* configure the cropping metadata */
meta->x = 8;
meta->y = 8;
meta->width = 120;
meta->height = 80;
[...]
]]>
</programlisting>
<para>
An element can then use the metadata on the buffer when rendering
the frame like this:
</para>
<programlisting>
<![CDATA[
#include <gst/video/gstvideometa.h>
[...]
GstVideoCropMeta *meta;
/* buffer points to a video frame, get the cropping metadata */
meta = gst_buffer_get_video_crop_meta (buffer);
if (meta) {
/* render frame with cropping */
_render_frame_cropped (buffer, meta->x, meta->y, meta->width, meta->height);
} else {
/* render frame */
_render_frame (buffer);
}
[...]
]]>
</programlisting>
</sect2>
<sect2 id="section-allocation-meta-new" xreflabel="GstMeta-new">
<title>Implementing new GstMeta</title>
<para>
In the next sections we show how you can add new metadata to the
system and use it on buffers.
</para>
<sect3 id="section-allocation-meta-api" xreflabel="GstMeta-api">
<title>Define the metadata API</title>
<para>
First we need to define what our API will look like and we
will have to register this API to the system. This is important
because this API definition will be used when elements negotiate
what kind of metadata they will exchange. The API definition
also contains arbitrary tags that give hints about what the
metadata contains. This is important when we see how metadata
is preserved when buffers pass through the pipeline.
</para>
<para>
If you are making a new implementation of an existing API,
you can skip this step and move on to the implementation step.
</para>
<para>
First we start with making the
<filename>my-example-meta.h</filename> header file that will contain
the definition of the API and structure for our metadata.
</para>
<programlisting>
<![CDATA[
#include <gst/gst.h>
typedef struct _MyExampleMeta MyExampleMeta;
struct _MyExampleMeta {
GstMeta meta;
gint age;
gchar *name;
};
GType my_example_meta_api_get_type (void);
#define MY_EXAMPLE_META_API_TYPE (my_example_meta_api_get_type())
#define gst_buffer_get_my_example_meta(b) \
((MyExampleMeta*)gst_buffer_get_meta((b),MY_EXAMPLE_META_API_TYPE))
]]>
</programlisting>
<para>
The metadata API definition consists of the definition of the
structure that holds a gint and a string. The first field in
the structure must be <classname>GstMeta</classname>.
</para>
<para>
We also define a <function>my_example_meta_api_get_type ()</function>
function that will register out metadata API definition. We
also define a convenience macro
<function>gst_buffer_get_my_example_meta ()</function> that simply
finds and returns the metadata with our new API.
</para>
<para>
Next let's have a look at how the
<function>my_example_meta_api_get_type ()</function> function is
implemented in the <filename>my-example-meta.c</filename> file.
</para>
<programlisting>
<![CDATA[
#include "my-example-meta.h"
GType
my_example_meta_api_get_type (void)
{
static volatile GType type;
static const gchar *tags[] = { "foo", "bar", NULL };
if (g_once_init_enter (&type)) {
GType _type = gst_meta_api_type_register ("MyExampleMetaAPI", tags);
g_once_init_leave (&type, _type);
}
return type;
}
]]>
</programlisting>
<para>
As you can see, it simply uses the
<function>gst_meta_api_type_register ()</function> function to
register a name for the api and some tags. The result is a
new pointer GType that defines the newly registered API.
</para>
</sect3>
<sect3 id="section-allocation-meta-impl" xreflabel="GstMeta-impl">
<title>Implementing a metadata API</title>
<para>
Next we can make an implementation for a registered metadata
API GType. The implementation detail of a metadata API
are kept in a <classname>GstMetaInfo</classname> structure
that you will make available to the users of your metadata
API implementation with a <function>my_example_meta_get_info ()</function>
function and a convenience <function>MY_EXAMPLE_META_INFO</function>
macro. You will also make a method to add your metadata
implementation to a <classname>GstBuffer</classname>.
Your <filename>my-example-meta.h</filename> header file will
need these additions:
</para>
<programlisting>
<![CDATA[
[...]
/* implementation */
const GstMetaInfo *my_example_meta_get_info (void);
#define MY_EXAMPLE_META_INFO (my_example_meta_get_info())
MyExampleMeta * gst_buffer_add_my_example_meta (GstBuffer *buffer,
gint age,
const gchar *name);
]]>
</programlisting>
<para>
Let's have a look at how these functions are
implemented in the <filename>my-example-meta.c</filename> file.
</para>
<programlisting>
<![CDATA[
[...]
static gboolean
my_example_meta_init (GstMeta * meta, gpointer params, GstBuffer * buffer)
{
MyExampleMeta *emeta = (MyExampleMeta *) meta;
emeta->age = 0;
emeta->name = NULL;
return TRUE;
}
static gboolean
my_example_meta_transform (GstBuffer * transbuf, GstMeta * meta,
GstBuffer * buffer, GQuark type, gpointer data)
{
MyExampleMeta *emeta = (MyExampleMeta *) meta;
/* we always copy no matter what transform */
gst_buffer_add_my_example_meta (transbuf, emeta->age, emeta->name);
return TRUE;
}
static void
my_example_meta_free (GstMeta * meta, GstBuffer * buffer)
{
MyExampleMeta *emeta = (MyExampleMeta *) meta;
g_free (emeta->name);
emeta->name = NULL;
}
const GstMetaInfo *
my_example_meta_get_info (void)
{
static const GstMetaInfo *meta_info = NULL;
if (g_once_init_enter (&meta_info)) {
const GstMetaInfo *mi = gst_meta_register (MY_EXAMPLE_META_API_TYPE,
"MyExampleMeta",
sizeof (MyExampleMeta),
my_example_meta_init,
my_example_meta_free,
my_example_meta_transform);
g_once_init_leave (&meta_info, mi);
}
return meta_info;
}
MyExampleMeta *
gst_buffer_add_my_example_meta (GstBuffer *buffer,
gint age,
const gchar *name)
{
MyExampleMeta *meta;
g_return_val_if_fail (GST_IS_BUFFER (buffer), NULL);
meta = (MyExampleMeta *) gst_buffer_add_meta (buffer,
MY_EXAMPLE_META_INFO, NULL);
meta->age = age;
meta->name = g_strdup (name);
return meta;
}
]]>
</programlisting>
<para>
<function>gst_meta_register ()</function> registers the implementation
details, like the API that you implement and the size of the
metadata structure along with methods to initialize and free the
memory area. You can also implement a transform function that will
be called when a certain transformation (identified by the quark and
quark specific data) is performed on a buffer.
</para>
<para>
Lastly, you implement a <function>gst_buffer_add_*_meta()</function>
that adds the metadata implementation to a buffer and sets the
values of the metadata.
</para>
</sect3>
</sect2>
</sect1>
<sect1 id="section-allocation-bufferpool" xreflabel="GstBufferPool">
<title>GstBufferPool</title>
<para>
The <classname>GstBufferPool</classname> object provides a convenient
base class for managing lists of reusable buffers. Essential for this
object is that all the buffers have the same properties such as size,
padding, metadata and alignment.
</para>
<para>
A bufferpool object can be configured to manage a minimum and maximum
amount of buffers of a specific size. A bufferpool can also be
configured to use a specific <classname>GstAllocator</classname> for
the memory of the buffers. There is support in the bufferpool to enable
bufferpool specific options, such as adding <classname>GstMeta</classname>
to the buffers in the pool or such as enabling specific padding on
the memory in the buffers.
</para>
<para>
A Bufferpool can be inactivate and active. In the inactive state,
you can configure the pool. In the active state, you can't change
the configuration anymore but you can acquire and release buffers
from/to the pool.
</para>
<para>
In the following sections we take a look at how you can use
a bufferpool.
</para>
<sect2 id="section-allocation-pool-ex" xreflabel="GstBufferPool-ex">
<title>GstBufferPool API example</title>
<para>
Many different bufferpool implementations can exist; they are all
subclasses of the base class <classname>GstBufferPool</classname>.
For this example, we will assume we somehow have access to a
bufferpool, either because we created it ourselves or because
we were given one as a result of the ALLOCATION query as we will
see below.
</para>
<para>
The bufferpool is initially in the inactive state so that we can
configure it. Trying to configure a bufferpool that is not in the
inactive state will fail. Likewise, trying to activate a bufferpool
that is not configured will fail.
</para>
<programlisting>
<![CDATA[
GstStructure *config;
[...]
/* get config structure */
config = gst_buffer_pool_get_config (pool);
/* set caps, size, minimum and maximum buffers in the pool */
gst_buffer_pool_config_set_params (config, caps, size, min, max);
/* configure allocator and parameters */
gst_buffer_pool_config_set_allocator (config, allocator, &params);
/* store the updated configuration again */
gst_buffer_pool_set_config (pool, config);
[...]
]]>
</programlisting>
<para>
The configuration of the bufferpool is maintained in a generic
<classname>GstStructure</classname> that can be obtained with
<function>gst_buffer_pool_get_config()</function>. Convenience
methods exist to get and set the configuration options in this
structure. After updating the structure, it is set as the current
configuration in the bufferpool again with
<function>gst_buffer_pool_set_config()</function>.
</para>
<para>
The following options can be configured on a bufferpool:
</para>
<itemizedlist mark="opencircle">
<listitem>
<para>
The caps of the buffers to allocate.
</para>
</listitem>
<listitem>
<para>
The size of the buffers. This is the suggested size of the
buffers in the pool. The pool might decide to allocate larger
buffers to add padding.
</para>
</listitem>
<listitem>
<para>
The minimum and maximum amount of buffers in the pool. When
minimum is set to > 0, the bufferpool will pre-allocate this
amount of buffers. When maximum is not 0, the bufferpool
will allocate up to maximum amount of buffers.
</para>
</listitem>
<listitem>
<para>
The allocator and parameters to use. Some bufferpools might
ignore the allocator and use its internal one.
</para>
</listitem>
<listitem>
<para>
Other arbitrary bufferpool options identified with a string.
a bufferpool lists the supported options with
<function>gst_buffer_pool_get_options()</function> and you
can ask if an option is supported with
<function>gst_buffer_pool_has_option()</function>. The option
can be enabled by adding it to the configuration structure
with <function>gst_buffer_pool_config_add_option ()</function>.
These options are used to enable things like letting the
pool set metadata on the buffers or to add extra configuration
options for padding, for example.
</para>
</listitem>
</itemizedlist>
<para>
After the configuration is set on the bufferpool, the pool can
be activated with
<function>gst_buffer_pool_set_active (pool, TRUE)</function>. From
that point on you can use
<function>gst_buffer_pool_acquire_buffer ()</function> to retrieve
a buffer from the pool, like this:
</para>
<programlisting>
<![CDATA[
[...]
GstFlowReturn ret;
GstBuffer *buffer;
ret = gst_buffer_pool_acquire_buffer (pool, &buffer, NULL);
if (G_UNLIKELY (ret != GST_FLOW_OK))
goto pool_failed;
[...]
]]>
</programlisting>
<para>
It is important to check the return value of the acquire function
because it is possible that it fails: When your
element shuts down, it will deactivate the bufferpool and then
all calls to acquire will return GST_FLOW_FLUSHNG.
</para>
<para>
All buffers that are acquired from the pool will have their pool
member set to the original pool. When the last ref is decremented
on the buffer, &GStreamer; will automatically call
<function>gst_buffer_pool_release_buffer()</function> to release
the buffer back to the pool. You (or any other downstream element)
don't need to know if a buffer came from a pool, you can just
unref it.
</para>
</sect2>
<sect2 id="section-allocation-pool-impl" xreflabel="GstBufferPool-impl">
<title>Implementing a new GstBufferPool</title>
<para>
WRITEME
</para>
</sect2>
</sect1>
<sect1 id="section-allocation-query" xreflabel="GST_QUERY_ALLOCATION">
<title>GST_QUERY_ALLOCATION</title>
<para>
The ALLOCATION query is used to negotiate
<classname>GstMeta</classname>, <classname>GstBufferPool</classname>
and <classname>GstAllocator</classname> between elements. Negotiation
of the allocation strategy is always initiated and decided by a srcpad
after it has negotiated a format and before it decides to push buffers.
A sinkpad can suggest an allocation strategy but it is ultimately the
source pad that will decide based on the suggestions of the downstream
sink pad.
</para>
<para>
The source pad will do a GST_QUERY_ALLOCATION with the negotiated caps
as a parameter. This is needed so that the downstream element knows
what media type is being handled. A downstream sink pad can answer the
allocation query with the following results:
</para>
<itemizedlist mark="opencircle">
<listitem>
<para>
An array of possible <classname>GstBufferPool</classname> suggestions
with suggested size, minimum and maximum amount of buffers.
</para>
</listitem>
<listitem>
<para>
An array of GstAllocator objects along with suggested allocation
parameters such as flags, prefix, alignment and padding. These
allocators can also be configured in a bufferpool when this is
supported by the bufferpool.
</para>
</listitem>
<listitem>
<para>
An array of supported <classname>GstMeta</classname> implementations
along with metadata specific parameters.
It is important that the upstream element knows what kind of
metadata is supported downstream before it places that metadata
on buffers.
</para>
</listitem>
</itemizedlist>
<para>
When the GST_QUERY_ALLOCATION returns, the source pad will select
from the available bufferpools, allocators and metadata how it will
allocate buffers.
</para>
<sect2 id="section-allocation-query-ex" xreflabel="Allocation-ex">
<title>ALLOCATION query example</title>
<para>
Below is an example of the ALLOCATION query.
</para>
<programlisting>
<![CDATA[
#include <gst/video/video.h>
#include <gst/video/gstvideometa.h>
#include <gst/video/gstvideopool.h>
GstCaps *caps;
GstQuery *query;
GstStructure *structure;
GstBufferPool *pool;
GstStructure *config;
guint size, min, max;
[...]
/* find a pool for the negotiated caps now */
query = gst_query_new_allocation (caps, TRUE);
if (!gst_pad_peer_query (scope->srcpad, query)) {
/* query failed, not a problem, we use the query defaults */
}
if (gst_query_get_n_allocation_pools (query) > 0) {
/* we got configuration from our peer, parse them */
gst_query_parse_nth_allocation_pool (query, 0, &pool, &size, &min, &max);
} else {
pool = NULL;
size = 0;
min = max = 0;
}
if (pool == NULL) {
/* we did not get a pool, make one ourselves then */
pool = gst_video_buffer_pool_new ();
}
config = gst_buffer_pool_get_config (pool);
gst_buffer_pool_config_add_option (config, GST_BUFFER_POOL_OPTION_VIDEO_META);
gst_buffer_pool_config_set_params (config, caps, size, min, max);
gst_buffer_pool_set_config (pool, config);
/* and activate */
gst_buffer_pool_set_active (pool, TRUE);
[...]
]]>
</programlisting>
<para>
This particular implementation will make a custom
<classname>GstVideoBufferPool</classname> object that is specialized
in allocating video buffers. You can also enable the pool to
put <classname>GstVideoMeta</classname> metadata on the buffers from
the pool doing
<function>gst_buffer_pool_config_add_option (config,
GST_BUFFER_POOL_OPTION_VIDEO_META)</function>.
</para>
</sect2>
<sect2 id="section-allocation-query-base" xreflabel="Allocation-base">
<title>The ALLOCATION query in base classes</title>
<para>
In many baseclasses you will see the following virtual methods for
influencing the allocation strategy:
</para>
<itemizedlist>
<listitem>
<para>
<function>propose_allocation ()</function> should suggest
allocation parameters for the upstream element.
</para>
</listitem>
<listitem>
<para>
<function>decide_allocation ()</function> should decide the
allocation parameters from the suggestions received from
downstream.
</para>
</listitem>
</itemizedlist>
<para>
Implementors of these methods should modify the given
<classname>GstQuery</classname> object by updating the pool options
and allocation options.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,173 +0,0 @@
<chapter id="chapter-advanced-clock">
<title>Clocking</title>
<para>
When playing complex media, each sound and video sample must be played in a
specific order at a specific time. For this purpose, GStreamer provides a
synchronization mechanism.
</para>
<sect1 id="section-clocks" xreflabel="Clocks">
<title>Clocks</title>
<para>
Time in &GStreamer; is defined as the value returned from a particular
<classname>GstClock</classname> object from the method
<function>gst_clock_get_time ()</function>.
</para>
<para>
In a typical computer, there are many sources that can be used as a
time source, e.g., the system time, soundcards, CPU performance
counters, ... For this reason, there are many
<classname>GstClock</classname> implementations available in &GStreamer;.
The clock time doesn't always start from 0 or from some known value.
Some clocks start counting from some known start date, other clocks start
counting since last reboot, etc...
</para>
<para>
As clocks return an absolute measure of time, they are not usually used
directly. Instead, differences between two clock times are used to
measure elapsed time according to a clock.
</para>
</sect1>
<sect1 id="section-clock-time-types" xreflabel="Clock running-time">
<title> Clock running-time </title>
<para>
A clock returns the <emphasis role="strong">absolute-time</emphasis>
according to that clock with <function>gst_clock_get_time ()</function>.
From the absolute-time is a <emphasis role="strong">running-time</emphasis>
calculated, which is simply the difference between a previous snapshot
of the absolute-time called the <emphasis role="strong">base-time</emphasis>.
So:
</para>
<para>
running-time = absolute-time - base-time
</para>
<para>
A &GStreamer; <classname>GstPipeline</classname> object maintains a
<classname>GstClock</classname> object and a base-time when it goes
to the PLAYING state. The pipeline gives a handle to the selected
<classname>GstClock</classname> to each element in the pipeline along
with selected base-time. The pipeline will select a base-time in such
a way that the running-time reflects the total time spent in the
PLAYING state. As a result, when the pipeline is PAUSED, the
running-time stands still.
</para>
<para>
Because all objects in the pipeline have the same clock and base-time,
they can thus all calculate the running-time according to the pipeline
clock.
</para>
</sect1>
<sect1 id="section-buffer-time-types" xreflabel="Buffer running-time">
<title> Buffer running-time </title>
<para>
To calculate a buffer running-time, we need a buffer timestamp and
the SEGMENT event that preceded the buffer. First we can convert
the SEGMENT event into a <classname>GstSegment</classname> object
and then we can use the
<function>gst_segment_to_running_time ()</function> function to
perform the calculation of the buffer running-time.
</para>
<para>
Synchronization is now a matter of making sure that a buffer with a
certain running-time is played when the clock reaches the same
running-time. Usually this task is done by sink elements. Sink also
have to take into account the latency configured in the pipeline and
add this to the buffer running-time before synchronizing to the
pipeline clock.
</para>
</sect1>
<sect1 id="section-clock-obligations-of-each-element" xreflabel="Obligations
of each element">
<title>
Obligations of each element.
</title>
<para>
Let us clarify the contract between GStreamer and each element in the
pipeline.
</para>
<sect2>
<title>Non-live source elements </title>
<para>
Non-live source elements must place a timestamp in each buffer that
they deliver when this is possible. They must choose the timestamps
and the values of the SEGMENT event in such a way that the
running-time of the buffer starts from 0.
</para>
<para>
Some sources, such as filesrc, is not able to generate timestamps
on all buffers. It can and must however create a timestamp on the
first buffer (with a running-time of 0).
</para>
<para>
The source then pushes out the SEGMENT event followed by the
timestamped buffers.
</para>
</sect2>
<sect2>
<title>Live source elements </title>
<para>
Live source elements must place a timestamp in each buffer that
they deliver. They must choose the timestamps and the values of the
SEGMENT event in such a way that the running-time of the buffer
matches exactly the running-time of the pipeline clock when the first
byte in the buffer was captured.
</para>
</sect2>
<sect2>
<title>Parser/Decoder/Encoder elements </title>
<para>
Parser/Decoder elements must use the incoming timestamps and transfer
those to the resulting output buffers. They are allowed to interpolate
or reconstruct timestamps on missing input buffers when they can.
</para>
</sect2>
<sect2>
<title>Demuxer elements </title>
<para>
Demuxer elements can usually set the timestamps stored inside the media
file onto the outgoing buffers. They need to make sure that outgoing
buffers that are to be played at the same time have the same
running-time. Demuxers also need to take into account the incoming
timestamps on buffers and use that to calculate an offset on the outgoing
buffer timestamps.
</para>
</sect2>
<sect2>
<title>Muxer elements</title>
<para>
Muxer elements should use the incoming buffer running-time to mux the
different streams together. They should copy the incoming running-time
to the outgoing buffers.
</para>
</sect2>
<sect2>
<title>Sink elements</title>
<para>
If the element is intended to emit samples at a specific time (real time
playing), the element should require a clock, and thus implement the
method <function>set_clock</function>.
</para>
<para>
The sink should then make sure that the sample with running-time is played
exactly when the pipeline clock reaches that running-time + latency.
Some elements might use the clock API such as
<function>gst_clock_id_wait()</function>
to perform this action. Other sinks might need to use other means of
scheduling timely playback of the data.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,108 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-dparams">
<title>Supporting Dynamic Parameters</title>
<para>
Warning, this part describes 0.10 and is outdated.
</para>
<para>
Sometimes object properties are not powerful enough to control the
parameters that affect the behaviour of your element.
When this is the case you can mark these parameters as being Controllable.
Aware applications can use the controller subsystem to dynamically adjust
the property values over time.
</para>
<sect1 id="section-dparam-start">
<title>Getting Started</title>
<para>
The controller subsystem is contained within the
<filename>gstcontroller</filename> library. You need to include the header in
your element's source file:
</para>
<programlisting>
...
#include &lt;gst/gst.h&gt;
#include &lt;gst/controller/gstcontroller.h&gt;
...
</programlisting>
<para>
Even though the <filename>gstcontroller</filename> library may be linked into
the host application, you should make sure it is initialized in your
<filename>plugin_init</filename> function:
</para>
<programlisting>
static gboolean
plugin_init (GstPlugin *plugin)
{
...
/* initialize library */
gst_controller_init (NULL, NULL);
...
}
</programlisting>
<para>
It makes no sense for all GObject parameter to be real-time controlled.
Therefore the next step is to mark controllable parameters.
This is done by using the special flag <constant>GST_PARAM_CONTROLLABLE</constant>.
when setting up GObject params in the <function>_class_init</function> method.
</para>
<programlisting>
g_object_class_install_property (gobject_class, PROP_FREQ,
g_param_spec_double ("freq", "Frequency", "Frequency of test signal",
0.0, 20000.0, 440.0,
G_PARAM_READWRITE | GST_PARAM_CONTROLLABLE | G_PARAM_STATIC_STRINGS));
</programlisting>
</sect1>
<sect1 id="chapter-dparam-loop">
<title>The Data Processing Loop</title>
<para>
In the last section we learned how to mark GObject params as controllable.
Application developers can then queue parameter changes for these parameters.
The approach the controller subsystem takes is to make plugins responsible
for pulling the changes in. This requires just one action:
</para>
<programlisting>
gst_object_sync_values(element,timestamp);
</programlisting>
<para>
This call makes all parameter-changes for the given timestamp active by
adjusting the GObject properties of the element. Its up to the element to
determine the synchronisation rate.
</para>
<sect2 id="chapter-dparam-loop-video">
<title>The Data Processing Loop for Video Elements</title>
<para>
For video processing elements it is the best to synchronise for every frame.
That means one would add the <function>gst_object_sync_values()</function>
call described in the previous section to the data processing function of
the element.
</para>
</sect2>
<sect2 id="chapter-dparam-loop-audio">
<title>The Data Processing Loop for Audio Elements</title>
<para>
For audio processing elements the case is not as easy as for video
processing elements. The problem here is that audio has a much higher rate.
For PAL video one will e.g. process 25 full frames per second, but for
standard audio it will be 44100 samples.
It is rarely useful to synchronise controllable parameters that often.
The easiest solution is also to have just one synchronisation call per
buffer processing. This makes the control-rate depend on the buffer
size.
</para>
<para>
Elements that need a specific control-rate need to break their data
processing loop to synchronise every n-samples.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,442 +0,0 @@
<chapter id="chapter-advanced-events">
<title>Events: Seeking, Navigation and More</title>
<para>
There are many different event types but only two ways they can travel in
the pipeline: downstream or upstream. It is very important to understand
how both of these methods work because if one element in the pipeline is not
handling them correctly the whole event system of the pipeline is broken.
We will try to explain here how these methods work and how elements are
supposed to implement them.
</para>
<sect1 id="section-events-downstream" xreflabel="Downstream events">
<title>Downstream events</title>
<para>
Downstream events are received through the sink pad's event handler,
as set using <function>gst_pad_set_event_function ()</function> when
the pad was created.
</para>
<para>
Downstream events can travel in two ways: they can be in-band (serialised
with the buffer flow) or out-of-band (travelling through the pipeline
instantly, possibly not in the same thread as the streaming thread that
is processing the buffers, skipping ahead of buffers being processed
or queued in the pipeline). The most common downstream events
(SEGMENT, CAPS, TAG, EOS) are all serialised with the buffer flow.
</para>
<para>
Here is a typical event function:
</para>
<programlisting>
static gboolean
gst_my_filter_sink_event (GstPad *pad, GstObject * parent, GstEvent * event)
{
GstMyFilter *filter;
gboolean ret;
filter = GST_MY_FILTER (parent);
...
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_SEGMENT:
/* maybe save and/or update the current segment (e.g. for output
* clipping) or convert the event into one in a different format
* (e.g. BYTES to TIME) or drop it and set a flag to send a segment
* event in a different format later */
ret = gst_pad_push_event (filter-&gt;src_pad, event);
break;
case GST_EVENT_EOS:
/* end-of-stream, we should close down all stream leftovers here */
gst_my_filter_stop_processing (filter);
ret = gst_pad_push_event (filter-&gt;src_pad, event);
break;
case GST_EVENT_FLUSH_STOP:
gst_my_filter_clear_temporary_buffers (filter);
ret = gst_pad_push_event (filter-&gt;src_pad, event);
break;
default:
ret = gst_pad_event_default (pad, parent, event);
break;
}
...
return ret;
}
</programlisting>
<para>
If your element is chain-based, you will almost always have to implement
a sink event function, since that is how you are notified about
segments, caps and the end of the stream.
</para>
<para>
If your element is exclusively loop-based, you may or may not want a
sink event function (since the element is driving the pipeline it will
know the length of the stream in advance or be notified by the flow
return value of <function>gst_pad_pull_range()</function>. In some cases
even loop-based element may receive events from upstream though (for
example audio decoders with an id3demux or apedemux element in front of
them, or demuxers that are being fed input from sources that send
additional information about the stream in custom events, as DVD sources
do).
</para>
</sect1>
<sect1 id="section-events-upstream" xreflabel="Upstream events">
<title>Upstream events</title>
<para>
Upstream events are generated by an element somewhere downstream in
the pipeline (example: a video sink may generate navigation
events that informs upstream elements about the current position of
the mouse pointer). This may also happen indirectly on request of the
application, for example when the application executes a seek on a
pipeline this seek request will be passed on to a sink element which
will then in turn generate an upstream seek event.
</para>
<para>
The most common upstream events are seek events, Quality-of-Service
(QoS) and reconfigure events.
</para>
<para>
An upstream event can be sent using the
<function>gst_pad_send_event</function> function. This
function simply call the default event handler of that pad. The default
event handler of pads is <function>gst_pad_event_default</function>, and
it basically sends the event to the peer of the internally linked pad.
So upstream events always arrive on the src pad of your element and are
handled by the default event handler except if you override that handler
to handle it yourself. There are some specific cases where you have to
do that :
</para>
<itemizedlist mark="opencircle">
<listitem>
<para>
If you have multiple sink pads in your element. In that case you will
have to decide which one of the sink pads you will send the event to
(if not all of them).
</para>
</listitem>
<listitem>
<para>
If you need to handle that event locally. For example a navigation
event that you will want to convert before sending it upstream, or
a QoS event that you want to handle.
</para>
</listitem>
</itemizedlist>
<para>
The processing you will do in that event handler does not really matter
but there are important rules you have to absolutely respect because
one broken element event handler is breaking the whole pipeline event
handling. Here they are :
</para>
<itemizedlist mark="opencircle">
<listitem>
<para>
Always handle events you won't handle using the default
<function>gst_pad_event_default</function> method. This method will
depending on the event, forward the event or drop it.
</para>
</listitem>
<listitem>
<para>
If you are generating some new event based on the one you received
don't forget to gst_event_unref the event you received.
</para>
</listitem>
<listitem>
<para>
Event handler function are supposed to return TRUE or FALSE indicating
if the event has been handled or not. Never simply return TRUE/FALSE
in that handler except if you really know that you have handled that
event.
</para>
</listitem>
<listitem>
<para>
Remember that the event handler might be called from a different
thread than the streaming thread, so make sure you use
appropriate locking everywhere.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-events-definitions" xreflabel="All Events Together">
<title>All Events Together</title>
<para>
In this chapter follows a list of all defined events that are currently
being used, plus how they should be used/interpreted. You can check the
what type a certain event is using the GST_EVENT_TYPE macro (or if you
need a string for debugging purposes you can use GST_EVENT_TYPE_NAME).
</para>
<para>
In this chapter, we will discuss the following events:
</para>
<itemizedlist>
<listitem><para><xref linkend="section-events-stream-start"/></para></listitem>
<listitem><para><xref linkend="section-events-caps"/></para></listitem>
<listitem><para><xref linkend="section-events-segment"/></para></listitem>
<listitem><para><xref linkend="section-events-tag"/></para></listitem>
<listitem><para><xref linkend="section-events-eos"/></para></listitem>
<listitem><para><xref linkend="section-events-toc"/></para></listitem>
<listitem><para><xref linkend="section-events-gap"/></para></listitem>
<listitem><para><xref linkend="section-events-flush-start"/></para></listitem>
<listitem><para><xref linkend="section-events-flush-stop"/></para></listitem>
<listitem><para><xref linkend="section-events-qos"/></para></listitem>
<listitem><para><xref linkend="section-events-seek"/></para></listitem>
<listitem><para><xref linkend="section-events-nav"/></para></listitem>
</itemizedlist>
<para>
For more comprehensive information about events and how they should be
used correctly in various circumstances please consult the GStreamer
design documentation. This section only gives a general overview.
</para>
<sect2 id="section-events-stream-start" xreflabel="Stream Start">
<title>Stream Start</title>
<para>
WRITEME
</para>
</sect2>
<sect2 id="section-events-caps" xreflabel="Caps">
<title>Caps</title>
<para>
The CAPS event contains the format description of the following
buffers. See <xref linkend="chapter-negotiation"/> for more
information about negotiation.
</para>
</sect2>
<sect2 id="section-events-segment" xreflabel="Segment">
<title>Segment</title>
<para>
A segment event is sent downstream to announce the range of valid
timestamps in the stream and how they should be transformed into
running-time and stream-time. A segment event must always be sent
before the first buffer of data and after a flush (see above).
</para>
<para>
The first segment event is created by the element driving the
pipeline, like a source operating in push-mode or a demuxer/decoder
operating pull-based. This segment event then travels down the
pipeline and may be transformed on the way (a decoder, for example,
might receive a segment event in BYTES format and might transform
this into a segment event in TIMES format based on the average
bitrate).
</para>
<para>
Depending on the element type, the event can simply be forwarded using
<function>gst_pad_event_default ()</function>, or it should be parsed
and a modified event should be sent on. The last is true for demuxers,
which generally have a byte-to-time conversion concept. Their input
is usually byte-based, so the incoming event will have an offset in
byte units (<symbol>GST_FORMAT_BYTES</symbol>), too. Elements
downstream, however, expect segment events in time units, so that
it can be used to synchronize against the pipeline clock. Therefore,
demuxers and similar elements should not forward the event, but parse
it, free it and send a segment event (in time units,
<symbol>GST_FORMAT_TIME</symbol>) further downstream.
</para>
<para>
The segment event is created using the function
<function>gst_event_new_segment ()</function>. See the API
reference and design document for details about its parameters.
</para>
<para>
Elements parsing this event can use gst_event_parse_segment()
to extract the event details. Elements may find the GstSegment
API useful to keep track of the current segment (if they want to use
it for output clipping, for example).
</para>
</sect2>
<sect2 id="section-events-tag" xreflabel="Tag (metadata)">
<title>Tag (metadata)</title>
<para>
Tagging events are being sent downstream to indicate the tags as parsed
from the stream data. This is currently used to preserve tags during
stream transcoding from one format to the other. Tags are discussed
extensively in <xref linkend="chapter-advanced-tagging"/>. Most
elements will simply forward the event by calling
<function>gst_pad_event_default ()</function>.
</para>
<para>
The tag event is created using the function
<function>gst_event_new_tag ()</function>, but more often elements will
send a tag event downstream that will be converted into a message
on the bus by sink elements.
All of these functions require a filled-in taglist as
argument, which they will take ownership of.
</para>
<para>
Elements parsing this event can use the function
<function>gst_event_parse_tag ()</function> to acquire the
taglist that the event contains.
</para>
</sect2>
<sect2 id="section-events-eos" xreflabel="End of Stream (EOS)">
<title>End of Stream (EOS)</title>
<para>
End-of-stream events are sent if the stream that an element sends out
is finished. An element receiving this event (from upstream, so it
receives it on its sinkpad) will generally just process any buffered
data (if there is any) and then forward the event further downstream.
The <function>gst_pad_event_default ()</function> takes care of all
this, so most elements do not need to support this event. Exceptions are
elements that explicitly need to close a resource down on EOS, and
N-to-1 elements. Note that the stream itself is <emphasis>not</emphasis>
a resource that should be closed down on EOS! Applications might seek
back to a point before EOS and continue playing again.
</para>
<para>
The EOS event has no properties, which makes it one of the simplest
events in &GStreamer;. It is created using the
<function>gst_event_new_eos()</function> function.
</para>
<para>
It is important to note that <emphasis>only elements driving the
pipeline should ever send an EOS event</emphasis>. If your element
is chain-based, it is not driving the pipeline. Chain-based elements
should just return GST_FLOW_EOS from their chain function at
the end of the stream (or the configured segment), the upstream
element that is driving the pipeline will then take care of
sending the EOS event (or alternatively post a SEGMENT_DONE message
on the bus depending on the mode of operation). If you are implementing
your own source element, you also do not need to ever manually send
an EOS event, you should also just return GST_FLOW_EOS in
your create or fill function (assuming your element derives from
GstBaseSrc or GstPushSrc).
</para>
</sect2>
<sect2 id="section-events-toc" xreflabel="Table Of Contents">
<title>Table Of Contents</title>
<para>
WRITEME
</para>
</sect2>
<sect2 id="section-events-gap" xreflabel="Gap">
<title>Gap</title>
<para>
WRITEME
</para>
</sect2>
<sect2 id="section-events-flush-start" xreflabel="Flush Start">
<title>Flush Start</title>
<para>
The flush start event is sent downstream (in push mode) or upstream
(in pull mode) if all buffers and caches in the pipeline should be
emptied. <quote>Queue</quote> elements will
empty their internal list of buffers when they receive this event, for
example. File sink elements (e.g. <quote>filesink</quote>) will flush
the kernel-to-disk cache (<function>fdatasync ()</function> or
<function>fflush ()</function>) when they receive this event. Normally,
elements receiving this event will simply just forward it, since most
filter or filter-like elements don't have an internal cache of data.
<function>gst_pad_event_default ()</function> does just that, so for
most elements, it is enough to forward the event using the default
event handler.
</para>
<para>
As a side-effect of flushing all data from the pipeline, this event
unblocks the streaming thread by making all pads reject data until
they receive a <xref linkend="section-events-flush-stop"/> signal
(elements trying to push data will get a FLUSHING flow return
and stop processing data).
</para>
<para>
The flush-start event is created with the
<function>gst_event_new_flush_start ()</function>.
Like the EOS event, it has no properties. This event is usually
only created by elements driving the pipeline, like source elements
operating in push-mode or pull-range based demuxers/decoders.
</para>
</sect2>
<sect2 id="section-events-flush-stop" xreflabel="Flush Stop">
<title>Flush Stop</title>
<para>
The flush-stop event is sent by an element driving the pipeline
after a flush-start and tells pads and elements downstream that
they should accept events and buffers again (there will be at
least a SEGMENT event before any buffers first though).
</para>
<para>
If your element keeps temporary caches of stream data, it should
clear them when it receives a FLUSH-STOP event (and also whenever
its chain function receives a buffer with the DISCONT flag set).
</para>
<para>
The flush-stop event is created with
<function>gst_event_new_flush_stop ()</function>. It has one
parameter that controls if the running-time of the pipeline should
be reset to 0 or not. Normally after a flushing seek, the
running_time is set back to 0.
</para>
</sect2>
<sect2 id="section-events-qos" xreflabel="Quality Of Service (QOS)">
<title>Quality Of Service (QOS)</title>
<para>
The QOS event contains a report about the current real-time
performance of the stream. See more info in
<xref linkend="chapter-advanced-qos"/>.
</para>
</sect2>
<sect2 id="section-events-seek" xreflabel="Seek Request">
<title>Seek Request</title>
<para>
Seek events are meant to request a new stream position to elements.
This new position can be set in several formats (time, bytes or
<quote>default units</quote> [a term indicating frames for video,
channel-independent samples for audio, etc.]). Seeking can be done with
respect to the end-of-file or start-of-file, and
usually happens in upstream direction (downstream seeking is done by
sending a SEGMENT event with the appropriate offsets for elements
that support that, like filesink).
</para>
<para>
Elements receiving seek events should, depending on the element type,
either just forward it upstream (filters, decoders), change the
format in which the event is given and then forward it (demuxers),
or handle the event by changing the file pointer in their internal
stream resource (file sources, demuxers/decoders driving the pipeline
in pull-mode) or something else.
</para>
<para>
Seek events are built up using positions in specified formats (time,
bytes, units). They are created using the function
<function>gst_event_new_seek ()</function>. Note that many plugins do
not support seeking from the end of the stream.
An element not driving the pipeline and forwarding a seek
request should not assume that the seek succeeded or actually happened,
it should operate based on the SEGMENT events it receives.
</para>
<para>
Elements parsing this event can do this using
<function>gst_event_parse_seek()</function>.
</para>
</sect2>
<sect2 id="section-events-nav" xreflabel="Navigation">
<title>Navigation</title>
<para>
Navigation events are sent upstream by video sinks to inform upstream
elements of where the mouse pointer is, if and where mouse pointer
clicks have happened, or if keys have been pressed or released.
</para>
<para>
All this information is contained in the event structure which can
be obtained with <function>gst_event_get_structure ()</function>.
</para>
<para>
Check out the navigationtest element in gst-plugins-good for an idea
how to extract navigation information from this event.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,237 +0,0 @@
<chapter id="chapter-advanced-interfaces">
<title>Interfaces</title>
<para>
Previously, in the chapter <xref linkend="chapter-building-args"/>, we have
introduced the concept of GObject properties of controlling an element's
behaviour. This is very powerful, but it has two big disadvantages:
first of all, it is too generic, and second, it isn't dynamic.
</para>
<para>
The first disadvantage is related to the customizability of the end-user
interface that will be built to control the element. Some properties are
more important than others. Some integer properties are better shown in a
spin-button widget, whereas others would be better represented by a slider
widget. Such things are not possible because the UI has no actual meaning
in the application. A UI widget that represents a bitrate property is the
same as a UI widget that represents the size of a video, as long as both
are of the same <classname>GParamSpec</classname> type. Another problem,
is that things like parameter grouping, function grouping, or parameter
coupling are not
really possible.
</para>
<para>
The second problem with parameters are that they are not dynamic. In
many cases, the allowed values for a property are not fixed, but depend
on things that can only be detected at runtime. The names of inputs for
a TV card in a video4linux source element, for example, can only be
retrieved from the kernel driver when we've opened the device; this only
happens when the element goes into the READY state. This means that we
cannot create an enum property type to show this to the user.
</para>
<para>
The solution to those problems is to create very specialized types of
controls for certain often-used controls. We use the concept of interfaces
to achieve this. The basis of this all is the glib
<classname>GTypeInterface</classname> type. For each case where we think
it's useful, we've created interfaces which can be implemented by elements
at their own will.
</para>
<para>
One important note: interfaces do <emphasis>not</emphasis> replace
properties. Rather, interfaces should be built <emphasis>next to</emphasis>
properties. There are two important reasons for this. First of all,
properties can be more easily introspected. Second, properties can be
specified on the commandline (<filename>gst-launch</filename>).
</para>
<sect1 id="section-iface-general" xreflabel="How to Implement Interfaces">
<title>How to Implement Interfaces</title>
<para>
Implementing interfaces is initiated in the <function>_get_type ()</function>
of your element. You can register one or more interfaces after having
registered the type itself. Some interfaces have dependencies on other
interfaces or can only be registered by certain types of elements. You
will be notified of doing that wrongly when using the element: it will
quit with failed assertions, which will explain what went wrong.
If it does, you need to register support for <emphasis>that</emphasis>
interface before registering support for the interface that you're
wanting to support. The example below explains how to add support for a
simple interface with no further dependencies.
</para>
<programlisting>
static void gst_my_filter_some_interface_init (GstSomeInterface *iface);
GType
gst_my_filter_get_type (void)
{
static GType my_filter_type = 0;
if (!my_filter_type) {
static const GTypeInfo my_filter_info = {
sizeof (GstMyFilterClass),
NULL,
NULL,
(GClassInitFunc) gst_my_filter_class_init,
NULL,
NULL,
sizeof (GstMyFilter),
0,
(GInstanceInitFunc) gst_my_filter_init
};
static const GInterfaceInfo some_interface_info = {
(GInterfaceInitFunc) gst_my_filter_some_interface_init,
NULL,
NULL
};
my_filter_type =
g_type_register_static (GST_TYPE_ELEMENT,
"GstMyFilter",
&amp;my_filter_info, 0);
g_type_add_interface_static (my_filter_type,
GST_TYPE_SOME_INTERFACE,
&amp;some_interface_info);
}
return my_filter_type;
}
static void
gst_my_filter_some_interface_init (GstSomeInterface *iface)
{
/* here, you would set virtual function pointers in the interface */
}
</programlisting>
<para>
Or more conveniently:
</para>
<programlisting>
static void gst_my_filter_some_interface_init (GstSomeInterface *iface);
G_DEFINE_TYPE_WITH_CODE (GstMyFilter, gst_my_filter,GST_TYPE_ELEMENT,
G_IMPLEMENT_INTERFACE (GST_TYPE_SOME_INTERFACE,
gst_my_filter_some_interface_init));
</programlisting>
</sect1>
<sect1 id="section-iface-uri" xreflabel="URI interface">
<title>URI interface</title>
<para>
WRITEME
</para>
</sect1>
<sect1 id="section-iface-colorbalance" xreflabel="Color Balance Interface">
<title>Color Balance Interface</title>
<para>
WRITEME
</para>
</sect1>
<sect1 id="section-iface-xoverlay" xreflabel="Video Overlay Interface">
<title>Video Overlay Interface</title>
<para>
The #GstVideoOverlay interface is used for 2 main purposes :
<itemizedlist>
<listitem>
<para>
To get a grab on the Window where the video sink element is going to render.
This is achieved by either being informed about the Window identifier that
the video sink element generated, or by forcing the video sink element to use
a specific Window identifier for rendering.
</para>
</listitem>
<listitem>
<para>
To force a redrawing of the latest video frame the video sink element
displayed on the Window. Indeed if the #GstPipeline is in #GST_STATE_PAUSED
state, moving the Window around will damage its content. Application
developers will want to handle the Expose events themselves and force the
video sink element to refresh the Window's content.
</para>
</listitem>
</itemizedlist>
</para>
<para>
A plugin drawing video output in a video window will need to have that
window at one stage or another. Passive mode simply means that no window
has been given to the plugin before that stage, so the plugin created the
window by itself. In that case the plugin is responsible of destroying
that window when it's not needed any more and it has to tell the
applications that a window has been created so that the application can
use it. This is done using the <classname>have-window-handle</classname>
message that can be posted from the plugin with the
<function>gst_video_overlay_got_window_handle</function> method.
</para>
<para>
As you probably guessed already active mode just means sending a video
window to the plugin so that video output goes there. This is done using
the <function>gst_video_overlay_set_window_handle</function> method.
</para>
<para>
It is possible to switch from one mode to another at any moment, so the
plugin implementing this interface has to handle all cases. There are only
2 methods that plugins writers have to implement and they most probably
look like that :
</para>
<programlisting><![CDATA[
static void
gst_my_filter_set_window_handle (GstVideoOverlay *overlay, guintptr handle)
{
GstMyFilter *my_filter = GST_MY_FILTER (overlay);
if (my_filter->window)
gst_my_filter_destroy_window (my_filter->window);
my_filter->window = handle;
}
static void
gst_my_filter_xoverlay_init (GstVideoOverlayClass *iface)
{
iface->set_window_handle = gst_my_filter_set_window_handle;
}
]]></programlisting>
<para>
You will also need to use the interface methods to post messages when
needed such as when receiving a CAPS event where you will know the video
geometry and maybe create the window.
</para>
<programlisting><![CDATA[
static MyFilterWindow *
gst_my_filter_window_create (GstMyFilter *my_filter, gint width, gint height)
{
MyFilterWindow *window = g_new (MyFilterWindow, 1);
...
gst_video_overlay_got_window_handle (GST_VIDEO_OVERLAY (my_filter), window->win);
}
/* called from the event handler for CAPS events */
static gboolean
gst_my_filter_sink_set_caps (GstMyFilter *my_filter, GstCaps *caps)
{
gint width, height;
gboolean ret;
...
ret = gst_structure_get_int (structure, "width", &width);
ret &= gst_structure_get_int (structure, "height", &height);
if (!ret) return FALSE;
gst_video_overlay_prepare_window_handle (GST_VIDEO_OVERLAY (my_filter));
if (!my_filter->window)
my_filter->window = gst_my_filter_create_window (my_filter, width, height);
...
}
]]></programlisting>
</sect1>
<sect1 id="section-iface-navigation" xreflabel="Navigation Interface">
<title>Navigation Interface</title>
<para>
WRITEME
</para>
</sect1>
</chapter>

View file

@ -1,601 +0,0 @@
<chapter id="chapter-negotiation" xreflabel="Caps negotiation">
<title>Caps negotiation</title>
<para>
Caps negotiation is the act of finding a media format (GstCaps) between
elements that they can handle. This process in &GStreamer; can in most
cases find an optimal solution for the complete pipeline. In this section
we explain how this works.
</para>
<sect1 id="section-nego-basics">
<title>Caps negotiation basics</title>
<para>
In &GStreamer;, negotiation of the media format always follows the
following simple rules:
</para>
<itemizedlist>
<listitem>
<para>
A downstream element suggest a format on its sinkpad and places the
suggestion in the result of the CAPS query performed on the sinkpad.
See also <xref linkend="section-nego-getcaps"/>.
</para>
</listitem>
<listitem>
<para>
An upstream element decides on a format. It sends the selected media
format downstream on its source pad with a CAPS event. Downstream
elements reconfigure themselves to handle the media type in the CAPS
event on the sinkpad.
</para>
</listitem>
<listitem>
<para>
A downstream element can inform upstream that it would like to
suggest a new format by sending a RECONFIGURE event upstream. The
RECONFIGURE event simply instructs an upstream element to restart
the negotiation phase. Because the element that sent out the
RECONFIGURE event is now suggesting another format, the format
in the pipeline might change.
</para>
</listitem>
</itemizedlist>
<para>
In addition to the CAPS and RECONFIGURE event and the CAPS query, there
is an ACCEPT_CAPS query to quickly check if a certain caps can
be accepted by an element.
</para>
<para>
All negotiation follows these simple rules. Let's take a look at some
typical uses cases and how negotiation happens.
</para>
</sect1>
<sect1 id="section-nego-usecases">
<title>Caps negotiation use cases</title>
<para>
In what follows we will look at some use cases for push-mode scheduling.
The pull-mode scheduling negotiation phase is discussed in
<xref linkend="section-nego-pullmode"/> and is actually similar as we
will see.
</para>
<para>
Since the sink pads only suggest formats and the source pads need to
decide, the most complicated work is done in the source pads.
We can identify 3 caps negotiation use cases for the source pads:
</para>
<itemizedlist>
<listitem>
<para>
Fixed negotiation. An element can output one format only.
See <xref linkend="section-nego-fixed"/>.
</para>
</listitem>
<listitem>
<para>
Transform negotiation. There is a (fixed) transform between the
input and output format of the element, usually based on some
element property. The caps that the element will produce depend
on the upstream caps and the caps that the element can accept
depend on the downstream caps.
See <xref linkend="section-nego-transform"/>.
</para>
</listitem>
<listitem>
<para>
Dynamic negotiation. An element can output many formats.
See <xref linkend="section-nego-dynamic"/>.
</para>
</listitem>
</itemizedlist>
<sect2 id="section-nego-fixed">
<title>Fixed negotiation</title>
<para>
In this case, the source pad can only produce a fixed format. Usually
this format is encoded inside the media. No downstream element can
ask for a different format, the only way that the source pad will
renegotiate is when the element decides to change the caps itself.
</para>
<para>
Elements that could implement fixed caps (on their source pads) are,
in general, all elements that are not renegotiable. Examples include:
</para>
<itemizedlist>
<listitem>
<para>
A typefinder, since the type found is part of the actual data stream
and can thus not be re-negotiated. The typefinder will look at the
stream of bytes, figure out the type, send a CAPS event with the
caps and then push buffers of the type.
</para>
</listitem>
<listitem>
<para>
Pretty much all demuxers, since the contained elementary data
streams are defined in the file headers, and thus not
renegotiable.
</para>
</listitem>
<listitem>
<para>
Some decoders, where the format is embedded in the data stream
and not part of the peercaps <emphasis>and</emphasis> where the
decoder itself is not reconfigurable, too.
</para>
</listitem>
<listitem>
<para>
Some sources that produce a fixed format.
</para>
</listitem>
</itemizedlist>
<para>
<function>gst_pad_use_fixed_caps()</function> is used on the source
pad with fixed caps. As long as the pad is not negotiated, the default
CAPS query will return the caps presented in the padtemplate. As soon
as the pad is negotiated, the CAPS query will return the negotiated
caps (and nothing else). These are the relevant code snippets for fixed
caps source pads.
</para>
<programlisting>
<![CDATA[
[..]
pad = gst_pad_new_from_static_template (..);
gst_pad_use_fixed_caps (pad);
[..]
]]>
</programlisting>
<para>
The fixed caps can then be set on the pad by calling
<function>gst_pad_set_caps ()</function>.
</para>
<programlisting>
<![CDATA[
[..]
caps = gst_caps_new_simple ("audio/x-raw",
"format", G_TYPE_STRING, GST_AUDIO_NE(F32),
"rate", G_TYPE_INT, <samplerate>,
"channels", G_TYPE_INT, <num-channels>, NULL);
if (!gst_pad_set_caps (pad, caps)) {
GST_ELEMENT_ERROR (element, CORE, NEGOTIATION, (NULL),
("Some debug information here"));
return GST_FLOW_ERROR;
}
[..]
]]>
</programlisting>
<para>
These types of elements also don't have a relation between the input
format and the output format, the input caps simply don't contain the
information needed to produce the output caps.
</para>
<para>
All other elements that need to be configured for the format should
implement full caps negotiation, which will be explained in the next
few sections.
</para>
</sect2>
<sect2 id="section-nego-transform">
<title>Transform negotiation</title>
<para>
In this negotiation technique, there is a fixed transform between
the element input caps and the output caps. This transformation
could be parameterized by element properties but not by the
content of the stream (see <xref linkend="section-nego-fixed"/>
for that use-case).
</para>
<para>
The caps that the element can accept depend on the (fixed
transformation) downstream caps. The caps that the element can
produce depend on the (fixed transformation of) the upstream
caps.
</para>
<para>
This type of element can usually set caps on its source pad from
the <function>_event()</function> function on the sink pad when
it received the CAPS event. This means that the caps transform
function transforms a fixed caps into another fixed caps.
Examples of elements include:
</para>
<itemizedlist>
<listitem>
<para>
Videobox. It adds configurable border around a video frame
depending on object properties.
</para>
</listitem>
<listitem>
<para>
Identity elements. All elements that don't change the format
of the data, only the content. Video and audio effects are an
example. Other examples include elements that inspect the
stream.
</para>
</listitem>
<listitem>
<para>
Some decoders and encoders, where the output format is defined
by input format, like mulawdec and mulawenc. These decoders
usually have no headers that define the content of the stream.
They are usually more like conversion elements.
</para>
</listitem>
</itemizedlist>
<para>
Below is an example of a negotiation steps of a typical transform
element. In the sink pad CAPS event handler, we compute the caps
for the source pad and set those.
</para>
<programlisting>
<![CDATA[
[...]
static gboolean
gst_my_filter_setcaps (GstMyFilter *filter,
GstCaps *caps)
{
GstStructure *structure;
int rate, channels;
gboolean ret;
GstCaps *outcaps;
structure = gst_caps_get_structure (caps, 0);
ret = gst_structure_get_int (structure, "rate", &rate);
ret = ret && gst_structure_get_int (structure, "channels", &channels);
if (!ret)
return FALSE;
outcaps = gst_caps_new_simple ("audio/x-raw",
"format", G_TYPE_STRING, GST_AUDIO_NE(S16),
"rate", G_TYPE_INT, rate,
"channels", G_TYPE_INT, channels, NULL);
ret = gst_pad_set_caps (filter->srcpad, outcaps);
gst_caps_unref (outcaps);
return ret;
}
static gboolean
gst_my_filter_sink_event (GstPad *pad,
GstObject *parent,
GstEvent *event)
{
gboolean ret;
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_CAPS:
{
GstCaps *caps;
gst_event_parse_caps (event, &caps);
ret = gst_my_filter_setcaps (filter, caps);
break;
}
default:
ret = gst_pad_event_default (pad, parent, event);
break;
}
return ret;
}
[...]
]]>
</programlisting>
</sect2>
<sect2 id="section-nego-dynamic">
<title>Dynamic negotiation</title>
<para>
A last negotiation method is the most complex and powerful dynamic
negotiation.
</para>
<para>
Like with the transform negotiation in
<xref linkend="section-nego-transform"/>, dynamic negotiation will
perform a transformation on the downstream/upstream caps. Unlike the
transform negotiation, this transform will convert fixed caps to
unfixed caps. This means that the sink pad input caps can be converted
into unfixed (multiple) formats. The source pad will have to choose a
format from all the possibilities. It would usually like to choose a
format that requires the least amount of effort to produce but it does
not have to be. The selection of the format should also depend on the
caps that can be accepted downstream (see a QUERY_CAPS function in
<xref linkend="section-nego-getcaps"/>).
</para>
<para>
A typical flow goes like this:
</para>
<itemizedlist>
<listitem>
<para>
Caps are received on the sink pad of the element.
</para>
</listitem>
<listitem>
<para>
If the element prefers to operate in passthrough mode, check
if downstream accepts the caps with the ACCEPT_CAPS query. If it
does, we can complete negotiation and we can operate in
passthrough mode.
</para>
</listitem>
<listitem>
<para>
Calculate the possible caps for the source pad.
</para>
</listitem>
<listitem>
<para>
Query the downstream peer pad for the list of possible
caps.
</para>
</listitem>
<listitem>
<para>
Select from the downstream list the first caps that you can
transform to and set this as the output caps. You might have to
fixate the caps to some reasonable defaults to construct
fixed caps.
</para>
</listitem>
</itemizedlist>
<para>
Examples of this type of elements include:
</para>
<itemizedlist>
<listitem>
<para>
Converter elements such as videoconvert, audioconvert, audioresample,
videoscale, ...
</para>
</listitem>
<listitem>
<para>
Source elements such as audiotestsrc, videotestsrc, v4l2src,
pulsesrc, ...
</para>
</listitem>
</itemizedlist>
<para>
Let's look at the example of an element that can convert between
samplerates, so where input and output samplerate don't have to be
the same:
</para>
<programlisting>
<![CDATA[
static gboolean
gst_my_filter_setcaps (GstMyFilter *filter,
GstCaps *caps)
{
if (gst_pad_set_caps (filter->srcpad, caps)) {
filter->passthrough = TRUE;
} else {
GstCaps *othercaps, *newcaps;
GstStructure *s = gst_caps_get_structure (caps, 0), *others;
/* no passthrough, setup internal conversion */
gst_structure_get_int (s, "channels", &filter->channels);
othercaps = gst_pad_get_allowed_caps (filter->srcpad);
others = gst_caps_get_structure (othercaps, 0);
gst_structure_set (others,
"channels", G_TYPE_INT, filter->channels, NULL);
/* now, the samplerate value can optionally have multiple values, so
* we "fixate" it, which means that one fixed value is chosen */
newcaps = gst_caps_copy_nth (othercaps, 0);
gst_caps_unref (othercaps);
gst_pad_fixate_caps (filter->srcpad, newcaps);
if (!gst_pad_set_caps (filter->srcpad, newcaps))
return FALSE;
/* we are now set up, configure internally */
filter->passthrough = FALSE;
gst_structure_get_int (s, "rate", &filter->from_samplerate);
others = gst_caps_get_structure (newcaps, 0);
gst_structure_get_int (others, "rate", &filter->to_samplerate);
}
return TRUE;
}
static gboolean
gst_my_filter_sink_event (GstPad *pad,
GstObject *parent,
GstEvent *event)
{
gboolean ret;
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_CAPS:
{
GstCaps *caps;
gst_event_parse_caps (event, &caps);
ret = gst_my_filter_setcaps (filter, caps);
break;
}
default:
ret = gst_pad_event_default (pad, parent, event);
break;
}
return ret;
}
static GstFlowReturn
gst_my_filter_chain (GstPad *pad,
GstObject *parent,
GstBuffer *buf)
{
GstMyFilter *filter = GST_MY_FILTER (parent);
GstBuffer *out;
/* push on if in passthrough mode */
if (filter->passthrough)
return gst_pad_push (filter->srcpad, buf);
/* convert, push */
out = gst_my_filter_convert (filter, buf);
gst_buffer_unref (buf);
return gst_pad_push (filter->srcpad, out);
}
]]>
</programlisting>
</sect2>
</sect1>
<sect1 id="section-nego-upstream" xreflabel="Upstream caps (re)negotiation">
<title>Upstream caps (re)negotiation</title>
<para>
Upstream negotiation's primary use is to renegotiate (part of) an
already-negotiated pipeline to a new format. Some practical examples
include to select a different video size because the size of the video
window changed, and the video output itself is not capable of rescaling,
or because the audio channel configuration changed.
</para>
<para>
Upstream caps renegotiation is requested by sending a GST_EVENT_RECONFIGURE
event upstream. The idea is that it will instruct the upstream element
to reconfigure its caps by doing a new query for the allowed caps and then
choosing a new caps. The element that sends out the RECONFIGURE event
would influence the selection of the new caps by returning the new
preferred caps from its GST_QUERY_CAPS query function. The RECONFIGURE
event will set the GST_PAD_FLAG_NEED_RECONFIGURE on all pads that it
travels over.
</para>
<para>
It is important to note here that different elements actually have
different responsibilities here:
</para>
<itemizedlist>
<listitem>
<para>
Elements that want to propose a new format upstream need to first
check if the new caps are acceptable upstream with an ACCEPT_CAPS
query. Then they would send a RECONFIGURE event and be prepared to
answer the CAPS query with the new preferred format. It should be
noted that when there is no upstream element that can (or wants)
to renegotiate, the element needs to deal with the currently
configured format.
</para>
</listitem>
<listitem>
<para>
Elements that operate in transform negotiation according to
<xref linkend="section-nego-transform"/> pass the RECONFIGURE
event upstream. Because these elements simply do a fixed transform
based on the upstream caps, they need to send the event upstream
so that it can select a new format.
</para>
</listitem>
<listitem>
<para>
Elements that operate in fixed negotiation
(<xref linkend="section-nego-fixed"/>) drop the RECONFIGURE event.
These elements can't reconfigure and their output caps don't depend
on the upstream caps so the event can be dropped.
</para>
</listitem>
<listitem>
<para>
Elements that can be reconfigured on the source pad (source pads
implementing dynamic negotiation in
<xref linkend="section-nego-dynamic"/>) should check its
NEED_RECONFIGURE flag with
<function>gst_pad_check_reconfigure ()</function> and it should
start renegotiation when the function returns TRUE.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-nego-getcaps" xreflabel="Implementing a CAPS query function">
<title>Implementing a CAPS query function</title>
<para>
A <function>_query ()</function>-function with the GST_QUERY_CAPS query
type is called when a peer element would like to know which formats
this pad supports, and in what order of preference. The return value
should be all formats that this elements supports, taking into account
limitations of peer elements further downstream or upstream, sorted by
order of preference, highest preference first.
</para>
<para>
</para>
<programlisting>
<![CDATA[
static gboolean
gst_my_filter_query (GstPad *pad, GstObject * parent, GstQuery * query)
{
gboolean ret;
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (GST_QUERY_TYPE (query)) {
case GST_QUERY_CAPS
{
GstPad *otherpad;
GstCaps *temp, *caps, *filt, *tcaps;
gint i;
otherpad = (pad == filter->srcpad) ? filter->sinkpad :
filter->srcpad;
caps = gst_pad_get_allowed_caps (otherpad);
gst_query_parse_caps (query, &filt);
/* We support *any* samplerate, indifferent from the samplerate
* supported by the linked elements on both sides. */
for (i = 0; i < gst_caps_get_size (caps); i++) {
GstStructure *structure = gst_caps_get_structure (caps, i);
gst_structure_remove_field (structure, "rate");
}
/* make sure we only return results that intersect our
* padtemplate */
tcaps = gst_pad_get_pad_template_caps (pad);
if (tcaps) {
temp = gst_caps_intersect (caps, tcaps);
gst_caps_unref (caps);
gst_caps_unref (tcaps);
caps = temp;
}
/* filter against the query filter when needed */
if (filt) {
temp = gst_caps_intersect (caps, filt);
gst_caps_unref (caps);
caps = temp;
}
gst_query_set_caps_result (query, caps);
gst_caps_unref (caps);
ret = TRUE;
break;
}
default:
ret = gst_pad_query_default (pad, parent, query);
break;
}
return ret;
}
]]>
</programlisting>
</sect1>
<sect1 id="section-nego-pullmode">
<title>Pull-mode Caps negotiation</title>
<para>
WRITEME, the mechanism of pull-mode negotiation is not yet fully
understood.
</para>
<para>
Using all the knowledge you've acquired by reading this chapter, you
should be able to write an element that does correct caps negotiation.
If in doubt, look at other elements of the same type in our git
repository to get an idea of how they do what you want to do.
</para>
</sect1>
</chapter>

View file

@ -1,283 +0,0 @@
<chapter id="chapter-advanced-qos">
<title>Quality Of Service (QoS)</title>
<para>
Quality of Service in &GStreamer; is about measuring and adjusting
the real-time performance of a pipeline. The real-time performance is
always measured relative to the pipeline clock and typically happens in
the sinks when they synchronize buffers against the clock.
</para>
<para>
When buffers arrive late in the sink, i.e. when their running-time is
smaller than that of the clock, we say that the pipeline is having a
quality of service problem. These are a few possible reasons:
</para>
<itemizedlist>
<listitem>
<para>
High CPU load, there is not enough CPU power to handle the stream,
causing buffers to arrive late in the sink.
</para>
</listitem>
<listitem>
<para>
Network problems
</para>
</listitem>
<listitem>
<para>
Other resource problems such as disk load, memory bottlenecks etc
</para>
</listitem>
</itemizedlist>
<para>
The measurements result in QOS events that aim to adjust the datarate
in one or more upstream elements. Two types of adjustments can be
made:
</para>
<itemizedlist>
<listitem>
<para>
Short time "emergency" corrections based on latest observation in
the sinks.
</para>
<para>
Long term rate corrections based on trends observed in the sinks.
</para>
</listitem>
</itemizedlist>
<para>
It is also possible for the application to artificially introduce delay
between synchronized buffers, this is called throttling. It can be used
to limit or reduce the framerate, for example.
</para>
<sect1 id="section-measuring">
<title>Measuring QoS</title>
<para>
Elements that synchronize buffers on the pipeline clock will usually
measure the current QoS. They will also need to keep some statistics
in order to generate the QOS event.
</para>
<para>
For each buffer that arrives in the sink, the element needs to calculate
how late or how early it was. This is called the jitter. Negative jitter
values mean that the buffer was early, positive values mean that the
buffer was late. the jitter value gives an indication of how early/late
a buffer was.
</para>
<para>
A synchronizing element will also need to calculate how much time
elapsed between receiving two consecutive buffers. We call this the
processing time because that is the amount of time it takes for the
upstream element to produce/process the buffer. We can compare this
processing time to the duration of the buffer to have a measurement
of how fast upstream can produce data, called the proportion.
If, for example, upstream can produce a buffer in 0.5 seconds of 1
second long, it is operating at twice the required speed. If, on the
other hand, it takes 2 seconds to produce a buffer with 1 seconds worth
of data, upstream is producing buffers too slow and we won't be able to
keep synchronization. Usually, a running average is kept of the
proportion.
</para>
<para>
A synchronizing element also needs to measure its own performance in
order to figure out if the performance problem is upstream of itself.
</para>
<para>
These measurements are used to construct a QOS event that is sent
upstream. Note that a QoS event is sent for each buffer that arrives
in the sink.
</para>
</sect1>
<sect1 id="section-handling">
<title>Handling QoS</title>
<para>
An element will have to install an event function on its source pads
in order to receive QOS events. Usually, the element will need to
store the value of the QOS event and use them in the data processing
function. The element will need to use a lock to protect these QoS
values as shown in the example below. Also make sure to pass the
QoS event upstream.
</para>
<programlisting>
<![CDATA[
[...]
case GST_EVENT_QOS:
{
GstQOSType type;
gdouble proportion;
GstClockTimeDiff diff;
GstClockTime timestamp;
gst_event_parse_qos (event, &type, &proportion, &diff, &timestamp);
GST_OBJECT_LOCK (decoder);
priv->qos_proportion = proportion;
priv->qos_timestamp = timestamp;
priv->qos_diff = diff;
GST_OBJECT_UNLOCK (decoder);
res = gst_pad_push_event (decoder->sinkpad, event);
break;
}
[...]
]]>
</programlisting>
<para>
With the QoS values, there are two types of corrections that an element
can do:
</para>
<sect2 id="section-handling-short">
<title>Short term correction</title>
<para>
The timestamp and the jitter value in the QOS event can be used to
perform a short term correction. If the jitter is positive, the
previous buffer arrived late and we can be sure that a buffer with
a timestamp &lt; timestamp + jitter is also going to be late. We
can thus drop all buffers with a timestamp less than timestamp +
jitter.
</para>
<para>
If the buffer duration is known, a better estimation for the next
likely timestamp as: timestamp + 2 * jitter + duration.
</para>
<para>
A possible algorithm typically looks like this:
</para>
<programlisting>
<![CDATA[
[...]
GST_OBJECT_LOCK (dec);
qos_proportion = priv->qos_proportion;
qos_timestamp = priv->qos_timestamp;
qos_diff = priv->qos_diff;
GST_OBJECT_UNLOCK (dec);
/* calculate the earliest valid timestamp */
if (G_LIKELY (GST_CLOCK_TIME_IS_VALID (qos_timestamp))) {
if (G_UNLIKELY (qos_diff > 0)) {
earliest_time = qos_timestamp + 2 * qos_diff + frame_duration;
} else {
earliest_time = qos_timestamp + qos_diff;
}
} else {
earliest_time = GST_CLOCK_TIME_NONE;
}
/* compare earliest_time to running-time of next buffer */
if (earliest_time > timestamp)
goto drop_buffer;
[...]
]]>
</programlisting>
</sect2>
<sect2 id="section-handling-long">
<title>Long term correction</title>
<para>
Long term corrections are a bit more difficult to perform. They
rely on the value of the proportion in the QOS event. Elements should
reduce the amount of resources they consume by the proportion
field in the QoS message.
</para>
<para>
Here are some possible strategies to achieve this:
</para>
<itemizedlist>
<listitem>
<para>
Permanently dropping frames or reducing the CPU or bandwidth
requirements of the element. Some decoders might be able to
skip decoding of B frames.
</para>
</listitem>
<listitem>
<para>
Switch to lower quality processing or reduce the algorithmic
complexity. Care should be taken that this doesn't introduce
disturbing visual or audible glitches.
</para>
</listitem>
<listitem>
<para>
Switch to a lower quality source to reduce network bandwidth.
</para>
</listitem>
<listitem>
<para>
Assign more CPU cycles to critical parts of the pipeline. This
could, for example, be done by increasing the thread priority.
</para>
</listitem>
</itemizedlist>
<para>
In all cases, elements should be prepared to go back to their normal
processing rate when the proportion member in the QOS event approaches
the ideal proportion of 1.0 again.
</para>
</sect2>
</sect1>
<sect1 id="section-throttle">
<title>Throttling</title>
<para>
Elements synchronizing to the clock should expose a property to configure
them in throttle mode. In throttle mode, the time distance between buffers
is kept to a configurable throttle interval. This means that effectively
the buffer rate is limited to 1 buffer per throttle interval. This can be
used to limit the framerate, for example.
</para>
<para>
When an element is configured in throttling mode (this is usually only
implemented on sinks) it should produce QoS events upstream with the jitter
field set to the throttle interval. This should instruct upstream elements to
skip or drop the remaining buffers in the configured throttle interval.
</para>
<para>
The proportion field is set to the desired slowdown needed to get the
desired throttle interval. Implementations can use the QoS Throttle type,
the proportion and the jitter member to tune their implementations.
</para>
<para>
The default sink base class, has the <quote>throttle-time</quote>
property for this feature. You can test this with:
<command>gst-launch-1.0 videotestsrc !
xvimagesink throttle-time=500000000</command>
</para>
</sect1>
<sect1 id="section-messages">
<title>QoS Messages</title>
<para>
In addition to the QOS events that are sent between elements in the
pipeline, there are also QOS messages posted on the pipeline bus to
inform the application of QoS decisions. The QOS message contains
the timestamps of when something was dropped along with the amount
of dropped vs processed items. Elements must post a QOS
message under these conditions:
</para>
<itemizedlist>
<listitem>
<para>
The element dropped a buffer because of QoS reasons.
</para>
</listitem>
<listitem>
<para>
An element changes its processing strategy because of QoS reasons
(quality). This could include a decoder that decides to drop every
B frame to increase its processing speed or an effect element
switching to a lower quality algorithm.
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,290 +0,0 @@
<chapter id="chapter-advanced-request">
<title>Request and Sometimes pads</title>
<para>
Until now, we've only dealt with pads that are always available. However,
there's also pads that are only being created in some cases, or only if
the application requests the pad. The first is called a
<emphasis>sometimes</emphasis>; the second is called a
<emphasis>request</emphasis> pad. The availability of a pad (always,
sometimes or request) can be seen in a pad's template. This chapter will
discuss when each of the two is useful, how they are created and when
they should be disposed.
</para>
<sect1 id="section-reqpad-sometimes" xreflabel="Sometimes pads">
<title>Sometimes pads</title>
<para>
A <quote>sometimes</quote> pad is a pad that is created under certain
conditions, but not in all cases. This mostly depends on stream content:
demuxers will generally parse the stream header, decide what elementary
(video, audio, subtitle, etc.) streams are embedded inside the system
stream, and will then create a sometimes pad for each of those elementary
streams. At its own choice, it can also create more than one instance of
each of those per element instance. The only limitation is that each
newly created pad should have a unique name. Sometimes pads are disposed
when the stream data is disposed, too (i.e. when going from PAUSED to the
READY state). You should <emphasis>not</emphasis> dispose the pad on EOS,
because someone might re-activate the pipeline and seek back to before
the end-of-stream point. The stream should still stay valid after EOS, at
least until the stream data is disposed. In any case, the element is
always the owner of such a pad.
</para>
<para>
The example code below will parse a text file, where the first line is
a number (n). The next lines all start with a number (0 to n-1), which
is the number of the source pad over which the data should be sent.
</para>
<programlisting>
3
0: foo
1: bar
0: boo
2: bye
</programlisting>
<para>
The code to parse this file and create the dynamic <quote>sometimes</quote>
pads, looks like this:
</para>
<programlisting>
<![CDATA[
typedef struct _GstMyFilter {
[..]
gboolean firstrun;
GList *srcpadlist;
} GstMyFilter;
static GstStaticPadTemplate src_factory =
GST_STATIC_PAD_TEMPLATE (
"src_%u",
GST_PAD_SRC,
GST_PAD_SOMETIMES,
GST_STATIC_CAPS ("ANY")
);
static void
gst_my_filter_class_init (GstMyFilterClass *klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
[..]
gst_element_class_add_pad_template (element_class,
gst_static_pad_template_get (&src_factory));
[..]
}
static void
gst_my_filter_init (GstMyFilter *filter)
{
[..]
filter->firstrun = TRUE;
filter->srcpadlist = NULL;
}
/*
* Get one line of data - without newline.
*/
static GstBuffer *
gst_my_filter_getline (GstMyFilter *filter)
{
guint8 *data;
gint n, num;
/* max. line length is 512 characters - for safety */
for (n = 0; n < 512; n++) {
num = gst_bytestream_peek_bytes (filter->bs, &data, n + 1);
if (num != n + 1)
return NULL;
/* newline? */
if (data[n] == '\n') {
GstBuffer *buf = gst_buffer_new_allocate (NULL, n + 1, NULL);
gst_bytestream_peek_bytes (filter->bs, &data, n);
gst_buffer_fill (buf, 0, data, n);
gst_buffer_memset (buf, n, '\0', 1);
gst_bytestream_flush_fast (filter->bs, n + 1);
return buf;
}
}
}
static void
gst_my_filter_loopfunc (GstElement *element)
{
GstMyFilter *filter = GST_MY_FILTER (element);
GstBuffer *buf;
GstPad *pad;
GstMapInfo map;
gint num, n;
/* parse header */
if (filter->firstrun) {
gchar *padname;
guint8 id;
if (!(buf = gst_my_filter_getline (filter))) {
gst_element_error (element, STREAM, READ, (NULL),
("Stream contains no header"));
return;
}
gst_buffer_extract (buf, 0, &id, 1);
num = atoi (id);
gst_buffer_unref (buf);
/* for each of the streams, create a pad */
for (n = 0; n < num; n++) {
padname = g_strdup_printf ("src_%u", n);
pad = gst_pad_new_from_static_template (src_factory, padname);
g_free (padname);
/* here, you would set _event () and _query () functions */
/* need to activate the pad before adding */
gst_pad_set_active (pad, TRUE);
gst_element_add_pad (element, pad);
filter->srcpadlist = g_list_append (filter->srcpadlist, pad);
}
}
/* and now, simply parse each line and push over */
if (!(buf = gst_my_filter_getline (filter))) {
GstEvent *event = gst_event_new (GST_EVENT_EOS);
GList *padlist;
for (padlist = srcpadlist;
padlist != NULL; padlist = g_list_next (padlist)) {
pad = GST_PAD (padlist->data);
gst_pad_push_event (pad, gst_event_ref (event));
}
gst_event_unref (event);
/* pause the task here */
return;
}
/* parse stream number and go beyond the ':' in the data */
gst_buffer_map (buf, &map, GST_MAP_READ);
num = atoi (map.data[0]);
if (num >= 0 && num < g_list_length (filter->srcpadlist)) {
pad = GST_PAD (g_list_nth_data (filter->srcpadlist, num);
/* magic buffer parsing foo */
for (n = 0; map.data[n] != ':' &&
map.data[n] != '\0'; n++) ;
if (map.data[n] != '\0') {
GstBuffer *sub;
/* create region copy that starts right past the space. The reason
* that we don't just forward the data pointer is because the
* pointer is no longer the start of an allocated block of memory,
* but just a pointer to a position somewhere in the middle of it.
* That cannot be freed upon disposal, so we'd either crash or have
* a memleak. Creating a region copy is a simple way to solve that. */
sub = gst_buffer_copy_region (buf, GST_BUFFER_COPY_ALL,
n + 1, map.size - n - 1);
gst_pad_push (pad, sub);
}
}
gst_buffer_unmap (buf, &map);
gst_buffer_unref (buf);
}
]]>
</programlisting>
<para>
Note that we use a lot of checks everywhere to make sure that the content
in the file is valid. This has two purposes: first, the file could be
erroneous, in which case we prevent a crash. The second and most important
reason is that - in extreme cases - the file could be used maliciously to
cause undefined behaviour in the plugin, which might lead to security
issues. <emphasis>Always</emphasis> assume that the file could be used to
do bad things.
</para>
</sect1>
<sect1 id="section-reqpad-request" xreflabel="Request pads">
<title>Request pads</title>
<para>
<quote>Request</quote> pads are similar to sometimes pads, except that
request are created on demand of something outside of the element rather
than something inside the element. This concept is often used in muxers,
where - for each elementary stream that is to be placed in the output
system stream - one sink pad will be requested. It can also be used in
elements with a variable number of input or outputs pads, such as the
<classname>tee</classname> (multi-output) or
<classname>input-selector</classname> (multi-input) elements.
</para>
<para>
To implement request pads, you need to provide a padtemplate with a
GST_PAD_REQUEST presence and implement the
<function>request_new_pad</function> virtual method in
<classname>GstElement</classname>.
To clean up, you will need to implement the
<function>release_pad</function> virtual method.
</para>
<programlisting>
<![CDATA[
static GstPad * gst_my_filter_request_new_pad (GstElement *element,
GstPadTemplate *templ,
const gchar *name,
const GstCaps *caps);
static void gst_my_filter_release_pad (GstElement *element,
GstPad *pad);
static GstStaticPadTemplate sink_factory =
GST_STATIC_PAD_TEMPLATE (
"sink_%u",
GST_PAD_SINK,
GST_PAD_REQUEST,
GST_STATIC_CAPS ("ANY")
);
static void
gst_my_filter_class_init (GstMyFilterClass *klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
[..]
gst_element_class_add_pad_template (klass,
gst_static_pad_template_get (&sink_factory));
[..]
element_class->request_new_pad = gst_my_filter_request_new_pad;
element_class->release_pad = gst_my_filter_release_pad;
}
static GstPad *
gst_my_filter_request_new_pad (GstElement *element,
GstPadTemplate *templ,
const gchar *name,
const GstCaps *caps)
{
GstPad *pad;
GstMyFilterInputContext *context;
context = g_new0 (GstMyFilterInputContext, 1);
pad = gst_pad_new_from_template (templ, name);
gst_pad_set_element_private (pad, context);
/* normally, you would set _chain () and _event () functions here */
gst_element_add_pad (element, pad);
return pad;
}
static void
gst_my_filter_release_pad (GstElement *element,
GstPad *pad)
{
GstMyFilterInputContext *context;
context = gst_pad_get_element_private (pad);
g_free (context);
gst_element_remove_pad (element, pad);
}
]]>
</programlisting>
</sect1>
</chapter>

View file

@ -1,445 +0,0 @@
<chapter id="chapter-scheduling" xreflabel="Different scheduling modes">
<title>Different scheduling modes</title>
<para>
The scheduling mode of a pad defines how data is retrieved from (source)
or given to (sink) pads. &GStreamer; can operate in two scheduling
mode, called push- and pull-mode. &GStreamer; supports elements with pads
in any of the scheduling modes where not all pads need to be operating
in the same mode.
</para>
<para>
So far, we have only discussed <function>_chain ()</function>-operating
elements, i.e. elements that have a chain-function set on their sink pad
and push buffers on their source pad(s). We call this the push-mode
because a peer element will use <function>gst_pad_push ()</function> on
a srcpad, which will cause our <function>_chain ()</function>-function
to be called, which in turn causes our element to push out a buffer on
the source pad. The initiative to start the dataflow happens somewhere
upstream when it pushes out a buffer and all downstream elements get
scheduled when their <function>_chain ()</function>-functions are
called in turn.
</para>
<para>
Before we explain pull-mode scheduling, let's first understand how the
different scheduling modes are selected and activated on a pad.
</para>
<sect1 id="section-scheduling-activation"
xreflabel="The pad activation stage">
<title>The pad activation stage</title>
<para>
During the element state change of READY->PAUSED, the pads of an
element will be activated. This happens first on the source pads and
then on the sink pads of the element. &GStreamer; calls the
<function>_activate ()</function> of a pad. By default this function
will activate the pad in push-mode by calling
<function>gst_pad_activate_mode ()</function> with the GST_PAD_MODE_PUSH
scheduling mode.
It is possible to override the <function>_activate ()</function> of a pad
and decide on a different scheduling mode. You can know in what
scheduling mode a pad is activated by overriding the
<function>_activate_mode ()</function>-function.
</para>
<para>
&GStreamer; allows the different pads of an element to operate in
different scheduling modes. This allows for many different possible
use-cases. What follows is an overview of some typical use-cases.
</para>
<itemizedlist>
<listitem>
<para>
If all pads of an element are activated in push-mode scheduling,
the element as a whole is operating in push-mode.
For source elements this means that they will have to start a
task that pushes out buffers on the source pad to the downstream
elements.
Downstream elements will have data pushed to them by upstream elements
using the sinkpads <function>_chain ()</function>-function which will
push out buffers on the source pads.
Prerequisites for this scheduling mode are that a chain-function was
set for each sinkpad using <function>gst_pad_set_chain_function ()</function>
and that all downstream elements operate in the same mode.
</para>
</listitem>
<listitem>
<para>
Alternatively, sinkpads can be the driving force behind a pipeline
by operating in pull-mode, while the sourcepads
of the element still operate in push-mode. In order to be the
driving force, those pads start a <classname>GstTask</classname>
when they are activated. This task is a thread, which
will call a function specified by the element. When called, this
function will have random data access (through
<function>gst_pad_pull_range ()</function>) over all sinkpads, and
can push data over the sourcepads, which effectively means that
this element controls data flow in the pipeline. Prerequisites for
this mode are that all downstream elements can act in push
mode, and that all upstream elements operate in pull-mode (see below).
</para>
<para>
Source pads can be activated in PULL mode by a downstream element
when they return GST_PAD_MODE_PULL from the GST_QUERY_SCHEDULING
query. Prerequisites for this scheduling mode are that a
getrange-function was set for the source pad using
<function>gst_pad_set_getrange_function ()</function>.
</para>
</listitem>
<listitem>
<para>
Lastly, all pads in an element can be activated in PULL-mode.
However, contrary to the above, this does not mean that they
start a task on their own. Rather, it means that they are pull
slave for the downstream element, and have to provide random data
access to it from their <function>_get_range ()</function>-function.
Requirements are that the a <function>_get_range
()</function>-function was set on this pad using the function
<function>gst_pad_set_getrange_function ()</function>. Also, if
the element has any sinkpads, all those pads (and thereby their
peers) need to operate in PULL access mode, too.
</para>
<para>
When a sink element is activated in PULL mode, it should start a
task that calls <function>gst_pad_pull_range ()</function> on its
sinkpad. It can only do this when the upstream SCHEDULING query
returns support for the GST_PAD_MODE_PULL scheduling mode.
</para>
</listitem>
</itemizedlist>
<para>
In the next two sections, we will go closer into pull-mode scheduling
(elements/pads driving the pipeline, and elements/pads providing random
access), and some specific use cases will be given.
</para>
</sect1>
<sect1 id="section-scheduling-loop" xreflabel="Pads driving the pipeline">
<title>Pads driving the pipeline</title>
<para>
Sinkpads operating in pull-mode, with the sourcepads operating in
push-mode (or it has no sourcepads when it is a sink), can start a task
that will drive the pipeline data flow.
Within this task function, you have random access over all of the sinkpads,
and push data over the sourcepads.
This can come in useful for several different kinds of elements:
</para>
<itemizedlist>
<listitem>
<para>
Demuxers, parsers and certain kinds of decoders where data comes
in unparsed (such as MPEG-audio or video streams), since those will
prefer byte-exact (random) access from their input. If possible,
however, such elements should be prepared to operate in push-mode
mode, too.
</para>
</listitem>
<listitem>
<para>
Certain kind of audio outputs, which require control over their
input data flow, such as the Jack sound server.
</para>
</listitem>
</itemizedlist>
<para>
First you need to perform a SCHEDULING query to check if the upstream
element(s) support pull-mode scheduling. If that is possible, you
can activate the sinkpad in pull-mode. Inside the activate_mode
function you can then start the task.
</para>
<programlisting><!-- example-begin task.c a -->
#include "filter.h"
#include &lt;string.h&gt;
static gboolean gst_my_filter_activate (GstPad * pad,
GstObject * parent);
static gboolean gst_my_filter_activate_mode (GstPad * pad,
GstObject * parent,
GstPadMode mode,
gboolean active);
static void gst_my_filter_loop (GstMyFilter * filter);
G_DEFINE_TYPE (GstMyFilter, gst_my_filter, GST_TYPE_ELEMENT);
<!-- example-end task.c a -->
<!-- example-begin task.c c -->
static void
gst_my_filter_init (GstMyFilter * filter)
{
<!-- example-end task.c c -->
[..]<!-- example-begin task.c d --><!--
--><!-- example-end task.c d -->
<!-- example-begin task.c e -->
gst_pad_set_activate_function (filter-&gt;sinkpad, gst_my_filter_activate);
gst_pad_set_activatemode_function (filter-&gt;sinkpad,
gst_my_filter_activate_mode);
<!-- example-end task.c e -->
<!-- example-begin task.c f --><!--
gst_element_add_pad (GST_ELEMENT (filter), filter-&gt;sinkpad);
gst_element_add_pad (GST_ELEMENT (filter), filter-&gt;srcpad);
--><!-- example-end task.c f -->
[..]<!-- example-begin task.c g -->
}
<!-- example-end task.c g -->
[..]<!-- example-begin task.c h --><!--
#include "caps.func"
--><!-- example-end task.c h -->
<!-- example-begin task.c i -->
static gboolean
gst_my_filter_activate (GstPad * pad, GstObject * parent)
{
GstQuery *query;
gboolean pull_mode;
/* first check what upstream scheduling is supported */
query = gst_query_new_scheduling ();
if (!gst_pad_peer_query (pad, query)) {
gst_query_unref (query);
goto activate_push;
}
/* see if pull-mode is supported */
pull_mode = gst_query_has_scheduling_mode_with_flags (query,
GST_PAD_MODE_PULL, GST_SCHEDULING_FLAG_SEEKABLE);
gst_query_unref (query);
if (!pull_mode)
goto activate_push;
/* now we can activate in pull-mode. GStreamer will also
* activate the upstream peer in pull-mode */
return gst_pad_activate_mode (pad, GST_PAD_MODE_PULL, TRUE);
activate_push:
{
/* something not right, we fallback to push-mode */
return gst_pad_activate_mode (pad, GST_PAD_MODE_PUSH, TRUE);
}
}
static gboolean
gst_my_filter_activate_pull (GstPad * pad,
GstObject * parent,
GstPadMode mode,
gboolean active)
{
gboolean res;
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (mode) {
case GST_PAD_MODE_PUSH:
res = TRUE;
break;
case GST_PAD_MODE_PULL:
if (active) {
filter->offset = 0;
res = gst_pad_start_task (pad,
(GstTaskFunction) gst_my_filter_loop, filter, NULL);
} else {
res = gst_pad_stop_task (pad);
}
break;
default:
/* unknown scheduling mode */
res = FALSE;
break;
}
return res;
}
<!-- example-end task.c i --></programlisting>
<para>
Once started, your task has full control over input and output. The
most simple case of a task function is one that reads input and pushes
that over its source pad. It's not all that useful, but provides some
more flexibility than the old push-mode case that we've been looking
at so far.
</para>
<programlisting><!-- example-begin task.c j -->
#define BLOCKSIZE 2048
static void
gst_my_filter_loop (GstMyFilter * filter)
{
GstFlowReturn ret;
guint64 len;
GstFormat fmt = GST_FORMAT_BYTES;
GstBuffer *buf = NULL;
if (!gst_pad_query_duration (filter-&gt;sinkpad, fmt, &amp;len)) {
GST_DEBUG_OBJECT (filter, "failed to query duration, pausing");
goto stop;
}
if (filter-&gt;offset >= len) {
GST_DEBUG_OBJECT (filter, "at end of input, sending EOS, pausing");
gst_pad_push_event (filter-&gt;srcpad, gst_event_new_eos ());
goto stop;
}
/* now, read BLOCKSIZE bytes from byte offset filter-&gt;offset */
ret = gst_pad_pull_range (filter-&gt;sinkpad, filter-&gt;offset,
BLOCKSIZE, &amp;buf);
if (ret != GST_FLOW_OK) {
GST_DEBUG_OBJECT (filter, "pull_range failed: %s", gst_flow_get_name (ret));
goto stop;
}
/* now push buffer downstream */
ret = gst_pad_push (filter-&gt;srcpad, buf);
buf = NULL; /* gst_pad_push() took ownership of buffer */
if (ret != GST_FLOW_OK) {
GST_DEBUG_OBJECT (filter, "pad_push failed: %s", gst_flow_get_name (ret));
goto stop;
}
/* everything is fine, increase offset and wait for us to be called again */
filter-&gt;offset += BLOCKSIZE;
return;
stop:
GST_DEBUG_OBJECT (filter, "pausing task");
gst_pad_pause_task (filter-&gt;sinkpad);
}
<!-- example-end task.c j -->
<!-- example-begin task.c k --><!--
#include "register.func"
--><!-- example-end task.c k --></programlisting>
</sect1>
<sect1 id="section-scheduling-randomxs" xreflabel="Providing random access">
<title>Providing random access</title>
<para>
In the previous section, we have talked about how elements (or pads)
that are activated to drive the pipeline using their own task, must use
pull-mode scheduling on their sinkpads. This means that all pads linked
to those pads need to be activated in pull-mode.
Source pads activated in pull-mode must implement a
<function>_get_range ()</function>-function set using
<function>gst_pad_set_getrange_function ()</function>, and
that function will be called when the peer pad requests some data with
<function>gst_pad_pull_range ()</function>.
The element is then responsible for seeking to the right offset and
providing the requested data. Several elements can implement random
access:
</para>
<itemizedlist>
<listitem>
<para>
Data sources, such as a file source, that can provide data from any
offset with reasonable low latency.
</para>
</listitem>
<listitem>
<para>
Filters that would like to provide a pull-mode scheduling
over the whole pipeline.
</para>
</listitem>
<listitem>
<para>
Parsers who can easily provide this by skipping a small part of
their input and are thus essentially "forwarding" getrange
requests literally without any own processing involved. Examples
include tag readers (e.g. ID3) or single output parsers, such as
a WAVE parser.
</para>
</listitem>
</itemizedlist>
<para>
The following example will show how a <function>_get_range
()</function>-function can be implemented in a source element:
</para>
<programlisting><!-- example-begin range.c a -->
#include "filter.h"
static GstFlowReturn
gst_my_filter_get_range (GstPad * pad,
GstObject * parent,
guint64 offset,
guint length,
GstBuffer ** buf);
G_DEFINE_TYPE (GstMyFilter, gst_my_filter, GST_TYPE_ELEMENT);
<!-- example-end range.c a -->
<!-- example-begin range.c b --><!--
static void
gst_my_filter_class_init (gpointer klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
static GstElementDetails my_filter_details = {
"An example plugin",
"Example/FirstExample",
"Shows the basic structure of a plugin",
"your name <your.name@your.isp>"
};
static GstStaticPadTemplate src_factory =
GST_STATIC_PAD_TEMPLATE (
"src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS ("ANY")
);
gst_element_class_set_details (element_class, &my_filter_details);
gst_element_class_add_pad_template (element_class,
gst_static_pad_template_get (&src_factory));
}
static void
gst_my_filter_class_init (GstMyFilterClass * klass)
{
}
--><!-- example-begin range.c b -->
<!-- example-begin range.c c -->
static void
gst_my_filter_init (GstMyFilter * filter)
{
<!-- example-end task.c c --><!--
GstElementClass *klass = GST_ELEMENT_GET_CLASS (filter);
filter-&gt;srcpad = gst_pad_new_from_template (
gst_element_class_get_pad_template (klass, "src"), "src");
-->
[..]<!-- example-begin task.c d --><!--
--><!-- example-end task.c d -->
<!-- example-begin task.c e -->
gst_pad_set_getrange_function (filter-&gt;srcpad,
gst_my_filter_get_range);
<!-- example-end range.c c --><!--
gst_element_add_pad (GST_ELEMENT (filter), filter-&gt;srcpad);
-->
[..]<!-- example-begin range.c d -->
}
static GstFlowReturn
gst_my_filter_get_range (GstPad * pad,
GstObject * parent,
guint64 offset,
guint length,
GstBuffer ** buf)
{
<!-- example-end range.c d -->
GstMyFilter *filter = GST_MY_FILTER (parent);
[.. here, you would fill *buf ..]
<!-- example-begin range.c e -->
return GST_FLOW_OK;
}
<!-- example-end range.c e -->
<!-- example-begin range.c f --><!--
#include "register.func"
--><!-- example-end range.c f --></programlisting>
<para>
In practice, many elements that could theoretically do random access,
may in practice often be activated in push-mode scheduling anyway,
since there is no downstream element able to start its own task.
Therefore, in practice, those elements should implement both a
<function>_get_range ()</function>-function and a <function>_chain
()</function>-function (for filters and parsers) or a <function>_get_range
()</function>-function and be prepared to start their own task by
providing <function>_activate_* ()</function>-functions (for
source elements).
</para>
</sect1>
</chapter>

View file

@ -1,241 +0,0 @@
<chapter id="chapter-advanced-tagging">
<title>Tagging (Metadata and Streaminfo)</title>
<sect1 id="section-tagging-overview" xreflabel="Overview">
<title>Overview</title>
<para>
Tags are pieces of information stored in a stream that are not the content
itself, but they rather <emphasis>describe</emphasis> the content. Most
media container formats support tagging in one way or another. Ogg uses
VorbisComment for this, MP3 uses ID3, AVI and WAV use RIFF's INFO list
chunk, etc. GStreamer provides a general way for elements to read tags from
the stream and expose this to the user. The tags (at least the metadata)
will be part of the stream inside the pipeline. The consequence of this is
that transcoding of files from one format to another will automatically
preserve tags, as long as the input and output format elements both support
tagging.
</para>
<para>
Tags are separated in two categories in GStreamer, even though applications
won't notice anything of this. The first are called <emphasis>metadata</emphasis>,
the second are called <emphasis>streaminfo</emphasis>. Metadata are tags
that describe the non-technical parts of stream content. They can be
changed without needing to re-encode the stream completely. Examples are
<quote>author</quote>, <quote>title</quote> or <quote>album</quote>. The
container format might still need to be re-written for the tags to fit in,
though. Streaminfo, on the other hand, are tags that describe the stream
contents technically. To change them, the stream needs to be re-encoded.
Examples are <quote>codec</quote> or <quote>bitrate</quote>. Note that some
container formats (like ID3) store various streaminfo tags as metadata in
the file container, which means that they can be changed so that they don't
match the content in the file any more. Still, they are called metadata
because <emphasis>technically</emphasis>, they can be changed without
re-encoding the whole stream, even though that makes them invalid. Files
with such metadata tags will have the same tag twice: once as metadata,
once as streaminfo.
</para>
<para>
There is no special name for tag reading elements in &GStreamer;. There are
specialised elements (e.g. id3demux) that do nothing besides tag reading,
but any &GStreamer; element may extract tags while processing data, and
most decoders, demuxers and parsers do.
</para>
<para>
A tag writer is called <ulink type="http"
url="../../gstreamer/html/GstTagSetter.html"><classname>TagSetter</classname></ulink>.
An element supporting both can be used in a tag editor for quick tag
changing (note: in-place tag editing is still poorly supported at the time
of writing and usually requires tag extraction/stripping and remuxing of
the stream with new tags).
</para>
</sect1>
<sect1 id="section-tagging-read" xreflabel="Reading Tags from Streams">
<title>Reading Tags from Streams</title>
<para>
The basic object for tags is a <ulink type="http"
url="../../gstreamer/html/GstTagList.html"><classname>GstTagList
</classname></ulink>. An element that is reading tags from a stream should
create an empty taglist and fill this with individual tags. Empty tag
lists can be created with <function>gst_tag_list_new ()</function>. Then,
the element can fill the list using <function>gst_tag_list_add ()
</function> or <function>gst_tag_list_add_values ()</function>.
Note that elements often read metadata as strings, but the
values in the taglist might not necessarily be strings - they need to be
of the type the tag was registered as (the API documentation for each
predefined tag should contain the type). Be sure to use functions like
<function>gst_value_transform ()</function>
to make sure that your data is of the right type.
After data reading, you can send the tags downstream with the TAG event.
When the TAG event reaches the sink, it will post the TAG message on
the pipeline's GstBus for the application to pick up.
</para>
<para>
We currently require the core to know the GType of tags before they are
being used, so all tags must be registered first. You can add new tags
to the list of known tags using <function>gst_tag_register ()</function>.
If you think the tag will be useful in more cases than just your own
element, it might be a good idea to add it to <filename>gsttag.c</filename>
instead. That's up to you to decide. If you want to do it in your own
element, it's easiest to register the tag in one of your class init
functions, preferably <function>_class_init ()</function>.
</para>
<programlisting>
<![CDATA[
static void
gst_my_filter_class_init (GstMyFilterClass *klass)
{
[..]
gst_tag_register ("my_tag_name", GST_TAG_FLAG_META,
G_TYPE_STRING,
_("my own tag"),
_("a tag that is specific to my own element"),
NULL);
[..]
}
]]>
</programlisting>
</sect1>
<sect1 id="section-tagging-write" xreflabel="Writing Tags to Streams">
<title>Writing Tags to Streams</title>
<para>
Tag writers are the opposite of tag readers. Tag writers only take
metadata tags into account, since that's the only type of tags that have
to be written into a stream. Tag writers can receive tags in three ways:
internal, application and pipeline. Internal tags are tags read by the
element itself, which means that the tag writer is - in that case - a tag
reader, too. Application tags are tags provided to the element via the
TagSetter interface (which is just a layer). Pipeline tags are tags
provided to the element from within the pipeline. The element receives
such tags via the <symbol>GST_EVENT_TAG</symbol> event, which means
that tags writers should implement an event handler. The tag writer is
responsible for combining all these three into one list and writing them
to the output stream.
</para>
<para>
The example below will receive tags from both application and pipeline,
combine them and write them to the output stream. It implements the tag
setter so applications can set tags, and retrieves pipeline tags from
incoming events.
</para>
<para>
Warning, this example is outdated and doesn't work with the 1.0 version
of &GStreamer; anymore.
</para>
<programlisting>
<![CDATA[
GType
gst_my_filter_get_type (void)
{
[..]
static const GInterfaceInfo tag_setter_info = {
NULL,
NULL,
NULL
};
[..]
g_type_add_interface_static (my_filter_type,
GST_TYPE_TAG_SETTER,
&tag_setter_info);
[..]
}
static void
gst_my_filter_init (GstMyFilter *filter)
{
[..]
}
/*
* Write one tag.
*/
static void
gst_my_filter_write_tag (const GstTagList *taglist,
const gchar *tagname,
gpointer data)
{
GstMyFilter *filter = GST_MY_FILTER (data);
GstBuffer *buffer;
guint num_values = gst_tag_list_get_tag_size (list, tag_name), n;
const GValue *from;
GValue to = { 0 };
g_value_init (&to, G_TYPE_STRING);
for (n = 0; n < num_values; n++) {
guint8 * data;
gsize size;
from = gst_tag_list_get_value_index (taglist, tagname, n);
g_value_transform (from, &to);
data = g_strdup_printf ("%s:%s", tagname,
g_value_get_string (&to));
size = strlen (data);
buf = gst_buffer_new_wrapped (data, size);
gst_pad_push (filter->srcpad, buf);
}
g_value_unset (&to);
}
static void
gst_my_filter_task_func (GstElement *element)
{
GstMyFilter *filter = GST_MY_FILTER (element);
GstTagSetter *tagsetter = GST_TAG_SETTER (element);
GstData *data;
GstEvent *event;
gboolean eos = FALSE;
GstTagList *taglist = gst_tag_list_new ();
while (!eos) {
data = gst_pad_pull (filter->sinkpad);
/* We're not very much interested in data right now */
if (GST_IS_BUFFER (data))
gst_buffer_unref (GST_BUFFER (data));
event = GST_EVENT (data);
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_TAG:
gst_tag_list_insert (taglist, gst_event_tag_get_list (event),
GST_TAG_MERGE_PREPEND);
gst_event_unref (event);
break;
case GST_EVENT_EOS:
eos = TRUE;
gst_event_unref (event);
break;
default:
gst_pad_event_default (filter->sinkpad, event);
break;
}
}
/* merge tags with the ones retrieved from the application */
if ((gst_tag_setter_get_tag_list (tagsetter)) {
gst_tag_list_insert (taglist,
gst_tag_setter_get_tag_list (tagsetter),
gst_tag_setter_get_tag_merge_mode (tagsetter));
}
/* write tags */
gst_tag_list_foreach (taglist, gst_my_filter_write_tag, filter);
/* signal EOS */
gst_pad_push (filter->srcpad, gst_event_new (GST_EVENT_EOS));
}
]]>
</programlisting>
<para>
Note that normally, elements would not read the full stream before
processing tags. Rather, they would read from each sinkpad until they've
received data (since tags usually come in before the first data buffer)
and process that.
</para>
</sect1>
</chapter>

File diff suppressed because it is too large Load diff

View file

@ -1,195 +0,0 @@
<chapter id="chapter-checklist-element">
<title>Things to check when writing an element</title>
<para>
This chapter contains a fairly random selection of things to take care
of when writing an element. It's up to you how far you're going to stick
to those guidelines. However, keep in mind that when you're writing an
element and hope for it to be included in the mainstream &GStreamer;
distribution, it <emphasis>has to</emphasis> meet those requirements.
As far as possible, we will try to explain why those requirements are
set.
</para>
<sect1 id="section-checklist-states">
<title>About states</title>
<itemizedlist>
<listitem>
<para>
Make sure the state of an element gets reset when going to
<classname>NULL</classname>. Ideally, this should set all
object properties to their original state. This function
should also be called from _init.
</para>
</listitem>
<listitem>
<para>
Make sure an element forgets <emphasis>everything</emphasis>
about its contained stream when going from
<classname>PAUSED</classname> to <classname>READY</classname>. In
<classname>READY</classname>, all stream states are reset. An
element that goes from <classname>PAUSED</classname> to
<classname>READY</classname> and back to
<classname>PAUSED</classname> should start reading the
stream from the start again.
</para>
</listitem>
<listitem>
<para>
People that use <command>gst-launch</command> for testing have
the tendency to not care about cleaning up. This is
<emphasis>wrong</emphasis>. An element should be tested using
various applications, where testing not only means to <quote>make
sure it doesn't crash</quote>, but also to test for memory leaks
using tools such as <command>valgrind</command>. Elements have to
be reusable in a pipeline after having been reset.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-checklist-debug">
<title>Debugging</title>
<itemizedlist>
<listitem>
<para>
Elements should <emphasis>never</emphasis> use their standard
output for debugging (using functions such as <function>printf
()</function> or <function>g_print ()</function>). Instead,
elements should use the logging functions provided by &GStreamer;,
named <function>GST_DEBUG ()</function>,
<function>GST_LOG ()</function>, <function>GST_INFO ()</function>,
<function>GST_WARNING ()</function> and
<function>GST_ERROR ()</function>. The various logging levels can
be turned on and off at runtime and can thus be used for solving
issues as they turn up. Instead of <function>GST_LOG ()</function>
(as an example), you can also use <function>GST_LOG_OBJECT
()</function> to print the object that you're logging output for.
</para>
</listitem>
<listitem>
<para>
Ideally, elements should use their own debugging category. Most
elements use the following code to do that:
</para>
<programlisting>
GST_DEBUG_CATEGORY_STATIC (myelement_debug);
#define GST_CAT_DEFAULT myelement_debug
[..]
static void
gst_myelement_class_init (GstMyelementClass *klass)
{
[..]
GST_DEBUG_CATEGORY_INIT (myelement_debug, "myelement",
0, "My own element");
}
</programlisting>
<para>
At runtime, you can turn on debugging using the commandline
option <command>--gst-debug=myelement:5</command>.
</para>
</listitem>
<listitem>
<para>
Elements should use GST_DEBUG_FUNCPTR when setting pad functions or
overriding element class methods, for example:
<programlisting>
gst_pad_set_event_func (myelement->srcpad,
GST_DEBUG_FUNCPTR (my_element_src_event));
</programlisting>
This makes debug output much easier to read later on.
</para>
</listitem>
<listitem>
<para>
Elements that are aimed for inclusion into one of the GStreamer
modules should ensure consistent naming of the element name,
structures and function names. For example, if the element type is
GstYellowFooDec, functions should be prefixed with
gst_yellow_foo_dec_ and the element should be registered
as 'yellowfoodec'. Separate words should be separate in this scheme,
so it should be GstFooDec and gst_foo_dec, and not GstFoodec and
gst_foodec.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-checklist-query">
<title>Querying, events and the like</title>
<itemizedlist>
<listitem>
<para>
All elements to which it applies (sources, sinks, demuxers)
should implement query functions on their pads, so that
applications and neighbour elements can request the current
position, the stream length (if known) and so on.
</para>
</listitem>
<listitem>
<para>
Elements should make sure they forward events they do not
handle with gst_pad_event_default (pad, parent, event) instead of
just dropping them. Events should never be dropped unless
specifically intended.
</para>
</listitem>
<listitem>
<para>
Elements should make sure they forward queries they do not
handle with gst_pad_query_default (pad, parent, query) instead of
just dropping them.
</para>
</listitem>
</itemizedlist>
</sect1>
<sect1 id="section-checklist-testing">
<title>Testing your element</title>
<itemizedlist>
<listitem>
<para>
<command>gst-launch</command> is <emphasis>not</emphasis> a good
tool to show that your element is finished. Applications such as
Rhythmbox and Totem (for GNOME) or AmaroK (for KDE)
<emphasis>are</emphasis>. <command>gst-launch</command> will not
test various things such as proper clean-up on reset, event
handling, querying and so on.
</para>
</listitem>
<listitem>
<para>
Parsers and demuxers should make sure to check their input. Input
cannot be trusted. Prevent possible buffer overflows and the like.
Feel free to error out on unrecoverable stream errors. Test your
demuxer using stream corruption elements such as
<classname>breakmydata</classname> (included in gst-plugins). It
will randomly insert, delete and modify bytes in a stream, and is
therefore a good test for robustness. If your element crashes
when adding this element, your element needs fixing. If it errors
out properly, it's good enough. Ideally, it'd just continue to
work and forward data as much as possible.
</para>
</listitem>
<listitem>
<para>
Demuxers should not assume that seeking works. Be prepared to
work with unseekable input streams (e.g. network sources) as
well.
</para>
</listitem>
<listitem>
<para>
Sources and sinks should be prepared to be assigned another clock
then the one they expose themselves. Always use the provided clock
for synchronization, else you'll get A/V sync issues.
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,38 +0,0 @@
<chapter id="chapter-licensing-advisory">
<title>GStreamer licensing</title>
<sect1 id="section-application-licensing">
<title>How to license the code you write for <application>GStreamer</application></title>
<para>
GStreamer is a plugin-based framework licensed under the LGPL. The reason
for this choice in licensing is to ensure that everyone can use GStreamer
to build applications using licenses of their choice.
</para>
<para>
To keep this policy viable, the GStreamer community has made a few
licensing rules for code to be included in GStreamer's core or GStreamer's
official modules, like our plugin packages. We require that all code going
into our core package is LGPL. For the plugin code, we require the use of
the LGPL for all plugins written from scratch or linking to external
libraries. The only exception to this is when plugins contain older code
under more liberal licenses (like the MPL or BSD). They can use those
licenses instead and will still be considered for inclusion. We do not
accept GPL code to be added to our plugins module, but we do accept
LGPL-licensed plugins using an external GPL library. The reason for
demanding plugins be licensed under the LGPL, even when using a GPL
library, is that other developers might want to use the plugin code as a
template for plugins linking to non-GPL libraries.
</para>
<para>
We also plan on splitting out the plugins using GPL libraries into a
separate package eventually and implement a system which makes sure an
application will not be able to access these plugins unless it uses some
special code to do so. The point of this is not to block GPL-licensed
plugins from being used and developed, but to make sure people are not
unintentionally violating the GPL license of said plugins.
</para>
<para>
This advisory is part of a bigger advisory with a FAQ which you can find
on the <ulink url="http://gstreamer.freedesktop.org/documentation/licensing.html">GStreamer website</ulink>
</para>
</sect1>
</chapter>

View file

@ -1,196 +0,0 @@
<chapter id="chapter-porting">
<title>Porting 0.8 plug-ins to 0.10</title>
<para>
This section of the appendix will discuss shortly what changes to
plugins will be needed to quickly and conveniently port most
applications from &GStreamer;-0.8 to &GStreamer;-0.10, with references
to the relevant sections in this Plugin Writer's Guide where needed.
With this list, it should be possible to port most plugins to
&GStreamer;-0.10 in less than a day. Exceptions are elements that will
require a base class in 0.10 (sources, sinks), in which case it may take
a lot longer, depending on the coder's skills (however, when using the
<classname>GstBaseSink</classname> and <classname>GstBaseSrc</classname>
base-classes, it shouldn't be all too bad), and elements requiring
the deprecated bytestream interface, which should take 1-2 days with
random access. The scheduling parts of muxers will also need a rewrite,
which will take about the same amount of time.
</para>
<sect1 id="section-porting-objects">
<title>List of changes</title>
<itemizedlist>
<listitem>
<para>
Discont events have been replaced by newsegment events. In 0.10, it is
essential that you send a newsegment event downstream before you send
your first buffer (in 0.8 the scheduler would invent discont events if
you forgot them, in 0.10 this is no longer the case).
</para>
</listitem>
<listitem>
<para>
In 0.10, buffers have caps attached to them. Elements should allocate
new buffers with <function>gst_pad_alloc_buffer ()</function>. See
<xref linkend="chapter-negotiation"/> for more details.
</para>
</listitem>
<listitem>
<para>
Most functions returning an object or an object property have
been changed to return its own reference rather than a constant
reference of the one owned by the object itself. The reason for
this change is primarily thread-safety. This means effectively
that return values of functions such as
<function>gst_element_get_pad ()</function>,
<function>gst_pad_get_name ()</function>,
<function>gst_pad_get_parent ()</function>,
<function>gst_object_get_parent ()</function>,
and many more like these
have to be free'ed or unreferenced after use. Check the API
references of each function to know for sure whether return
values should be free'ed or not.
</para>
</listitem>
<listitem>
<para>
In 0.8, scheduling could happen in any way. Source elements could
be <function>_get ()</function>-based or <function>_loop
()</function>-based, and any other element could be <function>_chain
()</function>-based or <function>_loop ()</function>-based, with
no limitations. Scheduling in 0.10 is simpler for the scheduler,
and the element is expected to do some more work. Pads get
assigned a scheduling mode, based on which they can either
operate in random access-mode, in pipeline driving mode or in
push-mode. all this is documented in detail in <xref
linkend="chapter-scheduling"/>. As a result of this, the bytestream
object no longer exists. Elements requiring byte-level access should
now use random access on their sinkpads.
</para>
</listitem>
<listitem>
<para>
Negotiation is asynchronous. This means that downstream negotiation
is done as data comes in and upstream negotiation is done whenever
renegotiation is required. All details are described in
<xref linkend="chapter-negotiation"/>.
</para>
</listitem>
<listitem>
<para>
For as far as possible, elements should try to use existing base
classes in 0.10. Sink and source elements, for example, could derive
from <classname>GstBaseSrc</classname> and
<classname>GstBaseSink</classname>. Audio sinks or sources could even
derive from audio-specific base classes. All existing base classes
have been discussed in <xref linkend="chapter-other-base"/> and the
next few chapters.
</para>
</listitem>
<listitem>
<para>
In 0.10, event handling and buffers are separated once again. This
means that in order to receive events, one no longer has to set the
<classname>GST_FLAG_EVENT_AWARE</classname> flag, but can simply
set an event handling function on the element's sinkpad(s), using
the function <function>gst_pad_set_event_function ()</function>. The
<function>_chain ()</function>-function will only receive buffers.
</para>
</listitem>
<listitem>
<para>
Although core will wrap most threading-related locking for you (e.g.
it takes the stream lock before calling your data handling
functions), you are still responsible for locking around certain
functions, e.g. object properties. Be sure to lock properly here,
since applications will change those properties in a different thread
than the thread which does the actual data passing! You can use the
<function>GST_OBJECT_LOCK ()</function> and <function>GST_OBJECT_UNLOCK
()</function> helpers in most cases, fortunately, which grabs the
default property lock of the element.
</para>
</listitem>
<listitem>
<para>
<classname>GstValueFixedList</classname> and all
<function>*_fixed_list_* ()</function> functions were renamed to
<classname>GstValueArray</classname> and <function>*_array_*
()</function>.
</para>
</listitem>
<listitem>
<para>
The semantics of <symbol>GST_STATE_PAUSED</symbol> and
<symbol>GST_STATE_PLAYING</symbol> have changed for elements that
are not sink elements. Non-sink elements need to be able to accept
and process data already in the <symbol>GST_STATE_PAUSED</symbol>
state now (i.e. when prerolling the pipeline). More details can be
found in <xref linkend="chapter-statemanage-states"/>.
</para>
</listitem>
<listitem>
<para>
If your plugin's state change function hasn't been superseded by
virtual start() and stop() methods of one of the new base classes,
then your plugin's state change functions may need to be changed in
order to safely handle concurrent access by multiple threads. Your
typical state change function will now first handle upwards state
changes, then chain up to the state change function of the parent
class (usually GstElementClass in these cases), and only then handle
downwards state changes. See the vorbis decoder plugin in
gst-plugins-base for an example.
</para>
<para>
The reason for this is that in the case of downwards state changes
you don't want to destroy allocated resources while your plugin's
chain function (for example) is still accessing those resources in
another thread. Whether your chain function might be running or not
depends on the state of your plugin's pads, and the state of those
pads is closely linked to the state of the element. Pad states are
handled in the GstElement class's state change function, including
proper locking, that's why it is essential to chain up before
destroying allocated resources.
</para>
<para>
As already mentioned above, you should really rewrite your plugin
to derive from one of the new base classes though, so you don't have
to worry about these things, as the base class will handle it for you.
There are no base classes for decoders and encoders yet, so the above
paragraphs about state changes definitively apply if your plugin is a
decoder or an encoder.
</para>
</listitem>
<listitem>
<para>
<function>gst_pad_set_link_function ()</function>, which used to set
a function that would be called when a format was negotiated between
two <classname>GstPad</classname>s, now sets a function that is
called when two elements are linked together in an application. For
all practical purposes, you most likely want to use the function
<function>gst_pad_set_setcaps_function ()</function>, nowadays, which
sets a function that is called when the format streaming over a pad
changes (so similar to <function>_set_link_function ()</function> in
&GStreamer;-0.8).
</para>
<para>
If the element is derived from a <classname>GstBase</classname> class,
then override the <function>set_caps ()</function>.
</para>
</listitem>
<listitem>
<para>
<function>gst_pad_use_explicit_caps ()</function> has been replaced by
<function>gst_pad_use_fixed_caps ()</function>. You can then set the
fixed caps to use on a pad with <function>gst_pad_set_caps ()</function>.
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>
<chapter id="chapter-porting-1_0">
<title>Porting 0.10 plug-ins to 1.0</title>
<para>
You can find the list of changes in the
<ulink url="http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/random/porting-to-1.0.txt">Porting to 1.0</ulink> document.
</para>
</chapter>

View file

View file

@ -1,466 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-boiler" xreflabel="Constructing the Boilerplate">
<title>Constructing the Boilerplate</title>
<para>
In this chapter you will learn how to construct the bare minimum code for a
new plugin. Starting from ground zero, you will see how to get the
&GStreamer; template source. Then you will learn how to use a few basic
tools to copy and modify a template plugin to create a new plugin. If you
follow the examples here, then by the end of this chapter you will have a
functional audio filter plugin that you can compile and use in &GStreamer;
applications.
</para>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-source" xreflabel="Getting the GStreamer Plugin Templates">
<title>Getting the GStreamer Plugin Templates</title>
<para>
There are currently two ways to develop a new plugin for &GStreamer;: You
can write the entire plugin by hand, or you can copy an existing plugin
template and write the plugin code you need. The second method is by far
the simpler of the two, so the first method will not even be described
here. (Errm, that is, <quote>it is left as an exercise to the
reader.</quote>)
</para>
<para>
The first step is to check out a copy of the
<filename>gst-template</filename> git module to get an important tool and
the source code template for a basic &GStreamer; plugin. To check out the
<filename>gst-template</filename> module, make sure you are connected to
the internet, and type the following commands at a command console:
</para>
<screen>
<prompt>shell $ </prompt><userinput>git clone git://anongit.freedesktop.org/gstreamer/gst-template.git</userinput>
Initialized empty Git repository in /some/path/gst-template/.git/
remote: Counting objects: 373, done.
remote: Compressing objects: 100% (114/114), done.
remote: Total 373 (delta 240), reused 373 (delta 240)
Receiving objects: 100% (373/373), 75.16 KiB | 78 KiB/s, done.
Resolving deltas: 100% (240/240), done.
</screen>
<para>
This command will check out a series of files and directories into
<filename class="directory">gst-template</filename>. The template you
will be using is in the
<filename class="directory">gst-template/gst-plugin/</filename>
directory. You should look over the files in that directory to get a
general idea of the structure of a source tree for a plugin.
</para>
<para>
If for some reason you can't access the git repository, you can also
<ulink type="http"
url="http://cgit.freedesktop.org/gstreamer/gst-template/commit/">
download a snapshot of the latest revision</ulink> via the cgit web
interface.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-project-stamp" xreflabel="Using the Project Stamp">
<title>Using the Project Stamp</title>
<para>
The first thing to do when making a new element is to specify some basic
details about it: what its name is, who wrote it, what version number it
is, etc. We also need to define an object to represent the element and to
store the data the element needs. These details are collectively known as
the <emphasis>boilerplate</emphasis>.
</para>
<para>
The standard way of defining the boilerplate is simply to write some code,
and fill in some structures. As mentioned in the previous section, the
easiest way to do this is to copy a template and add functionality
according to your needs. To help you do so, there is a tool in the
<filename class="directory">./gst-plugin/tools/</filename> directory.
This tool, <filename>make_element</filename>, is a command line utility
that creates the boilerplate code for you.
</para>
<para>
To use <command>make_element</command>, first open up a terminal window.
Change to the <filename class="directory">gst-template/gst-plugin/src</filename>
directory, and then run the <command>make_element</command> command. The
arguments to the <command>make_element</command> are:
</para>
<orderedlist>
<listitem>
<para>the name of the plugin, and</para>
</listitem>
<listitem>
<para>
the source file that the tool will use. By default,
<filename>gstplugin</filename> is used.
</para>
</listitem>
</orderedlist>
<para>
For example, the following commands create the MyFilter plugin based on
the plugin template and put the output files in the
<filename class="directory">gst-template/gst-plugin/src</filename>
directory:
</para>
<screen>
<prompt>shell $ </prompt><userinput>cd gst-template/gst-plugin/src</userinput>
<prompt>shell $ </prompt><userinput>../tools/make_element MyFilter</userinput>
</screen>
<note>
<para>
Capitalization is important for the name of the plugin. Keep in mind
that under some operating systems, capitalization is also important
when specifying directory and file names in general.
</para>
</note>
<para>
The last command creates two files:
<filename>gstmyfilter.c</filename> and
<filename>gstmyfilter.h</filename>.
</para>
<note>
<para>
It is recommended that you create a copy of the <filename
class="directory">gst-plugin</filename>
directory before continuing.
</para>
</note>
<para>
Now one needs to adjust the <filename>Makefile.am</filename> to use the
new filenames and run <filename>autogen.sh</filename> from the parent
directory to bootstrap the build environment. After that, the project
can be built and installed using the well known
<userinput>make &amp;&amp; sudo make install</userinput> commands.
</para>
<note>
<para>
Be aware that by default <filename>autogen.sh</filename> and
<filename>configure</filename> would choose <filename class="directory">/usr/local</filename>
as a default location. One would need to add
<filename class="directory">/usr/local/lib/gstreamer-1.0</filename>
to <symbol>GST_PLUGIN_PATH</symbol> in order to make the new plugin
show up in a gstreamer that's been installed from packages.
</para>
</note>
<note>
<para>
FIXME: this section is slightly outdated. gst-template is still useful
as an example for a minimal plugin build system skeleton. However, for
creating elements the tool gst-element-maker from gst-plugins-bad is
recommended these days.
</para>
</note>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-examine">
<title>Examining the Basic Code</title>
<para>
First we will examine the code you would be likely to place in a header
file (although since the interface to the code is entirely defined by the
plugin system, and doesn't depend on reading a header file, this is not
crucial.)
</para>
<example id="ex-boiler-examine-h">
<title>Example Plugin Header File</title>
<programlisting><!-- example-begin filter.h a -->
#include &lt;gst/gst.h&gt;
/* Definition of structure storing data for this element. */
typedef struct _GstMyFilter {
GstElement element;
GstPad *sinkpad, *srcpad;
gboolean silent;
<!-- example-end filter.h a -->
<!-- example-begin filter.h b --><!--
gint samplerate, channels;
gint from_samplerate, to_samplerate;
gboolean passthrough;
guint64 offset;
--><!-- example-end filter.h b -->
<!-- example-begin filter.h c -->
} GstMyFilter;
/* Standard definition defining a class for this element. */
typedef struct _GstMyFilterClass {
GstElementClass parent_class;
} GstMyFilterClass;
/* Standard macros for defining types for this element. */
#define GST_TYPE_MY_FILTER (gst_my_filter_get_type())
#define GST_MY_FILTER(obj) \
(G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_MY_FILTER,GstMyFilter))
#define GST_MY_FILTER_CLASS(klass) \
(G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_MY_FILTER,GstMyFilterClass))
#define GST_IS_MY_FILTER(obj) \
(G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_MY_FILTER))
#define GST_IS_MY_FILTER_CLASS(klass) \
(G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_MY_FILTER))
/* Standard function returning type information. */
GType gst_my_filter_get_type (void);
<!-- example-end filter.h c --></programlisting>
</example>
<para>
Using this header file, you can use the following macro to setup
the <classname>GObject</classname> basics in your source file so
that all functions will be called appropriately:
</para>
<programlisting><!-- example-begin boilerplate.c a -->
#include "filter.h"
G_DEFINE_TYPE (GstMyFilter, gst_my_filter, GST_TYPE_ELEMENT);
<!-- example-end boilerplate.c a --></programlisting>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-details">
<title>Element metadata</title>
<para>
The Element metadata provides extra element information. It is configured
with <function>gst_element_class_set_metadata</function> or
<function>gst_element_class_set_static_metadata</function> which takes the
following parameters:
</para>
<itemizedlist>
<listitem><para>
A long, English, name for the element.
</para></listitem><listitem><para>
The type of the element, see the docs/design/draft-klass.txt document
in the GStreamer core source tree for details and examples.
</para></listitem><listitem><para>
A brief description of the purpose of the element.
</para></listitem><listitem><para>
The name of the author of the element, optionally followed by a contact
email address in angle brackets.
</para></listitem>
</itemizedlist>
<para>
For example:
</para>
<programlisting>
gst_element_class_set_static_metadata (klass,
"An example plugin",
"Example/FirstExample",
"Shows the basic structure of a plugin",
"your name &lt;your.name@your.isp&gt;");
</programlisting>
<para>
The element details are registered with the plugin during
the <function>_class_init ()</function> function, which is part of
the GObject system. The <function>_class_init ()</function> function
should be set for this GObject in the function where you register
the type with GLib.
</para>
<programlisting><!-- example-begin boilerplate.c c -->
static void
gst_my_filter_class_init (GstMyFilterClass * klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
<!-- example-end boilerplate.c c -->
[..]<!-- example-begin boilerplate.c d -->
gst_element_class_set_static_metadata (element_klass,
"An example plugin",
"Example/FirstExample",
"Shows the basic structure of a plugin",
"your name &lt;your.name@your.isp&gt;");
<!-- example-end boilerplate.c d -->
}
</programlisting>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-padtemplates">
<title>GstStaticPadTemplate</title>
<para>
A GstStaticPadTemplate is a description of a pad that the element will
(or might) create and use. It contains:
</para>
<itemizedlist>
<listitem>
<para>A short name for the pad.</para>
</listitem>
<listitem>
<para>Pad direction.</para>
</listitem>
<listitem>
<para>
Existence property. This indicates whether the pad exists always (an
<quote>always</quote> pad), only in some cases (a
<quote>sometimes</quote> pad) or only if the application requested
such a pad (a <quote>request</quote> pad).
</para>
</listitem>
<listitem>
<para>Supported types by this element (capabilities).</para>
</listitem>
</itemizedlist>
<para>
For example:
</para>
<programlisting><!-- example-begin boilerplate.c e -->
static GstStaticPadTemplate sink_factory =
GST_STATIC_PAD_TEMPLATE (
"sink",
GST_PAD_SINK,
GST_PAD_ALWAYS,
GST_STATIC_CAPS ("ANY")
);
<!-- example-end boilerplate.c e -->
<!-- example-begin boilerplate.c f --><!--
static GstStaticPadTemplate src_factory =
GST_STATIC_PAD_TEMPLATE (
"src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS ("ANY")
);
--><!-- example-end boilerplate.c f -->
</programlisting>
<para>
Those pad templates are registered during the
<function>_class_init ()</function> function with the
<function>gst_element_class_add_pad_template ()</function>. For this
function you need a handle the <classname>GstPadTemplate</classname>
which you can create from the static pad template with
<function>gst_static_pad_template_get ()</function>. See below for more
details on this.
</para>
<para>
Pads are created from these static templates in the element's
<function>_init ()</function> function using
<function>gst_pad_new_from_static_template ()</function>.
In order to create a new pad from this
template using <function>gst_pad_new_from_static_template ()</function>, you
will need to declare the pad template as a global variable. More on
this subject in <xref linkend="chapter-building-pads"/>.
</para>
<programlisting>
static GstStaticPadTemplate sink_factory = [..],
src_factory = [..];
static void
gst_my_filter_class_init (GstMyFilterClass * klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
[..]
<!-- example-begin boilerplate.c g -->
gst_element_class_add_pad_template (element_class,
gst_static_pad_template_get (&amp;src_factory));
gst_element_class_add_pad_template (element_class,
gst_static_pad_template_get (&amp;sink_factory));
}
<!-- example-end boilerplate.c g -->
<!-- example-begin boilerplate.c h --><!--
static void
gst_my_filter_init (GstMyFilter * filter)
{
}
#include "register.func"
--><!-- example-end boilerplate.c h --></programlisting>
<para>
The last argument in a template is its type
or list of supported types. In this example, we use 'ANY', which means
that this element will accept all input. In real-life situations, you
would set a media type and optionally a set of properties to make sure
that only supported input will come in. This representation should be
a string that starts with a media type, then a set of comma-separates
properties with their supported values. In case of an audio filter that
supports raw integer 16-bit audio, mono or stereo at any samplerate, the
correct template would look like this:
</para>
<programlisting>
<![CDATA[
static GstStaticPadTemplate sink_factory =
GST_STATIC_PAD_TEMPLATE (
"sink",
GST_PAD_SINK,
GST_PAD_ALWAYS,
GST_STATIC_CAPS (
"audio/x-raw, "
"format = (string) " GST_AUDIO_NE (S16) ", "
"channels = (int) { 1, 2 }, "
"rate = (int) [ 8000, 96000 ]"
)
);
]]>
</programlisting>
<para>
Values surrounded by curly brackets (<quote>{</quote> and
<quote>}</quote>) are lists, values surrounded by square brackets
(<quote>[</quote> and <quote>]</quote>) are ranges.
Multiple sets of types are supported too, and should be separated by
a semicolon (<quote>;</quote>). Later, in the chapter on pads, we will
see how to use types to know the exact format of a stream:
<xref linkend="chapter-building-pads"/>.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-constructors">
<title>Constructor Functions</title>
<para>
Each element has two functions which are used for construction of an
element. The <function>_class_init()</function> function,
which is used to initialise the class only once (specifying what signals,
arguments and virtual functions the class has and setting up global
state); and the <function>_init()</function> function, which is used to
initialise a specific instance of this type.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-boiler-plugininit">
<title>The plugin_init function</title>
<para>
Once we have written code defining all the parts of the plugin, we need to
write the plugin_init() function. This is a special function, which is
called as soon as the plugin is loaded, and should return TRUE or FALSE
depending on whether it loaded initialized any dependencies correctly.
Also, in this function, any supported element type in the plugin should
be registered.
</para>
<programlisting>
<!-- example-begin register.func -->
<![CDATA[
static gboolean
plugin_init (GstPlugin *plugin)
{
return gst_element_register (plugin, "my_filter",
GST_RANK_NONE,
GST_TYPE_MY_FILTER);
}
GST_PLUGIN_DEFINE (
GST_VERSION_MAJOR,
GST_VERSION_MINOR,
my_filter,
"My filter plugin",
plugin_init,
VERSION,
"LGPL",
"GStreamer",
"http://gstreamer.net/"
)
]]>
<!-- example-end register.func -->
</programlisting>
<para>
Note that the information returned by the plugin_init() function will be
cached in a central registry. For this reason, it is important that the
same information is always returned by the function: for example, it
must not make element factories available based on runtime conditions.
If an element can only work in certain conditions (for example, if the
soundcard is not being used by some other process) this must be reflected
by the element being unable to enter the READY state if unavailable,
rather than the plugin attempting to deny existence of the plugin.
</para>
</sect1>
</chapter>

View file

@ -1,158 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-chainfn">
<title>The chain function</title>
<para>
The chain function is the function in which all data processing takes
place. In the case of a simple filter, <function>_chain ()</function>
functions are mostly linear functions - so for each incoming buffer,
one buffer will go out, too. Below is a very simple implementation of
a chain function:
</para>
<programlisting><!-- example-begin chain.c a --><!--
#include "init.func"
#include "caps.func"
static gboolean
gst_my_filter_event (GstPad * pad, GstObject * parent, GstEvent * event)
{
return gst_pad_event_default (pad, parent, event);
}
--><!-- example-end chain.c a -->
<!-- example-begin chain.c b -->
static GstFlowReturn gst_my_filter_chain (GstPad *pad,
GstObject *parent,
GstBuffer *buf);
[..]
static void
gst_my_filter_init (GstMyFilter * filter)
{
[..]
/* configure chain function on the pad before adding
* the pad to the element */
gst_pad_set_chain_function (filter-&gt;sinkpad,
gst_my_filter_chain);
[..]
}
static GstFlowReturn
gst_my_filter_chain (GstPad *pad,
GstObject *parent,
GstBuffer *buf)
{
GstMyFilter *filter = GST_MY_FILTER (parent);
if (!filter->silent)
g_print ("Have data of size %" G_GSIZE_FORMAT" bytes!\n",
gst_buffer_get_size (buf));
return gst_pad_push (filter->srcpad, buf);
}
<!-- example-end chain.c b -->
<!-- example-begin chain.c c --><!--
static GstStateChangeReturn
gst_my_filter_change_state (GstElement * element, GstStateChange transition)
{
return GST_CALL_PARENT_WITH_DEFAULT (GST_ELEMENT_CLASS,
change_state, (element, transition), GST_STATE_CHANGE_SUCCESS);
}
#include "register.func"
--><!-- example-end chain.c c --></programlisting>
<para>
Obviously, the above doesn't do much useful. Instead of printing that the
data is in, you would normally process the data there. Remember, however,
that buffers are not always writeable.
</para>
<para>
In more advanced elements (the ones that do event processing), you may want
to additionally specify an event handling function, which will be called
when stream-events are sent (such as caps, end-of-stream, newsegment, tags, etc.).
</para>
<programlisting>
static void
gst_my_filter_init (GstMyFilter * filter)
{
[..]
gst_pad_set_event_function (filter-&gt;sinkpad,
gst_my_filter_sink_event);
[..]
}
<!-- example-begin chain2.c a --><!--
#include "init.func"
#include "caps.func"
#include "chain.func"
--><!-- example-end chain2.c a -->
<!-- example-begin chain.func a --><!--
static void
gst_my_filter_stop_processing (GstMyFilter * filter)
{
}
static GstBuffer *
gst_my_filter_process_data (GstMyFilter * filter, const GstBuffer * buf)
{
return NULL;
}
--><!-- example-end chain.func a -->
<!-- example-begin chain.func b -->
static gboolean
gst_my_filter_sink_event (GstPad *pad,
GstObject *parent,
GstEvent *event)
{
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_CAPS:
/* we should handle the format here */
break;
case GST_EVENT_EOS:
/* end-of-stream, we should close down all stream leftovers here */
gst_my_filter_stop_processing (filter);
break;
default:
break;
}
return gst_pad_event_default (pad, parent, event);
}
static GstFlowReturn
gst_my_filter_chain (GstPad *pad,
GstObject *parent,
GstBuffer *buf)
{
GstMyFilter *filter = GST_MY_FILTER (parent);
GstBuffer *outbuf;
outbuf = gst_my_filter_process_data (filter, buf);
gst_buffer_unref (buf);
if (!outbuf) {
/* something went wrong - signal an error */
GST_ELEMENT_ERROR (GST_ELEMENT (filter), STREAM, FAILED, (NULL), (NULL));
return GST_FLOW_ERROR;
}
return gst_pad_push (filter->srcpad, outbuf);
}
<!-- example-end chain.func b -->
<!-- example-begin chain2.c b --><!--
static GstStateChangeReturn
gst_my_filter_change_state (GstElement * element, GstStateChange transition)
{
return GST_CALL_PARENT_WITH_DEFAULT (GST_ELEMENT_CLASS,
change_state, (element, transition), GST_STATE_CHANGE_SUCCESS);
}
#include "register.func"
--><!-- example-end chain2.c b --></programlisting>
<para>
In some cases, it might be useful for an element to have control over the
input data rate, too. In that case, you probably want to write a so-called
<emphasis>loop-based</emphasis> element. Source elements (with only source
pads) can also be <emphasis>get-based</emphasis> elements. These concepts
will be explained in the advanced section of this guide, and in the section
that specifically discusses source pads.
</para>
</chapter>

View file

@ -1,72 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-eventfn">
<title>The event function</title>
<para>
The event function notifies you of special events that happen in
the datastream (such as caps, end-of-stream, newsegment, tags, etc.).
Events can travel both upstream and downstream, so you can receive them
on sink pads as well as source pads.
</para>
<para>
Below follows a very simple event function that we install on the sink
pad of our element.
</para>
<programlisting>
<![CDATA[
static gboolean gst_my_filter_sink_event (GstPad *pad,
GstObject *parent,
GstEvent *event);
[..]
static void
gst_my_filter_init (GstMyFilter * filter)
{
[..]
/* configure event function on the pad before adding
* the pad to the element */
gst_pad_set_event_function (filter->sinkpad,
gst_my_filter_sink_event);
[..]
}
static gboolean
gst_my_filter_sink_event (GstPad *pad,
GstObject *parent,
GstEvent *event)
{
gboolean ret;
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_CAPS:
/* we should handle the format here */
/* push the event downstream */
ret = gst_pad_push_event (filter->srcpad, event);
break;
case GST_EVENT_EOS:
/* end-of-stream, we should close down all stream leftovers here */
gst_my_filter_stop_processing (filter);
ret = gst_pad_event_default (pad, parent, event);
break;
default:
/* just call the default handler */
ret = gst_pad_event_default (pad, parent, event);
break;
}
return ret;
}
]]>
</programlisting>
<para>
It is a good idea to call the default event handler
<function>gst_pad_event_default ()</function> for unknown events.
Depending on the event type, the default handler will forward
the event or simply unref it. The CAPS event is by default not
forwarded so we need to do this in the event handler ourselves.
</para>
</chapter>

View file

@ -1,146 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-pads">
<title>Specifying the pads</title>
<para>
As explained before, pads are the port through which data goes in and out
of your element, and that makes them a very important item in the process
of element creation. In the boilerplate code, we have seen how static pad
templates take care of registering pad templates with the element class.
Here, we will see how to create actual elements, use an <function>_event
()</function>-function to configure for a particular format and how to
register functions to let data flow through the element.
</para>
<para>
In the element <function>_init ()</function> function, you create the pad
from the pad template that has been registered with the element class in
the <function>_class_init ()</function> function. After creating the pad,
you have to set a <function>_chain ()</function> function pointer that will
receive and process the input data on the sinkpad.
You can optionally also set an <function>_event ()</function> function
pointer and a <function>_query ()</function> function pointer.
Alternatively, pads can also operate in looping mode, which means that they
can pull data themselves. More on this topic later. After that, you have
to register the pad with the element. This happens like this:
</para>
<programlisting><!-- example-begin init.func a --><!--
#include "filter.h"
#include &lt;string.h&gt;
static GstStateChangeReturn
gst_my_filter_change_state (GstElement * element, GstStateChange transition);
G_DEFINE_TYPE (GstMyFilter, gst_my_filter, GST_TYPE_ELEMENT);
static void
gst_my_filter_class_init (gpointer klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
static GstStaticPadTemplate sink_template =
GST_STATIC_PAD_TEMPLATE (
"sink",
GST_PAD_SINK,
GST_PAD_ALWAYS,
GST_STATIC_CAPS ("ANY")
);
static GstStaticPadTemplate src_template =
GST_STATIC_PAD_TEMPLATE (
"src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS ("ANY")
);
gst_element_class_set_static_metadata (element_class,
"An example plugin",
"Example/FirstExample",
"Shows the basic structure of a plugin",
"your name <your.name@your.isp>");
gst_element_class_add_pad_template (element_class,
gst_static_pad_template_get (&src_template));
gst_element_class_add_pad_template (element_class,
gst_static_pad_template_get (&sink_template));
}
static void
gst_my_filter_class_init (GstMyFilterClass * klass)
{
GST_ELEMENT_CLASS (klass)->change_state = gst_my_filter_change_state;
}
--><!-- example-end init.func a -->
<!-- example-begin init.func c --><!--
static GstFlowReturn gst_my_filter_chain (GstPad *pad,
GstObject *parent,
GstBuffer *buf);
static gboolean gst_my_filter_sink_event (GstPad *pad,
GstObject *parent,
GstEvent *event);
static gboolean gst_my_filter_src_query (GstPad *pad,
GstObject *parent,
GstQuery *query);
static gboolean gst_my_filter_sink_query (GstPad *pad,
GstObject *parent,
GstQuery *query);
--><!-- example-end init.func c -->
<!-- example-begin init.func d -->
static void
gst_my_filter_init (GstMyFilter *filter)
{
/* pad through which data comes in to the element */
filter-&gt;sinkpad = gst_pad_new_from_static_template (
&amp;sink_template, "sink");
/* pads are configured here with gst_pad_set_*_function () */
<!-- example-end init.func d -->
<!-- example-begin init.func e --><!--
gst_pad_set_chain_function (filter-&gt;sinkpad, gst_my_filter_chain);
gst_pad_set_event_function (filter-&gt;sinkpad, gst_my_filter_sink_event);
gst_pad_set_query_function (filter-&gt;sinkpad, gst_my_filter_sink_query);
--><!-- example-end init.func e -->
<!-- example-begin init.func f -->
gst_element_add_pad (GST_ELEMENT (filter), filter-&gt;sinkpad);
/* pad through which data goes out of the element */
filter-&gt;srcpad = gst_pad_new_from_static_template (
&amp;src_template, "src");
/* pads are configured here with gst_pad_set_*_function () */
<!-- example-end init.func f -->
<!-- example-begin init.func g --><!--
gst_pad_set_query_function (filter-&gt;srcpad, gst_my_filter_src_query);
--><!-- example-end init.func g -->
<!-- example-begin init.func h -->
gst_element_add_pad (GST_ELEMENT (filter), filter-&gt;srcpad);
/* properties initial value */
filter->silent = FALSE;
}
<!-- example-end init.func h --></programlisting>
<!-- example-begin pads.c --><!--
#include "init.func"
static gboolean
gst_my_filter_event (GstPad * pad, GstObject * parent, GstEvent * event)
{
return gst_pad_event_default (pad, parent, event);
}
static GstFlowReturn
gst_my_filter_chain (GstPad * pad, GstObject * parent, GstBuffer * buf)
{
return gst_pad_push (GST_MY_FILTER (parent)->srcpad, buf);
}
static GstStateChangeReturn
gst_my_filter_change_state (GstElement * element, GstStateChange transition)
{
return GST_CALL_PARENT_WITH_DEFAULT (GST_ELEMENT_CLASS,
change_state, (element, transition), GST_STATE_CHANGE_SUCCESS);
}
#include "register.func"
--><!-- example-end pads.c -->
</chapter>

View file

@ -1,169 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-args" xreflabel="Adding Properties">
<title>Adding Properties</title>
<para>
The primary and most important way of controlling how an element behaves,
is through GObject properties. GObject properties are defined in the
<function>_class_init ()</function> function. The element optionally
implements a <function>_get_property ()</function> and a
<function>_set_property ()</function> function. These functions will be
notified if an application changes or requests the value of a property,
and can then fill in the value or take action required for that property
to change value internally.
</para>
<para>
You probably also want to keep an instance variable around
with the currently configured value of the property that you use in the
get and set functions.
Note that <classname>GObject</classname> will not automatically set your
instance variable to the default value, you will have to do that in the
<function>_init ()</function> function of your element.
</para>
<programlisting><!-- example-begin properties.c a --><!--
#include "filter.h"
G_DEFINE_TYPE (GstMyFilter, gst_my_filter, GST_TYPE_ELEMENT);
static void
gst_my_filter_class_init (gpointer klass)
{
}
static void
gst_my_filter_init (GstMyFilter * filter)
{
}
--><!-- example-end properties.c a -->
<!-- example-begin properties.c b -->
/* properties */
enum {
PROP_0,
PROP_SILENT
/* FILL ME */
};
static void gst_my_filter_set_property (GObject *object,
guint prop_id,
const GValue *value,
GParamSpec *pspec);
static void gst_my_filter_get_property (GObject *object,
guint prop_id,
GValue *value,
GParamSpec *pspec);
static void
gst_my_filter_class_init (GstMyFilterClass *klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
/* define virtual function pointers */
object_class->set_property = gst_my_filter_set_property;
object_class->get_property = gst_my_filter_get_property;
/* define properties */
g_object_class_install_property (object_class, PROP_SILENT,
g_param_spec_boolean ("silent", "Silent",
"Whether to be very verbose or not",
FALSE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS));
}
static void
gst_my_filter_set_property (GObject *object,
guint prop_id,
const GValue *value,
GParamSpec *pspec)
{
GstMyFilter *filter = GST_MY_FILTER (object);
switch (prop_id) {
case PROP_SILENT:
filter->silent = g_value_get_boolean (value);
g_print ("Silent argument was changed to %s\n",
filter->silent ? "true" : "false");
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
break;
}
}
static void
gst_my_filter_get_property (GObject *object,
guint prop_id,
GValue *value,
GParamSpec *pspec)
{
GstMyFilter *filter = GST_MY_FILTER (object);
switch (prop_id) {
case PROP_SILENT:
g_value_set_boolean (value, filter->silent);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
break;
}
}
<!-- example-end properties.c b -->
<!-- example-begin properties.c c --><!--
#include "register.func"
--><!-- example-end properties.c c --></programlisting>
<para>
The above is a very simple example of how properties are used. Graphical
applications will use these properties and will display a
user-controllable widget with which these properties can be changed.
This means that - for the property to be as user-friendly
as possible - you should be as exact as possible in the definition of the
property. Not only in defining ranges in between which valid properties
can be located (for integers, floats, etc.), but also in using very
descriptive (better yet: internationalized) strings in the definition of
the property, and if possible using enums and flags instead of integers.
The GObject documentation describes these in a very complete way, but
below, we'll give a short example of where this is useful. Note that using
integers here would probably completely confuse the user, because they
make no sense in this context. The example is stolen from videotestsrc.
</para>
<programlisting>
typedef enum {
GST_VIDEOTESTSRC_SMPTE,
GST_VIDEOTESTSRC_SNOW,
GST_VIDEOTESTSRC_BLACK
} GstVideotestsrcPattern;
[..]
#define GST_TYPE_VIDEOTESTSRC_PATTERN (gst_videotestsrc_pattern_get_type ())
static GType
gst_videotestsrc_pattern_get_type (void)
{
static GType videotestsrc_pattern_type = 0;
if (!videotestsrc_pattern_type) {
static GEnumValue pattern_types[] = {
{ GST_VIDEOTESTSRC_SMPTE, "SMPTE 100% color bars", "smpte" },
{ GST_VIDEOTESTSRC_SNOW, "Random (television snow)", "snow" },
{ GST_VIDEOTESTSRC_BLACK, "0% Black", "black" },
{ 0, NULL, NULL },
};
videotestsrc_pattern_type =
g_enum_register_static ("GstVideotestsrcPattern",
pattern_types);
}
return videotestsrc_pattern_type;
}
[..]
static void
gst_videotestsrc_class_init (GstvideotestsrcClass *klass)
{
[..]
g_object_class_install_property (G_OBJECT_CLASS (klass), PROP_PATTERN,
g_param_spec_enum ("pattern", "Pattern",
"Type of test pattern to generate",
GST_TYPE_VIDEOTESTSRC_PATTERN, GST_VIDEOTESTSRC_SMPTE,
G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS));
[..]
}
</programlisting>
</chapter>

View file

@ -1,72 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-queryfn">
<title>The query function</title>
<para>
Through the query function, your element will receive queries that it
has to reply to. These are queries like position, duration but also
about the supported formats and scheduling modes your element supports.
Queries can travel both upstream and downstream, so you can receive them
on sink pads as well as source pads.
</para>
<para>
Below follows a very simple query function that we install on the source
pad of our element.
</para>
<programlisting>
<![CDATA[
static gboolean gst_my_filter_src_query (GstPad *pad,
GstObject *parent,
GstQuery *query);
[..]
static void
gst_my_filter_init (GstMyFilter * filter)
{
[..]
/* configure event function on the pad before adding
* the pad to the element */
gst_pad_set_query_function (filter->srcpad,
gst_my_filter_src_query);
[..]
}
static gboolean
gst_my_filter_src_query (GstPad *pad,
GstObject *parent,
GstQuery *query)
{
gboolean ret;
GstMyFilter *filter = GST_MY_FILTER (parent);
switch (GST_QUERY_TYPE (query)) {
case GST_QUERY_POSITION:
/* we should report the current position */
[...]
break;
case GST_QUERY_DURATION:
/* we should report the duration here */
[...]
break;
case GST_QUERY_CAPS:
/* we should report the supported caps here */
[...]
break;
default:
/* just call the default handler */
ret = gst_pad_query_default (pad, parent, query);
break;
}
return ret;
}
]]>
</programlisting>
<para>
It is a good idea to call the default query handler
<function>gst_pad_query_default ()</function> for unknown queries.
Depending on the query type, the default handler will forward
the query or simply unref it.
</para>
</chapter>

View file

@ -1,16 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-signals">
<title>Signals</title>
<para>
GObject signals can be used to notify applications of events specific
to this object. Note, however, that the application needs to be aware
of signals and their meaning, so if you're looking for a generic way
for application-element interaction, signals are probably not what
you're looking for. In many cases, however, signals can be very useful.
See the <ulink type="http"
url="http://library.gnome.org/devel/gobject/stable/">GObject
documentation</ulink> for all internals about signals.
</para>
</chapter>

View file

@ -1,184 +0,0 @@
<chapter id="chapter-statemanage-states">
<title>What are states?</title>
<para>
A state describes whether the element instance is initialized, whether it
is ready to transfer data and whether it is currently handling data. There
are four states defined in &GStreamer;:
</para>
<itemizedlist>
<listitem>
<para>
<symbol>GST_STATE_NULL</symbol>
</para>
</listitem>
<listitem>
<para>
<symbol>GST_STATE_READY</symbol>
</para>
</listitem>
<listitem>
<para>
<symbol>GST_STATE_PAUSED</symbol>
</para>
</listitem>
<listitem>
<para>
<symbol>GST_STATE_PLAYING</symbol>
</para>
</listitem>
</itemizedlist>
<para>
which will from now on be referred to simply as <quote>NULL</quote>,
<quote>READY</quote>, <quote>PAUSED</quote> and <quote>PLAYING</quote>.
</para>
<para>
<symbol>GST_STATE_NULL</symbol> is the default state of an element. In this state, it
has not allocated any runtime resources, it has not loaded any runtime
libraries and it can obviously not handle data.
</para>
<para>
<symbol>GST_STATE_READY</symbol> is the next state that an element can be in. In the
READY state, an element has all default resources (runtime-libraries,
runtime-memory) allocated. However, it has not yet allocated or defined
anything that is stream-specific. When going from NULL to READY state
(<symbol>GST_STATE_CHANGE_NULL_TO_READY</symbol>), an element should
allocate any non-stream-specific resources and should load runtime-loadable
libraries (if any). When going the other way around (from READY to NULL,
<symbol>GST_STATE_CHANGE_READY_TO_NULL</symbol>), an element should unload
these libraries and free all allocated resources. Examples of such
resources are hardware devices. Note that files are generally streams,
and these should thus be considered as stream-specific resources; therefore,
they should <emphasis>not</emphasis> be allocated in this state.
</para>
<para>
<symbol>GST_STATE_PAUSED</symbol> is the state in which an element is
ready to accept and handle data. For most elements this state is the same
as PLAYING. The only exception to this rule are sink elements. Sink
elements only accept one single buffer of data and then block. At this
point the pipeline is 'prerolled' and ready to render data immediately.
</para>
<para>
<symbol>GST_STATE_PLAYING</symbol> is the highest state that an element
can be in. For most elements this state is exactly the same as PAUSED,
they accept and process events and buffers with data. Only sink elements
need to differentiate between PAUSED and PLAYING state. In PLAYING state,
sink elements actually render incoming data, e.g. output audio to a sound
card or render video pictures to an image sink.
</para>
<sect1 id="section-statemanage-filters">
<title>Managing filter state</title>
<para>
If at all possible, your element should derive from one of the new base
classes (<xref linkend="chapter-other-base"/>). There are ready-made
general purpose base classes for different types of sources, sinks and
filter/transformation elements. In addition to those, specialised base
classes exist for audio and video elements and others.
</para>
<para>
If you use a base class, you will rarely have to handle state changes
yourself. All you have to do is override the base class's start() and
stop() virtual functions (might be called differently depending on the
base class) and the base class will take care of everything for you.
</para>
<para>
If, however, you do not derive from a ready-made base class, but from
GstElement or some other class not built on top of a base class, you
will most likely have to implement your own state change function to
be notified of state changes. This is definitively necessary if your
plugin is a demuxer or a muxer, as there are no base classes for
muxers or demuxers yet.
</para>
<para>
An element can be notified of state changes through a virtual function
pointer. Inside this function, the element can initialize any sort of
specific data needed by the element, and it can optionally fail to
go from one state to another.
</para>
<para>
Do not g_assert for unhandled state changes; this is taken care of by
the GstElement base class.
</para>
<programlisting>
static GstStateChangeReturn
gst_my_filter_change_state (GstElement *element, GstStateChange transition);
static void
gst_my_filter_class_init (GstMyFilterClass *klass)
{
GstElementClass *element_class = GST_ELEMENT_CLASS (klass);
element_class->change_state = gst_my_filter_change_state;
}
<!-- example-begin state.c a --><!--
#include "init.func"
#include "caps.func"
#include "chain.func"
#include "state.func"
--><!-- example-end state.c a -->
<!-- example-begin state.func a --><!--
static gboolean
gst_my_filter_allocate_memory (GstMyFilter * filter)
{
return TRUE;
}
static void
gst_my_filter_free_memory (GstMyFilter * filter)
{
}
--><!-- example-end state.func a -->
<!-- example-begin state.func b -->
static GstStateChangeReturn
gst_my_filter_change_state (GstElement *element, GstStateChange transition)
{
GstStateChangeReturn ret = GST_STATE_CHANGE_SUCCESS;
GstMyFilter *filter = GST_MY_FILTER (element);
switch (transition) {
case GST_STATE_CHANGE_NULL_TO_READY:
if (!gst_my_filter_allocate_memory (filter))
return GST_STATE_CHANGE_FAILURE;
break;
default:
break;
}
ret = GST_ELEMENT_CLASS (parent_class)-&gt;change_state (element, transition);
if (ret == GST_STATE_CHANGE_FAILURE)
return ret;
switch (transition) {
case GST_STATE_CHANGE_READY_TO_NULL:
gst_my_filter_free_memory (filter);
break;
default:
break;
}
return ret;
}
<!-- example-end state.func b -->
<!-- example-begin state.c b --><!--
#include "register.func"
--><!-- example-end state.c b --></programlisting>
<para>
Note that upwards (NULL=&gt;READY, READY=&gt;PAUSED, PAUSED=&gt;PLAYING)
and downwards (PLAYING=&gt;PAUSED, PAUSED=&gt;READY, READY=&gt;NULL) state
changes are handled in two separate blocks with the downwards state change
handled only after we have chained up to the parent class's state
change function. This is necessary in order to safely handle concurrent
access by multiple threads.
</para>
<para>
The reason for this is that in the case of downwards state changes
you don't want to destroy allocated resources while your plugin's
chain function (for example) is still accessing those resources in
another thread. Whether your chain function might be running or not
depends on the state of your plugin's pads, and the state of those
pads is closely linked to the state of the element. Pad states are
handled in the GstElement class's state change function, including
proper locking, that's why it is essential to chain up before
destroying allocated resources.
</para>
</sect1>
</chapter>

View file

@ -1,216 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-building-testapp">
<title>Building a Test Application</title>
<para>
Often, you will want to test your newly written plugin in an as small
setting as possible. Usually, <filename>gst-launch-1.0</filename> is a
good first step at testing a plugin. If you have not installed your
plugin in a directory that GStreamer searches, then you will need to
set the plugin path. Either set GST_PLUGIN_PATH to the directory
containing your plugin, or use the command-line option --gst-plugin-path.
If you based your plugin off of the gst-plugin template, then this
will look something like
<command>
gst-launch-1.0 --gst-plugin-path=$HOME/gst-template/gst-plugin/src/.libs TESTPIPELINE
</command>
However, you will often need more
testing features than gst-launch-1.0 can provide, such as seeking, events,
interactivity and more. Writing your own small testing program is the
easiest way to accomplish this. This section explains - in a few words
- how to do that. For a complete application development guide, see the
<ulink type="http" url="../../manual/html/index.html">Application Development
Manual</ulink>.
</para>
<para>
At the start, you need to initialize the &GStreamer; core library by
calling <function>gst_init ()</function>. You can alternatively call
<function>gst_init_get_option_group ()</function>, which will return
a pointer to GOptionGroup. You can then use GOption to handle the
initialization, and this will finish the &GStreamer; initialization.
</para>
<para>
You can create elements using <function>gst_element_factory_make ()</function>,
where the first argument is the element type that you want to create,
and the second argument is a free-form name. The example at the end uses
a simple filesource - decoder - soundcard output pipeline, but you can
use specific debugging elements if that's necessary. For example, an
<classname>identity</classname> element can be used in the middle of
the pipeline to act as a data-to-application transmitter. This can be
used to check the data for misbehaviours or correctness in your test
application. Also, you can use a <classname>fakesink</classname>
element at the end of the pipeline to dump your data to the stdout
(in order to do this, set the <function>dump</function> property to
TRUE). Lastly, you can use valgrind to check for memory errors.
</para>
<para>
During linking, your test application can use filtered caps
as a way to drive a specific type of data to or from your element. This
is a very simple and effective way of checking multiple types of input
and output in your element.
</para>
<para>
Note that during running, you should listen for at least the
<quote>error</quote> and <quote>eos</quote> messages on the bus
and/or your plugin/element to check for correct handling of this. Also,
you should add events into the pipeline and make sure your plugin handles
these correctly (with respect to clocking, internal caching, etc.).
</para>
<para>
Never forget to clean up memory in your plugin or your test application.
When going to the NULL state, your element should clean up allocated
memory and caches. Also, it should close down any references held to
possible support libraries. Your application should <function>unref ()</function>
the pipeline and make sure it doesn't crash.
</para>
<programlisting><!-- example-begin test.c -->
#include &lt;gst/gst.h&gt;
static gboolean
bus_call (GstBus *bus,
GstMessage *msg,
gpointer data)
{
GMainLoop *loop = data;
switch (GST_MESSAGE_TYPE (msg)) {
case GST_MESSAGE_EOS:
g_print ("End-of-stream\n");
g_main_loop_quit (loop);
break;
case GST_MESSAGE_ERROR: {
gchar *debug = NULL;
GError *err = NULL;
gst_message_parse_error (msg, &amp;err, &amp;debug);
g_print ("Error: %s\n", err->message);
g_error_free (err);
if (debug) {
g_print ("Debug details: %s\n", debug);
g_free (debug);
}
g_main_loop_quit (loop);
break;
}
default:
break;
}
return TRUE;
}
gint
main (gint argc,
gchar *argv[])
{
GstStateChangeReturn ret;
GstElement *pipeline, *filesrc, *decoder, *filter, *sink;
GstElement *convert1, *convert2, *resample;
GMainLoop *loop;
GstBus *bus;
guint watch_id;
/* initialization */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
if (argc != 2) {
g_print ("Usage: %s &lt;mp3 filename&gt;\n", argv[0]);
return 01;
}
/* create elements */
pipeline = gst_pipeline_new ("my_pipeline");
/* watch for messages on the pipeline's bus (note that this will only
* work like this when a GLib main loop is running) */
bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
watch_id = gst_bus_add_watch (bus, bus_call, loop);
gst_object_unref (bus);
filesrc = gst_element_factory_make ("filesrc", "my_filesource");
decoder = gst_element_factory_make ("mad", "my_decoder");
/* putting an audioconvert element here to convert the output of the
* decoder into a format that my_filter can handle (we are assuming it
* will handle any sample rate here though) */
convert1 = gst_element_factory_make ("audioconvert", "audioconvert1");
/* use "identity" here for a filter that does nothing */
filter = gst_element_factory_make ("my_filter", "my_filter");
/* there should always be audioconvert and audioresample elements before
* the audio sink, since the capabilities of the audio sink usually vary
* depending on the environment (output used, sound card, driver etc.) */
convert2 = gst_element_factory_make ("audioconvert", "audioconvert2");
resample = gst_element_factory_make ("audioresample", "audioresample");
sink = gst_element_factory_make ("pulsesink", "audiosink");
if (!sink || !decoder) {
g_print ("Decoder or output could not be found - check your install\n");
return -1;
} else if (!convert1 || !convert2 || !resample) {
g_print ("Could not create audioconvert or audioresample element, "
"check your installation\n");
return -1;
} else if (!filter) {
g_print ("Your self-written filter could not be found. Make sure it "
"is installed correctly in $(libdir)/gstreamer-1.0/ or "
"~/.gstreamer-1.0/plugins/ and that gst-inspect-1.0 lists it. "
"If it doesn't, check with 'GST_DEBUG=*:2 gst-inspect-1.0' for "
"the reason why it is not being loaded.");
return -1;
}
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
gst_bin_add_many (GST_BIN (pipeline), filesrc, decoder, convert1, filter,
convert2, resample, sink, NULL);
/* link everything together */
if (!gst_element_link_many (filesrc, decoder, convert1, filter, convert2,
resample, sink, NULL)) {
g_print ("Failed to link one or more elements!\n");
return -1;
}
/* run */
ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
if (ret == GST_STATE_CHANGE_FAILURE) {
GstMessage *msg;
g_print ("Failed to start up pipeline!\n");
/* check if there is an error message with details on the bus */
msg = gst_bus_poll (bus, GST_MESSAGE_ERROR, 0);
if (msg) {
GError *err = NULL;
gst_message_parse_error (msg, &amp;err, NULL);
g_print ("ERROR: %s\n", err-&gt;message);
g_error_free (err);
gst_message_unref (msg);
}
return -1;
}
g_main_loop_run (loop);
/* clean up */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
g_source_remove (watch_id);
g_main_loop_unref (loop);
return 0;
}
<!-- example-end test.c --></programlisting>
</chapter>

View file

@ -1,395 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-intro-basics" xreflabel="Foundations">
<title>Foundations</title>
<para><!-- synchronize with AppDevMan -->
This chapter of the guide introduces the basic concepts of &GStreamer;.
Understanding these concepts will help you grok the issues involved in
extending &GStreamer;. Many of these concepts are explained in greater
detail in the &GstAppDevMan;; the basic concepts presented here serve mainly
to refresh your memory.
</para>
<!-- ############ sect1 ############# -->
<sect1 id="section-basics-elements" xreflabel="Elements and Plugins">
<title>Elements and Plugins</title>
<para>
Elements are at the core of &GStreamer;. In the context of plugin
development, an <emphasis>element</emphasis> is an object derived from the
<ulink type="http" url="../../gstreamer/html/GstElement.html"><classname>
GstElement</classname></ulink> class. Elements provide some sort of
functionality when linked with other elements: For example, a source
element provides data to a stream, and a filter element acts on the data
in a stream. Without elements, &GStreamer; is just a bunch of conceptual
pipe fittings with nothing to link. A large number of elements ship
with &GStreamer;, but extra elements can also be written.
</para>
<para>
Just writing a new element is not entirely enough, however: You will need
to encapsulate your element in a <emphasis>plugin</emphasis> to enable
&GStreamer; to use it. A plugin is essentially a loadable block of code,
usually called a shared object file or a dynamically linked library. A
single plugin may contain the implementation of several elements, or just
a single one. For simplicity, this guide concentrates primarily on plugins
containing one element.
</para>
<para>
A <emphasis>filter</emphasis> is an important type of element that
processes a stream of data. Producers and consumers of data are called
<emphasis>source</emphasis> and <emphasis>sink</emphasis> elements,
respectively. <emphasis>Bin</emphasis> elements contain other elements.
One type of bin is responsible for synchronization of the elements that they
contain so that data flows smoothly. Another type of bin, called
<emphasis>autoplugger</emphasis> elements, automatically add other
elements to the bin and links them together so that they act as a
filter between two arbitrary stream types.
</para>
<para>
The plugin mechanism is used everywhere in &GStreamer;, even if only the
standard packages are being used. A few very basic functions reside in the
core library, and all others are implemented in plugins. A plugin registry
is used to store the details of the plugins in an binary registry file.
This way, a program using &GStreamer; does not have to load all plugins to
determine which are needed. Plugins are only loaded when their provided
elements are requested.
</para>
<para>
See the &GstLibRef; for the current implementation details of <ulink
type="http"
url="../../gstreamer/html/GstElement.html"><classname>GstElement</classname></ulink>
and <ulink type="http"
url="../../gstreamer/html/GstPlugin.html"><classname>GstPlugin</classname></ulink>.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-basics-pads" xreflabel="Pads">
<title>Pads</title>
<para>
<emphasis>Pads</emphasis> are used to negotiate links and data flow
between elements in &GStreamer;. A pad can be viewed as a
<quote>place</quote> or <quote>port</quote> on an element where
links may be made with other elements, and through which data can
flow to or from those elements. Pads have specific data handling
capabilities: A pad can restrict the type of data that flows
through it. Links are only allowed between two pads when the
allowed data types of the two pads are compatible.
</para>
<para>
An analogy may be helpful here. A pad is similar to a plug or jack on a
physical device. Consider, for example, a home theater system consisting
of an amplifier, a DVD player, and a (silent) video projector. Linking
the DVD player to the amplifier is allowed because both devices have audio
jacks, and linking the projector to the DVD player is allowed because
both devices have compatible video jacks. Links between the
projector and the amplifier may not be made because the projector and
amplifier have different types of jacks. Pads in &GStreamer; serve the
same purpose as the jacks in the home theater system.
</para>
<para>
For the most part, all data in &GStreamer; flows one way through a link
between elements. Data flows out of one element through one or more
<emphasis>source pads</emphasis>, and elements accept incoming data through
one or more <emphasis>sink pads</emphasis>. Source and sink elements have
only source and sink pads, respectively.
</para>
<para>
See the &GstLibRef; for the current implementation details of a <ulink
type="http"
url="../../gstreamer/html/GstPad.html"><classname>GstPad</classname></ulink>.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-basics-data" xreflabel="Data, Buffers and Events">
<title>GstMiniObject, Buffers and Events</title>
<para>
All streams of data in &GStreamer; are chopped up into chunks that are
passed from a source pad on one element to a sink pad on another element.
<emphasis>GstMiniObject</emphasis> is the structure used to hold these
chunks of data.
</para>
<para>
GstMiniObject contains the following important types:
<itemizedlist>
<listitem>
<para>
An exact type indicating what type of data (event, buffer, ...)
this GstMiniObject is.
</para>
</listitem>
<listitem>
<para>
A reference count indicating the number of elements currently
holding a reference to the miniobject. When the reference count
falls to zero, the miniobject will be disposed, and its memory will be
freed in some sense (see below for more details).
</para>
</listitem>
</itemizedlist>
</para>
<para>
For data transport, there are two types of GstMiniObject defined:
events (control) and buffers (content).
</para>
<para>
Buffers may contain any sort of data that the two linked pads
know how to handle. Normally, a buffer contains a chunk of some sort of
audio or video data that flows from one element to another.
</para>
<para>
Buffers also contain metadata describing the buffer's contents. Some of
the important types of metadata are:
<itemizedlist>
<listitem>
<para>
Pointers to one or more GstMemory objects. GstMemory objects are
refcounted objects that encapsulate a region of memory.
</para>
</listitem>
<listitem>
<para>
A timestamp indicating the preferred display timestamp of the
content in the buffer.
</para>
</listitem>
</itemizedlist>
</para>
<para>
Events
contain information on the state of the stream flowing between the two
linked pads. Events will only be sent if the element explicitly supports
them, else the core will (try to) handle the events automatically. Events
are used to indicate, for example, a media type, the end of a
media stream or that the cache should be flushed.
</para>
<para>
Events may contain several of the following items:
<itemizedlist>
<listitem>
<para>
A subtype indicating the type of the contained event.
</para>
</listitem>
<listitem>
<para>
The other contents of the event depend on the specific event type.
</para>
</listitem>
</itemizedlist>
</para>
<para>
Events will be discussed extensively in <xref linkend="chapter-advanced-events"/>.
Until then, the only event that will be used is the <emphasis>EOS</emphasis>
event, which is used to indicate the end-of-stream (usually end-of-file).
</para>
<para>
See the &GstLibRef; for the current implementation details of a <ulink
type="http"
url="../../gstreamer/html/gstreamer-GstMiniObject.html"><classname>GstMiniObject</classname></ulink>, <ulink type="http"
url="../../gstreamer/html/GstBuffer.html"><classname>GstBuffer</classname></ulink> and <ulink type="http"
url="../../gstreamer/html/GstEvent.html"><classname>GstEvent</classname></ulink>.
</para>
<sect2 id="sect2-buffer-allocation" xreflabel="Buffer Allocation">
<title>Buffer Allocation</title>
<para>
Buffers are able to store chunks of memory of several different
types. The most generic type of buffer contains memory allocated
by malloc(). Such buffers, although convenient, are not always
very fast, since data often needs to be specifically copied into
the buffer.
</para>
<para>
Many specialized elements create buffers that point to special
memory. For example, the filesrc element usually
maps a file into the address space of the application (using mmap()),
and creates buffers that point into that address range. These
buffers created by filesrc act exactly like generic buffers, except
that they are read-only. The buffer freeing code automatically
determines the correct method of freeing the underlying memory.
Downstream elements that receive these kinds of buffers do not
need to do anything special to handle or unreference it.
</para>
<para>
Another way an element might get specialized buffers is to
request them from a downstream peer through a GstBufferPool or
GstAllocator. Elements can ask a GstBufferPool or GstAllocator
from the downstream peer element. If downstream is able to provide
these objects, upstream can use them to allocate buffers.
See more in <xref linkend="chapter-allocation"/>.
</para>
<para>
Many sink elements have accelerated methods for copying data
to hardware, or have direct access to hardware. It is common
for these elements to be able to create a GstBufferPool or
GstAllocator for their upstream peers. One such example is
ximagesink. It creates buffers that contain XImages. Thus,
when an upstream peer copies data into the buffer, it is copying
directly into the XImage, enabling ximagesink to draw the
image directly to the screen instead of having to copy data
into an XImage first.
</para>
<para>
Filter elements often have the opportunity to either work on
a buffer in-place, or work while copying from a source buffer
to a destination buffer. It is optimal to implement both
algorithms, since the &GStreamer; framework can choose the
fastest algorithm as appropriate. Naturally, this only makes
sense for strict filters -- elements that have exactly the
same format on source and sink pads.
</para>
</sect2>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-basics-types" xreflabel="Types and Properties">
<title>Media types and Properties</title>
<para>
&GStreamer; uses a type system to ensure that the data passed between
elements is in a recognized format. The type system is also important
for ensuring that the parameters required to fully specify a format match
up correctly when linking pads between elements. Each link that is
made between elements has a specified type and optionally a set of
properties. See more about caps negotiation in
<xref linkend="chapter-negotiation"/>.
</para>
<!-- ############ sect2 ############# -->
<sect2 id="sect2-types-basictypes" xreflabel="Basic Types">
<title>The Basic Types</title>
<para>
&GStreamer; already supports many basic media types. Following is a
table of a few of the basic types used for buffers in
&GStreamer;. The table contains the name ("media type") and a
description of the type, the properties associated with the type, and
the meaning of each property. A full list of supported types is
included in <xref linkend="section-types-definitions"/>.
</para>
<table frame="all" id="table-basictypes" xreflabel="Table of Example Types">
<title>Table of Example Types</title>
<tgroup cols="6" align="left" colsep="1" rowsep="1">
<thead>
<row>
<entry>Media Type</entry>
<entry>Description</entry>
<entry>Property</entry>
<entry>Property Type</entry>
<entry>Property Values</entry>
<entry>Property Description</entry>
</row>
</thead>
<tbody valign="top">
<!-- ############ type ############# -->
<row>
<entry morerows="1">audio/*</entry>
<entry morerows="1">
<emphasis>All audio types</emphasis>
</entry>
<entry>rate</entry>
<entry>integer</entry>
<entry>greater than 0</entry>
<entry>
The sample rate of the data, in samples (per channel) per second.
</entry>
</row>
<row>
<entry>channels</entry>
<entry>integer</entry>
<entry>greater than 0</entry>
<entry>
The number of channels of audio data.
</entry>
</row>
<!-- ############ type ############# -->
<row>
<entry>audio/x-raw</entry>
<entry>
Unstructured and uncompressed raw integer audio data.
</entry>
<entry>format</entry>
<entry>string</entry>
<entry>
S8 U8 S16LE S16BE U16LE U16BE S24_32LE S24_32BE U24_32LE U24_32BE S32LE S32BE U32LE U32BE
S24LE S24BE U24LE U24BE S20LE S20BE U20LE U20BE S18LE S18BE U18LE U18BE F32LE F32BE F64LE F64BE
</entry>
<entry>
The format of the sample data.
</entry>
</row>
<!-- ############ type ############# -->
<row>
<entry morerows="3">audio/mpeg</entry>
<entry morerows="3">
Audio data compressed using the MPEG audio encoding scheme.
</entry>
<entry>mpegversion</entry>
<entry>integer</entry>
<entry>1, 2 or 4</entry>
<entry>
The MPEG-version used for encoding the data. The value 1 refers
to MPEG-1, -2 and -2.5 layer 1, 2 or 3. The values 2 and 4 refer
to the MPEG-AAC audio encoding schemes.
</entry>
</row>
<row>
<entry>framed</entry>
<entry>boolean</entry>
<entry>0 or 1</entry>
<entry>
A true value indicates that each buffer contains exactly one
frame. A false value indicates that frames and buffers do not
necessarily match up.
</entry>
</row>
<row>
<entry>layer</entry>
<entry>integer</entry>
<entry>1, 2, or 3</entry>
<entry>
The compression scheme layer used to compress the data
<emphasis>(only if mpegversion=1)</emphasis>.
</entry>
</row>
<row>
<entry>bitrate</entry>
<entry>integer</entry>
<entry>greater than 0</entry>
<entry>
The bitrate, in bits per second. For VBR (variable bitrate)
MPEG data, this is the average bitrate.
</entry>
</row>
<!-- ############ type ############# -->
<row>
<entry>audio/x-vorbis</entry>
<entry>Vorbis audio data</entry>
<entry></entry>
<entry></entry>
<entry></entry>
<entry>
There are currently no specific properties defined for this type.
</entry>
</row>
</tbody>
</tgroup>
</table>
</sect2>
</sect1>
</chapter>

View file

@ -1,296 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-intro-preface" xreflabel="Preface">
<title>Preface</title>
<!-- ############ sect1 ############# -->
<sect1 id="section-intro-what"><!-- synchronize with AppDevMan -->
<title>What is &GStreamer;?</title>
<para>
&GStreamer; is a framework for creating streaming media applications.
The fundamental design comes from the video pipeline at Oregon Graduate
Institute, as well as some ideas from DirectShow.
</para>
<para>
&GStreamer;'s development framework makes it possible to write any
type of streaming multimedia application. The &GStreamer; framework
is designed to make it easy to write applications that handle audio
or video or both. It isn't restricted to audio and video, and can
process any kind of data flow.
The pipeline design is made to have little overhead above what the
applied filters induce. This makes &GStreamer; a good framework for
designing even high-end audio applications which put high demands on
latency or performance.
</para>
<para>
One of the most obvious uses of &GStreamer; is using it to build
a media player. &GStreamer; already includes components for building a
media player that can support a very wide variety of formats, including
MP3, Ogg/Vorbis, MPEG-1/2, AVI, Quicktime, mod, and more. &GStreamer;,
however, is much more than just another media player. Its main advantages
are that the pluggable components can be mixed and matched into arbitrary
pipelines so that it's possible to write a full-fledged video or audio
editing application.
</para>
<para>
The framework is based on plugins that will provide the various codec
and other functionality. The plugins can be linked and arranged in
a pipeline. This pipeline defines the flow of the data.
</para>
<para>
The &GStreamer; core function is to provide a framework for plugins,
data flow, synchronization and media type handling/negotiation. It
also provides an API to write applications using the various plugins.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-preface-who" xreflabel="Who Should Read This Guide?">
<title>Who Should Read This Guide?</title>
<para>
This guide explains how to write new modules for &GStreamer;. The guide is
relevant to several groups of people:
</para>
<itemizedlist>
<listitem>
<para>
Anyone who wants to add support for new ways of processing data in
&GStreamer;. For example, a person in this group might want to create
a new data format converter, a new visualization tool, or a new
decoder or encoder.
</para>
</listitem>
<listitem>
<para>
Anyone who wants to add support for new input and output devices. For
example, people in this group might want to add the ability to write
to a new video output system or read data from a digital camera or
special microphone.
</para>
</listitem>
<listitem>
<para>
Anyone who wants to extend &GStreamer; in any way. You need to have an
understanding of how the plugin system works before you can understand
the constraints that the plugin system places on the rest of the code.
Also, you might be surprised after reading this at how much can be
done with plugins.
</para>
</listitem>
</itemizedlist>
<para>
This guide is not relevant to you if you only want to use the existing
functionality of &GStreamer;, or if you just want to use an application
that uses &GStreamer;. If you are only interested in using existing
plugins to write a new application - and there are quite a lot of
plugins already - you might want to check the &GstAppDevMan;. If you
are just trying to get help with a &GStreamer; application, then you
should check with the user manual for that particular application.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-preface-reading" xreflabel="Preliminary Reading">
<title>Preliminary Reading</title>
<para>
This guide assumes that you are somewhat familiar with the basic workings
of &GStreamer;. For a gentle introduction to programming concepts in
&GStreamer;, you may wish to read the &GstAppDevMan; first.
Also check out the other documentation available on the <ulink type="http"
url="http://gstreamer.freedesktop.org/documentation/">&GStreamer; web site</ulink>.
</para>
<para><!-- synchronize with AppDevMan -->
In order to understand this manual, you will need to have a basic
understanding of the C language.
Since &GStreamer; adheres to the GObject programming model, this guide
also assumes that you understand the basics of <ulink type="http"
url="http://developer.gnome.org/gobject/stable/pt01.html">GObject</ulink>
programming.
You may also want to have a look
at Eric Harlow's book <emphasis>Developing Linux Applications with
GTK+ and GDK</emphasis>.
</para>
</sect1>
<!-- ############ sect1 ############# -->
<sect1 id="section-preface-structure" xreflabel="Structure of This Guide">
<title>Structure of This Guide</title>
<para>
To help you navigate through this guide, it is divided into several large
parts. Each part addresses a particular broad topic concerning &GStreamer;
plugin development. The parts of this guide are laid out in the following
order:
</para>
<itemizedlist>
<listitem>
<para>
<xref linkend="part-building"/> -
Introduction to the structure of a plugin, using an example audio
filter for illustration.
</para>
<para>
This part covers all the basic steps you generally need to perform
to build a plugin, such as registering the element with &GStreamer;
and setting up the basics so it can receive data from and send data
to neighbour elements. The discussion begins by giving examples of
generating the basic structures and registering an element in
<xref linkend="chapter-building-boiler"/>. Then, you will learn how
to write the code to get a basic filter plugin working in <xref
linkend="chapter-building-pads"/>, <xref
linkend="chapter-building-chainfn"/> and <xref
linkend="chapter-statemanage-states"/>.
</para>
<para>
After that, we will show some of the GObject concepts on how to
make an element configurable for applications and how to do
application-element interaction in
<xref linkend="chapter-building-args"/> and <xref
linkend="chapter-building-signals"/>. Next, you will learn to build
a quick test application to test all that you've just learned in
<xref linkend="chapter-building-testapp"/>. We will just touch upon
basics here. For full-blown application development, you should
look at <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/index.html">the
Application Development Manual</ulink>.
</para>
</listitem>
<listitem>
<para>
<xref linkend="part-advanced"/> -
Information on advanced features of &GStreamer; plugin development.
</para>
<para>
After learning about the basic steps, you should be able to create a
functional audio or video filter plugin with some nice features.
However, &GStreamer; offers more for plugin writers. This part of the
guide includes chapters on more advanced topics, such as scheduling,
media type definitions in &GStreamer;, clocks, interfaces and
tagging. Since these features are purpose-specific, you can read them
in any order, most of them don't require knowledge from other
sections.
</para>
<para>
The first chapter, named <xref linkend="chapter-scheduling"/>,
will explain some of the basics of element scheduling. It is not
very in-depth, but is mostly some sort of an introduction on why
other things work as they do. Read this chapter if you're interested
in &GStreamer; internals. Next, we will apply this knowledge and
discuss another type of data transmission than what you learned in
<xref linkend="chapter-building-chainfn"/>: <xref
linkend="chapter-scheduling"/>. Loop-based elements will give
you more control over input rate. This is useful when writing, for
example, muxers or demuxers.
</para>
<para>
Next, we will discuss media identification in &GStreamer; in <xref
linkend="chapter-building-types"/>. You will learn how to define
new media types and get to know a list of standard media types
defined in &GStreamer;.
</para>
<para>
In the next chapter, you will learn the concept of request- and
sometimes-pads, which are pads that are created dynamically, either
because the application asked for it (request) or because the media
stream requires it (sometimes). This will be in <xref
linkend="chapter-advanced-request"/>.
</para>
<para>
The next chapter, <xref linkend="chapter-advanced-clock"/>, will
explain the concept of clocks in &GStreamer;. You need this
information when you want to know how elements should achieve
audio/video synchronization.
</para>
<para>
The next few chapters will discuss advanced ways of doing
application-element interaction. Previously, we learned on the
GObject-ways of doing this in <xref linkend="chapter-building-args"/>
and <xref linkend="chapter-building-signals"/>. We will discuss
dynamic parameters, which are a way of defining element behaviour
over time in advance, in <xref linkend="chapter-dparams"/>. Next,
you will learn about interfaces in <xref
linkend="chapter-advanced-interfaces"/>. Interfaces are very target-
specific ways of application-element interaction, based on GObject's
GInterface. Lastly, you will learn about how metadata is handled in
&GStreamer; in <xref linkend="chapter-advanced-tagging"/>.
</para>
<para>
The last chapter, <xref linkend="chapter-advanced-events"/>, will
discuss the concept of events in &GStreamer;. Events are, on the
one hand, another way of doing application-element interaction. It
takes care of seeking, for example. On the other hand, it is also
a way in which elements interact with each other, such as letting
each other know about media stream discontinuities, forwarding tags
inside a pipeline and so on.
</para>
</listitem>
<listitem>
<para>
<xref linkend="part-other"/> - Explanation
of writing other plugin types.
</para>
<para>
Because the first two parts of the guide use an audio filter as an
example, the concepts introduced apply to filter plugins. But many of
the concepts apply equally to other plugin types, including sources,
sinks, and autopluggers. This part of the guide presents the issues
that arise when working on these more specialized plugin types. The
chapter starts with a special focus on elements that can be written
using a base-class (<xref linkend="chapter-other-base"/>), and
later also goes into writing special types of elements in
<xref linkend="chapter-other-oneton"/>, <xref
linkend="chapter-other-ntoone"/> and <xref
linkend="chapter-other-manager"/>.
</para>
</listitem>
<listitem>
<para>
<xref linkend="part-appendix"/> - Further
information for plugin developers.
</para>
<para>
The appendices contain some information that stubbornly refuses
to fit cleanly in other sections of the guide. Most of this section
is not yet finished.
</para>
</listitem>
</itemizedlist>
<para>
The remainder of this introductory part of the guide presents a short
overview of the basic concepts involved in &GStreamer; plugin development.
Topics covered include <xref linkend="section-basics-elements"/>, <xref
linkend="section-basics-pads"/>, <xref linkend="section-basics-data"/> and
<xref linkend="section-basics-types"/>. If you are already familiar with
this information, you can use this short overview to refresh your memory,
or you can skip to <xref linkend="part-building"/>.
</para>
<para>
As you can see, there a lot to learn, so let's get started!
</para>
<itemizedlist>
<listitem>
<para>
Creating compound and complex elements by extending from a GstBin.
This will allow you to create plugins that have other plugins embedded
in them.
</para>
</listitem>
<listitem>
<para>
Adding new media types to the registry along with typedetect functions.
This will allow your plugin to operate on a completely new media type.
</para>
</listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,310 +0,0 @@
<chapter id="chapter-other-base" xreflabel="Pre-made base classes">
<title>Pre-made base classes</title>
<para>
So far, we've been looking at low-level concepts of creating any type of
&GStreamer; element. Now, let's assume that all you want is to create an
simple audiosink that works exactly the same as, say,
<quote>esdsink</quote>, or a filter that simply normalizes audio volume.
Such elements are very general in concept and since they do nothing
special, they should be easier to code than to provide your own scheduler
activation functions and doing complex caps negotiation. For this purpose,
&GStreamer; provides base classes that simplify some types of elements.
Those base classes will be discussed in this chapter.
</para>
<sect1 id="section-base-sink" xreflabel="Writing a sink">
<title>Writing a sink</title>
<para>
Sinks are special elements in &GStreamer;. This is because sink elements
have to take care of <emphasis>preroll</emphasis>, which is the process
that takes care that elements going into the
<classname>GST_STATE_PAUSED</classname> state will have buffers ready
after the state change. The result of this is that such elements can
start processing data immediately after going into the
<classname>GST_STATE_PLAYING</classname> state, without requiring to
take some time to initialize outputs or set up decoders; all that is done
already before the state-change to <classname>GST_STATE_PAUSED</classname>
successfully completes.
</para>
<para>
Preroll, however, is a complex process that would require the same
code in many elements. Therefore, sink elements can derive from the
<classname>GstBaseSink</classname> base-class, which does preroll and
a few other utility functions automatically. The derived class only
needs to implement a bunch of virtual functions and will work
automatically.
</para>
<para>
The base class implement much of the synchronization logic that a
sink has to perform.
</para>
<para>
The <classname>GstBaseSink</classname> base-class specifies some
limitations on elements, though:
</para>
<itemizedlist>
<listitem>
<para>
It requires that the sink only has one sinkpad. Sink elements that
need more than one sinkpad, must make a manager element with
multiple GstBaseSink elements inside.
</para>
</listitem>
</itemizedlist>
<para>
Sink elements can derive from <classname>GstBaseSink</classname> using
the usual <classname>GObject</classname> convenience macro
<function>G_DEFINE_TYPE ()</function>:
</para>
<programlisting>
G_DEFINE_TYPE (GstMySink, gst_my_sink, GST_TYPE_BASE_SINK);
[..]
static void
gst_my_sink_class_init (GstMySinkClass * klass)
{
klass->set_caps = [..];
klass->render = [..];
[..]
}
</programlisting>
<para>
The advantages of deriving from <classname>GstBaseSink</classname> are
numerous:
</para>
<itemizedlist>
<listitem>
<para>
Derived implementations barely need to be aware of preroll, and do
not need to know anything about the technical implementation
requirements of preroll. The base-class does all the hard work.
</para>
<para>
Less code to write in the derived class, shared code (and thus
shared bugfixes).
</para>
</listitem>
</itemizedlist>
<para>
There are also specialized base classes for audio and video, let's look
at those a bit.
</para>
<sect2 id="section-base-audiosink" xreflabel="Writing an audio sink">
<title>Writing an audio sink</title>
<para>
Essentially, audio sink implementations are just a special case of a
general sink. An audio sink has the added complexity that it needs to
schedule playback of samples. It must match the clock selected in the
pipeline against the clock of the audio device and calculate and
compensate for drift and jitter.
</para>
<para>
There are two audio base classes that you can choose to
derive from, depending on your needs:
<classname>GstAudioBasesink</classname> and
<classname>GstAudioSink</classname>. The audiobasesink provides full
control over how synchronization and scheduling is handled, by using
a ringbuffer that the derived class controls and provides. The
audiosink base-class is a derived class of the audiobasesink,
implementing a standard ringbuffer implementing default
synchronization and providing a standard audio-sample clock. Derived
classes of this base class merely need to provide a <function>_open
()</function>, <function>_close ()</function> and a <function>_write
()</function> function implementation, and some optional functions.
This should suffice for many sound-server output elements and even
most interfaces. More demanding audio systems, such as Jack, would
want to implement the <classname>GstAudioBaseSink</classname>
base-class.
</para>
<para>
The <classname>GstAudioBaseSink</classname> has little to no
limitations and should fit virtually every implementation, but is
hard to implement. The <classname>GstAudioSink</classname>, on the
other hand, only fits those systems with a simple <function>open
()</function> / <function>close ()</function> / <function>write
()</function> API (which practically means pretty much all of them),
but has the advantage that it is a lot easier to implement. The
benefits of this second base class are large:
</para>
<itemizedlist>
<listitem>
<para>
Automatic synchronization, without any code in the derived class.
</para>
</listitem>
<listitem>
<para>
Also automatically provides a clock, so that other sinks (e.g. in
case of audio/video playback) are synchronized.
</para>
</listitem>
<listitem>
<para>
Features can be added to all audiosinks by making a change in the
base class, which makes maintenance easy.
</para>
</listitem>
<listitem>
<para>
Derived classes require only three small functions, plus some
<classname>GObject</classname> boilerplate code.
</para>
</listitem>
</itemizedlist>
<para>
In addition to implementing the audio base-class virtual functions,
derived classes can (should) also implement the
<classname>GstBaseSink</classname> <function>set_caps ()</function> and
<function>get_caps ()</function> virtual functions for negotiation.
</para>
</sect2>
<sect2 id="section-base-videosink" xreflabel="Writing a general sink">
<title>Writing a video sink</title>
<para>
Writing a videosink can be done using the
<classname>GstVideoSink</classname> base-class, which derives from
<classname>GstBaseSink</classname> internally. Currently, it does
nothing yet but add another compile dependency, so derived classes
will need to implement all base-sink virtual functions. When they do
this correctly, this will have some positive effects on the end user
experience with the videosink:
</para>
<itemizedlist>
<listitem>
<para>
Because of preroll (and the <function>preroll ()</function> virtual
function), it is possible to display a video frame already when
going into the <classname>GST_STATE_PAUSED</classname> state.
</para>
</listitem>
<listitem>
<para>
By adding new features to <classname>GstVideoSink</classname>, it
will be possible to add extensions to videosinks that affect all of
them, but only need to be coded once, which is a huge maintenance
benefit.
</para>
</listitem>
</itemizedlist>
</sect2>
</sect1>
<sect1 id="section-base-src" xreflabel="Writing a source">
<title>Writing a source</title>
<para>
In the previous part, particularly <xref
linkend="section-scheduling-randomxs"/>, we have learned that some types
of elements can provide random access. This applies most definitely to
source elements reading from a randomly seekable location, such as file
sources. However, other source elements may be better described as a
live source element, such as a camera source, an audio card source and
such; those are not seekable and do not provide byte-exact access. For
all such use cases, &GStreamer; provides two base classes:
<classname>GstBaseSrc</classname> for the basic source functionality, and
<classname>GstPushSrc</classname>, which is a non-byte exact source
base-class. The pushsource base class itself derives from basesource as
well, and thus all statements about the basesource apply to the
pushsource, too.
</para>
<para>
The basesrc class does several things automatically for derived classes,
so they no longer have to worry about it:
</para>
<itemizedlist>
<listitem>
<para>
Fixes to <classname>GstBaseSrc</classname> apply to all derived
classes automatically.
</para>
</listitem>
<listitem>
<para>
Automatic pad activation handling, and task-wrapping in case we get
assigned to start a task ourselves.
</para>
</listitem>
</itemizedlist>
<para>
The <classname>GstBaseSrc</classname> may not be suitable for all cases,
though; it has limitations:
</para>
<itemizedlist>
<listitem>
<para>
There is one and only one sourcepad. Source elements requiring
multiple sourcepads must implement a manager bin and use multiple
source elements internally or make a manager element that uses
a source element and a demuxer inside.
</para>
</listitem>
</itemizedlist>
<para>
It is possible to use special memory, such as X server memory pointers
or <function>mmap ()</function>'ed memory areas, as data pointers in
buffers returned from the <function>create()</function> virtual function.
</para>
<sect2 id="section-base-audiosrc" xreflabel="Writing an audio source">
<title>Writing an audio source</title>
<para>
An audio source is nothing more but a special case of a pushsource.
Audio sources would be anything that reads audio, such as a source
reading from a soundserver, a kernel interface (such as ALSA) or a
test sound / signal generator. &GStreamer; provides two base classes,
similar to the two audiosinks described in <xref
linkend="section-base-audiosink"/>; one is ringbuffer-based, and
requires the derived class to take care of its own scheduling,
synchronization and such. The other is based on this
<classname>GstAudioBaseSrc</classname> and is called
<classname>GstAudioSrc</classname>, and provides a simple
<function>open ()</function>, <function>close ()</function> and
<function>read ()</function> interface, which is rather simple to
implement and will suffice for most soundserver sources and audio
interfaces (e.g. ALSA or OSS) out there.
</para>
<para>
The <classname>GstAudioSrc</classname> base-class has several benefits
for derived classes, on top of the benefits of the
<classname>GstPushSrc</classname> base-class that it is based on:
</para>
<itemizedlist>
<listitem>
<para>
Does syncronization and provides a clock.
</para>
</listitem>
<listitem>
<para>
New features can be added to it and will apply to all derived
classes automatically.
</para>
</listitem>
</itemizedlist>
</sect2>
</sect1>
<sect1 id="section-base-transform"
xreflabel="Writing a transformation element">
<title>Writing a transformation element</title>
<para>
A third base-class that &GStreamer; provides is the
<classname>GstBaseTransform</classname>. This is a base class for
elements with one sourcepad and one sinkpad which act as a filter
of some sort, such as volume changing, audio resampling, audio format
conversion, and so on and so on. There is quite a lot of bookkeeping
that such elements need to do in order for things such as buffer
allocation forwarding, passthrough, in-place processing and such to all
work correctly. This base class does all that for you, so that you just
need to do the actual processing.
</para>
<para>
Since the <classname>GstBaseTransform</classname> is based on the 1-to-1
model for filters, it may not apply well to elements such as decoders,
which may have to parse properties from the stream. Also, it will not
work for elements requiring more than one sourcepad or sinkpad.
</para>
</sect1>
</chapter>

View file

@ -1,46 +0,0 @@
<chapter id="chapter-other-manager" xreflabel="Writing a Manager">
<title>Writing a Manager</title>
<para>
Managers are elements that add a function or unify the function of
another (series of) element(s). Managers are generally a
<classname>GstBin</classname> with one or more ghostpads. Inside them
is/are the actual element(s) that matters. There is several cases where
this is useful. For example:
</para>
<itemizedlist>
<listitem>
<para>
To add support for private events with custom event handling to
another element.
</para>
</listitem>
<listitem>
<para>
To add support for custom pad <function>_query ()</function>
or <function>_convert ()</function> handling to another element.
</para>
</listitem>
<listitem>
<para>
To add custom data handling before or after another element's data
handler function (generally its <function>_chain ()</function>
function).
</para>
</listitem>
<listitem>
<para>
To embed an element, or a series of elements, into something that
looks and works like a simple element to the outside world. This
is particular handy for implementing sources and sink elements with
multiple pads.
</para>
</listitem>
</itemizedlist>
<para>
Making a manager is about as simple as it gets. You can derive from a
<classname>GstBin</classname>, and in most cases, you can embed the
required elements in the <function>_init ()</function> already, including
setup of ghostpads. If you need any custom data handlers, you can connect
signals or embed a second element which you control.
</para>
</chapter>

View file

@ -1,155 +0,0 @@
<chapter id="chapter-other-ntoone"
xreflabel="Writing a N-to-1 Element or Muxer">
<title>Writing a N-to-1 Element or Muxer</title>
<para>
N-to-1 elements have been previously mentioned and discussed in both
<xref linkend="chapter-advanced-request"/> and in
<xref linkend="chapter-scheduling"/>. The main noteworthy thing
about N-to-1 elements is that each pad is push-based in its own thread,
and the N-to-1 element synchronizes those streams by
expected-timestamp-based logic. This means it lets all streams wait
except for the one that provides the earliest next-expected timestamp.
When that stream has passed one buffer, the next
earliest-expected-timestamp is calculated, and we start back where we
were, until all streams have reached EOS. There is a helper base class,
called <classname>GstCollectPads</classname>, that will help you to do
this.
</para>
<para>
Note, however, that this helper class will only help you with grabbing
a buffer from each input and giving you the one with earliest timestamp.
If you need anything more difficult, such as "don't-grab-a-new-buffer
until a given timestamp" or something like that, you'll need to do this
yourself.
</para>
<!--
Note: I'd like to have something like this in the final text, but since
the code below doesn't work and this is all 0.8'y, I commented it for now.
<sect1 id="section-muxer-negotiation" xreflabel="Negotiation">
<title>Negotiation</title>
<para>
Most container formats will have a fair amount of issues with
changing content on an elementary stream. Therefore, you should
not allow caps to be changed once you've started using data from
them. The easiest way to achieve this is by using explicit caps,
which have been explained before. However, we're going to use them
in a slightly different way then what you're used to, having the
core do all the work for us.
</para>
<para>
The idea is that, as long as the stream/file headers have not been
written yet and no data has been processed yet, a stream is allowed
to renegotiate. After that point, the caps should be fixed, because
we can only use a stream once. Caps may then only change within an
allowed range (think MPEG, where changes in FPS are allowed), or
sometimes not at all (such as AVI audio). In order to do that, we
will, after data retrieval and header writing, set an explicit caps
on each sink pad, that is the minimal caps describing the properties
of the format that may not change. As an example, for MPEG audio
inside an MPEG system stream, this would mean a wide caps of
audio/mpeg with mpegversion=1 and layer=[1,2]. For the same audio type
in MPEG, though, the samplerate, bitrate, layer and number of channels
would become static, too. Since the (request) pads will be removed
when the stream ends, the static caps will cease to exist too, then.
While the explicit caps exist, the <function>_link ()</function>-
function will not be called, since the core will do all necessary
checks for us. Note that the property of using explicit caps should
be added along with the actual explicit caps, not any earlier.
</para>
<para>
Below here follows the simple example of an AVI muxer's audio caps
negotiation. The <function>_link ()</function>-function is fairly
normal, but the <function>-Loop ()</function>-function does some of
the tricks mentioned above. There is no <function>_getcaps ()</function>-
function since the pad template contains all that information already
(not shown).
</para>
<programlisting>
static GstPadLinkReturn
gst_avi_mux_audio_link (GstPad *pad,
const GstCaps *caps)
{
GstAviMux *mux = GST_AVI_MUX (gst_pad_get_parent (pad));
GstStructure *str = gst_caps_get_structure (caps, 0);
const gchar *media = gst_structure_get_name (str);
if (!strcmp (str, "audio/mpeg")) {
/* get version, make sure it's 1, get layer, make sure it's 1-3,
* then create the 2-byte audio tag (0x0055) and fill an audio
* stream structure (strh/strf). */
[..]
return GST_PAD_LINK_OK;
} else if !strcmp (str, "audio/x-raw")) {
/* See above, but now with the raw audio tag (0x0001). */
[..]
return GST_PAD_LINK_OK;
} else [..]
[..]
}
static void
gst_avi_mux_loop (GstElement *element)
{
GstAviMux *mux = GST_AVI_MUX (element);
[..]
/* As we get here, we should have written the header if we hadn't done
* that before yet, and we're supposed to have an internal buffer from
* each pad, also from the audio one. So here, we check again whether
* this is the first run and if so, we set static caps. */
if (mux->first_cycle) {
const GList *padlist = gst_element_get_pad_list (element);
GList *item;
for (item = padlist; item != NULL; item = item->next) {
GstPad *pad = item->data;
GstCaps *caps;
if (!GST_PAD_IS_SINK (pad))
continue;
/* set static caps here */
if (!strncmp (gst_pad_get_name (pad), "audio_", 6)) {
/* the strf is the struct you filled in the _link () function. */
switch (strf->format) {
case 0x0055: /* mp3 */
caps = gst_caps_new_simple ("audio/mpeg",
"mpegversion", G_TYPE_INT, 1,
"layer", G_TYPE_INT, 3,
"bitrate", G_TYPE_INT, strf->av_bps,
"rate", G_TYPE_INT, strf->rate,
"channels", G_TYPE_INT, strf->channels,
NULL);
break;
case 0x0001: /* pcm */
caps = gst_caps_new_simple ("audio/x-raw",
[..]);
break;
[..]
}
} else if (!strncmp (gst_pad_get_name (pad), "video_", 6)) {
[..]
} else {
g_warning ("oi!");
continue;
}
/* set static caps */
gst_pad_use_explicit_caps (pad);
gst_pad_set_explicit_caps (pad, caps);
}
}
[..]
/* Next runs will never be the first again. */
mux->first_cycle = FALSE;
}
</programlisting>
<para>
Note that there are other ways to achieve that, which might be useful
for more complex cases. This will do for the simple cases, though.
This method is provided to simplify negotiation and renegotiation in
muxers, it is not a complete solution, nor is it a pretty one.
</para>
</sect1>
-->
</chapter>

View file

@ -1,45 +0,0 @@
<chapter id="chapter-other-oneton" xreflabel="Writing a Demuxer or Parser">
<title>Writing a Demuxer or Parser</title>
<para>
Demuxers are the 1-to-N elements that need very special care.
They are responsible for timestamping raw, unparsed data into
elementary video or audio streams, and there are many things that you
can optimize or do wrong. Here, several culprits will be mentioned
and common solutions will be offered. Parsers are demuxers with only
one source pad. Also, they only cut the stream into buffers, they
don't touch the data otherwise.
</para>
<para>
As mentioned previously in <xref linkend="chapter-negotiation"/>,
demuxers should use fixed caps, since their data type will not change.
</para>
<para>
As discussed in <xref linkend="chapter-scheduling"/>, demuxer elements
can be written in multiple ways:
</para>
<itemizedlist>
<listitem>
<para>
They can be the driving force of the pipeline, by running their own
task. This works particularly well for elements that need random
access, for example an AVI demuxer.
</para>
</listitem>
<listitem>
<para>
They can also run in push-based mode, which means that an upstream
element drives the pipeline. This works particularly well for streams
that may come from network, such as Ogg.
</para>
</listitem>
</itemizedlist>
<para>
In addition, audio parsers with one output can, in theory, also be written
in random access mode. Although simple playback will mostly work if your
element only accepts one mode, it may be required to implement multiple
modes to work in combination with all sorts of applications, such as
editing. Also, performance may become better if you implement multiple
modes. See <xref linkend="chapter-scheduling"/> to see how an element
can accept multiple scheduling modes.
</para>
</chapter>

View file

@ -1,167 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-other-sink" xreflabel="Writing a Sink">
<title>Writing a Sink</title>
<para>
Sinks are output elements that, opposite to sources, have no source
pads and one or more (usually one) sink pad. They can be sound card
outputs, disk writers, etc. This chapter will discuss the basic
implementation of sink elements.
</para>
<sect1 id="other-sink-processing" xreflabel="Data processing, events, synchronization and clocks">
<title>Data processing, events, synchronization and clocks</title>
<para>
Except for corner cases, sink elements will be <function>_chain
()</function>-based elements. The concept of such elements has
been discussed before in detail, so that will be skipped here. What
is very important in sink elements, specifically in real-time audio
and video sources (such as <classname>osssink</classname> or
<classname>ximagesink</classname>), is event handling in the
<function>_chain ()</function>-function, because most elements rely
on EOS-handling of the sink element, and because A/V synchronization
can only be perfect if the element takes this into account.
</para>
<para>
How to achieve synchronization between streams depends on whether
you're a clock-providing or a clock-receiving element. If you're
the clock provider, you can do with time whatever you want. Correct
handling would mean that you check whether the end of the previous
buffer (if any) and the start of the current buffer are the same.
If so, there's no gap between the two and you can continue playing
right away. If there is a gap, then you'll need to wait for your
clock to reach that time. How to do that depends on the element
type. In the case of audio output elements, you would output silence
for a while. In the case of video, you would show background color.
In case of subtitles, show no subtitles at all.
</para>
<para>
In the case that the provided clock and the received clock are not
the same (or in the case where your element provides no clock, which
is the same), you simply wait for the clock to reach the timestamp of
the current buffer and then you handle the data in it.
</para>
<para>
A simple data handling function would look like this:
</para>
<programlisting>
static void
gst_my_sink_chain (GstPad *pad,
GstData *data)
{
GstMySink *sink = GST_MY_SINK (gst_pad_get_parent (pad));
GstBuffer *buf;
GstClockTime time;
/* only needed if the element is GST_EVENT_AWARE */
if (GST_IS_EVENT (data)) {
GstEvent *event = GST_EVENT (data);
switch (GST_EVENT_TYPE (event)) {
case GST_EVENT_EOS:
[ if your element provides a clock, disable (inactivate) it here ]
/* pass-through */
default:
/* the default handler handles discontinuities, even if your
* element provides a clock! */
gst_pad_event_default (pad, event);
break;
}
return;
}
buf = GST_BUFFER (data);
if (GST_BUFFER_TIME_IS_VALID (buf))
time = GST_BUFFER_TIMESTAMP (buf);
else
time = sink->expected_next_time;
/* Synchronization - the property is only useful in case the
* element has the option of not syncing. So it is not useful
* for hardware-sync (clock-providing) elements. */
if (sink->sync) {
/* This check is only needed if you provide a clock. Else,
* you can always execute the 'else' clause. */
if (sink->provided_clock == sink->received_clock) {
/* GST_SECOND / 10 is 0,1 sec, it's an arbitrary value. The
* casts are needed because else it'll be unsigned and we
* won't detect negative values. */
if (llabs ((gint64) sink->expected_next_time - (gint64) time) >
(GST_SECOND / 10)) {
/* so are we ahead or behind? */
if (time > sink->expected_time) {
/* we need to wait a while... In case of audio, output
* silence. In case of video, output background color.
* In case of subtitles, display nothing. */
[..]
} else {
/* Drop data. */
[..]
}
}
} else {
/* You could do more sophisticated things here, but we'll
* keep it simple for the purpose of the example. */
gst_element_wait (GST_ELEMENT (sink), time);
}
}
/* And now handle the data. */
[..]
}
</programlisting>
</sect1>
<sect1 id="other-sink-buffers" xreflabel="Special memory">
<title>Special memory</title>
<para>
Like source elements, sink elements can sometimes provide externally
allocated (such as X-provided or DMA'able) memory to elements earlier
in the pipeline, and thereby prevent the need for
<function>memcpy ()</function> for incoming data. We do this by
providing a pad-allocate-buffer function.
</para>
<programlisting>
static GstBuffer * gst_my_sink_buffer_allocate (GstPad *pad,
guint64 offset,
guint size);
static void
gst_my_sink_init (GstMySink *sink)
{
[..]
gst_pad_set_bufferalloc_function (sink->sinkpad,
gst_my_sink_buffer_allocate);
}
static void
gst_my_sink_buffer_free (GstBuffer *buf)
{
GstMySink *sink = GST_MY_SINK (GST_BUFFER_PRIVATE (buf));
/* Do whatever is needed here. */
[..]
}
static GstBuffer *
gst_my_sink_buffer_allocate (GstPad *pad,
guint64 offset,
guint size)
{
GstBuffer *buf = gst_buffer_new ();
/* So here it's up to you to wrap your private buffers and
* return that. */
GST_BUFFER_FREE_DATA_FUNC (buf) = gst_my_sink_buffer_free;
GST_BUFFER_PRIVATE (buf) = sink;
GST_BUFFER_FLAG_SET (buf, GST_BUFFER_DONTFREE);
[..]
return buf;
}
</programlisting>
</sect1>
</chapter>

View file

@ -1,475 +0,0 @@
<!-- ############ chapter ############# -->
<chapter id="chapter-other-source" xreflabel="Writing a Source">
<title>Writing a Source</title>
<para>
Source elements are the start of a data streaming pipeline. Source
elements have no sink pads and have one or more source pads. We will
focus on single-sourcepad elements here, but the concepts apply equally
well to multi-sourcepad elements. This chapter will explain the essentials
of source elements, which features it should implement and which it
doesn't have to, and how source elements will interact with other
elements in a pipeline.
</para>
<sect1 id="section-source-getfn" xreflabel="The get()-function">
<title>The get()-function</title>
<para>
Source elements have the special option of having a
<function>_get ()</function>-function rather than a
<function>_loop ()</function>- or <function>_chain
()</function>-function. A <function>_get ()</function>-function is
called by the scheduler every time the next elements needs data. Apart
from corner cases, every source element will want to be <function>_get
()</function>-based.
</para>
<programlisting>
static GstData * gst_my_source_get (GstPad *pad);
static void
gst_my_source_init (GstMySource *src)
{
[..]
gst_pad_set_get_function (src->srcpad, gst_my_source_get);
}
static GstData *
gst_my_source_get (GstPad *pad)
{
GstBuffer *buffer;
buffer = gst_buffer_new ();
GST_BUFFER_DATA (buf) = g_strdup ("hello pipeline!");
GST_BUFFER_SIZE (buf) = strlen (GST_BUFFER_DATA (buf));
/* terminating '/0' */
GST_BUFFER_MAZSIZE (buf) = GST_BUFFER_SIZE (buf) + 1;
return GST_DATA (buffer);
}
</programlisting>
</sect1>
<sect1 id="section-source-padfn" xreflabel="Events, querying and converting">
<title>Events, querying and converting</title>
<para>
One of the most important functions of source elements is to
implement correct query, convert and event handling functions.
Those will continuously describe the current state of the stream.
Query functions can be used to get stream properties such as current
position and length. This can be used by fellow elements to convert
this same value into a different unit, or by applications to provide
information about the length/position of the stream to the user.
Conversion functions are used to convert such values from one unit
to another. Lastly, events are mostly used to seek to positions
inside the stream. Any function is essentially optional, but the
element should try to provide as much information as it knows. Note
that elements providing an event function should also list their
supported events in an <function>_get_event_mask ()</function>
function. Elements supporting query operations should list the
supported operations in a <function>_get_query_types
()</function> function. Elements supporting either conversion
or query operations should also implement a <function>_get_formats
()</function> function.
</para>
<para>
An example source element could, for example, be an element that
continuously generates a wave tone at 44,1 kHz, mono, 16-bit. This
element will generate 44100 audio samples per second or 88,2 kB/s.
This information can be used to implement such functions:
</para>
<programlisting>
static GstFormat * gst_my_source_format_list (GstPad *pad);
static GstQueryType * gst_my_source_query_list (GstPad *pad);
static gboolean gst_my_source_convert (GstPad *pad,
GstFormat from_fmt,
gint64 from_val,
GstFormat *to_fmt,
gint64 *to_val);
static gboolean gst_my_source_query (GstPad *pad,
GstQueryType type,
GstFormat *to_fmt,
gint64 *to_val);
static void
gst_my_source_init (GstMySource *src)
{
[..]
gst_pad_set_convert_function (src->srcpad, gst_my_source_convert);
gst_pad_set_formats_function (src->srcpad, gst_my_source_format_list);
gst_pad_set_query_function (src->srcpad, gst_my_source_query);
gst_pad_set_query_type_function (src->srcpad, gst_my_source_query_list);
}
/*
* This function returns an enumeration of supported GstFormat
* types in the query() or convert() functions. See gst/gstformat.h
* for a full list.
*/
static GstFormat *
gst_my_source_format_list (GstPad *pad)
{
static const GstFormat formats[] = {
GST_FORMAT_TIME,
GST_FORMAT_DEFAULT, /* means "audio samples" */
GST_FORMAT_BYTES,
0
};
return formats;
}
/*
* This function returns an enumeration of the supported query()
* operations. Since we generate audio internally, we only provide
* an indication of how many samples we've played so far. File sources
* or such elements could also provide GST_QUERY_TOTAL for the total
* stream length, or other things. See gst/gstquery.h for details.
*/
static GstQueryType *
gst_my_source_query_list (GstPad *pad)
{
static const GstQueryType query_types[] = {
GST_QUERY_POSITION,
0,
};
return query_types;
}
/*
* And below are the logical implementations.
*/
static gboolean
gst_my_source_convert (GstPad *pad,
GstFormat from_fmt,
gint64 from_val,
GstFormat *to_fmt,
gint64 *to_val)
{
gboolean res = TRUE;
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
switch (from_fmt) {
case GST_FORMAT_TIME:
switch (*to_fmt) {
case GST_FORMAT_TIME:
/* nothing */
break;
case GST_FORMAT_BYTES:
*to_val = from_val / (GST_SECOND / (44100 * 2));
break;
case GST_FORMAT_DEFAULT:
*to_val = from_val / (GST_SECOND / 44100);
break;
default:
res = FALSE;
break;
}
break;
case GST_FORMAT_BYTES:
switch (*to_fmt) {
case GST_FORMAT_TIME:
*to_val = from_val * (GST_SECOND / (44100 * 2));
break;
case GST_FORMAT_BYTES:
/* nothing */
break;
case GST_FORMAT_DEFAULT:
*to_val = from_val / 2;
break;
default:
res = FALSE;
break;
}
break;
case GST_FORMAT_DEFAULT:
switch (*to_fmt) {
case GST_FORMAT_TIME:
*to_val = from_val * (GST_SECOND / 44100);
break;
case GST_FORMAT_BYTES:
*to_val = from_val * 2;
break;
case GST_FORMAT_DEFAULT:
/* nothing */
break;
default:
res = FALSE;
break;
}
break;
default:
res = FALSE;
break;
}
return res;
}
static gboolean
gst_my_source_query (GstPad *pad,
GstQueryType type,
GstFormat *to_fmt,
gint64 *to_val)
{
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
gboolean res = TRUE;
switch (type) {
case GST_QUERY_POSITION:
res = gst_pad_convert (pad, GST_FORMAT_BYTES, src->total_bytes,
to_fmt, to_val);
break;
default:
res = FALSE;
break;
}
return res;
}
</programlisting>
<para>
Be sure to increase src->total_bytes after each call to your
<function>_get ()</function> function.
</para>
<para>
Event handling has already been explained previously in the events
chapter.
</para>
</sect1>
<sect1 id="section-source-sync" xreflabel="Time, clocking and synchronization">
<title>Time, clocking and synchronization</title>
<para>
The above example does not provide any timing info, but will suffice
for elementary data sources such as a file source or network data
source element. Things become slightly more complicated, but still
very simple, if we create artificial video or audio data sources,
such as a video test image source or an artificial audio source (e.g.
<classname>audiotestsrc</classname>).
It will become more complicated if we want the element to be a
realtime capture source, such as a video4linux source (for reading
video frames from a TV card) or an ALSA source (for reading data
from soundcards supported by an ALSA-driver). Here, we will need to
make the element aware of timing and clocking.
</para>
<para>
Timestamps can essentially be generated from all the information
given above without any difficulty. We could add a very small amount
of code to generate perfectly timestamped buffers from our
<function>_get ()</function>-function:
</para>
<programlisting>
static void
gst_my_source_init (GstMySource *src)
{
[..]
src->total_bytes = 0;
}
static GstData *
gst_my_source_get (GstPad *pad)
{
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
GstBuffer *buf;
GstFormat fmt = GST_FORMAT_TIME;
[..]
GST_BUFFER_DURATION (buf) = GST_BUFFER_SIZE (buf) * (GST_SECOND / (44100 * 2));
GST_BUFFER_TIMESTAMP (buf) = src->total_bytes * (GST_SECOND / (44100 * 2));
src->total_bytes += GST_BUFFER_SIZE (buf);
return GST_DATA (buf);
}
static GstStateChangeReturn
gst_my_source_change_state (GstElement *element, GstStateChange transition)
{
GstStateChangeReturn ret = GST_STATE_CHANGE_SUCCESS;
GstMySource *src = GST_MY_SOURCE (element);
/* First, handle upwards state changes */
switch (transition) {
case GST_STATE_READY_TO_PAUSED:
/* do something */
break;
default:
break;
}
ret = GST_ELEMENT_CLASS (parent_class)-&gt;change_state (element, transition);
if (ret == GST_STATE_CHANGE_FAILURE)
return ret;
/* Now handle downwards state changes after chaining up */
switch (transition) {
case GST_STATE_PAUSED_TO_READY:
src->total_bytes = 0;
break;
default:
break;
}
return ret;
}
</programlisting>
<para>
That wasn't too hard. Now, let's assume real-time elements. Those
can either have hardware-timing, in which case we can rely on backends
to provide sync for us (in which case you probably want to provide a
clock), or we will have to emulate that internally (e.g. to acquire
sync in artificial data elements such as
<classname>audiotestsrc</classname>).
Let's first look at the second option (software sync). The first option
(hardware sync + providing a clock) does not require any special code
with respect to timing, and the clocking section already explained how
to provide a clock.
</para>
<programlisting>
enum {
ARG_0,
[..]
ARG_SYNC,
[..]
};
static void
gst_my_source_class_init (GstMySourceClass *klass)
{
GObjectClass *object_class = G_OBJECT_CLASS (klass);
[..]
g_object_class_install_property (object_class, ARG_SYNC,
g_param_spec_boolean ("sync", "Sync", "Synchronize to clock",
FALSE, G_PARAM_READWRITE |
G_PARAM_STATIC_STRINGS));
[..]
}
static void
gst_my_source_init (GstMySource *src)
{
[..]
src->sync = FALSE;
}
static GstData *
gst_my_source_get (GstPad *pad)
{
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
GstBuffer *buf;
[..]
if (src->sync) {
/* wait on clock */
gst_element_wait (GST_ELEMENT (src), GST_BUFFER_TIMESTAMP (buf));
}
return GST_DATA (buf);
}
static void
gst_my_source_get_property (GObject *object,
guint prop_id,
GParamSpec *pspec,
GValue *value)
{
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
switch (prop_id) {
[..]
case ARG_SYNC:
g_value_set_boolean (value, src->sync);
break;
[..]
}
}
static void
gst_my_source_get_property (GObject *object,
guint prop_id,
GParamSpec *pspec,
const GValue *value)
{
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
switch (prop_id) {
[..]
case ARG_SYNC:
src->sync = g_value_get_boolean (value);
break;
[..]
}
}
</programlisting>
<para>
Most of this is GObject wrapping code. The actual code to do
software-sync (in the <function>_get ()</function>-function)
is relatively small.
</para>
</sect1>
<sect1 id="section-source-buffers" xreflabel="Using special memory">
<title>Using special memory</title>
<para>
In some cases, it might be useful to use specially allocated memory
(e.g. <function>mmap ()</function>'ed DMA'able memory) in
your buffers, and those will require special handling when they are
being dereferenced. For this, &GStreamer; uses the concept of
buffer-free functions. Those are special functions pointers that an
element can set on buffers that it created itself. The given function
will be called when the buffer has been dereferenced, so that the
element can clean up or re-use memory internally rather than using
the default implementation (which simply calls
<function>g_free ()</function> on the data pointer).
</para>
<programlisting>
static void
gst_my_source_buffer_free (GstBuffer *buf)
{
GstMySource *src = GST_MY_SOURCE (GST_BUFFER_PRIVATE (buf));
/* do useful things here, like re-queueing the buffer which
* makes it available for DMA again. The default handler will
* not free this buffer because of the GST_BUFFER_DONTFREE
* flag. */
}
static GstData *
gst_my_source_get (GstPad *pad)
{
GstMySource *src = GST_MY_SOURCE (gst_pad_get_parent (pad));
GstBuffer *buf;
[..]
buf = gst_buffer_new ();
GST_BUFFER_FREE_DATA_FUNC (buf) = gst_my_source_buffer_free;
GST_BUFFER_PRIVATE (buf) = src;
GST_BUFFER_FLAG_SET (buf, GST_BUFFER_READONLY | GST_BUFFER_DONTFREE);
[..]
return GST_DATA (buf);
}
</programlisting>
<para>
Note that this concept should <emphasis>not</emphasis> be used to
decrease the number of calls made to functions such as
<function>g_malloc ()</function> inside your element. We
have better ways of doing that elsewhere (&GStreamer; core, Glib,
Glibc, Linux kernel, etc.).
</para>
</sect1>
</chapter>

View file

@ -1,198 +0,0 @@
<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE book PUBLIC "-//OASIS//DTD DocBook XML V4.2//EN"
"http://www.oasis-open.org/docbook/xml/4.2/docbookx.dtd" [
<!ENTITY % image-entities SYSTEM "image.entities">
%image-entities;
<!ENTITY % version-entities SYSTEM "version.entities">
%version-entities;
<!ENTITY TITLEPAGE SYSTEM "titlepage.xml">
<!-- Part 1: Introduction -->
<!ENTITY INTRO_PREFACE SYSTEM "intro-preface.xml">
<!ENTITY INTRO_BASICS SYSTEM "intro-basics.xml">
<!-- Part 2: Building a Plugin -->
<!ENTITY BUILDING_BOILER SYSTEM "building-boiler.xml">
<!ENTITY BUILDING_DEBUG SYSTEM "building-debug.xml">
<!ENTITY BUILDING_PADS SYSTEM "building-pads.xml">
<!ENTITY BUILDING_CHAINFN SYSTEM "building-chainfn.xml">
<!ENTITY BUILDING_EVENTFN SYSTEM "building-eventfn.xml">
<!ENTITY BUILDING_QUERYFN SYSTEM "building-queryfn.xml">
<!ENTITY BUILDING_STATE SYSTEM "building-state.xml">
<!ENTITY BUILDING_PROPS SYSTEM "building-props.xml">
<!ENTITY BUILDING_SIGNALS SYSTEM "building-signals.xml">
<!ENTITY BUILDING_TESTAPP SYSTEM "building-testapp.xml">
<!-- Part 3: Advanced Filter Concepts -->
<!ENTITY ADVANCED_REQUEST SYSTEM "advanced-request.xml">
<!ENTITY ADVANCED_SCHEDULING SYSTEM "advanced-scheduling.xml">
<!ENTITY ADVANCED_NEGOTIATION SYSTEM "advanced-negotiation.xml">
<!ENTITY ADVANCED_ALLOCATION SYSTEM "advanced-allocation.xml">
<!ENTITY ADVANCED_TYPES SYSTEM "advanced-types.xml">
<!ENTITY ADVANCED_EVENTS SYSTEM "advanced-events.xml">
<!ENTITY ADVANCED_CLOCK SYSTEM "advanced-clock.xml">
<!ENTITY ADVANCED_QOS SYSTEM "advanced-qos.xml">
<!ENTITY ADVANCED_DPARAMS SYSTEM "advanced-dparams.xml">
<!ENTITY ADVANCED_INTERFACES SYSTEM "advanced-interfaces.xml">
<!ENTITY ADVANCED_TAGGING SYSTEM "advanced-tagging.xml">
<!-- Part 4: Creating special element types -->
<!ENTITY OTHER_BASE SYSTEM "other-base.xml">
<!ENTITY OTHER_ONETON SYSTEM "other-oneton.xml">
<!ENTITY OTHER_NTOONE SYSTEM "other-ntoone.xml">
<!ENTITY OTHER_MANAGER SYSTEM "other-manager.xml">
<!-- Appendices -->
<!ENTITY APPENDIX_CHECKLIST SYSTEM "appendix-checklist.xml">
<!ENTITY APPENDIX_PORTING SYSTEM "appendix-porting.xml">
<!ENTITY APPENDIX_LICENSING SYSTEM "appendix-licensing.xml">
<!ENTITY APPENDIX_PYTHON SYSTEM "appendix-python.xml">
<!ENTITY GStreamer "<application>GStreamer</application>">
<!ENTITY GstAppDevMan "<emphasis>GStreamer Application Development Manual</emphasis>">
<!ENTITY GstLibRef "<emphasis>GStreamer Library Reference</emphasis>">
]>
<book id="index">
&TITLEPAGE;
<!-- ############# Introduction - part ############### -->
<part id="part-introduction" xreflabel="Introduction">
<title>Introduction</title>
<partintro>
<para><!-- synchronize with AppDevMan -->
&GStreamer; is an extremely powerful and versatile framework for creating
streaming media applications. Many of the virtues of the &GStreamer;
framework come from its modularity: &GStreamer; can seamlessly
incorporate new plugin modules. But because modularity and power often
come at a cost of greater complexity (consider, for example, <ulink
type="http" url="http://www.omg.org/">CORBA</ulink>), writing new
plugins is not always easy.
</para>
<para>
This guide is intended to help you understand the &GStreamer; framework
(version &GST_VERSION;) so you can develop new plugins to extend the
existing functionality. The guide addresses most issues by following the
development of an example plugin - an audio filter plugin -
written in C. However, the later parts of the guide also present some
issues involved in writing other types of plugins, and the end of the
guide describes some of the Python bindings for &GStreamer;.
</para>
</partintro>
&INTRO_PREFACE;
&INTRO_BASICS;
</part>
<!-- ############ Building a Plugin - part ############# -->
<part id="part-building" xreflabel="Building a Plugin">
<title>Building a Plugin</title>
<partintro>
<para>
You are now ready to learn how to build a plugin. In this part of the
guide, you will learn how to apply basic &GStreamer;
programming concepts to write a simple plugin. The previous parts of the
guide have contained no explicit example code, perhaps making things a
bit abstract and difficult to understand. In contrast, this section will
present both applications and code by following the development of an
example audio filter plugin called <quote>MyFilter</quote>.
</para>
<para>
The example filter element will begin with a single input pad and a
single
output pad. The filter will, at first, simply pass media and event data
from its sink pad to its source pad without modification. But by the end
of this part of the guide, you will learn to add some more interesting
functionality, including properties and signal handlers. And after
reading the next part of the guide, <xref linkend="part-advanced"/>, you
will be able to add even more functionality to your plugins.
</para>
</partintro>
&BUILDING_BOILER;
&BUILDING_PADS;
&BUILDING_CHAINFN;
&BUILDING_EVENTFN;
&BUILDING_QUERYFN;
&BUILDING_STATE;
&BUILDING_PROPS;
&BUILDING_SIGNALS;
&BUILDING_TESTAPP;
</part>
<!-- ############ Advanced Filter Concepts - part ############# -->
<part id="part-advanced" xreflabel="Advanced Filter Concepts">
<title>Advanced Filter Concepts</title>
<partintro>
<para>
By now, you should be able to create basic filter elements that can
receive and send data. This is the simple model that &GStreamer; stands
for. But &GStreamer; can do much more than only this! In this chapter,
various advanced topics will be discussed, such as scheduling, special
pad types, clocking, events, interfaces, tagging and more. These topics
are the sugar that makes &GStreamer; so easy to use for applications.
</para>
</partintro>
&ADVANCED_REQUEST;
&ADVANCED_SCHEDULING;
&ADVANCED_NEGOTIATION;
&ADVANCED_ALLOCATION;
&ADVANCED_TYPES;
&ADVANCED_EVENTS;
&ADVANCED_CLOCK;
&ADVANCED_QOS;
&ADVANCED_DPARAMS;
&ADVANCED_INTERFACES;
&ADVANCED_TAGGING;
<!-- FIXME: add querying, event handling and conversion -->
</part>
<!-- ############ Creating special element types - part ############# -->
<part id="part-other" xreflabel="Creating special element types">
<title>Creating special element types</title>
<partintro>
<para>
By now, we have looked at pretty much any feature that can be embedded
into a &GStreamer; element. Most of this has been fairly low-level and
given deep insights in how &GStreamer; works internally. Fortunately,
&GStreamer; contains some easier-to-use interfaces to create such
elements. In order to do that, we will look closer at the element
types for which &GStreamer; provides base classes (sources, sinks and
transformation elements). We will also look closer at some types of
elements that require no specific coding such as scheduling-interaction
or data passing, but rather require specific pipeline control (e.g.
N-to-1 elements and managers).
</para>
</partintro>
&OTHER_BASE;
&OTHER_ONETON;
&OTHER_NTOONE;
&OTHER_MANAGER;
</part>
<!-- ############ Appendices - part ############# -->
<part id="part-appendix" xreflabel="Appendices">
<title>Appendices</title>
<partintro>
<para>
This chapter contains things that don't belong anywhere else.
</para>
</partintro>
&APPENDIX_CHECKLIST;
&APPENDIX_PORTING;
&APPENDIX_LICENSING;
&APPENDIX_PYTHON;
</part>
</book>

View file

@ -1,98 +0,0 @@
<bookinfo>
<authorgroup>
<author>
<firstname>Richard</firstname>
<othername>John</othername>
<surname>Boulton</surname>
<authorblurb>
<para>
<email>richard-gst@tartarus.org</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Erik</firstname>
<surname>Walthinsen</surname>
<authorblurb>
<para>
<email>omega@temple-baptist.com</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Steve</firstname>
<surname>Baker</surname>
<authorblurb>
<para>
<email>stevebaker_org@yahoo.co.uk</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Leif</firstname>
<surname>Johnson</surname>
<authorblurb>
<para>
<email>leif@ambient.2y.net</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Ronald</firstname>
<othername>S.</othername>
<surname>Bultje</surname>
<authorblurb>
<para>
<email>rbultje@ronald.bitfreak.net</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Stefan</firstname>
<surname>Kost</surname>
<authorblurb>
<para>
<email>ensonic@users.sf.net</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Tim-Philipp</firstname>
<surname>Müller</surname>
<authorblurb>
<para>
<email>tim centricular . net</email>
</para>
</authorblurb>
</author>
<author>
<firstname>Wim</firstname>
<surname>Taymans</surname>
<authorblurb>
<para>
<email>wim.taymans@gmail.com</email>
</para>
</authorblurb>
</author>
</authorgroup>
<legalnotice id="misc-legalnotice">
<para>
This material may be distributed only subject to the terms and
conditions set forth in the Open Publication License, v1.0 or later (the
latest version is presently available at <ulink
url="http://www.opencontent.org/openpub/"
type="http">http://www.opencontent.org/openpub/</ulink>).
</para>
</legalnotice>
<title>&GStreamer; Plugin Writer's Guide (&GST_VERSION;)</title>
</bookinfo>

View file

@ -8,7 +8,6 @@ endif
always_dirs = \
controller \
helloworld \
manual \
memory \
netclock \
ptp \

View file

@ -1,50 +0,0 @@
Makefile
Makefile.in
*.c
*.o
*.lo
*.la
.deps
.libs
appsink
appsrc
blockprobe
dynformat
elementget
elementmake
gnome
helloworld
helloworld2
init
popt
queue
threads
bin
decodebin
dynamic
elementcreate
elementfactory
elementlink
ghostpad
pad
playbin
playsink
norebuffer
probe
query
fakesrc
typefind
effectswitch
testrtpool
xml-mp3
xml
xmlTest.gst
README
*.bb
*.bbg
*.da
test-registry.*

View file

@ -1,136 +0,0 @@
# if HAVE_LIBGNOMEUI
# GNOME = gnome
# else
GNOME =
# endif
# gnome_LDADD = $(GST_OBJ_LIBS) $(LIBGNOMEUI_LIBS)
# gnome_CFLAGS = $(GST_OBJ_CFLAGS) $(LIBGNOMEUI_CFLAGS)
CHECK_REGISTRY = $(top_builddir)/tests/examples/manual/test-registry.reg
REGISTRY_ENVIRONMENT = \
GST_REGISTRY=$(CHECK_REGISTRY)
AM_TESTS_ENVIRONMENT = \
$(REGISTRY_ENVIRONMENT) \
GST_PLUGIN_SCANNER_1_0=$(top_builddir)/libs/gst/helpers/gst-plugin-scanner \
GST_PLUGIN_SYSTEM_PATH_1_0= \
GST_PLUGIN_PATH_1_0=$(top_builddir)/plugins
EXTRA_DIST = extract.pl
EXAMPLES = \
$(GNOME) \
elementcreate \
elementmake \
elementfactory \
elementget \
elementlink \
bin \
pad \
ghostpad \
helloworld \
init \
query \
typefind \
blockprobe \
probe \
appsrc \
appsink \
dynformat \
effectswitch \
norebuffer \
playbin \
decodebin \
playsink
BUILT_SOURCES = \
elementmake.c elementcreate.c elementget.c elementlink.c elementfactory.c \
bin.c \
pad.c ghostpad.c \
gnome.c \
helloworld.c \
init.c \
query.c \
typefind.c \
blockprobe.c \
probe.c \
appsrc.c \
appsink.c \
dynformat.c \
effectswitch.c \
norebuffer.c \
playbin.c decodebin.c \
playsink.c
if HAVE_PTHREAD
BUILT_SOURCES += testrtpool.c
EXAMPLES += testrtpool
endif
CLEANFILES = core core.* test-registry.* *.gcno *.gcda $(BUILT_SOURCES)
AM_CFLAGS = $(GST_OBJ_CFLAGS)
LDADD = $(top_builddir)/libs/gst/base/libgstbase-@GST_API_VERSION@.la \
$(GST_OBJ_LIBS)
elementmake.c elementcreate.c elementget.c elementlink.c elementfactory.c: $(top_srcdir)/docs/manual/basics-elements.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
bin.c : $(top_srcdir)/docs/manual/basics-bins.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
pad.c ghostpad.c: $(top_srcdir)/docs/manual/basics-pads.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
gnome.c: $(top_srcdir)/docs/manual/appendix-integration.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
helloworld.c: $(top_srcdir)/docs/manual/basics-helloworld.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
init.c: $(top_srcdir)/docs/manual/basics-init.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
query.c: $(top_srcdir)/docs/manual/advanced-position.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
typefind.c: $(top_srcdir)/docs/manual/advanced-autoplugging.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
blockprobe.c: $(top_srcdir)/docs/manual/advanced-dataaccess.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
probe.c: $(top_srcdir)/docs/manual/advanced-dataaccess.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
appsrc.c: $(top_srcdir)/docs/manual/advanced-dataaccess.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
appsink.c: $(top_srcdir)/docs/manual/advanced-dataaccess.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
dynformat.c: $(top_srcdir)/docs/manual/advanced-dataaccess.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
effectswitch.c: $(top_srcdir)/docs/manual/advanced-dataaccess.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
norebuffer.c: $(top_srcdir)/docs/manual/advanced-buffering.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
playbin.c decodebin.c playsink.c: $(top_srcdir)/docs/manual/highlevel-playback.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
testrtpool.c: $(top_srcdir)/docs/manual/advanced-threads.xml
$(PERL_PATH) $(srcdir)/extract.pl $@ $<
TESTS = bin \
elementcreate elementfactory elementget elementlink elementmake \
ghostpad init
noinst_PROGRAMS = $(EXAMPLES)
testrtpool_LDADD = $(GST_OBJ_LIBS) $(PTHREAD_LIBS)
testrtpool_CFLAGS = $(GST_OBJ_CFLAGS) $(PTHREAD_CFLAGS)

Some files were not shown because too many files have changed in this diff Show more