docs/manual/: First try at rewriting the ADM. Needs lotsamore work, but some parts might already be somewhat useful.

Original commit message from CVS:
* docs/manual/advanced-autoplugging.xml:
* docs/manual/advanced-clocks.xml:
* docs/manual/advanced-dparams.xml:
* docs/manual/advanced-interfaces.xml:
* docs/manual/advanced-metadata.xml:
* docs/manual/advanced-position.xml:
* docs/manual/advanced-schedulers.xml:
* docs/manual/advanced-threads.xml:
* docs/manual/appendix-debugging.xml:
* docs/manual/appendix-gnome.xml:
* docs/manual/appendix-programs.xml:
* docs/manual/appendix-quotes.xml:
* docs/manual/appendix-win32.xml:
* docs/manual/autoplugging.xml:
* docs/manual/basics-bins.xml:
* docs/manual/basics-data.xml:
* docs/manual/basics-elements.xml:
* docs/manual/basics-helloworld.xml:
* docs/manual/basics-init.xml:
* docs/manual/basics-pads.xml:
* docs/manual/basics-plugins.xml:
* docs/manual/bins-api.xml:
* docs/manual/bins.xml:
* docs/manual/buffers-api.xml:
* docs/manual/buffers.xml:
* docs/manual/clocks.xml:
* docs/manual/components.xml:
* docs/manual/cothreads.xml:
* docs/manual/debugging.xml:
* docs/manual/dparams-app.xml:
* docs/manual/dynamic.xml:
* docs/manual/elements-api.xml:
* docs/manual/elements.xml:
* docs/manual/factories.xml:
* docs/manual/gnome.xml:
* docs/manual/goals.xml:
* docs/manual/helloworld.xml:
* docs/manual/helloworld2.xml:
* docs/manual/highlevel-components.xml:
* docs/manual/highlevel-xml.xml:
* docs/manual/init-api.xml:
* docs/manual/intro-motivation.xml:
* docs/manual/intro-preface.xml:
* docs/manual/intro.xml:
* docs/manual/links-api.xml:
* docs/manual/links.xml:
* docs/manual/manual.xml:
* docs/manual/motivation.xml:
* docs/manual/pads-api.xml:
* docs/manual/pads.xml:
* docs/manual/plugins-api.xml:
* docs/manual/plugins.xml:
* docs/manual/programs.xml:
* docs/manual/queues.xml:
* docs/manual/quotes.xml:
* docs/manual/schedulers.xml:
* docs/manual/states-api.xml:
* docs/manual/states.xml:
* docs/manual/threads.xml:
* docs/manual/typedetection.xml:
* docs/manual/win32.xml:
* docs/manual/xml.xml:
First try at rewriting the ADM. Needs lotsamore work, but some
parts might already be somewhat useful.
* docs/pwg/advanced-interfaces.xml:
Remove properties interface, it never actually existed (except for
on my HD...).
This commit is contained in:
Ronald S. Bultje 2004-12-14 23:35:09 +00:00
parent 7569c4e6ee
commit 5700e439fa
50 changed files with 554 additions and 6107 deletions

View file

@ -1,3 +1,73 @@
2004-12-15 Ronald S. Bultje <rbultje@ronald.bitfreak.net>
* docs/manual/advanced-autoplugging.xml:
* docs/manual/advanced-clocks.xml:
* docs/manual/advanced-dparams.xml:
* docs/manual/advanced-interfaces.xml:
* docs/manual/advanced-metadata.xml:
* docs/manual/advanced-position.xml:
* docs/manual/advanced-schedulers.xml:
* docs/manual/advanced-threads.xml:
* docs/manual/appendix-debugging.xml:
* docs/manual/appendix-gnome.xml:
* docs/manual/appendix-programs.xml:
* docs/manual/appendix-quotes.xml:
* docs/manual/appendix-win32.xml:
* docs/manual/autoplugging.xml:
* docs/manual/basics-bins.xml:
* docs/manual/basics-data.xml:
* docs/manual/basics-elements.xml:
* docs/manual/basics-helloworld.xml:
* docs/manual/basics-init.xml:
* docs/manual/basics-pads.xml:
* docs/manual/basics-plugins.xml:
* docs/manual/bins-api.xml:
* docs/manual/bins.xml:
* docs/manual/buffers-api.xml:
* docs/manual/buffers.xml:
* docs/manual/clocks.xml:
* docs/manual/components.xml:
* docs/manual/cothreads.xml:
* docs/manual/debugging.xml:
* docs/manual/dparams-app.xml:
* docs/manual/dynamic.xml:
* docs/manual/elements-api.xml:
* docs/manual/elements.xml:
* docs/manual/factories.xml:
* docs/manual/gnome.xml:
* docs/manual/goals.xml:
* docs/manual/helloworld.xml:
* docs/manual/helloworld2.xml:
* docs/manual/highlevel-components.xml:
* docs/manual/highlevel-xml.xml:
* docs/manual/init-api.xml:
* docs/manual/intro-motivation.xml:
* docs/manual/intro-preface.xml:
* docs/manual/intro.xml:
* docs/manual/links-api.xml:
* docs/manual/links.xml:
* docs/manual/manual.xml:
* docs/manual/motivation.xml:
* docs/manual/pads-api.xml:
* docs/manual/pads.xml:
* docs/manual/plugins-api.xml:
* docs/manual/plugins.xml:
* docs/manual/programs.xml:
* docs/manual/queues.xml:
* docs/manual/quotes.xml:
* docs/manual/schedulers.xml:
* docs/manual/states-api.xml:
* docs/manual/states.xml:
* docs/manual/threads.xml:
* docs/manual/typedetection.xml:
* docs/manual/win32.xml:
* docs/manual/xml.xml:
First try at rewriting the ADM. Needs lotsamore work, but some
parts might already be somewhat useful.
* docs/pwg/advanced-interfaces.xml:
Remove properties interface, it never actually existed (except for
on my HD...).
2004-12-13 David Schleef <ds@schleef.org>
* gst/gstpad.c: (gst_pad_set_explicit_caps): Allow caps to

2
common

@ -1 +1 @@
Subproject commit 8404d8841f5fd58fe31de09090867115e97c5261
Subproject commit b2638c100721f67b280c3b43b21f1ce1c9b5e316

View file

@ -0,0 +1,112 @@
<chapter id="chapter-interfaces">
<title>Interfaces</title>
<para>
In <xref linkend="section-elements-properties"/>, you have learned how
to use <classname>GObject</classname> properties as a simple way to do
interaction between applications and elements. This method suffices for
the simple'n'straight settings, but fails for anything more complicated
than a getter and setter. For the more complicated use cases, &GStreamer;
uses interfaces based on the Glib <classname>GInterface</classname> type.
</para>
<para>
Most of the interfaces handled here will not contain any example code.
See the API references for details. Here, we will just describe the
scope and purpose of each interface.
</para>
<sect1 id="section-interfaces-mixer">
<title>The Mixer interface</title>
<para>
The mixer interface provides a uniform way to control the volume on a
hardware (or software) mixer. The interface is primarily intended to
be implemented by elements for audio inputs and outputs that talk
directly to the hardware (e.g. OSS or ALSA plugins).
</para>
<para>
Using this interface, it is possible to control a list of tracks
(such as Line-in, Microphone, etc.) from a mixer element. They can
be muted, their volume can be changed and, for input tracks, their
record flag can be set as well.
</para>
<para>
Example plugins implementing this interface include the OSS elements
(osssrc, osssink, ossmixer) and the ALSA plugins (alsasrc, alsasink
and alsamixer).
</para>
</sect1>
<sect1 id="section-interfaces-tuner">
<title>The Tuner interface</title>
<para>
The tuner interface is a uniform way to control inputs and outputs
on a multi-input selection device. This is primarily used for input
selection on elements for TV- and capture-cards.
</para>
<para>
Using this interface, it is possible to select one track from a list
of tracks supported by that tuner-element. The tuner will than select
that track for media-processing internally. This can, for example, be
used to switch inputs on a TV-card (e.g. from Composite to S-video).
</para>
<para>
This interface is currently only implemented by the Video4linux and
Video4linux2 elements.
</para>
</sect1>
<sect1 id="section-interfaces-colorbalance">
<title>The Color Balance interface</title>
<para>
The colorbalance interface is a way to control video-related properties
on an element, such as brightness, contrast and so on. It's sole
reason for existance is that, as far as its authors know, there's no
way to dynamically register properties using
<classname>GObject</classname>.
</para>
<para>
The colorbalance interface is implemented by several plugins, including
xvimagesink and the Video4linux and Video4linux2 elements.
</para>
</sect1>
<sect1 id="section-interfaces-proprobe">
<title>The Property Probe interface</title>
<para>
The property probe is a way to autodetect allowed values for a
<classname>GObject</classname> property. It's primary use (and
the only thing that we currently use it for) is to autodetect
devices in several elements. For example, the OSS elements use
this interface to detect all OSS devices on a system. Applications
can then <quote>probe</quote> this property and get a list of
detected devices. Given the overlap between HAL and the practical
implementations of this interface, this might in time be deprecated
in favour of HAL.
</para>
<para>
This interface is currently implemented by many elements, including
the ALSA, OSS, Video4linux and Video4linux2 elements.
</para>
</sect1>
<sect1 id="section-interfaces-xoverlay">
<title>The X Overlay interface</title>
<para>
The X Overlay interface was created to solve the problem of embedding
video streams in an application window. The application provides an
X-window to the element implementing this interface to draw on, and
the element will then use this X-window to draw on rather than creating
a new toplevel window. This is useful to embed video in video players.
</para>
<para>
This interface is implemented by, amongst others, the Video4linux and
Video4linux2 elements and by ximagesink, xvimagesink and sdlvideosink.
</para>
</sect1>
</chapter>

View file

@ -0,0 +1,54 @@
<chapter id="chapter-metadata">
<title>Metadata</title>
<para>
&GStreamer; makes a clear distinction between two types of metadata, and
has support for both types. The first is stream tags, which describe the
content of a stream in a non-technical way. Examples include the author
of a song, the title of that very same song or the album it is a part of.
The other type of metadata is stream-info, which is a somewhat technical
description of the properties of a stream. This can include video size,
audio samplerate, codecs used and so on. Tags are handled using the
&GStreamer; tagging system. Stream-info can be retrieved from a
<classname>GstPad</classname>.
</para>
<sect1 id="section-streaminfo">
<title>Stream information</title>
<para>
Stream information can most easily be read by reading them from a
<classname>GstPad</classname>. This has already been discussed before
in <xref linkend="section-caps-metadata"/>. Therefore, we will skip
it here.
</para>
</sect1>
<sect1 id="section-tags-read">
<title>Tag reading</title>
<para>
Tag reading is remarkably simple in &GStreamer; Every element supports
the <quote>found-tag</quote> signal, which will be fired each the time
the element reads tags from the stream. A <classname>GstBin</classname>
will conveniently forward tags found by its childs. Therefore, in most
applications, you will only need to connect to the
<quote>found-tag</quote> signal on the top-most bin in your pipeline,
and you will automatically retrieve all tags from the stream.
</para>
<para>
Note, however, that the <quote>found-tag</quote> might be fired
multiple times and by multiple elements in the pipeline. It is the
application's responsibility to put all those tags together and
display them to the user in a nice, coherent way.
</para>
</sect1>
<sect1 id="section-tags-write">
<title>Tag writing</title>
<para>
WRITEME
</para>
</sect1>
</chapter>

View file

@ -0,0 +1,117 @@
<chapter id="chapter-queryevents">
<title>Position tracking and seeking</title>
<para>
So far, we've looked at how to create a pipeline to do media processing
and how to make it run ("iterate"). Most application developers will be
interested in providing feedback to the user on media progress. Media
players, for example, will want to show a slider showing the progress in
the song, and usually also a label indicating stream length. Transcoding
applications will want to show a progress bar on how much % of the task
is done. &GStreamer; has built-in support for doing all this using a
concept known as <emphasis>querying</emphasis>. Since seeking is very
similar, it will be discussed here as well. Seeking is done using the
concept of <emphasis>events</emphasis>.
</para>
<sect1 id="section-querying">
<title>Querying: getting the position or length of a stream</title>
<para>
Querying is defined as requesting a specific stream-property related
to progress tracking. This includes getting the length of a stream (if
available) or getting the current position. Those stream properties
can be retrieved in various formats such as time, audio samples, video
frames or bytes. The functions used are <function>gst_element_query
()</function> and <function>gst_pad_query ()</function>.
</para>
<para>
Obviously, using either of the above-mentioned functions requires the
application to know <emphasis>which</emphasis> element or pad to run
the query on. This is tricky, but there are some good sides to the
story. The good thing is that elements (or, rather, pads - since
<function>gst_element_query ()</function> internally calls
<function>gst_pad_query ()</function>) forward (<quote>dispatch</quote>)
events and queries to peer pads (or elements) if they don't handle it
themselves. The bad side is that some elements (or pads) will handle
events, but not the specific formats that you want, and therefore it
still won't work.
</para>
<para>
Most queries will, fortunately, work fine. Queries are always
dispatched backwards. This means, effectively, that it's easiest to
run the query on your video or audio output element, and it will take
care of dispatching the query to the element that knows the answer
(such as the current position or the media length; usually the demuxer
or decoder).
</para>
<programlisting>
#include &lt;gst/gst.h&gt;
gint
main (gint argc,
gchar *argv[])
{
GstElement *sink, *pipeline;
[..]
/* run pipeline */
do {
gint64 len, pos;
GstFormat fmt = GST_FORMAT_TIME;
if (gst_element_query (sink, GST_QUERY_POSITION, &amp;fmt, &amp;pos) &amp;&amp;
gst_element_query (sink, GST_QUERY_TOTAL, &amp;fmt, &amp;len)) {
g_print ("Time: %" GST_FORMAT_TIME " / %" GST_FORMAT_TIME "\r",
GST_TIME_ARGS (pos), GST_TIME_ARGS (len));
}
} while (gst_bin_iterate (GST_BIN (pipeline)));
[..]
}
</programlisting>
<para>
If you are having problems with the dispatching behaviour, your best
bet is to manually decide which element to start running the query on.
You can get a list of supported formats and query-types with
<function>gst_element_get_query_types ()</function> and
<function>gst_element_get_formats ()</function>.
</para>
</sect1>
<sect1 id="section-eventsseek">
<title>Events: seeking (and more)</title>
<para>
Events work in a very similar way as queries. Dispatching, for
example, works exactly the same for events (and also has the same
limitations). Although there are more ways in which applications
and elements can interact using events, we will only focus on seeking
here. This is done using the seek-event. A seek-event contains a
seeking offset, a seek method (which indicates relative to what the
offset was given), a seek format (which is the unit of the offset,
e.g. time, audio samples, video frames or bytes) and optionally a
set of seeking-related flags (e.g. whether internal buffers should be
flushed). The behaviour of a seek is also wrapped in the function
<function>gst_element_seek ()</function>.
</para>
<programlisting>
#include &lt;gst/gst.h&gt;
static void
seek_to_time (GstElement *audiosink,
gint64 time_nanonseconds)
{
gst_element_seek (audiosink,
GST_SEEK_METHOD_SET | GST_FORMAT_TIME |
GST_SEEK_FLAG_FLUSH, time_nanoseconds);
}
</programlisting>
</sect1>
</chapter>

View file

@ -1,42 +0,0 @@
<chapter id="chapter-scheduler">
<title>Understanding schedulers</title>
<para>
The scheduler is responsible for managing the plugins at runtime. Its
main responsibilities are:
<itemizedlist>
<listitem>
<para>
Preparing the plugins so they can be scheduled.
</para>
</listitem>
<listitem>
<para>
Monitoring state changes and enabling/disabling the element in the
chain.
</para>
</listitem>
<listitem>
<para>
Choosing an element as the entry point for the pipeline.
</para>
</listitem>
<listitem>
<para>
Selecting and distributing the global clock.
</para>
</listitem>
</itemizedlist>
</para>
<para>
The scheduler is a pluggable component; this means that alternative
schedulers can be written and plugged into GStreamer. The default scheduler
uses cothreads to schedule the plugins in a pipeline. Cothreads are fast
and lightweight user-space threads.
</para>
<para>
There is usually no need to interact with the scheduler directly, however
in some cases it is feasible to set a specific clock or force a specific
plugin as the entry point in the pipeline.
</para>
</chapter>

View file

@ -1,152 +0,0 @@
<chapter id="chapter-debugging">
<title>Debugging</title>
<para>
GStreamer has an extensive set of debugging tools for
plugin developers.
</para>
<sect1 id="section-debugging-command-line">
<title>Command line options</title>
<para>
Applications using the GStreamer libraries accept the following set
of command line argruments that help in debugging.
</para>
<para>
<itemizedlist>
<listitem>
<para>
<option>--gst-debug-help</option>
Print available debug categories and exit
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-level=<replaceable>LEVEL</replaceable></option>
Sets the default debug level from 0 (no output) to 5 (everything)
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug=<replaceable>LIST</replaceable></option>
Comma-separated list of category_name:level pairs to set specific
levels for the individual categories.
Example: GST_AUTOPLUG:5,GST_ELEMENT_*:3
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-no-color</option>
Disable color debugging output
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-disable</option>
Disable debugging
</para>
</listitem>
<listitem>
<para>
<option>--gst-plugin-spew</option>
Enable printout of errors while loading GStreamer plugins.
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
<sect1 id="section-debugging-adding">
<title>Adding debugging to a plugin</title>
<para>
Plugins can define their own categories for the debugging system.
Three things need to happen:
<itemizedlist>
<listitem>
<para>
The debugging variable needs to be defined somewhere.
If you only have one source file, you can Use GST_DEBUG_CATEGORY_STATIC to
define a static debug category variable.
</para>
<para>
If you have multiple source files, you should define the variable using
GST_DEBUG_CATEGORY in the source file where you're initializing the debug
category. The other source files should use GST_DEBUG_CATEGORY_EXTERN to
declare the debug category variable, possibly by including a common header
that has this statement.
</para>
</listitem>
<listitem>
<para>
The debugging category needs to be initialized. This is done through
GST_DEBUG_CATEGORY_INIT.
If you're using a global debugging category for the complete plugin,
you can call this in the
plugin's <function>plugin_init</function>.
If the debug category is only used for one of the elements, you can call it
from the element's <function>_class_init</function> function.
</para>
</listitem>
<listitem>
<para>
You should also define a default category to be used for debugging. This is
done by defining GST_CAT_DEFAULT for the source files where you're using
debug macros.
</para>
</listitem>
</itemizedlist>
</para>
<para>
Elements can then log debugging information using the set of macros. There
are five levels of debugging information:
<orderedlist>
<listitem>
<para>ERROR for fatal errors (for example, internal errors)</para>
</listitem>
<listitem>
<para>WARNING for warnings</para>
</listitem>
<listitem>
<para>INFO for normal information</para>
</listitem>
<listitem>
<para>DEBUG for debug information (for example, device parameters)</para>
</listitem>
<listitem>
<para>LOG for regular operation information (for example, chain handlers)</para>
</listitem>
</orderedlist>
</para>
<para>
For each of these levels, there are four macros to log debugging information.
Taking the LOG level as an example, there is
<itemizedlist>
<listitem>
<para>
GST_CAT_LOG_OBJECT logs debug information in the given GstCategory
and for the given GstObject
</para>
</listitem>
<listitem>
<para>
GST_CAT_LOG logs debug information in the given GstCategory
but without a GstObject (this is useful for libraries, for example)
</para>
</listitem>
<listitem>
<para>
GST_LOG_OBJECT logs debug information in the default GST_CAT_DEFAULT
category (as defined somewhere in the source), for the given GstObject
</para>
</listitem>
<listitem>
<para>
GST_LOG logs debug information in the default GST_CAT_DEFAULT
category, without a GstObject
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
</chapter>

View file

@ -1,152 +0,0 @@
<chapter id="chapter-debugging">
<title>Debugging</title>
<para>
GStreamer has an extensive set of debugging tools for
plugin developers.
</para>
<sect1 id="section-debugging-command-line">
<title>Command line options</title>
<para>
Applications using the GStreamer libraries accept the following set
of command line argruments that help in debugging.
</para>
<para>
<itemizedlist>
<listitem>
<para>
<option>--gst-debug-help</option>
Print available debug categories and exit
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-level=<replaceable>LEVEL</replaceable></option>
Sets the default debug level from 0 (no output) to 5 (everything)
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug=<replaceable>LIST</replaceable></option>
Comma-separated list of category_name:level pairs to set specific
levels for the individual categories.
Example: GST_AUTOPLUG:5,GST_ELEMENT_*:3
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-no-color</option>
Disable color debugging output
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-disable</option>
Disable debugging
</para>
</listitem>
<listitem>
<para>
<option>--gst-plugin-spew</option>
Enable printout of errors while loading GStreamer plugins.
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
<sect1 id="section-debugging-adding">
<title>Adding debugging to a plugin</title>
<para>
Plugins can define their own categories for the debugging system.
Three things need to happen:
<itemizedlist>
<listitem>
<para>
The debugging variable needs to be defined somewhere.
If you only have one source file, you can Use GST_DEBUG_CATEGORY_STATIC to
define a static debug category variable.
</para>
<para>
If you have multiple source files, you should define the variable using
GST_DEBUG_CATEGORY in the source file where you're initializing the debug
category. The other source files should use GST_DEBUG_CATEGORY_EXTERN to
declare the debug category variable, possibly by including a common header
that has this statement.
</para>
</listitem>
<listitem>
<para>
The debugging category needs to be initialized. This is done through
GST_DEBUG_CATEGORY_INIT.
If you're using a global debugging category for the complete plugin,
you can call this in the
plugin's <function>plugin_init</function>.
If the debug category is only used for one of the elements, you can call it
from the element's <function>_class_init</function> function.
</para>
</listitem>
<listitem>
<para>
You should also define a default category to be used for debugging. This is
done by defining GST_CAT_DEFAULT for the source files where you're using
debug macros.
</para>
</listitem>
</itemizedlist>
</para>
<para>
Elements can then log debugging information using the set of macros. There
are five levels of debugging information:
<orderedlist>
<listitem>
<para>ERROR for fatal errors (for example, internal errors)</para>
</listitem>
<listitem>
<para>WARNING for warnings</para>
</listitem>
<listitem>
<para>INFO for normal information</para>
</listitem>
<listitem>
<para>DEBUG for debug information (for example, device parameters)</para>
</listitem>
<listitem>
<para>LOG for regular operation information (for example, chain handlers)</para>
</listitem>
</orderedlist>
</para>
<para>
For each of these levels, there are four macros to log debugging information.
Taking the LOG level as an example, there is
<itemizedlist>
<listitem>
<para>
GST_CAT_LOG_OBJECT logs debug information in the given GstCategory
and for the given GstObject
</para>
</listitem>
<listitem>
<para>
GST_CAT_LOG logs debug information in the given GstCategory
but without a GstObject (this is useful for libraries, for example)
</para>
</listitem>
<listitem>
<para>
GST_LOG_OBJECT logs debug information in the default GST_CAT_DEFAULT
category (as defined somewhere in the source), for the given GstObject
</para>
</listitem>
<listitem>
<para>
GST_LOG logs debug information in the default GST_CAT_DEFAULT
category, without a GstObject
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
</chapter>

View file

@ -1,95 +0,0 @@
<chapter id="chapter-gnome">
<title>GNOME integration</title>
<para>
GStreamer is fairly easy to integrate with GNOME applications.
GStreamer uses libxml 2.0, GLib 2.0 and popt, as do all other
GNOME applications.
There are however some basic issues you need to address in your GNOME
applications.
</para>
<sect1>
<title>Command line options</title>
<para>
GNOME applications call gnome_program_init () to parse command-line
options and initialize the necessary gnome modules.
GStreamer applications normally call gst_init (&amp;argc, &amp;argv) to
do the same for GStreamer.
</para>
<para>
Each of these two swallows the program options passed to the program,
so we need a different way to allow both GNOME and GStreamer to parse
the command-line options. This is shown in the following example.
</para>
<programlisting>
/* example-begin gnome.c */
#include &lt;gnome.h&gt;
#include &lt;gst/gst.h&gt;
int
main (int argc, char **argv)
{
GstPoptOption options[] = {
{ NULL, '\0', POPT_ARG_INCLUDE_TABLE, NULL, 0, "GStreamer", NULL },
POPT_TABLEEND
};
GnomeProgram *program;
poptContext context;
const gchar **argvn;
GstElement *pipeline;
GstElement *src, *sink;
options[0].arg = (void *) gst_init_get_popt_table ();
g_print ("Calling gnome_program_init with the GStreamer popt table\n");
/* gnome_program_init will initialize GStreamer now
* as a side effect of having the GStreamer popt table passed. */
if (! (program = gnome_program_init ("my_package", "0.1", LIBGNOMEUI_MODULE,
argc, argv,
GNOME_PARAM_POPT_TABLE, options,
NULL)))
g_error ("gnome_program_init failed");
g_print ("Getting gnome-program popt context\n");
g_object_get (program, "popt-context", &amp;context, NULL);
argvn = poptGetArgs (context);
if (!argvn) {
g_print ("Run this example with some arguments to see how it works.\n");
return 0;
}
g_print ("Printing rest of arguments\n");
while (*argvn) {
g_print ("argument: %s\n", *argvn);
++argvn;
}
/* do some GStreamer things to show everything's initialized properly */
g_print ("Doing some GStreamer stuff to show that everything works\n");
pipeline = gst_pipeline_new ("pipeline");
src = gst_element_factory_make ("fakesrc", "src");
sink = gst_element_factory_make ("fakesink", "sink");
gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL);
gst_element_link (src, sink);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
gst_bin_iterate (GST_BIN (pipeline));
gst_element_set_state (pipeline, GST_STATE_NULL);
return 0;
}
/* example-end gnome.c */
</programlisting>
<para>
If you try out this program, you will see that when called with
--help, it will print out both GStreamer and GNOME help arguments.
All of the arguments that didn't belong to either end up in the
argvn pointer array.
</para>
<para>
FIXME: flesh this out more. How do we get the GStreamer arguments
at the end ?
FIXME: add a GConf bit.
</para>
</sect1>
</chapter>

View file

@ -1,95 +0,0 @@
<chapter id="chapter-gnome">
<title>GNOME integration</title>
<para>
GStreamer is fairly easy to integrate with GNOME applications.
GStreamer uses libxml 2.0, GLib 2.0 and popt, as do all other
GNOME applications.
There are however some basic issues you need to address in your GNOME
applications.
</para>
<sect1>
<title>Command line options</title>
<para>
GNOME applications call gnome_program_init () to parse command-line
options and initialize the necessary gnome modules.
GStreamer applications normally call gst_init (&amp;argc, &amp;argv) to
do the same for GStreamer.
</para>
<para>
Each of these two swallows the program options passed to the program,
so we need a different way to allow both GNOME and GStreamer to parse
the command-line options. This is shown in the following example.
</para>
<programlisting>
/* example-begin gnome.c */
#include &lt;gnome.h&gt;
#include &lt;gst/gst.h&gt;
int
main (int argc, char **argv)
{
GstPoptOption options[] = {
{ NULL, '\0', POPT_ARG_INCLUDE_TABLE, NULL, 0, "GStreamer", NULL },
POPT_TABLEEND
};
GnomeProgram *program;
poptContext context;
const gchar **argvn;
GstElement *pipeline;
GstElement *src, *sink;
options[0].arg = (void *) gst_init_get_popt_table ();
g_print ("Calling gnome_program_init with the GStreamer popt table\n");
/* gnome_program_init will initialize GStreamer now
* as a side effect of having the GStreamer popt table passed. */
if (! (program = gnome_program_init ("my_package", "0.1", LIBGNOMEUI_MODULE,
argc, argv,
GNOME_PARAM_POPT_TABLE, options,
NULL)))
g_error ("gnome_program_init failed");
g_print ("Getting gnome-program popt context\n");
g_object_get (program, "popt-context", &amp;context, NULL);
argvn = poptGetArgs (context);
if (!argvn) {
g_print ("Run this example with some arguments to see how it works.\n");
return 0;
}
g_print ("Printing rest of arguments\n");
while (*argvn) {
g_print ("argument: %s\n", *argvn);
++argvn;
}
/* do some GStreamer things to show everything's initialized properly */
g_print ("Doing some GStreamer stuff to show that everything works\n");
pipeline = gst_pipeline_new ("pipeline");
src = gst_element_factory_make ("fakesrc", "src");
sink = gst_element_factory_make ("fakesink", "sink");
gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL);
gst_element_link (src, sink);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
gst_bin_iterate (GST_BIN (pipeline));
gst_element_set_state (pipeline, GST_STATE_NULL);
return 0;
}
/* example-end gnome.c */
</programlisting>
<para>
If you try out this program, you will see that when called with
--help, it will print out both GStreamer and GNOME help arguments.
All of the arguments that didn't belong to either end up in the
argvn pointer array.
</para>
<para>
FIXME: flesh this out more. How do we get the GStreamer arguments
at the end ?
FIXME: add a GConf bit.
</para>
</sect1>
</chapter>

View file

@ -1,85 +0,0 @@
<chapter id="chapter-win32">
<title>Windows support</title>
<sect1 id="section-win32-build">
<title>Building <application>GStreamer</application> under Win32</title>
<para>There are different makefiles that can be used to build GStreamer with the usual Microsoft
compiling tools.</para>
<para>The Makefile is meant to be used with the GNU make program and the free
version of the Microsoft compiler (<ulink url="http://msdn.microsoft.com/visualc/vctoolkit2003/">http://msdn.microsoft.com/visualc/vctoolkit2003/</ulink>). You also
have to modify your system environment variables to use it from the command-line. You will also
need a working Platform SDK for Windows that is available for free from Microsoft.</para>
<para>The projects/makefiles will generate automatically some source files needed to compile
GStreamer. That requires that you have installed on your system some GNU tools and that they are
available in your system PATH.</para>
<para>The GStreamer project depends on other libraries, namely :</para>
<itemizedlist>
<listitem><para>GLib</para></listitem>
<listitem><para>popt</para></listitem>
<listitem><para>libxml2</para></listitem>
<listitem><para>libintl</para></listitem>
<listitem><para>libiconv</para></listitem>
</itemizedlist>
<para>There is now an existing package that has all these dependencies built with MSVC7.1. It exists either as precompiled librairies
and headers in both Release and Debug mode, or as the source package to build it yourself. You can
find it on <ulink url="http://mukoli.free.fr/gstreamer/deps/">http://mukoli.free.fr/gstreamer/deps/</ulink>.</para>
<note>
<title>Notes</title>
<para>GNU tools needed that you can find on <ulink url="http://gnuwin32.sourceforge.net/">http://gnuwin32.sourceforge.net/</ulink></para>
<itemizedlist>
<listitem><para>GNU flex (tested with 2.5.4)</para></listitem>
<listitem><para>GNU bison (tested with 1.35)</para></listitem>
</itemizedlist>
<para>and <ulink url="http://www.mingw.org/">http://www.mingw.org/</ulink></para>
<itemizedlist>
<listitem><para>GNU make (tested with 3.80)</para></listitem>
</itemizedlist>
<para>the generated files from the -auto makefiles will be available soon separately on the net
for convenience (people who don't want to install GNU tools).</para>
</note>
</sect1>
<sect1 id="section-win32-install">
<title>Installation on the system</title>
<para>By default, GSTreamer needs a registry. You have to generate it using "gst-register.exe". It will create
the file in c:\gstreamer\registry.xml that will hold all the plugins you can use.</para>
<para>You should install the GSTreamer core in c:\gstreamer\bin and the plugins in c:\gstreamer\plugins. Both
directories should be added to your system PATH. The library dependencies should be installed in c:\usr</para>
<para>For example, my current setup is :</para>
<itemizedlist>
<listitem><para><filename>c:\gstreamer\registry.xml</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-inspect.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-launch.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-register.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstbytestream.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstelements.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstoptimalscheduler.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstspider.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\libgtreamer-0.8.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\plugins\gst-libs.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\plugins\gstmatroska.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\iconv.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\intl.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libglib-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgmodule-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgobject-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgthread-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libxml2.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\popt.dll</filename></para></listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,198 +0,0 @@
<chapter id="chapter-autoplug">
<title>Autoplugging</title>
<para>
<application>GStreamer</application> provides an API to automatically
construct complex pipelines based on source and destination capabilities.
This feature is very useful if you want to convert type X to type Y but
don't care about the plugins needed to accomplish this task. The
autoplugger will consult the plugin repository, select and link the
elements needed for the conversion.
</para>
<para>
The autoplugger API is implemented in an abstract class. Autoplugger
implementations reside in plugins and are therefore optional and can be
optimized for a specific task. Two types of autopluggers exist: renderer
ones and non-renderer ones. The renderer autopluggers will not have any
source pads while the non-renderer ones do. The renderer autopluggers are
mainly used for media playback while the non renderer ones are used for
arbitrary format conversion.
</para>
<sect1>
<title>Using autoplugging</title>
<para>
You first need to create a suitable autoplugger with gst_autoplug_factory_make().
The name of the autoplugger must be one of the registered autopluggers..
</para>
<para>
A list of all available autopluggers can be obtained with gst_autoplug_factory_get_list().
</para>
<para>
If the autoplugger supports the RENDERER API, use the
gst_autoplug_to_renderers() function to create a bin that links
the source caps to the specified render elements. You can then add
the bin to a pipeline and run it.
<programlisting>
GstAutoplug *autoplug;
GstElement *element;
GstElement *sink;
/* create a static autoplugger */
autoplug = gst_autoplug_factory_make ("staticrender");
/* create an osssink */
sink = gst_element_factory_make ("osssink", "our_sink");
/* create an element that can play audio/mp3 through osssink */
element = gst_autoplug_to_renderers (autoplug,
gst_caps_new (
"sink_audio_caps",
"audio/mp3",
NULL
),
sink,
NULL);
/* add the element to a bin and link the sink pad */
...
</programlisting>
</para>
<para>
If the autoplugger supports the CAPS API, use the gst_autoplug_to_caps()
function to link the source caps to the destination caps. The created
bin will have source and sink pads compatible with the provided caps.
<programlisting>
GstAutoplug *autoplug;
GstElement *element;
/* create a static autoplugger */
autoplug = gst_autoplug_factory_make ("static");
/* create an element that converts audio/mp3 to audio/raw */
element = gst_autoplug_to_caps (autoplug,
gst_caps_new (
"sink_audio_caps",
"audio/mp3",
NULL
),
gst_caps_new (
"src_audio_caps",
"audio/raw",
NULL
),
NULL);
/* add the element to a bin and link the src/sink pads */
...
</programlisting>
</para>
</sect1>
<sect1 id="section-autoplug-cache">
<!-- FIXME: this is outdated, there is no GstAutoplugCache in gst-0.8.X -->
<title>Using the <classname>GstAutoplugCache</classname> element</title>
<para>
The <classname>GstAutoplugCache</classname> element is used to cache the
media stream when performing typedetection. As we have seen in
<xref linkend="chapter-typedetection"/>, the typefind function consumes a
buffer to determine its media type. After we have set up the pipeline
to play the media stream we should be able to 'replay' the previous buffer(s).
This is what the autoplugcache is used for.
</para>
<para>
The basic usage pattern for the autoplugcache in combination with the typefind
element is like this:
<orderedlist>
<listitem>
<para>
Add the autoplugcache element to a bin and link the sink pad
to the source pad of an element with unknown caps.
</para>
</listitem>
<listitem>
<para>
Link the source pad of the autoplugcache to the sink pad of
the typefind element.
</para>
</listitem>
<listitem>
<para>
Iterate the pipeline until the typefind element has found a type.
</para>
</listitem>
<listitem>
<para>
Remove the typefind element and add the plugins needed to play
back the discovered media type to the autoplugcache source pad.
</para>
</listitem>
<listitem>
<para>
Reset the cache to start playback of the cached data. Connect to the
"cache_empty" signal.
</para>
</listitem>
<listitem>
<para>
In the cache_empty signal callback function, remove the autoplugcache and
relink the pads.
</para>
</listitem>
</orderedlist>
</para>
<para>
In the next chapter we will create a new version of our helloworld example using the
autoplugger, the autoplugcache and the typefind element.
</para>
</sect1>
<sect1 id="section-autoplugging-spider">
<title>Another approach to autoplugging</title>
<para>
The autoplug API is interesting, but often impractical. It is static;
it cannot deal with dynamic pipelines. An element that will
automatically figure out and decode the type is more useful.
Enter the spider.
</para>
<sect2>
<title>The spider element</title>
<para>
The spider element is a generalized autoplugging element. At this point (April 2002), it's
the best we've got; it can be inserted anywhere within a pipeline to perform caps
conversion, if possible. Consider the following gst-launch line:
<programlisting>
$ gst-launch filesrc location=my.mp3 ! spider ! osssink
</programlisting>
The spider will detect the type of the stream, autoplug it to the osssink's caps, and play
the pipeline. It's neat.
</para>
</sect2>
<sect2>
<title>Spider features</title>
<para>
<orderedlist>
<listitem>
<para>
Automatically typefinds the incoming stream.
</para>
</listitem>
<listitem>
<para>
Has request pads on the source side. This means that it can
autoplug one source stream into many sink streams. For example,
an MPEG1 system stream can have audio as well as video; that
pipeline would be represented in gst-launch syntax as
<programlisting>
$ gst-launch filesrc location=my.mpeg1 ! spider ! { queue ! osssink } spider.src_%d!
{ queue ! xvideosink }
</programlisting>
</para>
</listitem>
</orderedlist>
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,205 +0,0 @@
<chapter id="chapter-bins-api">
<title>Bins</title>
<sect1 id="section-bin-create">
<title>Creating a bin</title>
<para>
Bins are created in the same way that other elements are created. ie.
using an element factory, or any of the associated convenience functions:
</para>
<programlisting>
GstElement *bin, *thread, *pipeline;
/* create a new bin called 'mybin'. this bin will be only for organizational purposes; a normal
GstBin doesn't affect plan generation */
bin = gst_element_factory_make ("bin", "mybin");
/* create a new thread, and give it a unique name */
thread = gst_element_factory_make ("thread", NULL);
/* the core bins (GstBin, GstThread, GstPipeline) also have convenience APIs,
gst_&lt;bintype&gt;_new (). these are equivalent to the gst_element_factory_make () syntax. */
pipeline = gst_pipeline_new ("pipeline_name");
</programlisting>
</sect1>
<sect1 id="section-bin-adding">
<title>Adding elements to a bin</title>
<para>
Elements are added to a bin with the following code sample:
</para>
<programlisting>
GstElement *element;
GstElement *bin;
bin = gst_bin_new ("mybin");
element = gst_element_factory_make ("mad", "decoder");
gst_bin_add (GST_BIN (bin), element);
...
</programlisting>
<para>
Bins and threads can be added to other bins too. This allows you to create nested bins. Pipelines shouldn't be added to any other element, though.
They are toplevel bins and they are directly linked to the scheduler.
</para>
<para>
To get an element from the bin you can use:
</para>
<programlisting>
GstElement *element;
element = gst_bin_get_by_name (GST_BIN (bin), "decoder");
...
</programlisting>
<para>
You can see that the name of the element becomes very handy
for retrieving the element from a bin by using the element's
name. gst_bin_get_by_name () will recursively search nested bins.
</para>
<para>
To get a list of elements in a bin, use:
</para>
<programlisting>
GList *elements;
elements = gst_bin_get_list (GST_BIN (bin));
while (elements) {
GstElement *element = GST_ELEMENT (elements-&gt;data);
g_print ("element in bin: %s\n", GST_OBJECT_NAME (GST_OBJECT (element)));
elements = g_list_next (elements);
}
...
</programlisting>
<para>
To remove an element from a bin, use:
</para>
<programlisting>
GstElement *element;
gst_bin_remove (GST_BIN (bin), element);
...
</programlisting>
<para>
To add many elements to a bin at the same time, use the gst_bin_add_many
() function. Remember to pass NULL as the last argument.
</para>
<programlisting>
GstElement *filesrc, *decoder, *audiosink;
GstBin *bin;
/* instantiate the elements and the bins... */
gst_bin_add_many (bin, filesrc, decoder, audiosink, NULL);
</programlisting>
</sect1>
<sect1 id="section-bin-custom">
<title>Custom bins</title>
<para>
The application programmer can create custom bins packed with elements
to perform a specific task. This allows you to write an MPEG audio
decoder with just the following lines of code:
</para>
<programlisting>
/* create the mp3player element */
GstElement *mp3player = gst_element_factory_make ("mp3player", "mp3player");
/* set the source mp3 audio file */
g_object_set (G_OBJECT (mp3player), "location", "helloworld.mp3", NULL);
/* start playback */
gst_element_set_state (GST_ELEMENT (mp3player), GST_STATE_PLAYING);
...
/* pause playback */
gst_element_set_state (GST_ELEMENT (mp3player), GST_STATE_PAUSED);
...
/* stop */
gst_element_set_state (GST_ELEMENT (mp3player), GST_STATE_NULL);
</programlisting>
<para>
Note that the above code assumes that the mp3player bin derives itself
from a <ulink type="http"
url="../../gstreamer/html/GstThread.html"><classname>GstThread</classname></ulink>, which begins to play as soon
as its state is set to PLAYING. Other bin types may need explicit
iteration. For more information, see <xref linkend="chapter-threads"/>.
</para>
<para>
Custom bins can be created with a plugin or an XML description. You
will find more information about creating custom bin in the Plugin
Writers Guide (FIXME ref).
</para>
</sect1>
<sect1 id="section-bin-ghostpads">
<title>Ghost pads</title>
<para>
You can see from <xref linkend="section-bin-noghost-img"/> how a bin has no pads of its own.
This is where "ghost pads" come into play.
</para>
<figure float="1" id="section-bin-noghost-img">
<title>Visualisation of a <ulink type="http"
url="../../gstreamer/html/GstBin.html"><classname>GstBin</classname></ulink> element without ghost pads</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/bin-element-noghost.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
A ghost pad is a pad from some element in the bin that has been promoted to the bin.
This way, the bin also has a pad. The bin becomes just another element with a pad and
you can then use the bin just like any other element. This is a very important feature
for creating custom bins.
</para>
<figure float="1" id="section-bin-ghost-img">
<title>Visualisation of a <ulink type="http"
url="../../gstreamer/html/GstBin.html"><classname>GstBin</classname></ulink> element with a ghost pad</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/bin-element-ghost.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
<xref linkend="section-bin-ghost-img"/>
is a representation of a ghost pad. The sink pad of element one is now also a pad
of the bin.
</para>
<para>
Ghost pads can actually be added to all <ulink type="http"
url="../../gstreamer/html/GstElement.html"><classname>GstElement</classname></ulink>s and not just
<ulink type="http" url="../../gstreamer/html/GstBin.html"><classname>GstBin</classname></ulink>s. Use the following code example to add a ghost pad to a bin:
</para>
<programlisting>
GstElement *bin;
GstElement *element;
element = gst_element_factory_create ("mad", "decoder");
bin = gst_bin_new ("mybin");
gst_bin_add (GST_BIN (bin), element);
gst_element_add_ghost_pad (bin, gst_element_get_pad (element, "sink"), "sink");
</programlisting>
<para>
In the above example, the bin now also has a pad: the pad called 'sink'
of the given element.
</para>
<para>
We can now, for example, link the source pad of a filesrc element
to the bin with:
</para>
<programlisting>
GstElement *filesrc;
filesrc = gst_element_factory_create ("filesrc", "disk_reader");
gst_element_link_pads (filesrc, "src", bin, "sink");
...
</programlisting>
</sect1>
</chapter>

View file

@ -1,49 +0,0 @@
<chapter id="chapter-bins">
<title>Bins</title>
<para>
A bin is a container element. You can add elements to a bin. Since a bin is
an element itself, it can also be added to another bin.
</para>
<para>
Bins allow you to combine a group of linked elements into one logical element. You do
not deal with the individual elements anymore but with just one element, the bin.
We will see that this is extremely powerful when you are going to construct
complex pipelines since it allows you to break up the pipeline in smaller chunks.
</para>
<para>
The bin will also manage the elements contained in it. It will figure out how
the data will flow in the bin and generate an optimal plan for that data flow. Plan
generation is one of the most complicated procedures in GStreamer.
</para>
<figure float="1" id="section-bin-img">
<title>Visualisation of a bin with some elements in it</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/bin-element.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
There are two specialized bins available to the GStreamer programmer:
<itemizedlist>
<listitem>
<para>
a pipeline: a generic container that allows scheduling of the
containing elements. The toplevel bin has to be a pipeline.
Every application thus needs at least one of these.
</para>
</listitem>
<listitem>
<para>
a thread: a bin that will be run in a separate execution thread.
You will have to use this bin if you have to carefully
synchronize audio and video, or for buffering. You will learn
more about threads in <xref linkend="chapter-threads"/>.
</para>
</listitem>
</itemizedlist>
</para>
</chapter>

View file

@ -1,6 +0,0 @@
<chapter id="chapter-buffers-api">
<title>Buffers</title>
<para>
</para>
</chapter>

View file

@ -1,66 +0,0 @@
<chapter id="chapter-buffers">
<title>Buffers</title>
<para>
Buffers contain the data that will flow through the pipeline you have
created. A source element will typically create a new buffer and pass
it through a pad to the next element in the chain. When using the
GStreamer infrastructure to create a media pipeline you will not have
to deal with buffers yourself; the elements will do that for you.
</para>
<para>
A buffer consists of:
<itemizedlist>
<listitem>
<para>
a pointer to a piece of memory.
</para>
</listitem>
<listitem>
<para>
the size of the memory.
</para>
</listitem>
<listitem>
<para>
a timestamp for the buffer.
</para>
</listitem>
<listitem>
<para>
A refcount that indicates how many elements are using this
buffer. This refcount will be used to destroy the buffer when no
element has a reference to it.
</para>
</listitem>
</itemizedlist>
</para>
<para>
<!-- FIXME: this is outdated, there is no GstBufferPool in gst-0.8.X -->
GStreamer provides functions to create custom buffer create/destroy algorithms, called
a <classname>GstBufferPool</classname>. This makes it possible to efficiently
allocate and destroy buffer memory. It also makes it possible to exchange memory between
elements by passing the <classname>GstBufferPool</classname>. A video element can,
for example, create a custom buffer allocation algorithm that creates buffers with XSHM
as the buffer memory. An element can use this algorithm to create and fill the buffer
with data.
</para>
<para>
The simple case is that a buffer is created, memory allocated, data put
in it, and passed to the next element. That element reads the data, does
something (like creating a new buffer and decoding into it), and
unreferences the buffer. This causes the data to be freed and the buffer
to be destroyed. A typical MPEG audio decoder works like this.
</para>
<para>
A more complex case is when the filter modifies the data in place. It
does so and simply passes on the buffer to the next element. This is just
as easy to deal with. An element that works in place has to be careful when
the buffer is used in more than one element; a copy on write has to made in this
situation.
</para>
</chapter>

View file

@ -1,5 +0,0 @@
<chapter id="chapter-clocks">
<title>Clocks in GStreamer</title>
<para>
</para>
</chapter>

View file

@ -1,37 +0,0 @@
<chapter id="chapter-components">
<title>Components</title>
<para>
FIXME: This chapter is way out of date.
</para>
<para>
<application>GStreamer</application> includes components that people can include
in their programs.
</para>
<sect1 id="section-components-gst-play">
<title>GstPlay</title>
<para>
GstPlay is a GtkWidget with a simple API to play, pause and stop a media file.
</para>
</sect1>
<sect1 id="section-components-gst-media-play">
<title>GstMediaPlay</title>
<para>
GstMediaPlay is a complete player widget.
</para>
</sect1>
<sect1 id="section-components-gst-editor">
<title>GstEditor</title>
<para>
GstEditor is a set of widgets to display a graphical representation of a
pipeline.
</para>
</sect1>
</chapter>

View file

@ -1,130 +0,0 @@
<chapter id="chapter-cothreads">
<title>Cothreads</title>
<para>
Cothreads are user-space threads that greatly reduce context switching overhead introduced by
regular kernel threads. Cothreads are also used to handle the more complex elements. They differ
from other user-space threading libraries in that they are scheduled explictly by GStreamer.
</para>
<para>
A cothread is created by a <ulink type="http"
url="../../gstreamer/html/GstBin.html"><classname>GstBin</classname></ulink>
whenever an element is found
inside the bin that has one or more of the following properties:
<itemizedlist>
<listitem>
<para>
The element is loop-based instead of chain-based
</para>
</listitem>
<listitem>
<para>
The element has multiple input pads
</para>
</listitem>
<listitem>
<para>
The element has the MULTI_IN flag set
</para>
</listitem>
</itemizedlist>
The <ulink type="http" url="../../gstreamer/html/GstBin.html"><classname>GstBin
</classname></ulink> will create a cothread context for all the elements
in the bin so that the elements will interact in cooperative
multithreading.
</para>
<para>
Before proceding to the concept of loop-based elements we will first
explain the chain-based elements.
</para>
<sect1 id="section-chain-based">
<title>Chain-based elements</title>
<para>
Chain based elements receive a buffer of data and are supposed
to handle the data and perform a gst_pad_push.
</para>
<para>
The basic main function of a chain-based element is like:
</para>
<programlisting>
static void
chain_function (GstPad *pad, GstBuffer *buffer)
{
GstBuffer *outbuffer;
....
// process the buffer, create a new outbuffer
...
gst_pad_push (srcpad, outbuffer);
}
</programlisting>
<para>
Chain based function are mainly used for elements that have a one to one
relation between their input and output behaviour. An example of such an
element can be a simple video blur filter. The filter takes a buffer in, performs
the blur operation on it and sends out the resulting buffer.
</para>
<para>
Another element, for example, is a volume filter. The filter takes audio samples as
input, performs the volume effect and sends out the resulting buffer.
</para>
</sect1>
<sect1 id="section-loop-based">
<title>Loop-based elements</title>
<para>
As opposed to chain-based elements, loop-based elements enter an
infinite loop that looks like this:
<programlisting>
GstBuffer *buffer, *outbuffer;
while (1) {
buffer = gst_pad_pull (sinkpad);
...
// process buffer, create outbuffer
while (!done) {
....
// optionally request another buffer
buffer = gst_pad_pull (sinkpad);
....
}
...
gst_pad_push (srcpad, outbuffer);
}
</programlisting>
The loop-based elements request a buffer whenever they need one.
</para>
<para>
When the request for a buffer cannot be immediately satisfied, the control
will be given to the source element of the loop-based element until it
performs a push on its source pad. At that time the control is handed
back to the loop-based element, etc... The execution trace can get
fairly complex using cothreads when there are multiple input/output
pads for the loop-based element. Cothread switches are performed within
the call to gst_pad_pull and gst_pad_push; from the perspective of
the loop-based element, it just "appears" that gst_pad_push (or _pull)
might take a long time to return.
</para>
<para>
Loop based elements are mainly used for the more complex elements
that need a specific amount of data before they can start to produce
output. An example of such an element is the MPEG video decoder. The
element will pull a buffer, perform some decoding on it and optionally
request more buffers to decode, and when a complete video frame has
been decoded, a buffer is sent out. For example, any plugin using the
bytestream library will need to be loop-based.
</para>
<para>
There is no problem in putting cothreaded elements into a <ulink
type="http" url="../../gstreamer/html/GstThread.html"><classname>GstThread
</classname></ulink> to
create even more complex pipelines with both user and kernel space threads.
</para>
</sect1>
</chapter>

View file

@ -1,152 +0,0 @@
<chapter id="chapter-debugging">
<title>Debugging</title>
<para>
GStreamer has an extensive set of debugging tools for
plugin developers.
</para>
<sect1 id="section-debugging-command-line">
<title>Command line options</title>
<para>
Applications using the GStreamer libraries accept the following set
of command line argruments that help in debugging.
</para>
<para>
<itemizedlist>
<listitem>
<para>
<option>--gst-debug-help</option>
Print available debug categories and exit
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-level=<replaceable>LEVEL</replaceable></option>
Sets the default debug level from 0 (no output) to 5 (everything)
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug=<replaceable>LIST</replaceable></option>
Comma-separated list of category_name:level pairs to set specific
levels for the individual categories.
Example: GST_AUTOPLUG:5,GST_ELEMENT_*:3
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-no-color</option>
Disable color debugging output
</para>
</listitem>
<listitem>
<para>
<option>--gst-debug-disable</option>
Disable debugging
</para>
</listitem>
<listitem>
<para>
<option>--gst-plugin-spew</option>
Enable printout of errors while loading GStreamer plugins.
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
<sect1 id="section-debugging-adding">
<title>Adding debugging to a plugin</title>
<para>
Plugins can define their own categories for the debugging system.
Three things need to happen:
<itemizedlist>
<listitem>
<para>
The debugging variable needs to be defined somewhere.
If you only have one source file, you can Use GST_DEBUG_CATEGORY_STATIC to
define a static debug category variable.
</para>
<para>
If you have multiple source files, you should define the variable using
GST_DEBUG_CATEGORY in the source file where you're initializing the debug
category. The other source files should use GST_DEBUG_CATEGORY_EXTERN to
declare the debug category variable, possibly by including a common header
that has this statement.
</para>
</listitem>
<listitem>
<para>
The debugging category needs to be initialized. This is done through
GST_DEBUG_CATEGORY_INIT.
If you're using a global debugging category for the complete plugin,
you can call this in the
plugin's <function>plugin_init</function>.
If the debug category is only used for one of the elements, you can call it
from the element's <function>_class_init</function> function.
</para>
</listitem>
<listitem>
<para>
You should also define a default category to be used for debugging. This is
done by defining GST_CAT_DEFAULT for the source files where you're using
debug macros.
</para>
</listitem>
</itemizedlist>
</para>
<para>
Elements can then log debugging information using the set of macros. There
are five levels of debugging information:
<orderedlist>
<listitem>
<para>ERROR for fatal errors (for example, internal errors)</para>
</listitem>
<listitem>
<para>WARNING for warnings</para>
</listitem>
<listitem>
<para>INFO for normal information</para>
</listitem>
<listitem>
<para>DEBUG for debug information (for example, device parameters)</para>
</listitem>
<listitem>
<para>LOG for regular operation information (for example, chain handlers)</para>
</listitem>
</orderedlist>
</para>
<para>
For each of these levels, there are four macros to log debugging information.
Taking the LOG level as an example, there is
<itemizedlist>
<listitem>
<para>
GST_CAT_LOG_OBJECT logs debug information in the given GstCategory
and for the given GstObject
</para>
</listitem>
<listitem>
<para>
GST_CAT_LOG logs debug information in the given GstCategory
but without a GstObject (this is useful for libraries, for example)
</para>
</listitem>
<listitem>
<para>
GST_LOG_OBJECT logs debug information in the default GST_CAT_DEFAULT
category (as defined somewhere in the source), for the given GstObject
</para>
</listitem>
<listitem>
<para>
GST_LOG logs debug information in the default GST_CAT_DEFAULT
category, without a GstObject
</para>
</listitem>
</itemizedlist>
</para>
</sect1>
</chapter>

View file

@ -1,198 +0,0 @@
<chapter id="chapter-dparams">
<title>Dynamic Parameters</title>
<sect1 id="section-dparams-getting-started">
<title>Getting Started</title>
<para>
The Dynamic Parameters subsystem is contained within the
<filename>gstcontrol</filename> library.
You need to include the header in your application's source file:
</para>
<programlisting>
...
#include &lt;gst/gst.h&gt;
#include &lt;gst/control/control.h&gt;
...
</programlisting>
<para>
Your application should link to the shared library <filename>gstcontrol</filename>.
</para>
<para>
The <filename>gstcontrol</filename> library needs to be initialized
when your application is run. This can be done after the the GStreamer
library has been initialized.
</para>
<programlisting>
...
gst_init(&amp;argc,&amp;argv);
gst_control_init(&amp;argc,&amp;argv);
...
</programlisting>
</sect1>
<sect1 id="section-dparams-creating">
<title>Creating and Attaching Dynamic Parameters</title>
<para>
Once you have created your elements you can create and attach dparams to them.
First you need to get the element's dparams manager. If you know exactly what kind of element
you have, you may be able to get the dparams manager directly. However if this is not possible,
you can get the dparams manager by calling <filename>gst_dpman_get_manager</filename>.
</para>
<para>
Once you have the dparams manager, you must set the mode that the manager will run in.
There is currently only one mode implemented called <filename>"synchronous"</filename> - this is used for real-time
applications where the dparam value cannot be known ahead of time (such as a slider in a GUI).
The mode is called <filename>"synchronous"</filename> because the dparams are polled by the element for changes before
each buffer is processed. Another yet-to-be-implemented mode is <filename>"asynchronous"</filename>. This is used when
parameter changes are known ahead of time - such as with a timelined editor. The mode is called
<filename>"asynchronous"</filename> because parameter changes may happen in the middle of a buffer being processed.
</para>
<programlisting>
GstElement *sinesrc;
GstDParamManager *dpman;
...
sinesrc = gst_element_factory_make("sinesrc","sine-source");
...
dpman = gst_dpman_get_manager (sinesrc);
gst_dpman_set_mode(dpman, "synchronous");
</programlisting>
<para>
If you don't know the names of the required dparams for your element you can call
<filename>gst_dpman_list_dparam_specs(dpman)</filename> to get a NULL terminated array of param specs.
This array should be freed after use. You can find the name of the required dparam by calling
<filename>g_param_spec_get_name</filename> on each param spec in the array. In our example,
<filename>"volume"</filename> will be the name of our required dparam.
</para>
<para>
Each type of dparam currently has its own <filename>new</filename> function. This may eventually
be replaced by a factory method for creating new instances. A default dparam instance can be created
with the <filename>gst_dparam_new</filename> function. Once it is created it can be attached to a
required dparam in the element.
</para>
<programlisting>
GstDParam *volume;
...
volume = gst_dparam_new(G_TYPE_DOUBLE);
if (gst_dpman_attach_dparam (dpman, "volume", volume)){
/* the dparam was successfully attached */
...
}
</programlisting>
</sect1>
<sect1 id="section-dparams-changing">
<title>Changing Dynamic Parameter Values</title>
<para>
All interaction with dparams to actually set the dparam value is done through simple GObject properties.
There is a property value for each type that dparams supports - these currently being
<filename>"value_double"</filename>, <filename>"value_float"</filename>, <filename>"value_int"</filename> and <filename>"value_int64"</filename>.
To set the value of a dparam, simply set the property which matches the type of your dparam instance.
</para>
<programlisting>
#define ZERO(mem) memset(&amp;mem, 0, sizeof(mem))
...
gdouble set_to_value;
GstDParam *volume;
GValue set_val;
ZERO(set_val);
g_value_init(&amp;set_val, G_TYPE_DOUBLE);
...
g_value_set_double(&amp;set_val, set_to_value);
g_object_set_property(G_OBJECT(volume), "value_double", &amp;set_val);
</programlisting>
<para>Or if you create an actual GValue instance:</para>
<programlisting>
gdouble set_to_value;
GstDParam *volume;
GValue *set_val;
set_val = g_new0(GValue,1);
g_value_init(set_val, G_TYPE_DOUBLE);
...
g_value_set_double(set_val, set_to_value);
g_object_set_property(G_OBJECT(volume), "value_double", set_val);
</programlisting>
</sect1>
<sect1 id="section-dparams-types">
<title>Different Types of Dynamic Parameter</title>
<para>
There are currently only two implementations of dparams so far. They are both for real-time use so
should be run in the <filename>"synchronous"</filename> mode.
</para>
<sect2>
<title>GstDParam - the base dparam type</title>
<para>
All dparam implementations will subclass from this type. It provides a basic implementation which simply
propagates any value changes as soon as it can.
A new instance can be created with the function <filename>GstDParam* gst_dparam_new (GType type)</filename>.
It has the following object properties:
</para>
<itemizedlist>
<listitem><para><filename>"value_double"</filename>
- the property to set and get if it is a double dparam
</para></listitem>
<listitem><para><filename>"value_float"</filename>
- the property to set and get if it is a float dparam
</para></listitem>
<listitem><para><filename>"value_int"</filename>
- the property to set and get if it is an integer dparam
</para></listitem>
<listitem><para><filename>"value_int64"</filename>
- the property to set and get if it is a 64 bit integer dparam
</para></listitem>
<listitem><para><filename>"is_log"</filename>
- readonly boolean which is TRUE if the param should be displayed on a log scale
</para></listitem>
<listitem><para><filename>"is_rate"</filename>
- readonly boolean which is TRUE if the value is a proportion of the sample rate.
For example with a sample rate of 44100, 0.5 would be 22050 Hz and 0.25 would be 11025 Hz.
</para></listitem>
</itemizedlist>
</sect2>
<sect2>
<title>GstDParamSmooth - smoothing real-time dparam</title>
<para>
Some parameter changes can create audible artifacts if they change too rapidly. The GstDParamSmooth
implementation can greatly reduce these artifacts by limiting the rate at which the value can change.
This is currently only supported for double and float dparams - the other types fall back to the default implementation.
A new instance can be created with the function <filename>GstDParam* gst_dpsmooth_new (GType type)</filename>.
It has the following object properties:
</para>
<itemizedlist>
<listitem><para><filename>"update_period"</filename>
- an int64 value specifying the number nanoseconds between updates. This will be ignored in
<filename>"synchronous"</filename> mode since the buffer size dictates the update period.
</para></listitem>
<listitem><para><filename>"slope_time"</filename>
- an int64 value specifying the time period to use in the maximum slope calculation
</para></listitem>
<listitem><para><filename>"slope_delta_double"</filename>
- a double specifying the amount a double value can change in the given slope_time.
</para></listitem>
<listitem><para><filename>"slope_delta_float"</filename>
- a float specifying the amount a float value can change in the given slope_time.
</para></listitem>
</itemizedlist>
<para>
Audible artifacts may not be completely eliminated by using this dparam. The only way to eliminate
artifacts such as "zipper noise" would be for the element to implement its required dparams using the
array method. This would allow dparams to change parameters at the sample rate which should eliminate
any artifacts.
</para>
</sect2>
<sect2>
<title>Timelined dparams</title>
<para>
A yet-to-be-implemented subclass of GstDParam will add an API which allows the creation and manipulation
of points on a timeline. This subclass will also provide a dparam implementation which uses linear
interpolation between these points to find the dparam value at any given time. Further subclasses can
extend this functionality to implement more exotic interpolation algorithms such as splines.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,191 +0,0 @@
<chapter id="chapter-dynamic">
<title>Dynamic pipelines</title>
<para>
In this chapter we will see how you can create a dynamic pipeline. A
dynamic pipeline is a pipeline that is updated or created while data
is flowing through it. We will create a partial pipeline first and add
more elements while the pipeline is playing. Dynamic pipelines cause
all sorts of scheduling issues and will remain a topic of research for
a long time in GStreamer.
</para>
<para>
We will show how to create an MPEG1 video player using dynamic pipelines.
As you have seen in the pad section, we can attach a signal to an element
when a pad is created. We will use this to create our MPEG1 player.
</para>
<para>
We'll start with a simple main function:
</para>
<programlisting>
/* example-begin dynamic.c */
#include &lt;string.h&gt;
#include &lt;gst/gst.h&gt;
void
eof (GstElement *src)
{
g_print ("have eos, quitting\n");
exit (0);
}
gboolean
idle_func (gpointer data)
{
gst_bin_iterate (GST_BIN (data));
return TRUE;
}
void
new_pad_created (GstElement *parse, GstPad *pad, GstElement *pipeline)
{
GstElement *decode_video = NULL;
GstElement *decode_audio, *play, *color, *show;
GstElement *audio_queue, *video_queue;
GstElement *audio_thread, *video_thread;
g_print ("***** a new pad %s was created\n", gst_pad_get_name (pad));
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PAUSED);
/* link to audio pad */
if (strncmp (gst_pad_get_name (pad), "audio_", 6) == 0) {
/* construct internal pipeline elements */
decode_audio = gst_element_factory_make ("mad", "decode_audio");
g_return_if_fail (decode_audio != NULL);
play = gst_element_factory_make ("osssink", "play_audio");
g_return_if_fail (play != NULL);
/* create the thread and pack stuff into it */
audio_thread = gst_thread_new ("audio_thread");
g_return_if_fail (audio_thread != NULL);
/* construct queue and link everything in the main pipeline */
audio_queue = gst_element_factory_make ("queue", "audio_queue");
g_return_if_fail (audio_queue != NULL);
gst_bin_add_many (GST_BIN (audio_thread),
audio_queue, decode_audio, play, NULL);
/* set up pad links */
gst_element_add_ghost_pad (audio_thread,
gst_element_get_pad (audio_queue, "sink"),
"sink");
gst_element_link (audio_queue, decode_audio);
gst_element_link (decode_audio, play);
gst_bin_add (GST_BIN (pipeline), audio_thread);
gst_pad_link (pad, gst_element_get_pad (audio_thread, "sink"));
/* set up thread state and kick things off */
g_print ("setting to READY state\n");
gst_element_set_state (GST_ELEMENT (audio_thread), GST_STATE_READY);
}
else if (strncmp (gst_pad_get_name (pad), "video_", 6) == 0) {
/* construct internal pipeline elements */
decode_video = gst_element_factory_make ("mpeg2dec", "decode_video");
g_return_if_fail (decode_video != NULL);
color = gst_element_factory_make ("colorspace", "color");
g_return_if_fail (color != NULL);
show = gst_element_factory_make ("xvideosink", "show");
g_return_if_fail (show != NULL);
/* construct queue and link everything in the main pipeline */
video_queue = gst_element_factory_make ("queue", "video_queue");
g_return_if_fail (video_queue != NULL);
/* create the thread and pack stuff into it */
video_thread = gst_thread_new ("video_thread");
g_return_if_fail (video_thread != NULL);
gst_bin_add_many (GST_BIN (video_thread), video_queue,
decode_video, color, show, NULL);
/* set up pad links */
gst_element_add_ghost_pad (video_thread,
gst_element_get_pad (video_queue, "sink"),
"sink");
gst_element_link (video_queue, decode_video);
gst_element_link_many (decode_video, color, show, NULL);
gst_bin_add (GST_BIN (pipeline), video_thread);
gst_pad_link (pad, gst_element_get_pad (video_thread, "sink"));
/* set up thread state and kick things off */
g_print ("setting to READY state\n");
gst_element_set_state (GST_ELEMENT (video_thread), GST_STATE_READY);
}
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
}
int
main (int argc, char *argv[])
{
GstElement *pipeline, *src, *demux;
gst_init (&amp;argc, &amp;argv);
pipeline = gst_pipeline_new ("pipeline");
g_return_val_if_fail (pipeline != NULL, -1);
src = gst_element_factory_make ("filesrc", "src");
g_return_val_if_fail (src != NULL, -1);
if (argc &lt; 2)
g_error ("Please specify a video file to play !");
g_object_set (G_OBJECT (src), "location", argv[1], NULL);
demux = gst_element_factory_make ("mpegdemux", "demux");
g_return_val_if_fail (demux != NULL, -1);
gst_bin_add_many (GST_BIN (pipeline), src, demux, NULL);
g_signal_connect (G_OBJECT (demux), "new_pad",
G_CALLBACK (new_pad_created), pipeline);
g_signal_connect (G_OBJECT (src), "eos",
G_CALLBACK (eof), NULL);
gst_element_link (src, demux);
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
g_idle_add (idle_func, pipeline);
gst_main ();
return 0;
}
/* example-end dynamic.c */
</programlisting>
<para>
We create two elements: a file source and an MPEG demuxer.
There's nothing special about this piece of code except for
the signal 'new_pad' that we linked to the mpegdemux
element using:
</para>
<programlisting>
g_signal_connect (G_OBJECT (demux), "new_pad",
G_CALLBACK (new_pad_created), pipeline);
</programlisting>
<para>
When an elementary stream has been detected in the system stream,
mpegdemux will create a new pad that will provide the data of the
elementary stream. A function 'new_pad_created' will be called when
the pad is created.
</para>
<para>
In the above example, we created new elements based on the name of
the newly created pad. We then added them to a new thread.
There are other possibilities to check the type of the pad, for
example by using the MIME type and the properties of the pad.
</para>
</chapter>

View file

@ -1,202 +0,0 @@
<chapter id="chapter-elements-api">
<title>Elements</title>
<sect1 id="section-elements-create">
<title>Creating a GstElement</title>
<para>
The simplest way to create an element is to use
<ulink type="http"
url="&URLAPI;GstElementFactory.html#gst-element-factory-make">
<function>gst_element_factory_make</function>
</ulink>.
This function takes a factory name and an element name for the newly created
element.
The name of the
element is something you can use later on to look up the element in
a bin, for example. You can pass <symbol>NULL</symbol> as the name
argument to get a unique, default name.
</para>
<para>
When you don't need the element anymore, you need to unref it using
<ulink type="http"
url="&URLAPI;GstObject.html#gst-object-unref">
<function>gst_object_unref</function></ulink>.
This decreases the reference count for the element by 1. An element has a
refcount of 1 when it gets created. An element gets destroyed completely
when the refcount is decreased to 0.
</para>
<para>
The following example &EXAFOOT; shows how to create an element named
<emphasis>source</emphasis> from the element factory named
<emphasis>fakesrc</emphasis>. It checks if the creation succeeded.
After checking, it unrefs the element.
</para>
<programlisting>
<![CDATA[
/* example-begin elementmake.c */
#include <gst/gst.h>
int
main (int argc, char *argv[])
{
GstElement *element;
gst_init (&argc, &argv);
element = gst_element_factory_make ("fakesrc", "source");
if (!element) {
g_error ("Could not create an element from 'fakesrc' factory.\n");
}
gst_object_unref (GST_OBJECT (element));
return 0;
}
/* example-end elementmake.c */
]]>
</programlisting>
<para>
<function>gst_element_factory_make</function> is actually a shorthand
for a combination of two functions.
A
<ulink type="http"
url="&URLAPI;GstElement.html"><classname>GstElement</classname></ulink>
object is created from a factory.
To create the element, you have to get access to a
<ulink type="http" url="&URLAPI;GstElementFactory.html">
<classname>GstElementFactory</classname></ulink>
object using a unique factory name.
This is done with
<ulink type="http"
url="&URLAPI;GstElementFactory.html#gst-element-factory-find">
<function>gst_element_factory_find</function></ulink>.
</para>
<para>
The following code fragment is used to get a factory that can be used
to create the <emphasis>fakesrc</emphasis> element, a fake data source.
</para>
<programlisting>
GstElementFactory *factory;
factory = gst_element_factory_find ("fakesrc");
</programlisting>
<para>
Once you have the handle to the element factory, you can create a
real element with the following code fragment:
</para>
<programlisting>
GstElement *element;
element = gst_element_factory_create (factory, "source");
</programlisting>
<para>
<ulink type="http"
url="&URLAPI;GstElementFactory.html#gst-element-factory-create">
<function>gst_element_factory_create</function></ulink>
will use the element factory to create an element with the given name.
</para>
</sect1>
<sect1 id="section-elements-properties">
<title>GstElement properties</title>
<para>
A <ulink type="http" url="&URLAPI;GstElement.html">
<classname>GstElement</classname></ulink> can have several properties
which are implemented using standard <classname>GObject</classname>
properties. The usual <classname>GObject</classname> methods to query,
set and get property values and <classname>GParamSpecs</classname>
are therefore supported.
</para>
<para>
Every <ulink type="http" url="&URLAPI;GstElementFactory.html">
<classname>GstElement</classname></ulink> inherits at least
one property of its parent <classname>GstObject</classname>:
the "name" property. This is the name you provide to the
functions <function>gst_element_factory_make</function> or
<function>gst_element_factory_create</function>. You can get and set
this property using the functions
<function>gst_object_set_name</function>
and <function>gst_object_get_name</function> or use the
<classname>GObject</classname> property mechanism as shown below.
</para>
<programlisting>
<![CDATA[
/* example-begin elementget.c */
#include <gst/gst.h>
int
main (int argc, char *argv[])
{
GstElement *element;
GValue value = { 0, }; /* initialize the GValue for g_object_get() */
gst_init (&argc, &argv);
element = gst_element_factory_make ("fakesrc", "source");
g_object_set (G_OBJECT (element), "name", "mysource", NULL);
g_value_init (&value, G_TYPE_STRING);
g_object_get_property (G_OBJECT (element), "name", &value);
g_print ("The name of the source is '%s'.\n", g_value_get_string (&value));
return 0;
}
/* example-end elementget.c */
]]>
</programlisting>
<para>
Most plugins provide additional properties to provide more information
about their configuration or to configure the element.
<command>gst-inspect</command> is a useful tool to query the properties
of a particular element, it will also use property introspection to give
a short explanation about the function of the property and about the
parameter types and ranges it supports.
</para>
<para>
For more information about <classname>GObject</classname>
properties we recommend you read the <ulink
url="http://developer.gnome.org/doc/API/2.0/gobject/index.html"
type="http">GObject manual</ulink> and an introduction to <ulink
url="http://le-hacker.org/papers/gobject/index.html" type="http">
The Glib Object system</ulink>.
</para>
</sect1>
<sect1 id="section-elements-signals">
<title>GstElement signals</title>
<para>
A <ulink type="http" url="&URLAPI;gstreamer/html/GstElementFactory.html">
<classname>GstElement</classname></ulink> also provides various
<classname>GObject</classname> signals that can be used as a flexible
callback mechanism.
</para>
</sect1>
<sect1 id="section-elements-factories">
<title>More about GstElementFactory</title>
<para>
We talk some more about the GstElementFactory object.
</para>
<sect2 id="section-elements-factories-details">
<title>Getting information about an element using the factory details</title>
<para>
</para>
</sect2>
<sect2 id="section-elements-factories-padtemplates">
<title>Finding out what pads an element can contain</title>
<para>
</para>
</sect2>
<sect2 id="section-elements-factories-query">
<title>Different ways of querying the factories</title>
<para>
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,122 +0,0 @@
<chapter id="chapter-elements">
<title>Elements</title>
<para>
The most important object in <application>GStreamer</application> for the
application programmer is the <ulink type="http"
url="../../gstreamer/html/GstElement.html"><classname>GstElement</classname>
</ulink>object.
</para>
<sect1 id="section-elements-design">
<title>What is an element ?</title>
<para>
An element is the basic building block for the media pipeline.
All the different high-level components you are going to use are
derived from <ulink type="http" url="../../gstreamer/html/GstElement.html">
<classname>GstElement</classname></ulink>. This means that a
lot of functions you are going to use operate on objects of this class.
</para>
<para>
Elements, from the perspective of GStreamer, are viewed as "black boxes"
with a number of different aspects. One of these aspects is the presence
of "pads" (see <xref linkend="chapter-pads"/>), or link points.
This terminology arises from soldering; pads are where wires can be
attached.
</para>
</sect1>
<sect1 id="section-elements-types">
<title>Types of elements</title>
<sect2 id="section-elements-src">
<title>Source elements</title>
<para>
Source elements generate data for use by a pipeline, for example
reading from disk or from a sound card.
</para>
<para>
<xref linkend="section-element-srcimg"/> shows how we will visualise
a source element.
We always draw a source pad to the right of the element.
</para>
<figure float="1" id="section-element-srcimg">
<title>Visualisation of a source element</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/src-element.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
Source elements do not accept data, they only generate data. You can
see this in the figure because it only has a source pad. A source
pad can only generate data.
</para>
</sect2>
<sect2 id="section-elements-filter">
<title>Filters and codecs</title>
<para>
Filter elements have both input and output pads. They operate on
data they receive in their sink pads and produce data on their source
pads. For example, MPEG decoders and volume filters would fall into
this category.
</para>
<para>
Elements are not constrained as to the number of pads they might have;
for example, a video mixer might have two input pads (the images of
the two different video streams) and one output pad.
</para>
<figure float="1" id="section-element-filterimg">
<title>Visualisation of a filter element</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/filter-element.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
<xref linkend="section-element-filterimg"/> shows how we will visualise
a filter element.
This element has one sink (input) pad and one source (output) pad.
Sink pads are drawn on the left of the element.
</para>
<figure float="1" id="section-element-multifilterimg">
<title>Visualisation of a filter element with
more than one output pad</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/filter-element-multi.&image;"
format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
<xref linkend="section-element-multifilterimg"/> shows the visualisation of a filter element with
more than one output pad. An example of such a filter is the AVI
demultiplexer. This element will parse the input data and
extract the audio and video data. Most of these filters dynamically
send out a signal when a new pad is created so that the application
programmer can link an arbitrary element to the newly created pad.
</para>
</sect2>
<sect2 id="section-elements-sink">
<title>Sink elements</title>
<para>
Sink elements are end points in a media pipeline. They accept
data but do not produce anything. Disk writing, soundcard playback,
and video output would all be implemented by sink elements.
<xref linkend="section-element-sinkimg"/> shows a sink element.
</para>
<figure float="1" id="section-element-sinkimg">
<title>Visualisation of a sink element</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/sink-element.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
</sect2>
</sect1>
</chapter>

View file

@ -1,257 +0,0 @@
<chapter id="chapter-factories">
<title>More on factories</title>
<para>
The small application we created in the previous chapter used the
concept of a factory to create the elements. In this chapter we will
show you how to use the factory concepts to create elements based
on what they do instead of what they are called.
</para>
<para>
We will first explain the concepts involved before we move on
to the reworked helloworld example using autoplugging.
</para>
<sect1 id="section-factories-helloworld-problems">
<title>The problems with the helloworld example</title>
<para>
If we take a look at how the elements were created in the previous
example we used a rather crude mechanism:
</para>
<programlisting>
...
/* now it's time to get the parser */
decoder = gst_element_factory_make ("mad", "decoder");
...
</programlisting>
<para>
While this mechanism is quite effective it also has some big problems:
The elements are created based on their name. Indeed, we create an
element, mad, by explicitly stating the mad element's name. Our little
program therefore always uses the mad decoder element to decode
the MP3 audio stream, even if there are three other MP3 decoders in the
system. We will see how we can use a more general way to create an
MP3 decoder element.
</para>
<para>
We have to introduce the concept of MIME types and capabilities
added to the source and sink pads.
</para>
</sect1>
<sect1 id="section-factories-mime">
<title>More on MIME Types</title>
<para>
GStreamer uses MIME types to identify the different types of data
that can be handled by the elements. They are the high level
mechanisms to make sure that everyone is talking about the right
kind of data.
</para>
<para>
A MIME (Multipurpose Internet Mail Extension) type is a pair of
strings that denote a certain type of data. Examples include:
<itemizedlist>
<listitem>
<para>
audio/x-raw-int : raw audio samples
</para>
</listitem>
<listitem>
<para>
audio/mpeg : MPEG audio
</para>
</listitem>
<listitem>
<para>
video/mpeg : MPEG video
</para>
</listitem>
</itemizedlist>
</para>
<para>
An element must associate a MIME type to its source and sink pads
when it is loaded into the system. GStreamer knows about the
different elements and what type of data they expect and emit.
This allows for very dynamic and extensible element creation as we
will see.
</para>
<para>
As we have seen in the previous chapter, MIME types are added
to the Capability structure of a pad.
</para>
<para>
<xref linkend="section-mime-img"/> shows the MIME types associated with
each pad from the "hello world" example.
</para>
<figure float="1" id="section-mime-img">
<title>The Hello world pipeline with MIME types</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/mime-world.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
We will see how you can create an element based on the MIME types
of its source and sink pads. This way the end-user will have the
ability to choose his/her favorite audio/mpeg decoder without
you even having to care about it.
</para>
<para>
The typing of the source and sink pads also makes it possible to
'autoplug' a pipeline. We will have the ability to say: "construct
a pipeline that does an audio/mpeg to audio/x-raw-int conversion".
</para>
<note>
<para>
The basic GStreamer library does not try to solve all of your
autoplug problems. It leaves the hard decisions to the application
programmer, where they belong.
</para>
</note>
</sect1>
<sect1 id="section-factories-gstreamer-types">
<title>GStreamer types</title>
<para>
GStreamer assigns a unique number to all registered MIME types.
GStreamer also keeps a reference to
a function that can be used to determine if a given buffer is of
the given MIME type.
</para>
<para>
There is also an association between a MIME type and a file extension,
but the use of typefind functions (similar to file(1)) is preferred.
</para>
<para>
The type information is maintained in a list of
<classname>GstType</classname>. The definition of a
<classname>GstType</classname> is like:
</para>
<para>
<programlisting>
typedef GstCaps (*GstTypeFindFunc) (GstBuffer *buf,gpointer *priv);
typedef struct _GstType GstType;
struct _GstType {
guint16 id; /* type id (assigned) */
gchar *mime; /* MIME type */
gchar *exts; /* space-delimited list of extensions */
GstTypeFindFunc typefindfunc; /* typefind function */
};
</programlisting>
</para>
<para>
All operations on <classname>GstType</classname> occur
via their <classname>guint16 id</classname> numbers, with
the <classname>GstType</classname> structure private to the GStreamer
library.
</para>
<sect2>
<title>MIME type to id conversion</title>
<para>
We can obtain the id for a given MIME type
with the following piece of code:
</para>
<programlisting>
guint16 id;
id = gst_type_find_by_mime ("audio/mpeg");
</programlisting>
<para>
This function will return 0 if the type was not known.
</para>
</sect2>
<sect2>
<title>id to <classname>GstType</classname> conversion</title>
<para>
We can obtain the <classname>GstType</classname> for a given id
with the following piece of code:
</para>
<programlisting>
GstType *type;
type = gst_type_find_by_id (id);
</programlisting>
<para>
This function will return NULL if the id was not associated with
any known <classname>GstType</classname>
</para>
</sect2>
<sect2>
<title>extension to id conversion</title>
<para>
We can obtain the id for a given file extension
with the following piece of code:
</para>
<programlisting>
guint16 id;
id = gst_type_find_by_ext (".mp3");
</programlisting>
<para>
This function will return 0 if the extension was not known.
</para>
<para>
For more information, see <xref linkend="chapter-autoplug"/>.
</para>
</sect2>
</sect1>
<sect1 id="section-factories-create">
<title>Creating elements with the factory</title>
<para>
In the previous section we described how you could obtain
an element factory using MIME types. One the factory has been
obtained, you can create an element using:
</para>
<programlisting>
GstElementFactory *factory;
GstElement *element;
// obtain the factory
factory = ...
element = gst_element_factory_create (factory, "name");
</programlisting>
<para>
This way, you do not have to create elements by name which
allows the end-user to select the elements he/she prefers for the
given MIME types.
</para>
</sect1>
<sect1 id="section-factories-basic-types">
<title>GStreamer basic types</title>
<para>
GStreamer only has two builtin types:
</para>
<itemizedlist>
<listitem>
<para>
audio/raw : raw audio samples
</para>
</listitem>
<listitem>
<para>
video/raw and image/raw : raw video data
</para>
</listitem>
</itemizedlist>
<para>
All other MIME types are maintained by the plugin elements.
</para>
</sect1>
</chapter>

View file

@ -1,95 +0,0 @@
<chapter id="chapter-gnome">
<title>GNOME integration</title>
<para>
GStreamer is fairly easy to integrate with GNOME applications.
GStreamer uses libxml 2.0, GLib 2.0 and popt, as do all other
GNOME applications.
There are however some basic issues you need to address in your GNOME
applications.
</para>
<sect1>
<title>Command line options</title>
<para>
GNOME applications call gnome_program_init () to parse command-line
options and initialize the necessary gnome modules.
GStreamer applications normally call gst_init (&amp;argc, &amp;argv) to
do the same for GStreamer.
</para>
<para>
Each of these two swallows the program options passed to the program,
so we need a different way to allow both GNOME and GStreamer to parse
the command-line options. This is shown in the following example.
</para>
<programlisting>
/* example-begin gnome.c */
#include &lt;gnome.h&gt;
#include &lt;gst/gst.h&gt;
int
main (int argc, char **argv)
{
GstPoptOption options[] = {
{ NULL, '\0', POPT_ARG_INCLUDE_TABLE, NULL, 0, "GStreamer", NULL },
POPT_TABLEEND
};
GnomeProgram *program;
poptContext context;
const gchar **argvn;
GstElement *pipeline;
GstElement *src, *sink;
options[0].arg = (void *) gst_init_get_popt_table ();
g_print ("Calling gnome_program_init with the GStreamer popt table\n");
/* gnome_program_init will initialize GStreamer now
* as a side effect of having the GStreamer popt table passed. */
if (! (program = gnome_program_init ("my_package", "0.1", LIBGNOMEUI_MODULE,
argc, argv,
GNOME_PARAM_POPT_TABLE, options,
NULL)))
g_error ("gnome_program_init failed");
g_print ("Getting gnome-program popt context\n");
g_object_get (program, "popt-context", &amp;context, NULL);
argvn = poptGetArgs (context);
if (!argvn) {
g_print ("Run this example with some arguments to see how it works.\n");
return 0;
}
g_print ("Printing rest of arguments\n");
while (*argvn) {
g_print ("argument: %s\n", *argvn);
++argvn;
}
/* do some GStreamer things to show everything's initialized properly */
g_print ("Doing some GStreamer stuff to show that everything works\n");
pipeline = gst_pipeline_new ("pipeline");
src = gst_element_factory_make ("fakesrc", "src");
sink = gst_element_factory_make ("fakesink", "sink");
gst_bin_add_many (GST_BIN (pipeline), src, sink, NULL);
gst_element_link (src, sink);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
gst_bin_iterate (GST_BIN (pipeline));
gst_element_set_state (pipeline, GST_STATE_NULL);
return 0;
}
/* example-end gnome.c */
</programlisting>
<para>
If you try out this program, you will see that when called with
--help, it will print out both GStreamer and GNOME help arguments.
All of the arguments that didn't belong to either end up in the
argvn pointer array.
</para>
<para>
FIXME: flesh this out more. How do we get the GStreamer arguments
at the end ?
FIXME: add a GConf bit.
</para>
</sect1>
</chapter>

View file

@ -1,167 +0,0 @@
<chapter id="chapter-goals">
<title>Goals</title>
<para>
GStreamer was designed to provide a solution to the current Linux media
problems.
</para>
<sect1 id="section-goals-design">
<title>The design goals</title>
<para>
We describe what we try to achieve with GStreamer.
</para>
<sect2 id="section-goals-clean">
<title>Clean and powerful</title>
<para>
GStreamer wants to provide a clean interface to:
</para>
<itemizedlist>
<listitem>
<para>
The application programmer who wants to build a media pipeline.
The programmer can use an extensive set of powerful tools to create
media pipelines without writing a single line of code. Performing
complex media manipulations becomes very easy.
</para>
</listitem>
<listitem>
<para>
The plugin programmer. Plugin programmers are provided a clean and
simple API to create self contained plugins. An extensive debugging
and tracing mechanism has been integrated. GStreamer also comes with
an extensive set of real-life plugins that serve as examples too.
</para>
</listitem>
</itemizedlist>
</sect2>
<sect2 id="section-goals-object">
<title>Object oriented</title>
<para>
GStreamer adheres to the GLib 2.0 object model. A programmer familiar with GLib 2.0 or older versions
of GTK+ will be comfortable with GStreamer.
</para>
<para>
GStreamer uses the mechanism of signals and object properties.
</para>
<para>
All objects can be queried at runtime for their various properties and
capabilities.
</para>
<para>
GStreamer intends to be similar in programming methodology to GTK+.
This applies to the object model, ownership of objects, reference
counting, ...
</para>
</sect2>
<sect2 id="section-goals-extensible">
<title>Extensible</title>
<para>
All GStreamer Objects can be extended using the GObject inheritance methods.
</para>
<para>
All plugins are loaded dynamically and can be extended and upgraded
independently.
</para>
</sect2>
<sect2 id="section-goals-binary">
<title>Allow binary only plugins</title>
<para>
Plugins are shared libraries that are loaded at runtime. Since all the properties of the
plugin can be set using the GObject properties, there is no need (and in fact no way) to
have any header files installed for the plugins.
</para>
<para>
Special care has been taken to make plugins completely selfcontained.
All relevant aspects of plugins can be queried at run-time.
</para>
</sect2>
<sect2 id="section-goals-performance">
<title>High performance</title>
<para>
High performance is obtained by:
</para>
<itemizedlist>
<listitem>
<para>
using GLib's <function>g_mem_chunk</function> and fast non-blocking allocation algorithms
where possible to minimize dynamic memory allocation.
</para>
</listitem>
<listitem>
<para>
extremely light-weight links between plugins. Data can travel
the pipeline with minimal overhead. Data passing between plugins only involves
a pointer dereference in a typical pipeline.
</para>
</listitem>
<listitem>
<para>
providing a mechanism to directly work on the target memory. A plugin can for example
directly write to the X server's shared memory space. Buffers can also point to
arbitrary memory, such as a sound card's internal hardware buffer.
</para>
</listitem>
<listitem>
<para>
refcounting and copy on write minimize usage of memcpy.
Sub-buffers efficiently split buffers into manageable pieces.
</para>
</listitem>
<listitem>
<para>
the use of cothreads to minimize the threading overhead. Cothreads are a simple and fast
user-space method for switching between subtasks. Cothreads were measured to
consume as little as 600 cpu cycles.
</para>
</listitem>
<listitem>
<para>
allowing hardware acceleration by using specialized plugins.
</para>
</listitem>
<listitem>
<para>
using a plugin registry with the specifications of the plugins so
that the plugin loading can be delayed until the plugin is actually
used.
</para>
</listitem>
<listitem>
<para>
all critical data passing is free of locks and mutexes.
</para>
</listitem>
</itemizedlist>
</sect2>
<sect2 id="section-goals-separation">
<title>Clean core/plugins separation</title>
<para>
The core of GStreamer is essentially media-agnostic. It only knows
about bytes and blocks, and only contains basic elements.
The core of GStreamer is functional enough to even implement low-level
system tools, like cp.
</para>
<para>
All of the media handling functionality is provided by plugins external
to the core. These tell the core how to handle specific types of media.
</para>
</sect2>
<sect2 id="section-goals-testbed">
<title>Provide a framework for codec experimentation</title>
<para>
GStreamer also wants to be an easy framework where codec
developers can experiment with different algorithms, speeding up
the development of open and free multimedia codecs like <ulink
url="http://www.xiph.org/ogg/index.html" type="http">tarkin and
vorbis</ulink>.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,280 +0,0 @@
<chapter id="chapter-hello-world">
<title>Your first application</title>
<para>
This chapter describes the most rudimentary aspects of a
<application>GStreamer</application> application, including initializing
the libraries, creating elements, packing them into a pipeline and playing,
pausing and stopping the pipeline.
</para>
<sect1 id="section-hello-world">
<title>Hello world</title>
<para>
We will create a simple first application, a complete MP3 player, using
standard <application>GStreamer</application> components. The player
will read from a file that is given as the first argument to the program.
</para>
<programlisting>
/* example-begin helloworld.c */
#include &lt;gst/gst.h&gt;
int
main (int argc, char *argv[])
{
GstElement *pipeline, *filesrc, *decoder, *audiosink;
gst_init(&amp;argc, &amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;mp3 filename&gt;\n", argv[0]);
exit (-1);
}
/* create a new pipeline to hold the elements */
pipeline = gst_pipeline_new ("pipeline");
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
/* now it's time to get the decoder */
decoder = gst_element_factory_make ("mad", "decoder");
/* and an audio sink */
audiosink = gst_element_factory_make ("osssink", "play_audio");
/* add objects to the main pipeline */
gst_bin_add_many (GST_BIN (pipeline), filesrc, decoder, audiosink, NULL);
/* link src to sink */
gst_element_link_many (filesrc, decoder, audiosink, NULL);
/* start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
while (gst_bin_iterate (GST_BIN (pipeline)));
/* stop the pipeline */
gst_element_set_state (pipeline, GST_STATE_NULL);
/* we don't need a reference to these objects anymore */
gst_object_unref (GST_OBJECT (pipeline));
/* unreffing the pipeline unrefs the contained elements as well */
exit (0);
}
/* example-end helloworld.c */
</programlisting>
<para>
Let's go through this example step by step.
</para>
<para>
The first thing you have to do is to include the standard
<application>GStreamer</application> headers and
initialize the framework.
</para>
<programlisting>
#include &lt;gst/gst.h&gt;
...
int
main (int argc, char *argv[])
{
...
gst_init(&amp;argc, &amp;argv);
...
</programlisting>
<para>
We are going to create three elements and one pipeline. Since all
elements share the same base type, <ulink type="http"
url="../../gstreamer/html/GstElement.html"><classname>GstElement</classname></ulink>,
we can define them as:
</para>
<programlisting>
...
GstElement *pipeline, *filesrc, *decoder, *audiosink;
...
</programlisting>
<para>
Next, we are going to create an empty pipeline. As you have seen in
the basic introduction, this pipeline will hold and manage all the
elements we are going to pack into it.
</para>
<programlisting>
/* create a new pipeline to hold the elements */
pipeline = gst_pipeline_new ("pipeline");
</programlisting>
<para>
We use the standard constructor for a pipeline: gst_pipeline_new ().
</para>
<para>
We then create a disk source element. The disk source element is able to
read from a file. We use the standard GObject property mechanism to set
a property of the element: the file to read from.
</para>
<programlisting>
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
</programlisting>
<note>
<para>
You can check if the filesrc != NULL to verify the creation of the
disk source element.
</para>
</note>
<para>
We now create the MP3 decoder element. This assumes that the 'mad' plugin
is installed on the system where this application is executed.
</para>
<programlisting>
/* now it's time to get the decoder */
decoder = gst_element_factory_make ("mad", "decoder");
</programlisting>
<para>
gst_element_factory_make() takes two arguments: a string that will
identify the element you need and a second argument: how you want
to name the element. The name of the element is something you can
choose yourself and might be used to retrieve the element from a
bin/pipeline.
</para>
<para>
Finally we create our audio sink element. This element will be able
to play back the audio using OSS.
</para>
<programlisting>
/* and an audio sink */
audiosink = gst_element_factory_make ("osssink", "play_audio");
</programlisting>
<para>
We then add the elements to the pipeline.
</para>
<programlisting>
/* add objects to the main pipeline */
gst_bin_add_many (GST_BIN (pipeline), filesrc, decoder, audiosink, NULL);
</programlisting>
<para>
We link the different pads of the elements together like this:
</para>
<programlisting>
/* link src to sink */
gst_element_link_many (filesrc, decoder, audiosink, NULL);
</programlisting>
<para>
We now have created a complete pipeline. We can visualise the
pipeline as follows:
</para>
<figure float="1" id="section-hello-img">
<title>The "hello world" pipeline</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/hello-world.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
Everything is now set up to start streaming. We use the following
statements to change the state of the pipeline:
</para>
<programlisting>
/* start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
</programlisting>
<note>
<para>
<application>GStreamer</application> will take care of the READY and PAUSED state for
you when going from NULL to PLAYING.
</para>
</note>
<para>
Since we do not use threads, nothing will happen yet. We have to
call gst_bin_iterate() to execute one iteration of the pipeline.
</para>
<programlisting>
while (gst_bin_iterate (GST_BIN (pipeline)));
</programlisting>
<para>
The gst_bin_iterate() function will return TRUE as long as something
interesting happened inside the pipeline. When the end-of-file has been
reached the _iterate function will return FALSE and we can end the loop.
</para>
<programlisting>
/* stop the pipeline */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
exit (0);
</programlisting>
<note>
<para>
Don't forget to set the state of the pipeline to NULL. This will free
all of the resources held by the elements.
</para>
</note>
</sect1>
<sect1 id="section-hello-world-compile">
<title>Compiling helloworld.c</title>
<para>
To compile the helloworld example, use:
</para>
<programlisting>
gcc -Wall `pkg-config gstreamer-&GST_MAJORMINOR; --cflags --libs` helloworld.c \
-o helloworld
</programlisting>
<para>
We use pkg-config to get the compiler flags needed to compile
this application. Make sure to have your PKG_CONFIG_PATH environment
variable set to the correct location if you are building this
application against the uninstalled location.
</para>
<para>
You can run the example with
(substitute helloworld.mp3 with you favorite MP3 file):
</para>
<programlisting>
./helloworld helloworld.mp3
</programlisting>
</sect1>
<sect1 id="section-hello-world-conclusion">
<title>Conclusion</title>
<para>
This concludes our first example. As you see, setting up a pipeline
is very low-level but powerful. You will see later in this manual how
you can create a custom MP3 element with a higher-level API.
</para>
<para>
It should be clear from the example that we can very easily replace the
filesrc element with the gnomevfssrc element, giving you instant streaming
from any gnomevfs URL.
</para>
<para>
We can also choose to use another type of sink instead of the audiosink.
We could use a filesink to write the raw samples to a file, for example.
It should also be clear that inserting filters, like a stereo effect,
into the pipeline is not that hard to do. The most important thing is
that you can reuse already existing elements.
</para>
</sect1>
</chapter>

View file

@ -1,274 +0,0 @@
<chapter id="chapter-hello2">
<title>Your second application</title>
<para>
FIXME: delete this section, talk more about the spider. In a previous chapter we created a first
version of the helloworld application. We then explained a better way of creating the elements
using factories identified by MIME types and the autoplugger.
</para>
<sect1>
<title>Autoplugging helloworld </title>
<para>
We will create a second version of the helloworld application using
autoplugging. Its source code is a bit more complicated but
it can handle many more data types. It can even play the audio track
of a video file.
</para>
<para>
Here is the full program listing. Start by looking at the main ()
function.
</para>
<programlisting>
/* example-begin helloworld2.c */
#include &lt;gst/gst.h&gt;
static void gst_play_have_type (GstElement *typefind, GstCaps *caps, GstElement *pipeline);
static void gst_play_cache_empty (GstElement *element, GstElement *pipeline);
static void
gst_play_have_type (GstElement *typefind, GstCaps *caps, GstElement *pipeline)
{
GstElement *osssink;
GstElement *new_element;
GstAutoplug *autoplug;
GstElement *autobin;
GstElement *filesrc;
GstElement *cache;
g_print ("GstPipeline: play have type\n");
gst_element_set_state (pipeline, GST_STATE_PAUSED);
filesrc = gst_bin_get_by_name (GST_BIN (pipeline), "disk_source");
autobin = gst_bin_get_by_name (GST_BIN (pipeline), "autobin");
cache = gst_bin_get_by_name (GST_BIN (autobin), "cache");
/* unlink the typefind from the pipeline and remove it */
gst_element_unlink (cache, typefind);
gst_bin_remove (GST_BIN (autobin), typefind);
/* and an audio sink */
osssink = gst_element_factory_make ("osssink", "play_audio");
g_assert (osssink != NULL);
autoplug = gst_autoplug_factory_make ("staticrender");
g_assert (autoplug != NULL);
new_element = gst_autoplug_to_renderers (autoplug, caps, osssink, NULL);
if (!new_element) {
g_print ("could not autoplug, no suitable codecs found...\n");
exit (-1);
}
gst_element_set_name (new_element, "new_element");
gst_bin_add (GST_BIN (autobin), new_element);
g_object_set (G_OBJECT (cache), "reset", TRUE, NULL);
gst_element_link (cache, new_element);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
}
static void
gst_play_cache_empty (GstElement *element, GstElement *pipeline)
{
GstElement *autobin;
GstElement *filesrc;
GstElement *cache;
GstElement *new_element;
g_print ("have cache empty\n");
gst_element_set_state (pipeline, GST_STATE_PAUSED);
filesrc = gst_bin_get_by_name (GST_BIN (pipeline), "disk_source");
autobin = gst_bin_get_by_name (GST_BIN (pipeline), "autobin");
cache = gst_bin_get_by_name (GST_BIN (autobin), "cache");
new_element = gst_bin_get_by_name (GST_BIN (autobin), "new_element");
gst_element_unlink (filesrc, cache);
gst_element_unlink (cache, new_element);
gst_bin_remove (GST_BIN (autobin), cache);
gst_element_link (filesrc, new_element);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_print ("done with cache_empty\n");
}
int
main (int argc, char *argv[])
{
GstElement *filesrc;
GstElement *pipeline;
GstElement *autobin;
GstElement *typefind;
GstElement *cache;
gst_init (&amp;argc, &amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;filename with audio&gt;\n", argv[0]);
exit (-1);
}
/* create a new pipeline to hold the elements */
pipeline = gst_pipeline_new ("pipeline");
g_assert (pipeline != NULL);
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_assert (filesrc != NULL);
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
gst_bin_add (GST_BIN (pipeline), filesrc);
autobin = gst_bin_new ("autobin");
cache = gst_element_factory_make ("autoplugcache", "cache");
g_signal_connect (G_OBJECT (cache), "cache_empty",
G_CALLBACK (gst_play_cache_empty), pipeline);
typefind = gst_element_factory_make ("typefind", "typefind");
g_signal_connect (G_OBJECT (typefind), "have_type",
G_CALLBACK (gst_play_have_type), pipeline);
gst_bin_add (GST_BIN (autobin), cache);
gst_bin_add (GST_BIN (autobin), typefind);
gst_element_link (cache, typefind);
gst_element_add_ghost_pad (autobin,
gst_element_get_pad (cache, "sink"), "sink");
gst_bin_add (GST_BIN( pipeline), autobin);
gst_element_link (filesrc, autobin);
/* start playing */
gst_element_set_state( GST_ELEMENT (pipeline), GST_STATE_PLAYING);
while (gst_bin_iterate (GST_BIN (pipeline)));
/* stop the pipeline */
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
exit(0);
}
/* example-end helloworld2.c */
</programlisting>
<para>
We start by constructing a 'filesrc' element and an 'autobin' element that
holds the autoplugcache and the typefind element.
</para>
<para>
We attach the "cache_empty" signal to gst_play_cache_empty and the
"have_type" to our gst_play_have_type function.
</para>
<para>
The _have_type function first sets the pipeline to the PAUSED state
so that it can safely modify the pipeline. It then finds the elements
it is going to manipulate in the pipeline with:
</para>
<programlisting>
filesrc = gst_bin_get_by_name (GST_BIN (pipeline), "disk_source");
autobin = gst_bin_get_by_name (GST_BIN (pipeline), "autobin");
cache = gst_bin_get_by_name (GST_BIN (autobin), "cache");
</programlisting>
<para>
Now we have a handle to the elements we are going to manipulate in
the next step.
</para>
<para>
We don't need the typefind element anymore so we remove it from
the pipeline:
</para>
<programlisting>
/* unlink the typefind from the pipeline and remove it */
gst_element_unlink (cache, "src", typefind, "sink");
gst_bin_remove (GST_BIN (autobin), typefind);
</programlisting>
<para>
Our next step is to construct an element that can play the type we just
detected. We are going to use the autoplugger to create an element that
links the type to an osssink. We add the new element to our
autobin.
</para>
<programlisting>
/* and an audio sink */
osssink = gst_element_factory_make("osssink", "play_audio");
g_assert(osssink != NULL);
autoplug = gst_autoplug_factory_make ("staticrender");
g_assert (autoplug != NULL);
new_element = gst_autoplug_to_renderers (autoplug,
caps,
osssink,
NULL);
if (!new_element) {
g_print ("could not autoplug, no suitable codecs found...\n");
exit (-1);
}
gst_element_set_name (new_element, "new_element");
gst_bin_add (GST_BIN (autobin), new_element);
</programlisting>
<para>
Our next step is to reset the cache so that the buffers used by the
typefind element are fed into the new element we just created. We reset
the cache by setting the "reset" property of the cache element to TRUE.
</para>
<programlisting>
g_object_set (G_OBJECT (cache), "reset", TRUE, NULL);
gst_element_link (cache, "src", new_element, "sink");
</programlisting>
<para>
Finally we set the pipeline back to the playing state. At this point the
cache will replay the buffers. We will be notified when the cache is empty
by the gst_play_cache_empty callback function.
</para>
<para>
The cache empty function simply removes the autoplugcache element from
the pipeline and relinks the filesrc to the autoplugged element.
</para>
<para>
To compile the helloworld2 example, use:
</para>
<programlisting>
gcc -Wall `pkg-config gstreamer-&GST_MAJORMINOR; --cflags --libs` helloworld2.c \
-o helloworld2
</programlisting>
<para>
You can run the example with
(substitute helloworld.mp3 with you favorite audio file):
</para>
<programlisting>
./helloworld2 helloworld.mp3
</programlisting>
<para>
You can also try to use an AVI or MPEG file as its input.
Using autoplugging,
<application>GStreamer</application>
will automatically figure out how to
handle the stream.
Remember that only the audio part will be played because
we have only added an osssink to the pipeline.
</para>
<programlisting>
./helloworld2 mymovie.mpeg
</programlisting>
</sect1>
</chapter>

View file

@ -1,99 +0,0 @@
<chapter id="chapter-initialisation">
<title>Initializing <application>GStreamer</application></title>
<para>
When writing a <application>GStreamer</application> application, you can
simply include <filename class='headerfile'>gst/gst.h</filename> to get
access to the library functions.
</para>
<para>
Before the <application>GStreamer</application> libraries can be used,
<function>gst_init</function> has to be called from the main application.
This call will perform the necessary initialization of the library as
well as parse the GStreamer-specific command line options.
</para>
<para>
A typical program
&EXAFOOT;
would have code to initialize GStreamer that
looks like this:
</para>
<programlisting>
<![CDATA[
/* example-begin init.c */
#include <gst/gst.h>
int
main (int argc, char *argv[])
{
guint major, minor, micro;
gst_init (&amp;argc, &amp;argv);
gst_version (&amp;major, &amp;minor, &amp;micro);
printf ("This program is linked against GStreamer %d.%d.%d\n",
major, minor, micro);
return 0;
}
/* example-end init.c */
]]>
</programlisting>
<para>
Use the <symbol>GST_VERSION_MAJOR</symbol>,
<symbol>GST_VERSION_MINOR</symbol> and <symbol>GST_VERSION_MICRO</symbol>
macros to get the <application>GStreamer</application> version you are
building against, or use the function <function>gst_version</function>
to get the version your application is linked against.
<!-- FIXME: include an automatically generated list of these options. -->
</para>
<para>
It is also possible to call the <function>gst_init</function> function
with two <symbol>NULL</symbol> arguments, in which case no command line
options will be parsed by <application>GStreamer</application>.
</para>
<sect1>
<title>The popt interface</title>
<para>
You can also use a popt table to initialize your own parameters as shown in the
next example:
</para>
<programlisting>
/* example-begin popt.c */
#include &lt;gst/gst.h&gt;
int
main(int argc, char *argv[])
{
gboolean silent = FALSE;
gchar *savefile = NULL;
struct poptOption options[] = {
{"silent", 's', POPT_ARG_NONE|POPT_ARGFLAG_STRIP, &amp;silent, 0,
"do not output status information", NULL},
{"output", 'o', POPT_ARG_STRING|POPT_ARGFLAG_STRIP, &amp;savefile, 0,
"save xml representation of pipeline to FILE and exit", "FILE"},
POPT_TABLEEND
};
gst_init_with_popt_table (&amp;argc, &amp;argv, options);
printf ("Run me with --help to see the Application options appended.\n");
return 0;
}
/* example-end popt.c */
</programlisting>
<para>
As shown in this fragment, you can use a <ulink
url="http://developer.gnome.org/doc/guides/popt/"
type="http">popt</ulink> table to define your application-specific
command line options, and pass this table to the
function <function>gst_init_with_popt_table</function>. Your
application options will be parsed in addition to the standard
<application>GStreamer</application> options.
</para>
</sect1>
</chapter>

View file

@ -1,59 +0,0 @@
<chapter id="chapter-intro">
<title>Introduction</title>
<para>
This chapter gives you an overview of the technologies described in this
book.
</para>
<sect1 id="section-intro-what">
<title>What is GStreamer?</title>
<para>
GStreamer is a framework for creating streaming media applications.
The fundamental design comes from the video pipeline at Oregon Graduate
Institute, as well as some ideas from DirectShow.
</para>
<para>
GStreamer's development framework makes it possible to write any type of
streaming multimedia application. The GStreamer framework is designed
to make it easy to write applications that handle audio or video or both.
It isn't restricted to audio and video, and can process any kind of
data flow.
The pipeline design is made to have little overhead above what the
applied filters induce. This makes GStreamer a good framework for designing
even high-end audio applications which put high demands on latency.
</para>
<para>
One of the the most obvious uses of GStreamer is using it to build
a media player. GStreamer already includes components for building a
media player that can support a very wide variety of formats, including
MP3, Ogg Vorbis, MPEG1, MPEG2, AVI, Quicktime, mod, and more. GStreamer,
however, is much more than just another media player. Its main advantages
are that the pluggable components can be mixed and matched into arbitrary
pipelines so that it's possible to write a full-fledged video or audio
editing application.
</para>
<para>
The framework is based on plugins that will provide the various codec
and other functionality. The plugins can be linked and arranged in
a pipeline. This pipeline defines the flow of the data. Pipelines can
also be edited with a GUI editor and saved as XML so that pipeline
libraries can be made with a minimum of effort.
</para>
<para>
The GStreamer core function is to provide a framework for plugins, data flow
and media type handling/negotiation.
It also provides an API to write applications using the various plugins.
</para>
<para>
This book is about GStreamer from a developer's point of view; it describes
how to write a GStreamer application using the GStreamer libraries and tools.
For an explanation about writing plugins, we suggest the Plugin Writers Guide.
</para>
</sect1>
</chapter>

View file

@ -1,83 +0,0 @@
<chapter id="chapter-links-api">
<title>Linking elements</title>
<sect1 id="section-link-basic">
<title>Making simple links</title>
<para>
You can link two pads with:
</para>
<programlisting>
GstPad *srcpad, *sinkpad;
srcpad = gst_element_get_pad (element1, "src");
sinpad = gst_element_get_pad (element2, "sink");
// link them
gst_pad_link (srcpad, sinkpad);
....
// and unlink them
gst_pad_unlink (srcpad, sinkpad);
</programlisting>
<para>
A convenient shortcut for the above code is done with the gst_element_link_pads ()
function:
</para>
<programlisting>
// link them
gst_element_link_pads (element1, "src", element2, "sink");
....
// and unlink them
gst_element_unlink_pads (element1, "src", element2, "sink");
</programlisting>
<para>
An even more convenient shortcut but only works for single-source, single-sink elements is the
gst_element_link () function:
</para>
<programlisting>
// link them
gst_element_link (element1, element2);
....
// and unlink them
gst_element_unlink (element1, element2);
</programlisting>
<para>
If you have more than one element to link, the gst_element_link_many () function takes
a NULL-terminated list of elements. Again this only works for single-source single-sink
elements:
</para>
<programlisting>
// link them
gst_element_link_many (element1, element2, element3, element4, NULL);
....
// and unlink them
gst_element_unlink_many (element1, element2, element3, element4, NULL);
</programlisting>
<para>
You can query if a pad is linked with
<function>GST_PAD_IS_LINKED (pad)</function>.
</para>
<para>
To query for the <ulink type="http"
url="../../gstreamer/html/GstPad.html"><classname>GstPad</classname></ulink>
a pad is linked to, use <function>gst_pad_get_peer (pad)</function>.
</para>
</sect1>
<sect1 id="section-link-filtered">
<title>Making filtered links</title>
<para>
You can also force a specific media type on the link by using
<function>gst_pad_link_filtered ()</function>
and <function>gst_element_link_filtered ()</function> with capabilities.
See <xref linkend="section-caps"/> for
an explanation of capabilities.
</para>
</sect1>
</chapter>

View file

@ -1,29 +0,0 @@
<chapter id="chapter-links">
<title>Linking elements</title>
<para>
You can link the different pads of elements together so that the elements
form a chain.
</para>
<figure float="1" id="section-link">
<title>Visualisation of three linked elements</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/linked-elements.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
By linking these three elements, we have created a very simple
chain. The effect of this will be that the output of the source element
(element1) will be used as input for the filter element (element2). The
filter element will do something with the data and send the result to
the final sink element (element3).
</para>
<para>
Imagine the above graph as a simple MPEG audio decoder. The source
element is a disk source, the filter element is the MPEG decoder and
the sink element is your audiocard. We will use this simple graph to
construct an MPEG player later in this manual.
</para>
</chapter>

View file

@ -10,62 +10,49 @@
<!ENTITY EXAFOOT "
<footnote>
<para>
The code for this example is automatically extracted from
the documentation and built under <filename>examples/manual</filename>
in the GStreamer tarball.
</para>
<para>
The code for this example is automatically extracted from
the documentation and built under <filename>examples/manual</filename>
in the GStreamer tarball.
</para>
</footnote>
">
<!-- Part 1: Overview -->
<!ENTITY INTRO SYSTEM "intro.xml">
<!ENTITY MOTIVATION SYSTEM "motivation.xml">
<!ENTITY GOALS SYSTEM "goals.xml">
<!ENTITY INTRO SYSTEM "intro-preface.xml">
<!ENTITY MOTIVATION SYSTEM "intro-motivation.xml">
<!-- Part 2: Basic Concepts -->
<!ENTITY ELEMENTS SYSTEM "elements.xml">
<!ENTITY PADS SYSTEM "pads.xml">
<!ENTITY LINKS SYSTEM "links.xml">
<!ENTITY BINS SYSTEM "bins.xml">
<!ENTITY BUFFERS SYSTEM "buffers.xml">
<!ENTITY STATES SYSTEM "states.xml">
<!ENTITY PLUGINS SYSTEM "plugins.xml">
<!ENTITY INIT SYSTEM "basics-init.xml">
<!ENTITY ELEMENTS SYSTEM "basics-elements.xml">
<!ENTITY BINS SYSTEM "basics-bins.xml">
<!ENTITY PADS SYSTEM "basics-pads.xml">
<!ENTITY DATA SYSTEM "basics-data.xml">
<!ENTITY HELLOWORLD SYSTEM "basics-helloworld.xml">
<!-- Part 3: Basic API -->
<!ENTITY INIT-API SYSTEM "init-api.xml">
<!ENTITY ELEMENTS-API SYSTEM "elements-api.xml">
<!ENTITY PADS-API SYSTEM "pads-api.xml">
<!ENTITY LINKS-API SYSTEM "links-api.xml">
<!ENTITY BINS-API SYSTEM "bins-api.xml">
<!ENTITY BUFFERS-API SYSTEM "buffers-api.xml">
<!ENTITY STATES-API SYSTEM "states-api.xml">
<!ENTITY PLUGINS-API SYSTEM "plugins-api.xml">
<!-- Part 3: Advanced Concepts -->
<!ENTITY QUERYEVENTS SYSTEM "advanced-position.xml">
<!ENTITY METADATA SYSTEM "advanced-metadata.xml">
<!ENTITY INTERFACES SYSTEM "advanced-interfaces.xml">
<!ENTITY CLOCKS SYSTEM "advanced-clocks.xml">
<!ENTITY DPARAMS SYSTEM "advanced-dparams.xml">
<!ENTITY THREADS SYSTEM "advanced-threads.xml">
<!ENTITY SCHEDULERS SYSTEM "advanced-schedulers.xml">
<!ENTITY AUTOPLUGGING SYSTEM "advanced-autoplugging.xml">
<!ENTITY DATAACCESS SYSTEM "advanced-dataaccess.xml">
<!-- Part 4: Building An Application -->
<!ENTITY HELLOWORLD SYSTEM "helloworld.xml">
<!ENTITY FACTORIES SYSTEM "factories.xml">
<!ENTITY AUTOPLUGGING SYSTEM "autoplugging.xml">
<!ENTITY HELLOWORLD2 SYSTEM "helloworld2.xml">
<!-- Part 4: Higher-level interfaces -->
<!ENTITY XML SYSTEM "highlevel-xml.xml">
<!ENTITY COMPONENTS SYSTEM "highlevel-components.xml">
<!-- Part 5: Advanced Concepts -->
<!ENTITY THREADS SYSTEM "threads.xml">
<!ENTITY QUEUES SYSTEM "queues.xml">
<!ENTITY COTHREADS SYSTEM "cothreads.xml">
<!ENTITY SCHEDULERS SYSTEM "schedulers.xml">
<!ENTITY CLOCKS SYSTEM "clocks.xml">
<!ENTITY DYNAMIC SYSTEM "dynamic.xml">
<!ENTITY TYPEDETECTION SYSTEM "typedetection.xml">
<!ENTITY UTILITY SYSTEM "utility.xml">
<!ENTITY DPARAMS SYSTEM "dparams-app.xml">
<!-- Appendices -->
<!ENTITY DEBUGGING SYSTEM "appendix-debugging.xml">
<!ENTITY PROGRAMS SYSTEM "appendix-programs.xml">
<!ENTITY GNOME SYSTEM "appendix-gnome.xml">
<!ENTITY WIN32 SYSTEM "appendix-win32.xml">
<!ENTITY QUOTES SYSTEM "appendix-quotes.xml">
<!ENTITY XML SYSTEM "xml.xml">
<!ENTITY DEBUGGING SYSTEM "debugging.xml">
<!ENTITY PROGRAMS SYSTEM "programs.xml">
<!ENTITY COMPONENTS SYSTEM "components.xml">
<!ENTITY GNOME SYSTEM "gnome.xml">
<!ENTITY WIN32 SYSTEM "win32.xml">
<!ENTITY QUOTES SYSTEM "quotes.xml">
<!ENTITY GStreamer "<application>GStreamer</application>">
]>
<book id="index">
@ -99,6 +86,16 @@ in the GStreamer tarball.
</para>
</authorblurb>
</author>
<author>
<firstname>Ronald</firstname>
<othername>S.</othername>
<surname>Bultje</surname>
<authorblurb>
<para>
<email>rbultje@ronald.bitfreak.net</email>
</para>
</authorblurb>
</author>
</authorgroup>
<legalnotice id="misc-legalnotice">
@ -107,218 +104,218 @@ in the GStreamer tarball.
conditions set forth in the Open Publication License, v1.0 or later (the
latest version is presently available at <ulink url="
http://www.opencontent.org/opl.shtml"
type="http">http://www.opencontent.org/opl.shtml</ulink> )
type="http">http://www.opencontent.org/opl.shtml</ulink>).
</para>
</legalnotice>
<title><application>GStreamer</application> Application Development Manual (&GST_VERSION;)</title>
<title>&GStreamer; Application Development Manual (&GST_VERSION;)</title>
</bookinfo>
<!-- ############# Overview - part ############### -->
<!-- ############# Introduction & Overview - part ############### -->
<part id="part-overview"><title>Overview</title>
<part id="part-overview">
<title>Overview</title>
<partintro>
<para>
<xref linkend="part-overview"/> gives you an overview of
<application>GStreamer</application> design goals.
<xref linkend="part-basic-concepts"/> rapidly covers the basics of
<application>GStreamer</application> programming.
In <xref linkend="part-build-app"/> we will move on to the
examples. Since <application>GStreamer</application> uses <ulink
url="http://developer.gnome.org/arch/gtk/glib.html" type="http">GLib
2.0</ulink>, the reader is assumed to understand the basics of the
<ulink url="http://developer.gnome.org/doc/API/2.0/gobject/index.html"
type="http">GObject object model</ulink>.
For a gentle introduction to this system, you may wish to read the
<emphasis><ulink url="http://www.gtk.org/tutorial/" type="http">GTK+
Tutorial</ulink></emphasis>, Eric Harlow's book <emphasis>Developing
Linux Applications with GTK+ and GDK</emphasis> and the <emphasis>
<ulink type="http"
url="http://www.le-hacker.org/papers/gobject/index.html">Glib Object
system</ulink></emphasis>.
&GStreamer; is an exremely powerful and versatile framework for
creating streaming media applications. Many of the virtues of the
&GStreamer; framework come from its modularity: &GStreamer; can
seamlessly incorporate new plugin modules. But because modularity
and power often come at a cost of greater complexity (consider,
for example, <ulink
type="http" url="http://www.omg.org/">CORBA</ulink>), writing new
applications is not always easy.
</para>
<para>
This guide is intended to help you understand the &GStreamer;
framework (version &GST_VERSION;) so you can develop applications
based on it. The first chapters will focus on development of a
simple audio player, with much effort going into helping you
understand &GStreamer; concepts. Later chapters will go into
more advanced topics related to media playback, but also at
other forms of media processing (capture, editing, etc.).
</para>
</partintro>
<!-- ############ Introduction - chapter ############# -->
&INTRO;
&INTRO;
&MOTIVATION;
&MOTIVATION;
&GOALS;
</part>
<!-- ############ Basic concepts - part ############# -->
<part id="part-basic-concepts">
<part id="part-basics">
<title>Basic Concepts</title>
<partintro>
<para>
We will first describe the basics of
<application>GStreamer</application> programming by introducing the
different objects needed to create a media pipeline.
<para>
In these chapters, we will discuss the basic concepts of &GStreamer;
and the most-used objects, such as elements, pads and buffers. We
will use a visual representation of these objects so that we can
visualize the more complex pipelines you will learn to build later
on. You will get a first glance at the &GStreamer; API, which should
be enough for building elementary applications. Later on in this
part, you will also learn to build a basic command-line application.
</para>
<para>
We will use a visual representation of these objects so that we can
visualize the more complex pipelines you will learn to build later on.
<para>
Note that this part will give a look into the low-level API and
concepts of &GStreamer;. Once you're going to build applications,
you might want to use higher-level APIs. Those will be discussed
later on in this manual.
</para>
</partintro>
&ELEMENTS;
&PADS;
&PLUGINS;
&LINKS;
&BINS;
&BUFFERS;
&STATES;
</part>
<!-- ############ Basic API - part ############# -->
<part id="part-basic-api">
<title>Basic API</title>
<partintro>
<para>
This chapter will describe the basics of programming with GStreamer.
Most of the concepts from the previous chapter will be illustrated with code
fragments.
</para>
<para>
Most of the code examples in this manual are automatically extracted as part
of the build process of the GStreamer tarball. After building GStreamer from
source, you will find the examples in <filename>examples/manual</filename>.
Each example has a comment on the first line giving the name of the file
it will be extracted as.
</para>
</partintro>
&INIT-API;
&ELEMENTS-API;
&PADS-API;
&PLUGINS-API;
&LINKS-API;
&BINS-API;
&BUFFERS-API;
&STATES-API;
</part>
<!-- ############ Building Apps - part ############# -->
<part id="part-build-app"><title>Building an application</title>
<partintro>
<para>
With the basic concepts out of the way, you're ready to start building a
full-scale <application>GStreamer</application> application.
</para>
<para>
We assume the reader is familiar with GTK+/GNOME programming.
</para>
</partintro>
&HELLOWORLD;
&FACTORIES;
&INIT;
&ELEMENTS;
&BINS;
&PADS;
&DATA;
&HELLOWORLD;
</part>
<!-- ############ Advanced GStreamer - part ############# -->
<part id="part-advanced"><title>Advanced <application>GStreamer</application> concepts</title>
<part id="part-advanced">
<title>Advanced &GStreamer; concepts</title>
<partintro>
<para>
In this part we will cover the more advanced features of <application>GStreamer</application>.
With the basics you learned in the prevous part you should be
able to create a 'simple' pipeline. If you want more control over
the media types and the pipeline you should use the more
low-level features of <application>GStreamer</application>.
In this part we will cover the more advanced features of &GStreamer;.
With the basics you learned in the previous part you should be
able to create a <emphasis>simple</emphasis> application. However,
&GStreamer; provides much more candy than just the basics of playing
back audio files. In this chapter, you will learn more of the
low-level features and internals of &GStreamer;, such as threads,
scheduling, synchronization, metadata, interfaces and dynamic
parameters.
</para>
</partintro>
&THREADS;
<!--
Idea:
* Querying and events
- seeking
- getting stream length
- prerolls and why
* Stream info
- pads info (see 2.4)
- tags
- inserting tags
* Interfaces
- each
* Clocks & Synchronization
- stress that it's automated
* Dynamic parameters
- ..
* Threading
- gstthread & queues
- buffering (network, live network/video/audio source)
- when 1-to-N, N-to-1 and N-to-N elements need threads
* Scheduling
- loop/chain/get etc. (internals)
- opt
* Autoplugging principles
- type detection
- dynamic element lookup (registry, factories and categories)
- reference part4!
- Hello world 2
* Pipeline manipulation
- data manipulation (identity, fakesrc, fakesink)
- probes (both events and data)
- length manipulation (managers, PWG)
- explain how to embed elements in applications
- explain why separate elements are to be preferred
-->
&QUEUES;
&QUERYEVENTS;
&METADATA;
&INTERFACES;
&CLOCKS;
&DPARAMS;
&THREADS;
&SCHEDULERS;
&AUTOPLUGGING;
&DATAACCESS;
&COTHREADS;
&SCHEDULERS;
&CLOCKS;
&DYNAMIC;
&TYPEDETECTION;
&AUTOPLUGGING;
&HELLOWORLD2;
&DPARAMS;
</part>
<!-- ############ XML in GStreamer - part ############# -->
<part id="part-xml-gstreamer"><title>XML in <application>GStreamer</application></title>
<!-- ############ Higher-level APIs in GStreamer - part ############# -->
<part id="part-highlevel">
<title>Higher-level interfaces for &GStreamer; applications</title>
<partintro>
<para>
<application>GStreamer</application> has the possibility to serialize the pipelines you
create using an XML format. You can load a previously created pipeline by loading the XML
file.
In the previous two parts, you have learned many of the internals
and their corresponding low-level interfaces into &GStreamer;
application programming. Many people will, however, not need so
much control (and as much code), but will prefer to use a standard
playback interface that does most of the difficult internals for
them. In this chapter, we will introduce you into the concept of
autopluggers, playback managing elements, XML-based pipelines and
other such things. Those higher-level interfaces are intended to
simplify &GStreamer;-based application programming. They do, however,
also reduce the flexibility. It is up to the application developer
to choose which interface he will want to use.
</para>
</partintro>
&XML;
</part>
&COMPONENTS;
&XML;
</part>
<!-- ############ Appendices - part ############# -->
<part id="part-appendices">
<title>Appendices</title>
<partintro>
<para>
<application>GStreamer</application> comes prepackaged with a few
programs, and some useful debugging options.
By now, you've learned all about the internals of &GStreamer; and
application programming using the &GStreamer; framework. This part
will go into some random bits that are useful to know if you're
going to use &GStreamer; for serious application programming. It
will touch upon things related to integration with popular desktop
environments that we run on (GNOME, KDE, OS X, Windows), it will
shortly explain how applications included with &GStreamer; can help
making your life easier, and some information on debugging.
</para>
</partintro>
&DEBUGGING;
<!--
Idea:
* Debugging and error handling
- 'error' signal in pipelines
- checking return values and how to handle them
- using signals for pipeline states
- gst-debug
- programs
* Desktop integration
- Linux/UNIX
. {x,xv}imagesink
. {oss,alsa}sink
. {v4l,v4l2,oss,alsa}src
- GNOME
. GConf ({video,audio}{src,sink})
. gnomevfssrc, gnomevfssink
. popt
. app examples (RB, Totem, gnome-media, ...)
- KDE
. kiosrc
. app examples (JuK, AmaroK)
. ask Scott/Mark
- Mac OS X
. native video/audio sink
- Windows
. build etc.
* Quotes from devs
- table please...
-->
&PROGRAMS;
&COMPONENTS;
&GNOME;
&WIN32;
&QUOTES;
&DEBUGGING;
&PROGRAMS;
&GNOME;
&WIN32;
&QUOTES;
</part>
</book>

View file

@ -1,111 +0,0 @@
<chapter id="chapter-motivation">
<title>Motivation</title>
<para>
Linux has historically lagged behind other operating systems in the multimedia
arena. Microsoft's <trademark>Windows</trademark> and Apple's <trademark>MacOS</trademark> both have strong support
for multimedia devices, multimedia content creation,
playback, and realtime processing. Linux, on the other hand, has a poorly integrated
collection of multimedia utilities and applications available, which can hardly compete
with the professional level of software available for MS Windows and MacOS.
</para>
<sect1 id="section-motivation-problems">
<title>Current problems</title>
<para>
We describe the typical problems in today's media handling on Linux.
</para>
<sect2 id="section-motivation-duplicate">
<title>Multitude of duplicate code</title>
<para>
The Linux user who wishes to hear a sound file must hunt through their collection of
sound file players in order to play the tens of sound file formats in wide use today.
Most of these players basically reimplement the same code over and over again.
</para>
<para>
The Linux developer who wishes to embed a video clip in their application must use
crude hacks to run an external video player. There is no library available that a
developer can use to create a custom media player.
</para>
</sect2>
<sect2 id="section-motivation-goal">
<title>'One goal' media players/libraries</title>
<para>
Your typical MPEG player was designed to play MPEG video and audio. Most of
these players have implemented a complete infrastructure focused on
achieving their only goal: playback. No provisions were made to add
filters or special effects to the video or audio data.
</para>
<para>
If you want to convert an MPEG2 video stream into an AVI file, your best
option would be to take all of the MPEG2 decoding algorithms out
of the player and duplicate them into your own AVI encoder. These
algorithms cannot easily be shared across applications.
</para>
<para>
Attempts have been made to create libraries for handling various media types.
Because they focus on a very specific media type (avifile, libmpeg2, ...),
significant work is needed to integrate them due to a lack of a common API.
GStreamer allows you to wrap these libraries with a common API, which
significantly simplifies integration and reuse.
</para>
</sect2>
<sect2 id="section-motivation-plugin">
<title>Non unified plugin mechanisms</title>
<para>
Your typical media player might have a plugin for different media
types. Two media players will typically implement their own plugin
mechanism so that the codecs cannot be easily exchanged. The plugin system
of the typical media player is also very tailored to the specific needs
of the application.
</para>
<para>
The lack of a unified plugin mechanism also seriously hinders the
creation of binary only codecs. No company is willing to port their
code to all the different plugin mechanisms.
</para>
<para>
While GStreamer also uses it own plugin system it offers a very rich
framework for the plugin developper and ensures the plugin can be used
in a wide range of applications, transparently interacting with other
plugins. The framework that GStreamer provides for the plugins is
flexible enough to host even the most demanding plugins.
</para>
</sect2>
<sect2 id="section-motivation-network">
<title>Provision for network transparency</title>
<para>
No infrastructure is present to allow network transparent media
handling. A distributed MPEG encoder will typically duplicate the
same encoder algorithms found in a non-distributed encoder.
</para>
<para>
No provisions have been made for technologies such as
the <ulink url="http://developer.gnome.org/arch/component/bonobo.html"
type="http">GNOME object embedding using Bonobo</ulink>.
</para>
<para>
The GStreamer core does not use network transparent technologies at the
lowest level as it only adds overhead for the local case.
That said, it shouldn't be hard to create a wrapper around the
core components. There are tcp plugins now that implement a GStreamer
Data Protocol that allows pipelines to be slit over TCP. These are
located in the gst-plugins module directory gst/tcp.
</para>
</sect2>
<sect2 id="section-motivation-catchup">
<title>Catch up with the <trademark>Windows</trademark> world</title>
<para>
We need solid media handling if we want to see Linux succeed on
the desktop.
</para>
<para>
We must clear the road for commercially backed codecs and multimedia
applications so that Linux can become an option for doing multimedia.
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,302 +0,0 @@
<chapter id="chapter-pads-api">
<title>Pads</title>
<para>
As we have seen in <xref linkend="chapter-elements"/>, the pads are the element's
interface to the outside world.
</para>
<para>
The specific type of media that the element can handle will be exposed by the pads.
The description of this media type is done with capabilities(see
<xref linkend="section-caps"/>)
</para>
<para>
Pads are either source or sink pads. The terminology is defined from the
view of the element itself: elements accept data on their sink pads, and
send data out on their source pads. Sink pads are drawn on the left,
while source pads are drawn on the right of an element. In general,
data flows from left to right in the graph.<footnote>
<para>
In reality, there is no objection to data flowing from a
source pad to the sink pad of an element upstream. Data will, however,
always flow from a source pad of one element to the sink pad of
another.
</para></footnote>
</para>
<sect1 id="section-pads-api-type">
<title>Types of pad</title>
<sect2 id="section-pads-api-dynamic">
<title>Dynamic pads</title>
<para>
You can attach a signal to an element to inform you when the element has created
a new pad from one of its padtemplates. The following piece of code is an example
of how to do this:
</para>
<programlisting>
static void
pad_link_func (GstElement *parser, GstPad *pad, GstElement *pipeline)
{
g_print("***** a new pad %s was created\n", gst_pad_get_name(pad));
gst_element_set_state (pipeline, GST_STATE_PAUSED);
if (strncmp (gst_pad_get_name (pad), "private_stream_1.0", 18) == 0) {
// set up an AC3 decoder pipeline
...
// link pad to the AC3 decoder pipeline
...
}
gst_element_set_state (GST_ELEMENT (audio_thread), GST_STATE_READY);
}
int
main(int argc, char *argv[])
{
GstElement *pipeline;
GstElement *mpeg2parser;
// create pipeline and do something useful
...
mpeg2parser = gst_element_factory_make ("mpegdemux", "mpegdemux");
g_signal_connect (G_OBJECT (mpeg2parser), "new_pad", pad_link_func, pipeline);
...
// start the pipeline
gst_element_set_state (GST_ELEMENT (pipeline), GST_STATE_PLAYING);
...
}
</programlisting>
<note>
<para>
A pipeline cannot be changed in the PLAYING state.
</para>
</note>
</sect2>
<sect2 id="section-pads-api-request">
<title>Request pads</title>
<para>
The following piece of code can be used to get a pad from the tee element. After
the pad has been requested, it can be used to link another element to it.
</para>
<programlisting>
...
GstPad *pad;
...
element = gst_element_factory_make ("tee", "element");
pad = gst_element_get_request_pad (element, "src%d");
g_print ("new pad %s\n", gst_pad_get_name (pad));
...
</programlisting>
<para>
The gst_element_get_request_pad method can be used to get a pad
from the element based on the name_template of the padtemplate.
</para>
<para>
It is also possible to request a pad that is compatible with another
pad template. This is very useful if you want to link an element
to a multiplexer element and you need to request a pad that is
compatible. The gst_element_get_compatible_pad is used to request
a compatible pad, as is shown in the next example.
</para>
<programlisting>
...
GstPadTemplate *templ;
GstPad *pad;
...
element = gst_element_factory_make ("tee", "element");
mad = gst_element_factory_make ("mad", "mad");
templ = gst_element_get_pad_template_by_name (mad, "sink");
pad = gst_element_get_compatible_pad (element, templ);
g_print ("new pad %s\n", gst_pad_get_name (pad));
...
</programlisting>
</sect2>
</sect1>
<sect1 id="section-api-caps">
<title>Capabilities of a pad</title>
<para>
Since the pads play a very important role in how the element is viewed by the
outside world, a mechanism is implemented to describe the data that can
flow through the pad by using capabilities.
</para>
<para>
We will briefly describe what capabilities are, enough for you to get a basic understanding
of the concepts. You will find more information on how to create capabilities in the
Plugin Writer's Guide.
</para>
<sect2 id="section-pads-api-caps">
<title>Capabilities</title>
<para>
Capabilities are attached to a pad in order to describe
what type of media the pad can handle.
</para>
<para>
Its structure is:
</para>
<programlisting>
struct _GstCaps {
gchar *name; /* the name of this caps */
guint16 id; /* type id (major type) */
guint refcount; /* caps are refcounted */
GstProps *properties; /* properties for this capability */
GstCaps *next; /* caps can be chained together */
};
</programlisting>
</sect2>
<sect2 id="section-pads-api-caps-get">
<title>Getting the capabilities of a pad</title>
<para>
A pad can have a chain of capabilities attached to it. You can get the capabilities chain
with:
</para>
<programlisting>
GstCaps *caps;
...
caps = gst_pad_get_caps (pad);
g_print ("pad name %s\n", gst_pad_get_name (pad));
while (caps) {
g_print (" Capability name %s, MIME type %s\n",
gst_caps_get_name (cap),
gst_caps_get_mime (cap));
caps = caps-&gt;next;
}
...
</programlisting>
</sect2>
<sect2 id="section-pads-api-caps-create">
<title>Creating capability structures</title>
<para>
While capabilities are mainly used inside a plugin to describe the
media type of the pads, the application programmer also has to have
basic understanding of capabilities in order to interface with the
plugins, especially when using the autopluggers.
</para>
<para>
As we said, a capability has a name, a mime-type and some
properties. The signature of the function to create a new
<ulink type="http" url="../../gstreamer/html/gstreamer-GstCaps.html">
<classname>GstCaps</classname></ulink> structure is:
<!-- FIXME: GstProbs are deprecated, in gst-0.8.X -->
<programlisting>
GstCaps* gst_caps_new (const gchar *name, const gchar *mime, GstProps *props);
</programlisting>
</para>
<para>
You can therefore create a new capability with no properties like this:
<programlisting>
GstCaps *newcaps;
newcaps = gst_caps_new ("my_caps", "audio/x-wav", NULL);
</programlisting>
</para>
<para>
<classname>GstProps</classname> basically consist of a set of key-value pairs
and are created with a function with this signature:
<programlisting>
GstProps* gst_props_new (const gchar *firstname, ...);
</programlisting>
</para>
<para>
The keys are given as strings and the values are given with a set of macros:
<itemizedlist>
<listitem>
<para>
GST_PROPS_INT(a): An integer value
</para>
</listitem>
<listitem>
<para>
GST_PROPS_FLOAT(a): A floating point value
</para>
</listitem>
<listitem>
<para>
GST_PROPS_FOURCC(a): A fourcc value
</para>
</listitem>
<listitem>
<para>
GST_PROPS_BOOLEAN(a): A boolean value
</para>
</listitem>
<listitem>
<para>
GST_PROPS_STRING(a): A string value
</para>
</listitem>
</itemizedlist>
The values can also be specified as ranges with:
<itemizedlist>
<listitem>
<para>
GST_PROPS_INT_RANGE(a,b): An integer range from a to b
</para>
</listitem>
<listitem>
<para>
GST_PROPS_FLOAT_RANGE(a,b): A float range from a to b
</para>
</listitem>
</itemizedlist>
All of the above values can be given with a list too, using:
<itemizedlist>
<listitem>
<para>
GST_PROPS_LIST(a,...): A list of property values.
</para>
</listitem>
</itemizedlist>
</para>
<para>
A more complex capability with properties is created like this:
<programlisting>
GstCaps *newcaps;
newcaps = gst_caps_new ("my_caps",
"audio/x-wav",
gst_props_new (
"bitrate", GST_PROPS_INT_RANGE (11025,22050),
"depth", GST_PROPS_INT (16),
"signed", GST_PROPS_LIST (
GST_PROPS_BOOLEAN (TRUE),
GST_PROPS_BOOLEAN (FALSE)
),
NULL
);
</programlisting>
Optionally, the convenient shortcut macro can be used. The above complex
capability can be created with:
<programlisting>
GstCaps *newcaps;
newcaps = GST_CAPS_NEW ("my_caps",
"audio/x-wav",
"bitrate", GST_PROPS_INT_RANGE (11025,22050),
"depth", GST_PROPS_INT (16),
"signed", GST_PROPS_LIST (
GST_PROPS_BOOLEAN (TRUE),
GST_PROPS_BOOLEAN (FALSE)
)
);
</programlisting>
</para>
</sect2>
</sect1>
</chapter>

View file

@ -1,244 +0,0 @@
<chapter id="chapter-pads">
<title>Pads</title>
<para>
As we have seen in <xref linkend="chapter-elements"/>, the pads are the element's
interface to the outside world.
</para>
<para>
The specific type of media that the element can handle will be exposed by the pads.
The description of this media type is done with capabilities(see
<xref linkend="section-caps"/>)
</para>
<para>
Pads are either source or sink pads. The terminology is defined from the
view of the element itself: elements accept data on their sink pads, and
send data out on their source pads. Sink pads are drawn on the left,
while source pads are drawn on the right of an element. In general,
data flows from left to right in the graph.<footnote>
<para>
In reality, there is no objection to data flowing from a
source pad to the sink pad of an element upstream. Data will, however,
always flow from a source pad of one element to the sink pad of
another.
</para></footnote>
</para>
<sect1 id="section-pads-type">
<title>Types of pad</title>
<sect2 id="section-pads-dynamic">
<title>Dynamic pads</title>
<para>
Some elements might not have all of their pads when the element is
created. This
can happen, for example, with an MPEG system demultiplexer. The
demultiplexer will create its pads at runtime when it detects the
different elementary streams in the MPEG system stream.
</para>
<para>
Running <application>gst-inspect mpegdemux</application> will show that
the element has only one pad: a sink pad called 'sink'. The other pads are
"dormant". You can see this in the pad template because there is
an 'Exists: Sometimes'
property. Depending on the type of MPEG file you play, the pads will
be created. We
will see that this is very important when you are going to create dynamic
pipelines later on in this manual.
</para>
</sect2>
<sect2 id="section-pads-request">
<title>Request pads</title>
<para>
An element can also have request pads. These pads are not created
automatically but are only created on demand. This is very useful
for multiplexers, aggregators and tee elements.
</para>
<para>
The tee element, for example, has one input pad and a request padtemplate for the
output pads. Whenever an element wants to get an output pad from the tee element, it
has to request the pad.
</para>
</sect2>
</sect1>
<sect1 id="section-caps">
<title>Capabilities of a pad</title>
<para>
Since the pads play a very important role in how the element is viewed by the
outside world, a mechanism is implemented to describe the data that can
flow through the pad by using capabilities.
</para>
<para>
We will briefly describe what capabilities are, enough for you to get a basic understanding
of the concepts. You will find more information on how to create capabilities in the
Plugin Writer's Guide.
</para>
<sect2 id="section-pads-caps">
<title>Capabilities</title>
<para>
Capabilities are attached to a pad in order to describe
what type of media the pad can handle.
</para>
<para>
Capabilities is shorthand for "capability chain". A capability chain
is a chain of one capability or more.
</para>
<para>
The basic entity is a capability, and is defined by a name, a MIME
type and a set of properties. A capability can be chained to
another capability, which is why we commonly refer to a chain of
capability entities as "capabilities".
<footnote>
<para>
It is important to understand that the term "capabilities" refers
to a chain of one capability or more. This will be clearer when
you see the structure definition of a <ulink type="http"
url="../../gstreamer/html/gstreamer-GstCaps.html"><classname>GstCaps
</classname></ulink>element.
</para>
</footnote>
</para>
<para>
Below is a dump of the capabilities of the element mad, as shown by
<command>gst-inspect</command>.
You can see two pads: sink and src. Both pads have capability information attached to them.
</para>
<para>
The sink pad (input pad) is called 'sink' and takes data of MIME type 'audio/mp3'. It also has
three properties: layer, bitrate and framed.
</para>
<para>
The source pad (output pad) is called 'src' and outputs data of
MIME type 'audio/raw'. It also has four properties: format, depth,
rate and channels.
</para>
<programlisting>
Pads:
SINK template: 'sink'
Availability: Always
Capabilities:
'mad_sink':
MIME type: 'audio/mp3':
SRC template: 'src'
Availability: Always
Capabilities:
'mad_src':
MIME type: 'audio/raw':
format: String: int
endianness: Integer: 1234
width: Integer: 16
depth: Integer: 16
channels: Integer range: 1 - 2
law: Integer: 0
signed: Boolean: TRUE
rate: Integer range: 11025 - 48000
</programlisting>
</sect2>
<sect2 id="section-pads-props">
<title>What are properties ?</title>
<para>
Properties are used to describe extra information for
capabilities. A property consists of a key (a string) and
a value. There are different possible value types that can be used:
</para>
<itemizedlist>
<listitem>
<para>
basic types:
</para>
<itemizedlist>
<listitem>
<para>
an integer value: the property has this exact value.
</para>
</listitem>
<listitem>
<para>
a boolean value: the property is either TRUE or FALSE.
</para>
</listitem>
<listitem>
<para>
a fourcc value: this is a value that is commonly used to
describe an encoding for video,
as used for example by the AVI specification.
<footnote><para>
fourcc values consist of four bytes.
<ulink url="http://www.fourcc.org" type="http">The FOURCC
Definition List</ulink> is the most complete resource
on the allowed fourcc values.
</para></footnote>
</para>
</listitem>
<listitem>
<para>
a float value: the property has this exact floating point value.
</para>
</listitem>
<listitem>
<para>
a string value.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
range types:
</para>
<itemizedlist>
<listitem>
<para>
an integer range value: the property denotes a range of
possible integers. For example, the wavparse element has
a source pad where the "rate" property can go from 8000 to
48000.
</para>
</listitem>
<listitem>
<para>
a float range value: the property denotes a range of possible
floating point values.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
a list value: the property can take any value from a list of
basic value types or range types.
</para>
</listitem>
</itemizedlist>
</sect2>
<sect2 id="section-pads-caps-use">
<title>What capabilities are used for</title>
<para>
Capabilities describe in great detail the type of media that is handled by the pads.
They are mostly used for:
</para>
<itemizedlist>
<listitem>
<para>
Autoplugging: automatically finding plugins for a set of capabilities
</para>
</listitem>
<listitem>
<para>
Compatibility detection: when two pads are linked, <application>GStreamer</application>
can verify if the two pads are talking about the same media types.
The process of linking two pads and checking if they are compatible
is called "caps negotiation".
</para>
</listitem>
</itemizedlist>
</sect2>
</sect1>
</chapter>

View file

@ -1,56 +0,0 @@
<chapter id="chapter-plugins-api">
<title>Plugins</title>
<!-- FIXME: introduce type definitions before this chapter -->
<para>
All plugins should implement one function, <function>plugin_init</function>,
that creates all the element factories and registers all the type
definitions contained in the plugin.
Without this function, a plugin cannot be registered.
</para>
<para>
The plugins are maintained in the plugin system. Optionally, the
type definitions and the element factories can be saved into an XML
representation so that the plugin system does not have to load all
available plugins in order to know their definition.
</para>
<para>
The basic plugin structure has the following fields:
</para>
<programlisting>
typedef struct _GstPlugin GstPlugin;
struct _GstPlugin {
gchar *name; /* name of the plugin */
gchar *longname; /* long name of plugin */
gchar *filename; /* filename it came from */
GList *types; /* list of types provided */
gint numtypes;
GList *elements; /* list of elements provided */
gint numelements;
GList *autopluggers; /* list of autopluggers provided */
gint numautopluggers;
gboolean loaded; /* if the plugin is in memory */
};
</programlisting>
<para>
You can query a <classname>GList</classname> of available plugins with the
function <function>gst_plugin_get_list</function> as this example shows:
</para>
<programlisting>
GList *plugins;
plugins = gst_plugin_get_list ();
while (plugins) {
GstPlugin *plugin = (GstPlugin *)plugins-&gt;data;
g_print ("plugin: %s\n", gst_plugin_get_name (plugin));
plugins = g_list_next (plugins);
}
</programlisting>
</chapter>

View file

@ -1,31 +0,0 @@
<chapter id="chapter-plugins">
<title>Plugins</title>
<!-- FIXME: introduce type definitions before this chapter -->
<para>
A plugin is a shared library that contains at least one of the following
items:
</para>
<itemizedlist>
<listitem>
<para>
one or more element factories
</para>
</listitem>
<listitem>
<para>
one or more type definitions
</para>
</listitem>
<listitem>
<para>
one or more auto-pluggers
</para>
</listitem>
<listitem>
<para>
exported symbols for use in other plugins
</para>
</listitem>
</itemizedlist>
</chapter>

View file

@ -1,333 +0,0 @@
<chapter id="chapter-programs">
<title>Programs</title>
<para>
</para>
<sect1 id="section-programs-gst-register">
<title><command>gst-register</command></title>
<para>
<command>gst-register</command> is used to rebuild the database of plugins.
It is used after a new plugin has been added to the system. The plugin database
can be found, by default, in <filename>/etc/gstreamer/reg.xml</filename>.
</para>
</sect1>
<sect1 id="section-programs-gst-launch">
<title><command>gst-launch</command></title>
<para>
This is a tool that will construct pipelines based on a command-line
syntax.
</para>
<para>
A simple commandline to play a mp3 audio file looks like:
<screen>
gst-launch filesrc location=hello.mp3 ! mad ! osssink
</screen>
A more complex pipeline looks like:
<screen>
gst-launch filesrc location=redpill.vob ! mpegdemux name=demux \
demux.audio_00! { ac3parse ! a52dec ! osssink } \
demux.video_00! { mpeg2dec ! xvideosink }
</screen>
<xref linkend="section-programs-gst-launch-more-examples"/> lists more gst-launch commandlines.
</para>
<para>
You can also use the parser in you own
code. <application>GStreamer</application> provides a function
gst_parse_launch () that you can use to construct a pipeline.
The following program lets you create an MP3 pipeline using the
gst_parse_launch () function:
</para>
<programlisting>
#include &lt;gst/gst.h&gt;
int
main (int argc, char *argv[])
{
GstElement *pipeline;
GstElement *filesrc;
GError *error = NULL;
gst_init (&amp;argc, &amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;filename&gt;\n", argv[0]);
return -1;
}
pipeline = gst_parse_launch ("filesrc name=my_filesrc ! mad ! osssink", &amp;error);
if (!pipeline) {
g_print ("Parse error: %s\n", error->message);
exit (1);
}
filesrc = gst_bin_get_by_name (GST_BIN (pipeline), "my_filesrc");
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
while (gst_bin_iterate (GST_BIN (pipeline)));
gst_element_set_state (pipeline, GST_STATE_NULL);
return 0;
}
</programlisting>
<para>
Note how we can retrieve the filesrc element from the constructed bin using the
element name.
</para>
<sect2>
<title>Grammar Reference</title>
<para>
The <command>gst-launch</command> syntax is processed by a flex/bison parser. This section
is intended to provide a full specification of the grammar; any deviations from this
specification is considered a bug.
</para>
<sect3>
<title>Elements</title>
<screen>
... mad ...
</screen>
<para>
A bare identifier (a string beginning with a letter and containing
only letters, numbers, dashes, underscores, percent signs, or colons)
will create an element from a given element factory. In this example,
an instance of the "mad" MP3 decoding plugin will be created.
</para>
</sect3>
<sect3>
<title>Links</title>
<screen>
... !sink ...
</screen>
<para>
An exclamation point, optionally having a qualified pad name (an the name of the pad,
optionally preceded by the name of the element) on both sides, will link two pads. If
the source pad is not specified, a source pad from the immediately preceding element
will be automatically chosen. If the sink pad is not specified, a sink pad from the next
element to be constructed will be chosen. An attempt will be made to find compatible
pads. Pad names may be preceded by an element name, as in
<computeroutput>my_element_name.sink_pad</computeroutput>.
</para>
</sect3>
<sect3>
<title>Properties</title>
<screen>
... location="http://gstreamer.net" ...
</screen>
<para>
The name of a property, optionally qualified with an element name, and a value,
separated by an equals sign, will set a property on an element. If the element is not
specified, the previous element is assumed. Strings can optionally be enclosed in
quotation marks. Characters in strings may be escaped with the backtick
(<literal>\</literal>). If the right-hand side is all digits, it is considered to be an
integer. If it is all digits and a decimal point, it is a double. If it is "true",
"false", "TRUE", or "FALSE" it is considered to be boolean. Otherwise, it is parsed as a
string. The type of the property is determined later on in the parsing, and the value is
converted to the target type. This conversion is not guaranteed to work, it relies on
the g_value_convert routines. No error message will be displayed on an invalid
conversion, due to limitations in the value convert API.
</para>
<para>
The list of properties an element supports can be found out using
<userinput>gst-inspect elemnt-name</userinput>.
</para>
</sect3>
<sect3>
<title>Bins, Threads, and Pipelines</title>
<screen>
( ... )
</screen>
<para>
A pipeline description between parentheses is placed into a bin. The open paren may be
preceded by a type name, as in <computeroutput>jackbin.( ... )</computeroutput> to make
a bin of a specified type. Square brackets '[ ]' make pipelines, and curly braces '{ }' make
threads. The default toplevel bin type is a pipeline, although putting the whole
description within parentheses or braces can override this default.
</para>
</sect3>
</sect2>
<sect2 id="section-programs-gst-launch-more-examples">
<title>More Examples</title>
<para>
This chapter collects some more complex pipelines. The examples are split into several lines,
so make sure to include the trailing backslashes.
When modifying the pipelines and seeking for the right element to insert, a grep of the gst-inspect
output often gives a starting point:
<screen>
gst-inspect | grep "avi"
</screen>
Another way is to do:
<screen>
gst-launch filesrc location=video.avi ! decodebin name=d ! xvimagesink d. ! { queue ! alsasink } -v
</screen>
and look on the output, which plugins it chooses.
</para>
<para>
Play a remote mp3 audio file:
<screen>
gst-launch gnomevfssrc location=http://www.server.org/hello.mp3 ! mad ! alsasink
</screen>
</para>
<para>
Play a local mp3 audio file with visualisation:
<screen>
gst-launch filesrc location=Hello.mp3 ! mad ! tee name=t ! \
{ queue ! osssink } \
{ t. ! queue ! synaesthesia ! ffmpegcolorspace ! xvimagesink }
</screen>
</para>
<para>
Play a local ogg audio file:
<screen>
gst-launch filesrc location=file.ogg ! oggdemux ! vorbisdec ! audioconvert ! audioscale ! alsasink
</screen>
</para>
<para>
Play a local ogg video file:
<screen>
gst-launch filesrc location=file.ogg ! oggdemux name=demux \
{ demux. ! queue ! theoradec ! ffmpegcolorspace ! videoscale ! xvimagesink } \
{ demux. ! queue ! vorbisdec ! audioconvert ! audioscale ! alsasink }
</screen>
</para>
<para>
Play a local avi video file:
<screen>
gst-launch filesrc location=video.avi ! mpegdemux name=demux \
demux.audio_00! { queue ! ac3parse ! a52dec ! osssink } \
demux.video_00! { queue ! mpeg2dec ! xvideosink }
</screen>
</para>
<para>
Transcoding an audio file from one format into another:
<screen>
gst-launch filesrc location=file.ogg ! oggdemux ! vorbisdec ! audioconvert ! flacenc ! filesink location=file.flac
</screen>
<screen>
gst-launch filesrc location=file.mp3 ! id3demus ! mad ! audioconvert ! rawvorbisenc ! oggmux ! filesink location=file.ogg
</screen>
</para>
<para>
Transcoding an dvd video into a ogg video:
<screen>
gst-launch-0.8 oggmux name=mux ! filesink location=/tmp/file.ogg \
{ dvdreadsrc location=/dev/cdrom ! dvddemux name=demux.audio_00 ! \
{ queue ! a52dec ! audioconvert ! rawvorbisenc ! queue ! mux. } \
{ demux.video_00 ! queue ! mpeg2dec ! ffcolorspace ! videoscale ! video/x-raw-yuv,width=384,height=288 ! tee name=t ! \
{ queue ! theoraenc ! queue ! mux. } \
} \
} \
{ t. ! queue ! ffcolorspace ! ximagesink }
</screen>
</para>
</sect2>
</sect1>
<sect1 id="section-programs-gst-inspect">
<title><command>gst-inspect</command></title>
<para>
This is a tool to query a plugin or an element about its properties.
</para>
<para>
To query the information about the element mad, you would specify:
</para>
<screen>
gst-inspect mad
</screen>
<para>
Below is the output of a query for the osssink element:
</para>
<screen>
Factory Details:
Long name: Audio Sink (OSS)
Class: Sink/Audio
Description: Output to a sound card via OSS
Version: 0.3.3.1
Author(s): Erik Walthinsen &lt;omega@cse.ogi.edu&gt;, Wim Taymans &lt;wim.taymans@chello.be&gt;
Copyright: (C) 1999
GObject
+----GstObject
+----GstElement
+----GstOssSink
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
'osssink_sink':
MIME type: 'audio/raw':
format: String: int
endianness: Integer: 1234
width: List:
Integer: 8
Integer: 16
depth: List:
Integer: 8
Integer: 16
channels: Integer range: 1 - 2
law: Integer: 0
signed: List:
Boolean: FALSE
Boolean: TRUE
rate: Integer range: 1000 - 48000
Element Flags:
GST_ELEMENT_THREADSUGGESTED
Element Implementation:
No loopfunc(), must be chain-based or not configured yet
Has change_state() function: gst_osssink_change_state
Has custom save_thyself() function: gst_element_save_thyself
Has custom restore_thyself() function: gst_element_restore_thyself
Clocking Interaction:
element requires a clock
element provides a clock: GstOssClock
Pads:
SINK: 'sink'
Implementation:
Has chainfunc(): 0x40056fc0
Pad Template: 'sink'
Element Arguments:
name : String (Default "element")
device : String (Default "/dev/dsp")
mute : Boolean (Default false)
format : Integer (Default 16)
channels : Enum "GstAudiosinkChannels" (default 1)
(0): Silence
(1): Mono
(2): Stereo
frequency : Integer (Default 11025)
fragment : Integer (Default 6)
buffer-size : Integer (Default 4096)
Element Signals:
"handoff" : void user_function (GstOssSink* object,
gpointer user_data);
</screen>
<para>
To query the information about a plugin, you would do:
</para>
<screen>
gst-inspect gstelements
</screen>
</sect1>
</chapter>

View file

@ -1,129 +0,0 @@
<chapter id="chapter-queues">
<title>Queues</title>
<para>
A queue is a filter element.
Queues can be used to link two elements in such way that the data can
be buffered.
</para>
<para>
A buffer that is sinked to a Queue will not automatically be pushed to the
next linked element but will be buffered. It will be pushed to the next
element as soon as a gst_pad_pull () is called on the queue's source pad.
</para>
<para>
Queues are mostly used in conjunction with a thread bin to
provide an external link for the thread's elements. You could have one
thread feeding buffers into a queue and another
thread repeatedly pulling on the queue to feed its
internal elements.
</para>
<para>
Below is a figure of a two-threaded decoder. We have one thread (the main execution
thread) reading the data from a file, and another thread decoding the data.
</para>
<figure float="1" id="section-queues-img">
<title>a two-threaded decoder with a queue</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/queue.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
The standard <application>GStreamer</application> queue implementation has some
properties that can be changed using the g_objet_set () method. To set the
maximum number of buffers that can be queued to 30, do:
</para>
<programlisting>
g_object_set (G_OBJECT (queue), "max_level", 30, NULL);
</programlisting>
<para>
The following MP3 player shows you how to create the above pipeline
using a thread and a queue.
</para>
<programlisting>
/* example-begin queue.c */
#include &lt;stdlib.h&gt;
#include &lt;gst/gst.h&gt;
gboolean playing;
/* eos will be called when the src element has an end of stream */
void
eos (GstElement *element, gpointer data)
{
g_print ("have eos, quitting\n");
playing = FALSE;
}
int
main (int argc, char *argv[])
{
GstElement *filesrc, *audiosink, *queue, *decode;
GstElement *bin;
GstElement *thread;
gst_init (&amp;argc,&amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;mp3 filename&gt;\n", argv[0]);
exit (-1);
}
/* create a new thread to hold the elements */
thread = gst_thread_new ("thread");
g_assert (thread != NULL);
/* create a new bin to hold the elements */
bin = gst_bin_new ("bin");
g_assert (bin != NULL);
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_assert (filesrc != NULL);
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
g_signal_connect (G_OBJECT (filesrc), "eos",
G_CALLBACK (eos), thread);
queue = gst_element_factory_make ("queue", "queue");
g_assert (queue != NULL);
/* and an audio sink */
audiosink = gst_element_factory_make ("osssink", "play_audio");
g_assert (audiosink != NULL);
decode = gst_element_factory_make ("mad", "decode");
/* add objects to the main bin */
gst_bin_add_many (GST_BIN (thread), decode, audiosink, NULL);
gst_bin_add_many (GST_BIN (bin), filesrc, queue, thread, NULL);
gst_element_link (filesrc, queue);
gst_element_link_many (queue, decode, audiosink, NULL);
/* start playing */
gst_element_set_state (GST_ELEMENT (bin), GST_STATE_PLAYING);
playing = TRUE;
while (playing) {
gst_bin_iterate (GST_BIN (bin));
}
gst_element_set_state (GST_ELEMENT (bin), GST_STATE_NULL);
return 0;
}
/* example-end queue.c */
</programlisting>
</chapter>

View file

@ -1,253 +0,0 @@
<chapter id="chapter-quotes">
<title>Quotes from the Developers</title>
<para>
As well as being a cool piece of software,
<application>GStreamer</application> is a lively project, with
developers from around the globe very actively contributing.
We often hang out on the #gstreamer IRC channel on
irc.freenode.net: the following are a selection of amusing<footnote>
<para>No guarantee of sense of humour compatibility is given.</para>
</footnote> quotes from our conversations.
</para>
<variablelist>
<varlistentry>
<term>2 Nov 2004</term>
<listitem>
<para>
<emphasis>zaheerm</emphasis>:
wtay: unfair u fixed the bug i was using as a feature!
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Oct 2004</term>
<listitem>
<para>
<emphasis>* zaheerm</emphasis>
wonders how he can break gstreamer today :)
</para>
<para>
<emphasis>ensonic</emphasis>:
zaheerm, spider is always a good starting point
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Jun 2004</term>
<listitem>
<para>
<emphasis>teuf</emphasis>: ok, things work much better when I don't write incredibly stupid and buggy code
</para>
<para>
<emphasis>thaytan</emphasis>: I find that too
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>23 Nov 2003</term>
<listitem>
<para>
<emphasis>Uraeus</emphasis>: ah yes, the sleeping part, my mind
is not multitasking so I was still thinking about exercise
</para>
<para>
<emphasis>dolphy</emphasis>: Uraeus: your mind is multitasking
</para>
<para>
<emphasis>dolphy</emphasis>: Uraeus: you just miss low latency patches
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Sep 2002</term>
<listitem>
<para>
--- <emphasis>wingo-party</emphasis> is now known as
<emphasis>wingo</emphasis>
</para>
<para>
* <emphasis>wingo</emphasis> holds head
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>16 Feb 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
I shipped a few commerical products to &gt;40000 people now but
GStreamer is way more exciting...
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>16 Feb 2001</term>
<listitem>
<para>
*
<emphasis>tool-man</emphasis>
is a gstreamer groupie
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>14 Jan 2001</term>
<listitem>
<para>
<emphasis>Omega:</emphasis>
did you run ldconfig? maybe it talks to init?
</para>
<para>
<emphasis>wtay:</emphasis>
not sure, don't think so...
I did run gstreamer-register though :-)
</para>
<para>
<emphasis>Omega:</emphasis>
ah, that did it then ;-)
</para>
<para>
<emphasis>wtay:</emphasis>
right
</para>
<para>
<emphasis>Omega:</emphasis>
probably not, but in case GStreamer starts turning into an OS, someone please let me know?
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>9 Jan 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
me tar, you rpm?
</para>
<para>
<emphasis>wtay:</emphasis>
hehe, forgot &quot;zan&quot;
</para>
<para>
<emphasis>Omega:</emphasis>
?
</para>
<para>
<emphasis>wtay:</emphasis>
me tar&quot;zan&quot;, you ...
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>7 Jan 2001</term>
<listitem>
<para>
<emphasis>Omega:</emphasis>
that means probably building an agreggating, cache-massaging
queue to shove N buffers across all at once, forcing cache
transfer.
</para>
<para>
<emphasis>wtay:</emphasis>
never done that before...
</para>
<para>
<emphasis>Omega:</emphasis>
nope, but it's easy to do in gstreamer &lt;g&gt;
</para>
<para>
<emphasis>wtay:</emphasis>
sure, I need to rewrite cp with gstreamer too, someday :-)
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>7 Jan 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
GStreamer; always at least one developer is awake...
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>5/6 Jan 2001</term>
<listitem>
<para>
<emphasis>wtay:</emphasis>
we need to cut down the time to create an mp3 player down to
seconds...
</para>
<para>
<emphasis>richardb:</emphasis>
:)
</para>
<para>
<emphasis>Omega:</emphasis>
I'm wanting to something more interesting soon, I did the "draw an mp3
player in 15sec" back in October '99.
</para>
<para>
<emphasis>wtay:</emphasis>
by the time Omega gets his hands on the editor, you'll see a
complete audio mixer in the editor :-)
</para>
<para>
<emphasis>richardb:</emphasis>
Well, it clearly has the potential...
</para>
<para>
<emphasis>Omega:</emphasis>
Working on it... ;-)
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>28 Dec 2000</term>
<listitem>
<para>
<emphasis>MPAA:</emphasis>
We will sue you now, you have violated our IP rights!
</para>
<para>
<emphasis>wtay:</emphasis>
hehehe
</para>
<para>
<emphasis>MPAA:</emphasis>
How dare you laugh at us? We have lawyers! We have Congressmen! We have <emphasis>LARS</emphasis>!
</para>
<para>
<emphasis>wtay:</emphasis>
I'm so sorry your honor
</para>
<para>
<emphasis>MPAA:</emphasis>
Hrumph.
</para>
<para>
*
<emphasis>wtay</emphasis>
bows before thy
</para>
</listitem>
</varlistentry>
<varlistentry>
<term>4 Jun 2001</term>
<listitem>
<para><emphasis>taaz:</emphasis> you witchdoctors and your voodoo mpeg2 black magic... </para>
<para><emphasis>omega_:</emphasis> um. I count three, no four different cults there &lt;g&gt; </para>
<para><emphasis>ajmitch:</emphasis> hehe </para>
<para><emphasis>omega_:</emphasis> witchdoctors, voodoo, black magic, </para>
<para><emphasis>omega_:</emphasis> and mpeg </para>
</listitem>
</varlistentry>
</variablelist>
</chapter>

View file

@ -1,42 +0,0 @@
<chapter id="chapter-scheduler">
<title>Understanding schedulers</title>
<para>
The scheduler is responsible for managing the plugins at runtime. Its
main responsibilities are:
<itemizedlist>
<listitem>
<para>
Preparing the plugins so they can be scheduled.
</para>
</listitem>
<listitem>
<para>
Monitoring state changes and enabling/disabling the element in the
chain.
</para>
</listitem>
<listitem>
<para>
Choosing an element as the entry point for the pipeline.
</para>
</listitem>
<listitem>
<para>
Selecting and distributing the global clock.
</para>
</listitem>
</itemizedlist>
</para>
<para>
The scheduler is a pluggable component; this means that alternative
schedulers can be written and plugged into GStreamer. The default scheduler
uses cothreads to schedule the plugins in a pipeline. Cothreads are fast
and lightweight user-space threads.
</para>
<para>
There is usually no need to interact with the scheduler directly, however
in some cases it is feasible to set a specific clock or force a specific
plugin as the entry point in the pipeline.
</para>
</chapter>

View file

@ -1,48 +0,0 @@
<chapter id="chapter-states-api">
<title>Element states</title>
<sect1 id="section-states-api">
<title>Changing element state</title>
<para>
The state of an element can be changed with the following code:
</para>
<programlisting>
GstElement *bin;
// create a bin, put elements in it and link them
...
gst_element_set_state (bin, GST_STATE_PLAYING);
...
</programlisting>
<para>
You can set the following states on an element:
</para>
<informaltable pgwide="1" frame="none" role="enum">
<tgroup cols="2">
<tbody>
<row>
<entry><literal>GST_STATE_NULL</literal></entry>
<entry>Reset the state of an element.
</entry>
</row>
<row>
<entry><literal>GST_STATE_READY</literal></entry>
<entry>will make the element ready to start processing data.
</entry>
</row>
<row>
<entry><literal>GST_STATE_PAUSED</literal></entry>
<entry>temporary stops the data flow.
</entry>
</row>
<row>
<entry><literal>GST_STATE_PLAYING</literal></entry>
<entry>means there really is data flowing through the graph.
</entry>
</row>
</tbody>
</tgroup>
</informaltable>
</sect1>
</chapter>

View file

@ -1,141 +0,0 @@
<chapter id="chapter-states">
<title>Element states</title>
<para>
Once you have created a pipeline packed with elements, nothing will happen
right away. This is where the different states come into play.
</para>
<sect1 id="section-states">
<title>The different element states</title>
<para>
An element can be in one of the following four states:
<itemizedlist>
<listitem>
<para>
NULL: this is the default state all elements are in when they are created
and are doing nothing.
</para>
</listitem>
<listitem>
<para>
READY: An element is ready to start doing something.
</para>
</listitem>
<listitem>
<para>
PAUSED: The element is paused for a period of time.
</para>
</listitem>
<listitem>
<para>
PLAYING: The element is doing something.
</para>
</listitem>
</itemizedlist>
</para>
<para>
All elements start with the NULL state. The elements will go throught
the following state changes: NULL -&gt; READY -&gt; PAUSED -&gt;
PLAYING. When going from NULL to PLAYING, GStreamer will
internally go throught the intermediate states.
</para>
<para>
You can set the following states on an element:
</para>
<informaltable pgwide="1" frame="none" role="enum">
<tgroup cols="2">
<tbody>
<row>
<entry><literal>GST_STATE_NULL</literal></entry>
<entry>Reset the state of an element.
</entry>
</row>
<row>
<entry><literal>GST_STATE_READY</literal></entry>
<entry>will make the element ready to start processing data.
</entry>
</row>
<row>
<entry><literal>GST_STATE_PAUSED</literal></entry>
<entry>temporarily stops the data flow.
</entry>
</row>
<row>
<entry><literal>GST_STATE_PLAYING</literal></entry>
<entry>means there really is data flowing through the graph.
</entry>
</row>
</tbody>
</tgroup>
</informaltable>
</sect1>
<sect1 id="section-states-null">
<title>The NULL state</title>
<para>
When you created the pipeline all of the elements will be in the NULL state. There is
nothing special about the NULL state.
</para>
<note>
<para>
Don't forget to reset the pipeline to the NULL state when you are not going to use it
anymore. This will allow the elements to free the resources they might use.
</para>
</note>
</sect1>
<sect1 id="section-states-ready">
<title>The READY state</title>
<para>
You will start the pipeline by first setting it to the READY state. This will allow the
pipeline and all the elements contained in it to prepare themselves for the actions
they are about to perform.
</para>
<para>
The typical actions that an element will perform in the READY state might be to open a file or
an audio device. Some more complex elements might have a non trivial action to perform in
the READY state such as connecting to a media server using a CORBA connection.
</para>
<note>
<para>
You can also go from the NULL to PLAYING state directly without
going through the READY state. This is a shortcut; the framework
will internally go through the READY and the PAUSED state for you.
</para>
</note>
</sect1>
<sect1 id="section-states-paused">
<title>The PAUSED state</title>
<para>
A pipeline that is playing can be set to the PAUSED state. This will temporarily stop all
data flowing through the pipeline.
</para>
<para>
You can resume the data flow by setting the pipeline back to the PLAYING state.
</para>
<note>
<para>
The PAUSED state is available for temporarily freezing the pipeline.
Elements will typically not free their resources in the PAUSED state.
Use the NULL state if you want to stop the data flow permanently.
</para>
</note>
<para>
The pipeline has to be in the PAUSED or NULL state if you want to insert or modify an element
in the pipeline. We will cover dynamic pipeline behaviour in <xref linkend="chapter-dynamic"/>.
</para>
</sect1>
<sect1 id="section-states-playing">
<title>The PLAYING state</title>
<para>
A pipeline can be started by setting it to the PLAYING state. At
that time data will start to flow all the way through the pipeline.
</para>
</sect1>
</chapter>

View file

@ -1,168 +0,0 @@
<chapter id="chapter-threads">
<title>Threads</title>
<para>
GStreamer has support for multithreading through the use of
the <ulink type="http" url="../../gstreamer/html/GstThread.html"><classname>
GstThread</classname></ulink> object. This object is in fact
a special <ulink type="http" url="../../gstreamer/html/GstBin.html"><classname>
GstBin</classname></ulink> that will become a thread when started.
</para>
<para>
To construct a new thread you will perform something like:
</para>
<para>
<programlisting>
GstElement *my_thread;
/* create the thread object */
my_thread = gst_thread_new ("my_thread");
/* you could have used gst_element_factory_make ("thread", "my_thread"); */
g_return_if_fail (my_thread != NULL);
/* add some plugins */
gst_bin_add (GST_BIN (my_thread), GST_ELEMENT (funky_src));
gst_bin_add (GST_BIN (my_thread), GST_ELEMENT (cool_effect));
/* link the elements here... */
...
/* start playing */
gst_element_set_state (GST_ELEMENT (my_thread), GST_STATE_PLAYING);
</programlisting>
</para>
<para>
The above program will create a thread with two elements in it. As soon
as it is set to the PLAYING state, the thread will start to iterate
itself. You never need to explicitly iterate a thread.
</para>
<sect1 id="section-threads-constraints">
<title>Constraints placed on the pipeline by the GstThread</title>
<para>
Within the pipeline, everything is the same as in any other bin. The
difference lies at the thread boundary, at the link between the
thread and the outside world (containing bin). Since GStreamer is
fundamentally buffer-oriented rather than byte-oriented, the natural
solution to this problem is an element that can "buffer" the buffers
between the threads, in a thread-safe fashion. This element is the
queue, described more fully in <xref linkend="chapter-queues"/>. It doesn't
matter if the queue is placed in the containing bin or in the thread
itself, but it needs to be present on one side or the other to enable
inter-thread communication.
</para>
</sect1>
<sect1 id="section-threads-when">
<title>When would you want to use a thread?</title>
<para>
If you are writing a GUI application, making the top-level bin a thread will make your GUI
more responsive. If it were a pipeline instead, it would have to be iterated by your
application's event loop, which increases the latency between events (say, keyboard presses)
and responses from the GUI. In addition, any slight hang in the GUI would delay iteration of
the pipeline, which (for example) could cause pops in the output of the sound card, if it is
an audio pipeline.
</para>
<para>
<xref linkend="section-threads-img"/> shows how a thread can be visualised.
</para>
<figure float="1" id="section-threads-img">
<title>A thread</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/thread.&image;" format="&IMAGE;" />
</imageobject>
</mediaobject>
</figure>
<para>
As an example we show the helloworld program using a thread.
</para>
<para>
<programlisting>
/* example-begin threads.c */
#include &lt;gst/gst.h&gt;
/* we set this to TRUE right before gst_main (), but there could still
be a race condition between setting it and entering the function */
gboolean can_quit = FALSE;
/* eos will be called when the src element has an end of stream */
void
eos (GstElement *src, gpointer data)
{
GstThread *thread = GST_THREAD (data);
g_print ("have eos, quitting\n");
/* stop the bin */
gst_element_set_state (GST_ELEMENT (thread), GST_STATE_NULL);
while (!can_quit) /* waste cycles */ ;
gst_main_quit ();
}
int
main (int argc, char *argv[])
{
GstElement *filesrc, *demuxer, *decoder, *converter, *audiosink;
GstElement *thread;
if (argc &lt; 2) {
g_print ("usage: %s &lt;Ogg/Vorbis filename&gt;\n", argv[0]);
exit (-1);
}
gst_init (&amp;argc, &amp;argv);
/* create a new thread to hold the elements */
thread = gst_thread_new ("thread");
g_assert (thread != NULL);
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_assert (filesrc != NULL);
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
g_signal_connect (G_OBJECT (filesrc), "eos",
G_CALLBACK (eos), thread);
/* create an ogg demuxer */
demuxer = gst_element_factory_make ("oggdemux", "demuxer");
g_assert (demuxer != NULL);
/* create a vorbis decoder */
decoder = gst_element_factory_make ("vorbisdec", "decoder");
g_assert (decoder != NULL);
/* create an audio converter */
converter = gst_element_factory_make ("audioconvert", "converter");
g_assert (decoder != NULL);
/* and an audio sink */
audiosink = gst_element_factory_make ("osssink", "play_audio");
g_assert (audiosink != NULL);
/* add objects to the thread */
gst_bin_add_many (GST_BIN (thread), filesrc, demuxer, decoder, converter, audiosink, NULL);
/* link them in the logical order */
gst_element_link_many (filesrc, demuxer, decoder, converter, audiosink, NULL);
/* start playing */
gst_element_set_state (thread, GST_STATE_PLAYING);
/* do whatever you want here, the thread will be playing */
g_print ("thread is playing\n");
can_quit = TRUE;
gst_main ();
gst_object_unref (GST_OBJECT (thread));
exit (0);
}
/* example-end threads.c */
</programlisting>
</para>
</sect1>
</chapter>

View file

@ -1,145 +0,0 @@
<chapter id="chapter-typedetection">
<title>Type Detection</title>
<para>
Sometimes the capabilities of a pad are not specificied. The filesrc
element, for example, does not know what type of file it is reading. Before
you can attach an element to the pad of the filesrc, you need to determine
the media type in order to be able to choose a compatible element.
</para>
<para>
To solve this problem, a plugin can provide the <application>GStreamer</application>
core library with a type definition. The type definition
will contain the following information:
<itemizedlist>
<listitem>
<para>
The MIME type we are going to define.
</para>
</listitem>
<listitem>
<para>
An optional string with a list of possible file extensions this
type usually is associated with. the list entries are separated with
a space. eg, ".mp3 .mpa .mpg".
</para>
</listitem>
<listitem>
<para>
An optional typefind function.
</para>
</listitem>
</itemizedlist>
</para>
<para>
The typefind functions give a meaning to the MIME types that are used
in GStreamer. The typefind function is a function with the following definition:
</para>
<programlisting>
typedef GstCaps *(*GstTypeFindFunc) (GstBuffer *buf, gpointer priv);
</programlisting>
<para>
This typefind function will inspect a GstBuffer with data and will output
a GstCaps structure describing the type. If the typefind function does not
understand the buffer contents, it will return NULL.
</para>
<para>
<application>GStreamer</application> has a typefind element in the set
of core elements
that can be used to determine the type of a given pad.
</para>
<para>
The next example will show how a typefind element can be inserted into a pipeline
to detect the media type of a file. It will output the capabilities of the pad into
an XML representation.
</para>
<programlisting>
#include &lt;gst/gst.h&gt;
void type_found (GstElement *typefind, GstCaps* caps);
int
main(int argc, char *argv[])
{
GstElement *bin, *filesrc, *typefind;
gst_init (&amp;argc, &amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;filename&gt;\n", argv[0]);
exit (-1);
}
/* create a new bin to hold the elements */
bin = gst_bin_new ("bin");
g_assert (bin != NULL);
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_assert (filesrc != NULL);
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
/* create the typefind element */
typefind = gst_element_factory_make ("typefind", "typefind");
g_assert (typefind != NULL);
/* add objects to the main pipeline */
gst_bin_add_many (GST_BIN (bin), filesrc, typefind, NULL);
g_signal_connect (G_OBJECT (typefind), "have_type",
G_CALLBACK (type_found), NULL);
gst_element_link (filesrc, typefind);
/* start playing */
gst_element_set_state (GST_ELEMENT (bin), GST_STATE_PLAYING);
gst_bin_iterate (GST_BIN (bin));
gst_element_set_state (GST_ELEMENT (bin), GST_STATE_NULL);
exit (0);
}
</programlisting>
<para>
We create a very simple pipeline with only a filesrc and the typefind
element in it. The sinkpad of the typefind element has been linked
to the source pad of the filesrc.
</para>
<para>
We attached a signal 'have_type' to the typefind element which will be called
when the type of the media stream as been detected.
</para>
<para>
The typefind function will loop over all the registered types and will
execute each of the typefind functions. As soon as a function returns
a GstCaps pointer, the type_found function will be called:
</para>
<programlisting>
void
type_found (GstElement *typefind, GstCaps* caps)
{
xmlDocPtr doc;
xmlNodePtr parent;
doc = xmlNewDoc ("1.0");
doc-&gt;root = xmlNewDocNode (doc, NULL, "Capabilities", NULL);
parent = xmlNewChild (doc-&gt;root, NULL, "Caps1", NULL);
gst_caps_save_thyself (caps, parent);
xmlDocDump (stdout, doc);
}
</programlisting>
<para>
In the type_found function we can print or inspect the type that has been
detected using the GstCaps APIs. In this example, we just print out the
XML representation of the caps structure to stdout.
</para>
<para>
A more useful option would be to use the registry to look up an element
that can handle this particular caps structure, or we can also use the
autoplugger to link this caps structure to, for example, a videosink.
</para>
</chapter>

View file

@ -1,85 +0,0 @@
<chapter id="chapter-win32">
<title>Windows support</title>
<sect1 id="section-win32-build">
<title>Building <application>GStreamer</application> under Win32</title>
<para>There are different makefiles that can be used to build GStreamer with the usual Microsoft
compiling tools.</para>
<para>The Makefile is meant to be used with the GNU make program and the free
version of the Microsoft compiler (<ulink url="http://msdn.microsoft.com/visualc/vctoolkit2003/">http://msdn.microsoft.com/visualc/vctoolkit2003/</ulink>). You also
have to modify your system environment variables to use it from the command-line. You will also
need a working Platform SDK for Windows that is available for free from Microsoft.</para>
<para>The projects/makefiles will generate automatically some source files needed to compile
GStreamer. That requires that you have installed on your system some GNU tools and that they are
available in your system PATH.</para>
<para>The GStreamer project depends on other libraries, namely :</para>
<itemizedlist>
<listitem><para>GLib</para></listitem>
<listitem><para>popt</para></listitem>
<listitem><para>libxml2</para></listitem>
<listitem><para>libintl</para></listitem>
<listitem><para>libiconv</para></listitem>
</itemizedlist>
<para>There is now an existing package that has all these dependencies built with MSVC7.1. It exists either as precompiled librairies
and headers in both Release and Debug mode, or as the source package to build it yourself. You can
find it on <ulink url="http://mukoli.free.fr/gstreamer/deps/">http://mukoli.free.fr/gstreamer/deps/</ulink>.</para>
<note>
<title>Notes</title>
<para>GNU tools needed that you can find on <ulink url="http://gnuwin32.sourceforge.net/">http://gnuwin32.sourceforge.net/</ulink></para>
<itemizedlist>
<listitem><para>GNU flex (tested with 2.5.4)</para></listitem>
<listitem><para>GNU bison (tested with 1.35)</para></listitem>
</itemizedlist>
<para>and <ulink url="http://www.mingw.org/">http://www.mingw.org/</ulink></para>
<itemizedlist>
<listitem><para>GNU make (tested with 3.80)</para></listitem>
</itemizedlist>
<para>the generated files from the -auto makefiles will be available soon separately on the net
for convenience (people who don't want to install GNU tools).</para>
</note>
</sect1>
<sect1 id="section-win32-install">
<title>Installation on the system</title>
<para>By default, GSTreamer needs a registry. You have to generate it using "gst-register.exe". It will create
the file in c:\gstreamer\registry.xml that will hold all the plugins you can use.</para>
<para>You should install the GSTreamer core in c:\gstreamer\bin and the plugins in c:\gstreamer\plugins. Both
directories should be added to your system PATH. The library dependencies should be installed in c:\usr</para>
<para>For example, my current setup is :</para>
<itemizedlist>
<listitem><para><filename>c:\gstreamer\registry.xml</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-inspect.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-launch.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gst-register.exe</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstbytestream.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstelements.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstoptimalscheduler.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\gstspider.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\bin\libgtreamer-0.8.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\plugins\gst-libs.dll</filename></para></listitem>
<listitem><para><filename>c:\gstreamer\plugins\gstmatroska.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\iconv.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\intl.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libglib-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgmodule-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgobject-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libgthread-2.0-0.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\libxml2.dll</filename></para></listitem>
<listitem><para><filename>c:\usr\bin\popt.dll</filename></para></listitem>
</itemizedlist>
</sect1>
</chapter>

View file

@ -1,283 +0,0 @@
<chapter id="chapter-xml">
<title>XML in <application>GStreamer</application></title>
<para>
<application>GStreamer</application> uses XML to store and load
its pipeline definitions. XML is also used internally to manage the
plugin registry. The plugin registry is a file that contains the definition
of all the plugins <application>GStreamer</application> knows about to have
quick access to the specifics of the plugins.
</para>
<para>
We will show you how you can save a pipeline to XML and how you can reload that
XML file again for later use.
</para>
<sect1 id="section-xml-write">
<title>Turning GstElements into XML</title>
<para>
We create a simple pipeline and write it to stdout with
gst_xml_write_file (). The following code constructs an MP3 player
pipeline with two threads and then writes out the XML both to stdout
and to a file. Use this program with one argument: the MP3 file on disk.
</para>
<programlisting>
/* example-begin xml-mp3.c */
#include &lt;stdlib.h&gt;
#include &lt;gst/gst.h&gt;
gboolean playing;
int
main (int argc, char *argv[])
{
GstElement *filesrc, *osssink, *queue, *queue2, *decode;
GstElement *bin;
GstElement *thread, *thread2;
gst_init (&amp;argc,&amp;argv);
if (argc != 2) {
g_print ("usage: %s &lt;mp3 filename&gt;\n", argv[0]);
exit (-1);
}
/* create a new thread to hold the elements */
thread = gst_element_factory_make ("thread", "thread");
g_assert (thread != NULL);
thread2 = gst_element_factory_make ("thread", "thread2");
g_assert (thread2 != NULL);
/* create a new bin to hold the elements */
bin = gst_bin_new ("bin");
g_assert (bin != NULL);
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_assert (filesrc != NULL);
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
queue = gst_element_factory_make ("queue", "queue");
queue2 = gst_element_factory_make ("queue", "queue2");
/* and an audio sink */
osssink = gst_element_factory_make ("osssink", "play_audio");
g_assert (osssink != NULL);
decode = gst_element_factory_make ("mad", "decode");
g_assert (decode != NULL);
/* add objects to the main bin */
gst_bin_add_many (GST_BIN (bin), filesrc, queue, NULL);
gst_bin_add_many (GST_BIN (thread), decode, queue2, NULL);
gst_bin_add (GST_BIN (thread2), osssink);
gst_element_link_many (filesrc, queue, decode, queue2, osssink, NULL);
gst_bin_add_many (GST_BIN (bin), thread, thread2, NULL);
/* write the bin to stdout */
gst_xml_write_file (GST_ELEMENT (bin), stdout);
/* write the bin to a file */
gst_xml_write_file (GST_ELEMENT (bin), fopen ("xmlTest.gst", "w"));
exit (0);
}
/* example-end xml-mp3.c */
</programlisting>
<para>
The most important line is:
</para>
<programlisting>
gst_xml_write_file (GST_ELEMENT (bin), stdout);
</programlisting>
<para>
gst_xml_write_file () will turn the given element into an xmlDocPtr that
is then formatted and saved to a file. To save to disk, pass the result
of a fopen(2) as the second argument.
</para>
<para>
The complete element hierarchy will be saved along with the inter element
pad links and the element parameters. Future <application>GStreamer</application>
versions will also allow you to store the signals in the XML file.
</para>
</sect1>
<sect1 id="section-xml-load">
<title>Loading a GstElement from an XML file</title>
<para>
Before an XML file can be loaded, you must create a GstXML object.
A saved XML file can then be loaded with the
gst_xml_parse_file (xml, filename, rootelement) method.
The root element can optionally left NULL. The following code example loads
the previously created XML file and runs it.
</para>
<programlisting>
#include &lt;stdlib.h&gt;
#include &lt;gst/gst.h&gt;
int
main(int argc, char *argv[])
{
GstXML *xml;
GstElement *bin;
gboolean ret;
gst_init (&amp;argc, &amp;argv);
xml = gst_xml_new ();
ret = gst_xml_parse_file(xml, "xmlTest.gst", NULL);
g_assert (ret == TRUE);
bin = gst_xml_get_element (xml, "bin");
g_assert (bin != NULL);
gst_element_set_state (bin, GST_STATE_PLAYING);
while (gst_bin_iterate(GST_BIN(bin)));
gst_element_set_state (bin, GST_STATE_NULL);
exit (0);
}
</programlisting>
<para>
gst_xml_get_element (xml, "name") can be used to get a specific element
from the XML file.
</para>
<para>
gst_xml_get_topelements (xml) can be used to get a list of all toplevel elements
in the XML file.
</para>
<para>
In addition to loading a file, you can also load a from a xmlDocPtr and
an in memory buffer using gst_xml_parse_doc and gst_xml_parse_memory
respectively. Both of these methods return a gboolean indicating
success or failure of the requested action.
</para>
</sect1>
<sect1 id="section-xml-custom">
<title>Adding custom XML tags into the core XML data</title>
<para>
It is possible to add custom XML tags to the core XML created with
gst_xml_write. This feature can be used by an application to add more
information to the save plugins. The editor will for example insert
the position of the elements on the screen using the custom XML tags.
</para>
<para>
It is strongly suggested to save and load the custom XML tags using
a namespace. This will solve the problem of having your XML tags
interfere with the core XML tags.
</para>
<para>
To insert a hook into the element saving procedure you can link
a signal to the GstElement using the following piece of code:
</para>
<programlisting>
xmlNsPtr ns;
...
ns = xmlNewNs (NULL, "http://gstreamer.net/gst-test/1.0/", "test");
...
thread = gst_element_factory_make ("thread", "thread");
g_signal_connect (G_OBJECT (thread), "object_saved",
G_CALLBACK (object_saved), g_strdup ("decoder thread"));
...
</programlisting>
<para>
When the thread is saved, the object_save method will be called. Our example
will insert a comment tag:
</para>
<programlisting>
static void
object_saved (GstObject *object, xmlNodePtr parent, gpointer data)
{
xmlNodePtr child;
child = xmlNewChild (parent, ns, "comment", NULL);
xmlNewChild (child, ns, "text", (gchar *)data);
}
</programlisting>
<para>
Adding the custom tag code to the above example you will get an XML file
with the custom tags in it. Here's an excerpt:
</para>
<programlisting>
...
&lt;gst:element&gt;
&lt;gst:name&gt;thread&lt;/gst:name&gt;
&lt;gst:type&gt;thread&lt;/gst:type&gt;
&lt;gst:version&gt;0.1.0&lt;/gst:version&gt;
...
&lt;/gst:children&gt;
&lt;test:comment&gt;
&lt;test:text&gt;decoder thread&lt;/test:text&gt;
&lt;/test:comment&gt;
&lt;/gst:element&gt;
...
</programlisting>
<para>
To retrieve the custom XML again, you need to attach a signal to
the GstXML object used to load the XML data. You can then parse your
custom XML from the XML tree whenever an object is loaded.
</para>
<para>
We can extend our previous example with the following piece of
code.
</para>
<programlisting>
xml = gst_xml_new ();
g_signal_connect (G_OBJECT (xml), "object_loaded",
G_CALLBACK (xml_loaded), xml);
ret = gst_xml_parse_file (xml, "xmlTest.gst", NULL);
g_assert (ret == TRUE);
</programlisting>
<para>
Whenever a new object has been loaded, the xml_loaded function will
be called. This function looks like:
</para>
<programlisting>
static void
xml_loaded (GstXML *xml, GstObject *object, xmlNodePtr self, gpointer data)
{
xmlNodePtr children = self-&gt;xmlChildrenNode;
while (children) {
if (!strcmp (children-&gt;name, "comment")) {
xmlNodePtr nodes = children-&gt;xmlChildrenNode;
while (nodes) {
if (!strcmp (nodes-&gt;name, "text")) {
gchar *name = g_strdup (xmlNodeGetContent (nodes));
g_print ("object %s loaded with comment '%s'\n",
gst_object_get_name (object), name);
}
nodes = nodes-&gt;next;
}
}
children = children-&gt;next;
}
}
</programlisting>
<para>
As you can see, you'll get a handle to the GstXML object, the
newly loaded GstObject and the xmlNodePtr that was used to create
this object. In the above example we look for our special tag inside
the XML tree that was used to load the object and we print our
comment to the console.
</para>
</sect1>
</chapter>

View file

@ -611,13 +611,6 @@ gst_my_filter_probe_interface_init (GstPropertyProbeInterface *iface)
</para>
</sect1>
<sect1 id="section-iface-profile" xreflabel="Profile Interface">
<title>Profile Interface</title>
<para>
WRITEME
</para>
</sect1>
<sect1 id="section-iface-xoverlay" xreflabel="X Overlay Interface">
<title>X Overlay Interface</title>
<para>