typos and style fixes

Original commit message from CVS:
typos and style fixes
This commit is contained in:
Thomas Vander Stichele 2003-10-09 12:42:49 +00:00
parent deb5b5e3c7
commit 3b7659725f
45 changed files with 533 additions and 379 deletions

View file

@ -82,8 +82,8 @@
</para>
<para>
In our helloworld example the elements we constructed would have the
following MIME types associated with their source and sink pads:
<xref linkend="sec-mime-img"/> shows the MIME types associated with
each pad from the "hello world" example.
</para>
<figure float="1" id="sec-mime-img">
<title>The Hello world pipeline with MIME types</title>
@ -103,7 +103,7 @@
<para>
The typing of the source and sink pads also makes it possible to
'autoplug' a pipeline. We will have the ability to say: "construct
me a pipeline that does an audio/mpeg to audio/raw conversion".
a pipeline that does an audio/mpeg to audio/raw conversion".
</para>
<note>
<para>

View file

@ -28,7 +28,7 @@
</itemizedlist>
</para>
<para>
The scheduler is a plugable component; this means that alternative
The scheduler is a pluggable component; this means that alternative
schedulers can be written and plugged into GStreamer. The default scheduler
uses cothreads to schedule the plugins in a pipeline. Cothreads are fast
and lightweight user-space threads.

View file

@ -63,7 +63,7 @@
an audio pipeline.
</para>
<para>
A thread can be visualised as below
<xref linkend="sec-threads-img"/> shows how a thread can be visualised.
</para>
<figure float="1" id="sec-threads-img">
<title>A thread</title>

View file

@ -1,24 +1,24 @@
<chapter id="cha-gnome">
<title>Gnome integration</title>
<title>GNOME integration</title>
<para>
GStreamer is fairly easy to integrate with Gnome applications.
GStreamer is fairly easy to integrate with GNOME applications.
GStreamer uses libxml 2.0, GLib 2.0 and popt, as do all other
Gnome applications.
There are however some basic issues you need to address in your Gnome
GNOME applications.
There are however some basic issues you need to address in your GNOME
applications.
</para>
<sect1>
<title>Command line options</title>
<para>
Gnome applications call gnome_program_init () to parse command-line
GNOME applications call gnome_program_init () to parse command-line
options and initialize the necessary gnome modules.
GStreamer applications normally call gst_init (&amp;argc, &amp;argv) to
do the same for GStreamer.
</para>
<para>
Each of these two swallows the program options passed to the program,
so we need a different way to allow both Gnome and GStreamer to parse
so we need a different way to allow both GNOME and GStreamer to parse
the command-line options. This is shown in the following example.
</para>
@ -34,7 +34,7 @@ main (int argc, char **argv)
{ NULL, '\0', POPT_ARG_INCLUDE_TABLE, NULL, 0, "GStreamer", NULL },
POPT_TABLEEND
};
GnomeProgram *program;
GNOMEProgram *program;
poptContext context;
const gchar **argvn;
@ -63,7 +63,7 @@ main (int argc, char **argv)
</programlisting>
<para>
If you try out this program, you will see that when called with
--help, it will print out both GStreamer and Gnome help arguments.
--help, it will print out both GStreamer and GNOME help arguments.
All of the arguments that didn't belong to either end up in the
argvn pointer array.
</para>

View file

@ -1,24 +1,24 @@
<chapter id="cha-gnome">
<title>Gnome integration</title>
<title>GNOME integration</title>
<para>
GStreamer is fairly easy to integrate with Gnome applications.
GStreamer is fairly easy to integrate with GNOME applications.
GStreamer uses libxml 2.0, GLib 2.0 and popt, as do all other
Gnome applications.
There are however some basic issues you need to address in your Gnome
GNOME applications.
There are however some basic issues you need to address in your GNOME
applications.
</para>
<sect1>
<title>Command line options</title>
<para>
Gnome applications call gnome_program_init () to parse command-line
GNOME applications call gnome_program_init () to parse command-line
options and initialize the necessary gnome modules.
GStreamer applications normally call gst_init (&amp;argc, &amp;argv) to
do the same for GStreamer.
</para>
<para>
Each of these two swallows the program options passed to the program,
so we need a different way to allow both Gnome and GStreamer to parse
so we need a different way to allow both GNOME and GStreamer to parse
the command-line options. This is shown in the following example.
</para>
@ -34,7 +34,7 @@ main (int argc, char **argv)
{ NULL, '\0', POPT_ARG_INCLUDE_TABLE, NULL, 0, "GStreamer", NULL },
POPT_TABLEEND
};
GnomeProgram *program;
GNOMEProgram *program;
poptContext context;
const gchar **argvn;
@ -63,7 +63,7 @@ main (int argc, char **argv)
</programlisting>
<para>
If you try out this program, you will see that when called with
--help, it will print out both GStreamer and Gnome help arguments.
--help, it will print out both GStreamer and GNOME help arguments.
All of the arguments that didn't belong to either end up in the
argvn pointer array.
</para>

View file

@ -247,11 +247,5 @@ Element Signals:
gst-inspect gstelements
</screen>
</sect1>
<sect1>
<title><command>gst-play</command></title>
<para>
A sample media player.
</para>
</sect1>
</chapter>

View file

@ -5,7 +5,7 @@
<application>GStreamer</application> is a lively project, with
developers from around the globe very actively contributing.
We often hang out on the #gstreamer IRC channel on
irc.openprojects.net: the following are a selection of amusing<footnote>
irc.freenode.org: the following are a selection of amusing<footnote>
<para>No guarantee of sense of humour compatibility is given.</para>
</footnote> quotes from our conversations.
</para>

View file

@ -95,11 +95,11 @@
<title>Using the <classname>GstAutoplugCache</classname> element</title>
<para>
The <classname>GstAutoplugCache</classname> element is used to cache the
media stream when performing typedetection. As we have have seen in the
previous chapter (typedetection), the type typefind function consumes a
buffer to determine the media type of it. After we have set up the pipeline
media stream when performing typedetection. As we have seen in
<xref linkend="cha-typedetection"/>, the typefind function consumes a
buffer to determine its media type. After we have set up the pipeline
to play the media stream we should be able to 'replay' the previous buffer(s).
This is where the autoplugcache is used for.
This is what the autoplugcache is used for.
</para>
<para>
The basic usage pattern for the autoplugcache in combination with the typefind
@ -151,9 +151,9 @@
<title>Another approach to autoplugging</title>
<para>
The autoplug API is interesting, but often impractical. It is static;
it cannot deal with dynamic pipelines (insert ref here). What you
often want is just an element to stick into a pipeline that will DWIM
(Do What I Mean)(ref). Enter the spider.
it cannot deal with dynamic pipelines. An element that will
automatically figure out and decode the type is more useful.
Enter the spider.
</para>
<sect2>
<title>The spider element</title>

View file

@ -1,11 +1,11 @@
<chapter id="cha-bins">
<title>Bins</title>
<para>
A Bin is a container element. You can add elements to a bin. Since a bin is
an <classname>GstElement</classname> itself, it can also be added to another bin.
A bin is a container element. You can add elements to a bin. Since a bin is
an element itself, it can also be added to another bin.
</para>
<para>
Bins allow you to combine linked elements into one logical element. You do
Bins allow you to combine a group of linked elements into one logical element. You do
not deal with the individual elements anymore but with just one element, the bin.
We will see that this is extremely powerful when you are going to construct
complex pipelines since it allows you to break up the pipeline in smaller chunks.
@ -17,7 +17,7 @@
</para>
<figure float="1" id="sec-bin-img">
<title>Visualisation of a <classname>GstBin</classname> element with some elements in it</title>
<title>Visualisation of a bin with some elements in it</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/bin-element.&image;" format="&IMAGE;" />
@ -26,20 +26,21 @@
</figure>
<para>
There are two standard bins available to the GStreamer programmer:
There are two specialized bins available to the GStreamer programmer:
<itemizedlist>
<listitem>
<para>
A pipeline (<classname>GstPipeline</classname>). Which is a generic container you will
use most of the time. The toplevel bin has to be a pipeline.
a pipeline: a generic container that allows scheduling of the
containing elements. The toplevel bin has to be a pipeline.
Every application thus needs at least one of these.
</para>
</listitem>
<listitem>
<para>
A thread (<classname>GstThread</classname>). The plan for the
<classname>GstThread</classname> will be run in a separate thread. You will have to use
this bin if you have to carefully synchronize audio and video, for example. You will learn
a thread: a bin that will be run in a separate execution thread.
You will have to use this bin if you have to carefully
synchronize audio and video, or for buffering. You will learn
more about threads in <xref linkend="cha-threads"/>.
</para>
</listitem>
@ -84,9 +85,8 @@
...
</programlisting>
<para>
Bins and threads can be added to other bins too. This allows you to create nested bins. Note
that it doesn't make very much sense to add a <classname>GstPipeline</classname> to anything,
as it's a toplevel bin that needs to be explicitly iterated.
Bins and threads can be added to other bins too. This allows you to create nested bins. Pipelines shouldn't be added to any other element, though.
They are toplevel bins and they are directly linked to the scheduler.
</para>
<para>
To get an element from the bin you can use:
@ -180,7 +180,7 @@
<sect1 id="sec-bin-ghostpads">
<title>Ghost pads</title>
<para>
You can see from figure <xref linkend="sec-bin-noghost-img"/> how a bin has no pads of its own.
You can see from <xref linkend="sec-bin-noghost-img"/> how a bin has no pads of its own.
This is where "ghost pads" come into play.
</para>
<figure float="1" id="sec-bin-noghost-img">
@ -207,7 +207,8 @@
</mediaobject>
</figure>
<para>
Above is a representation of a ghost pad. The sink pad of element one is now also a pad
<xref linkend="sec-bin-ghost-img"/>
is a representation of a ghost pad. The sink pad of element one is now also a pad
of the bin.
</para>
<para>

View file

@ -8,24 +8,29 @@
to deal with buffers yourself; the elements will do that for you.
</para>
<para>
The most important information in the buffer is:
A buffer consists of:
<itemizedlist>
<listitem>
<para>
A pointer to a piece of memory.
a pointer to a piece of memory.
</para>
</listitem>
<listitem>
<para>
The size of the memory.
the size of the memory.
</para>
</listitem>
<listitem>
<para>
a timestamp for the buffer.
</para>
</listitem>
<listitem>
<para>
A refcount that indicates how many elements are using this
buffer. This refcount will be used to destroy the buffer when no
element is having a reference to it.
element has a reference to it.
</para>
</listitem>
</itemizedlist>

View file

@ -1,24 +1,28 @@
<chapter id="cha-elements">
<title>GstElement</title>
<title>Elements</title>
<para>
The most important object in <application>GStreamer</application> for the
application programmer is the <classname>GstElement</classname> object.
</para>
<sect1 id="sec-elements-design">
<title>What is a GstElement</title>
<title>What is an element ?</title>
<para>
<classname>GstElement</classname> is the basic building block for the
media pipeline. All the different components you are going to use are
An element is the basic building block for the media pipeline.
All the different high-level components you are going to use are
derived from <classname>GstElement</classname>. This means that a
lot of functions you are going to use operate on objects of this class.
</para>
<para>
Elements, from the perspective of GStreamer, are viewed as "black boxes"
with a number of different aspects. One of these aspects is the presence
of "pads", or link points. This terminology arises from soldering;
of "pads" (see <xref linkend="cha-pads"/>), or link points. This terminology arises from soldering;
pads are where wires can be attached.
</para>
</sect1>
<sect1 id="sec-elements-types">
<title>Types of elements</title>
<sect2 id="sec-elements-src">
<title>Source elements</title>
@ -27,7 +31,8 @@
reading from disk or from a sound card.
</para>
<para>
Below you see how we will visualize the element.
<xref linkend="sec-element-srcimg"/> shows how we will visualise
a source element.
We always draw a source pad to the right of the element.
</para>
<figure float="1" id="sec-element-srcimg">
@ -48,7 +53,7 @@
<sect2 id="sec-elements-filter">
<title>Filters and codecs</title>
<para>
Filter elements both have input and output pads. They operate on
Filter elements have both input and output pads. They operate on
data they receive in their sink pads and produce data on their source
pads. For example, MPEG decoders and volume filters would fall into
this category.
@ -67,7 +72,8 @@
</mediaobject>
</figure>
<para>
The above figure shows the visualisation of a filter element.
<xref linkend="sec-element-filterimg"/> shows how we will visualise
a filter element.
This element has one sink (input) pad and one source (output) pad.
Sink pads are drawn on the left of the element.
</para>
@ -82,9 +88,9 @@
</mediaobject>
</figure>
<para>
The above figure shows the visualisation of a filter element with
<xref linkend="sec-element-filterimg"/> shows the visualisation of a filter element with
more than one output pad. An example of such a filter is the AVI
splitter (demultiplexer). This element will parse the input data and
demultiplexer. This element will parse the input data and
extract the audio and video data. Most of these filters dynamically
send out a signal when a new pad is created so that the application
programmer can link an arbitrary element to the newly created pad.
@ -94,9 +100,10 @@
<sect2 id="sec-elements-sink">
<title>Sink elements</title>
<para>
Sink elements are terminal points in a media pipeline. They accept
Sink elements are end points in a media pipeline. They accept
data but do not produce anything. Disk writing, soundcard playback,
and video output would all be implemented by sink elements.
<xref linkend="sec-element-sinkimg"/> shows a sink element.
</para>
<figure float="1" id="sec-element-sinkimg">
<title>Visualisation of a sink element</title>
@ -207,7 +214,9 @@
For more information about <classname>GObject</classname>
properties we recommend you read the <ulink
url="http://developer.gnome.org/doc/API/2.0/gobject/index.html"
type="http">GObject manual</ulink>.
type="http">GObject manual</ulink> and an introduction to <ulink
url="http://le-hacker.org/papers/gobject/index.html" type="http">
The Glib Object system</ulink>.
</para>
</sect1>

View file

@ -178,7 +178,7 @@ main (int argc, char *argv[])
pipeline as follows:
</para>
<figure float="1" id="sec-hello-img">
<title>The Hello world pipeline</title>
<title>The "hello world" pipeline</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/hello-world.&image;" format="&IMAGE;" />
@ -188,7 +188,7 @@ main (int argc, char *argv[])
</figure>
<para>
Everything is now set up to start the streaming. We use the following
Everything is now set up to start streaming. We use the following
statements to change the state of the pipeline:
</para>
<programlisting>

View file

@ -1,12 +1,27 @@
<chapter id="cha-pads">
<title>GstPad</title>
<title>Pads</title>
<para>
As we have seen in the previous chapter (GstElement), the pads are the element's
links with the outside world.
As we have seen in <xref linkend="cha-elements"/>, the pads are the element's
interface to the outside world.
</para>
<para>
The specific type of media that the element can handle will be exposed by the pads.
The description of this media type is done with capabilities (<classname>GstCaps</classname>)
The description of this media type is done with capabilities(see
<xref linkend="sec-caps"/>)
</para>
<para>
Pads are either source or sink pads. The terminology is defined from the
view of the element itself: elements accept data on their sink pads, and
send data out on their source pads. Sink pads are drawn on the left,
while source pads are drawn on the right of an element. In general,
data flows from left to right in the graph.<footnote>
<para>
In reality, there is no objection to data flowing from a
source pad to the sink pad of an element upstream. Data will, however,
always flow from a source pad of one element to the sink pad of
another.
</para></footnote>
</para>
<sect1 id="sec-pads-get">
@ -58,19 +73,27 @@
GstElement.
</para>
</sect2>
</sect1>
<sect1 id="sec-pads-type">
<title>Types of pads</title>
<sect2 id="sec-pads-dynamic">
<title>Dynamic pads</title>
<para>
Some elements might not have their pads when they are created. This
can happen, for example, with an MPEG2 system demultiplexer. The
Some elements might not have all of their pads when the element is
created. This
can happen, for example, with an MPEG system demultiplexer. The
demultiplexer will create its pads at runtime when it detects the
different elementary streams in the MPEG2 system stream.
different elementary streams in the MPEG system stream.
</para>
<para>
Running <application>gst-inspect mpegdemux</application> will show that
the element has only one pad: a sink pad called 'sink'. The other pads are
"dormant" as you can see in the padtemplates from the 'Exists: Sometimes'
property. Depending on the type of MPEG2 file you play, the pads are created. We
"dormant". You can see this in the pad template because there is
an 'Exists: Sometimes'
property. Depending on the type of MPEG file you play, the pads will
be created. We
will see that this is very important when you are going to create dynamic
pipelines later on in this manual.
</para>
@ -116,7 +139,7 @@ main(int argc, char *argv[])
</programlisting>
<note>
<para>
You need to set the pipeline to READY or NULL if you want to change it.
A pipeline cannot be changed in the PLAYING state.
</para>
</note>
</sect2>
@ -172,12 +195,15 @@ main(int argc, char *argv[])
...
</programlisting>
</sect2>
</sect1>
<sect1 id="sec-pads-description">
<title>Capabilities of a GstPad</title>
<sect1 id="sec-caps">
<title>Capabilities of a pad</title>
<para>
Since the pads play a very important role in how the element is viewed by the
outside world, a mechanism is implemented to describe the pad by using capabilities.
outside world, a mechanism is implemented to describe the data that can
flow through the pad by using capabilities.
</para>
<para>
We will briefly describe what capabilities are, enough for you to get a basic understanding
@ -186,14 +212,29 @@ main(int argc, char *argv[])
</para>
<sect2 id="sec-pads-caps">
<title>What is a capability</title>
<title>What are capabilities ?</title>
<para>
A capability is attached to a pad in order to describe what type of media the pad
can handle.
Capabilities are attached to a pad in order to describe
what type of media the pad can handle.
</para>
<para>
A capability is named and consists of a MIME type and a set of properties. Its data
structure is:
Capabilities is shorthand for "capability chain". A capability chain
is a chain of one capability or more.
</para>
<para>
The basic entity is a capability, and is defined by a name, a MIME
type and a set of properties. A capability can be chained to
another capability, which is why we commonly refer to a chain of
capability entities as "capabilities".<footnote>
<para>
It is important to understand that the term "capabilities" refers
to a chain of one capability or more. This will be clearer when
you see the structure definition of a <classname>GstCaps</classname>
element.
</para></footnote>
</para>
<para>
Its structure is:
</para>
<programlisting>
struct _GstCaps {
@ -245,55 +286,80 @@ Pads:
</programlisting>
</sect2>
<sect2 id="sec-pads-props">
<title>What are properties</title>
<title>What are properties ?</title>
<para>
Properties are used to describe extra information for
capabilities. The properties basically consist of a key (a string) and
a value. There are different possibile value types that can be used:
capabilities. A property consists of a key (a string) and
a value. There are different possible value types that can be used:
</para>
<itemizedlist>
<listitem>
<para>
An integer value: the property has this exact value.
basic types:
</para>
<itemizedlist>
<listitem>
<para>
an integer value: the property has this exact value.
</para>
</listitem>
<listitem>
<para>
a boolean value: the property is either TRUE or FALSE.
</para>
</listitem>
<listitem>
<para>
a fourcc value: this is a value that is commonly used to
describe an encoding for video,
as used for example by the AVI specification.
<footnote><para>
fourcc values consist of four bytes.
<ulink url="http://www.fourcc.org" type="http">The FOURCC
Definition List</ulink> is the most complete resource
on the allowed fourcc values.
</para></footnote>
</para>
</listitem>
<listitem>
<para>
a float value: the property has this exact floating point value.
</para>
</listitem>
<listitem>
<para>
a string value.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
range types:
</para>
<itemizedlist>
<listitem>
<para>
an integer range value: the property denotes a range of
possible integer. For example, the wavparse element has
a source pad where the "rate" property can go from 8000 to
48000.
</para>
</listitem>
<listitem>
<para>
a float range value: the property denotes a range of possible
floating point values.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
An integer range value. The property denotes a range of possible
values. In the case of the mad element, the source pad has a
property rate that can go from 11025 to 48000.
</para>
</listitem>
<listitem>
<para>
A boolean value.
</para>
</listitem>
<listitem>
<para>
a fourcc value: this is a value that is commonly used to describe an encoding for video,
as used by the AVI specification.
</para>
</listitem>
<listitem>
<para>
A list value: the property can take any value from a list.
</para>
</listitem>
<listitem>
<para>
A float value: the property has this exact floating point value.
</para>
</listitem>
<listitem>
<para>
A float range value: denotes a range of possible floating point values.
</para>
</listitem>
<listitem>
<para>
A string value.
a list value: the property can take any value from a list of
basic value types or range types.
</para>
</listitem>
</itemizedlist>
@ -315,6 +381,8 @@ Pads:
<para>
Compatibility detection: when two pads are linked, <application>GStreamer</application>
can verify if the two pads are talking about the same media types.
The process of linking two pads and checking if they are compatible
is called "caps negotiation".
</para>
</listitem>
</itemizedlist>
@ -407,7 +475,7 @@ GstProps* gst_props_new (const gchar *firstname, ...);
<itemizedlist>
<listitem>
<para>
GST_PROPS_INT_RANGE(a,b): An integer ragne from a to b
GST_PROPS_INT_RANGE(a,b): An integer range from a to b
</para>
</listitem>
<listitem>

View file

@ -1,11 +1,11 @@
<chapter id="cha-bins">
<title>Bins</title>
<para>
A Bin is a container element. You can add elements to a bin. Since a bin is
an <classname>GstElement</classname> itself, it can also be added to another bin.
A bin is a container element. You can add elements to a bin. Since a bin is
an element itself, it can also be added to another bin.
</para>
<para>
Bins allow you to combine linked elements into one logical element. You do
Bins allow you to combine a group of linked elements into one logical element. You do
not deal with the individual elements anymore but with just one element, the bin.
We will see that this is extremely powerful when you are going to construct
complex pipelines since it allows you to break up the pipeline in smaller chunks.
@ -17,7 +17,7 @@
</para>
<figure float="1" id="sec-bin-img">
<title>Visualisation of a <classname>GstBin</classname> element with some elements in it</title>
<title>Visualisation of a bin with some elements in it</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/bin-element.&image;" format="&IMAGE;" />
@ -26,20 +26,21 @@
</figure>
<para>
There are two standard bins available to the GStreamer programmer:
There are two specialized bins available to the GStreamer programmer:
<itemizedlist>
<listitem>
<para>
A pipeline (<classname>GstPipeline</classname>). Which is a generic container you will
use most of the time. The toplevel bin has to be a pipeline.
a pipeline: a generic container that allows scheduling of the
containing elements. The toplevel bin has to be a pipeline.
Every application thus needs at least one of these.
</para>
</listitem>
<listitem>
<para>
A thread (<classname>GstThread</classname>). The plan for the
<classname>GstThread</classname> will be run in a separate thread. You will have to use
this bin if you have to carefully synchronize audio and video, for example. You will learn
a thread: a bin that will be run in a separate execution thread.
You will have to use this bin if you have to carefully
synchronize audio and video, or for buffering. You will learn
more about threads in <xref linkend="cha-threads"/>.
</para>
</listitem>
@ -84,9 +85,8 @@
...
</programlisting>
<para>
Bins and threads can be added to other bins too. This allows you to create nested bins. Note
that it doesn't make very much sense to add a <classname>GstPipeline</classname> to anything,
as it's a toplevel bin that needs to be explicitly iterated.
Bins and threads can be added to other bins too. This allows you to create nested bins. Pipelines shouldn't be added to any other element, though.
They are toplevel bins and they are directly linked to the scheduler.
</para>
<para>
To get an element from the bin you can use:
@ -180,7 +180,7 @@
<sect1 id="sec-bin-ghostpads">
<title>Ghost pads</title>
<para>
You can see from figure <xref linkend="sec-bin-noghost-img"/> how a bin has no pads of its own.
You can see from <xref linkend="sec-bin-noghost-img"/> how a bin has no pads of its own.
This is where "ghost pads" come into play.
</para>
<figure float="1" id="sec-bin-noghost-img">
@ -207,7 +207,8 @@
</mediaobject>
</figure>
<para>
Above is a representation of a ghost pad. The sink pad of element one is now also a pad
<xref linkend="sec-bin-ghost-img"/>
is a representation of a ghost pad. The sink pad of element one is now also a pad
of the bin.
</para>
<para>

View file

@ -8,24 +8,29 @@
to deal with buffers yourself; the elements will do that for you.
</para>
<para>
The most important information in the buffer is:
A buffer consists of:
<itemizedlist>
<listitem>
<para>
A pointer to a piece of memory.
a pointer to a piece of memory.
</para>
</listitem>
<listitem>
<para>
The size of the memory.
the size of the memory.
</para>
</listitem>
<listitem>
<para>
a timestamp for the buffer.
</para>
</listitem>
<listitem>
<para>
A refcount that indicates how many elements are using this
buffer. This refcount will be used to destroy the buffer when no
element is having a reference to it.
element has a reference to it.
</para>
</listitem>
</itemizedlist>

View file

@ -97,7 +97,7 @@ chain_function (GstPad *pad, GstBuffer *buffer)
</para>
<para>
When the request for a buffer cannot immediatly satisfied, the control
When the request for a buffer cannot be immediately satisfied, the control
will be given to the source element of the loop-based element until it
performs a push on its source pad. At that time the control is handed
back to the loop-based element, etc... The execution trace can get

View file

@ -188,8 +188,4 @@ main (int argc, char *argv[])
There are other possibilities to check the type of the pad, for
example by using the MIME type and the properties of the pad.
</para>
<para>
Note that the pipeline has to be in the PAUSED state before changes
can be made to its structure.
</para>
</chapter>

View file

@ -1,24 +1,28 @@
<chapter id="cha-elements">
<title>GstElement</title>
<title>Elements</title>
<para>
The most important object in <application>GStreamer</application> for the
application programmer is the <classname>GstElement</classname> object.
</para>
<sect1 id="sec-elements-design">
<title>What is a GstElement</title>
<title>What is an element ?</title>
<para>
<classname>GstElement</classname> is the basic building block for the
media pipeline. All the different components you are going to use are
An element is the basic building block for the media pipeline.
All the different high-level components you are going to use are
derived from <classname>GstElement</classname>. This means that a
lot of functions you are going to use operate on objects of this class.
</para>
<para>
Elements, from the perspective of GStreamer, are viewed as "black boxes"
with a number of different aspects. One of these aspects is the presence
of "pads", or link points. This terminology arises from soldering;
of "pads" (see <xref linkend="cha-pads"/>), or link points. This terminology arises from soldering;
pads are where wires can be attached.
</para>
</sect1>
<sect1 id="sec-elements-types">
<title>Types of elements</title>
<sect2 id="sec-elements-src">
<title>Source elements</title>
@ -27,7 +31,8 @@
reading from disk or from a sound card.
</para>
<para>
Below you see how we will visualize the element.
<xref linkend="sec-element-srcimg"/> shows how we will visualise
a source element.
We always draw a source pad to the right of the element.
</para>
<figure float="1" id="sec-element-srcimg">
@ -48,7 +53,7 @@
<sect2 id="sec-elements-filter">
<title>Filters and codecs</title>
<para>
Filter elements both have input and output pads. They operate on
Filter elements have both input and output pads. They operate on
data they receive in their sink pads and produce data on their source
pads. For example, MPEG decoders and volume filters would fall into
this category.
@ -67,7 +72,8 @@
</mediaobject>
</figure>
<para>
The above figure shows the visualisation of a filter element.
<xref linkend="sec-element-filterimg"/> shows how we will visualise
a filter element.
This element has one sink (input) pad and one source (output) pad.
Sink pads are drawn on the left of the element.
</para>
@ -82,9 +88,9 @@
</mediaobject>
</figure>
<para>
The above figure shows the visualisation of a filter element with
<xref linkend="sec-element-filterimg"/> shows the visualisation of a filter element with
more than one output pad. An example of such a filter is the AVI
splitter (demultiplexer). This element will parse the input data and
demultiplexer. This element will parse the input data and
extract the audio and video data. Most of these filters dynamically
send out a signal when a new pad is created so that the application
programmer can link an arbitrary element to the newly created pad.
@ -94,9 +100,10 @@
<sect2 id="sec-elements-sink">
<title>Sink elements</title>
<para>
Sink elements are terminal points in a media pipeline. They accept
Sink elements are end points in a media pipeline. They accept
data but do not produce anything. Disk writing, soundcard playback,
and video output would all be implemented by sink elements.
<xref linkend="sec-element-sinkimg"/> shows a sink element.
</para>
<figure float="1" id="sec-element-sinkimg">
<title>Visualisation of a sink element</title>
@ -207,7 +214,9 @@
For more information about <classname>GObject</classname>
properties we recommend you read the <ulink
url="http://developer.gnome.org/doc/API/2.0/gobject/index.html"
type="http">GObject manual</ulink>.
type="http">GObject manual</ulink> and an introduction to <ulink
url="http://le-hacker.org/papers/gobject/index.html" type="http">
The Glib Object system</ulink>.
</para>
</sect1>

View file

@ -82,8 +82,8 @@
</para>
<para>
In our helloworld example the elements we constructed would have the
following MIME types associated with their source and sink pads:
<xref linkend="sec-mime-img"/> shows the MIME types associated with
each pad from the "hello world" example.
</para>
<figure float="1" id="sec-mime-img">
<title>The Hello world pipeline with MIME types</title>
@ -103,7 +103,7 @@
<para>
The typing of the source and sink pads also makes it possible to
'autoplug' a pipeline. We will have the ability to say: "construct
me a pipeline that does an audio/mpeg to audio/raw conversion".
a pipeline that does an audio/mpeg to audio/raw conversion".
</para>
<note>
<para>

View file

@ -15,7 +15,7 @@ Single
6825 3225 7575 3225 7575 3750 6825 3750 6825 3225
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
6825 3825 7575 3825 7575 4350 6825 4350 6825 3825
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 element_name\001
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 demuxer\001
4 0 0 50 0 16 12 0.0000 4 135 330 5850 3975 sink\001
4 0 0 50 0 16 12 0.0000 4 135 465 6975 3600 video\001
4 0 0 50 0 16 12 0.0000 4 135 465 6975 4200 audio\001

View file

@ -14,5 +14,5 @@ Single
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
5625 3600 6375 3600 6375 4125 5625 4125 5625 3600
4 0 0 50 0 16 12 0.0000 4 105 255 7050 3975 src\001
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 element_name\001
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 filter\001
4 0 0 50 0 16 12 0.0000 4 135 330 5850 3975 sink\001

View file

@ -1,24 +1,24 @@
<chapter id="cha-gnome">
<title>Gnome integration</title>
<title>GNOME integration</title>
<para>
GStreamer is fairly easy to integrate with Gnome applications.
GStreamer is fairly easy to integrate with GNOME applications.
GStreamer uses libxml 2.0, GLib 2.0 and popt, as do all other
Gnome applications.
There are however some basic issues you need to address in your Gnome
GNOME applications.
There are however some basic issues you need to address in your GNOME
applications.
</para>
<sect1>
<title>Command line options</title>
<para>
Gnome applications call gnome_program_init () to parse command-line
GNOME applications call gnome_program_init () to parse command-line
options and initialize the necessary gnome modules.
GStreamer applications normally call gst_init (&amp;argc, &amp;argv) to
do the same for GStreamer.
</para>
<para>
Each of these two swallows the program options passed to the program,
so we need a different way to allow both Gnome and GStreamer to parse
so we need a different way to allow both GNOME and GStreamer to parse
the command-line options. This is shown in the following example.
</para>
@ -34,7 +34,7 @@ main (int argc, char **argv)
{ NULL, '\0', POPT_ARG_INCLUDE_TABLE, NULL, 0, "GStreamer", NULL },
POPT_TABLEEND
};
GnomeProgram *program;
GNOMEProgram *program;
poptContext context;
const gchar **argvn;
@ -63,7 +63,7 @@ main (int argc, char **argv)
</programlisting>
<para>
If you try out this program, you will see that when called with
--help, it will print out both GStreamer and Gnome help arguments.
--help, it will print out both GStreamer and GNOME help arguments.
All of the arguments that didn't belong to either end up in the
argvn pointer array.
</para>

View file

@ -38,8 +38,8 @@
<sect2 id="sec-goals-object">
<title>Object oriented</title>
<para>
Adhere to the GLib 2.0 object model. A programmer familiar with GLib 2.0 or older versions
of Gtk+ will be comfortable with GStreamer.
GStreamer adheres to the GLib 2.0 object model. A programmer familiar with GLib 2.0 or older versions
of GTK+ will be comfortable with GStreamer.
</para>
<para>
GStreamer uses the mechanism of signals and object properties.
@ -48,6 +48,11 @@
All objects can be queried at runtime for their various properties and
capabilities.
</para>
<para>
GStreamer intends to be similar in programming methodology to GTK+.
This applies to the object model, ownership of objects, reference
counting, ...
</para>
</sect2>
<sect2 id="sec-goals-extensible">
@ -69,7 +74,7 @@
have any header files installed for the plugins.
</para>
<para>
Special care has been taken to make plugins completely self contained.
Special care has been taken to make plugins completely selfcontained.
All relevant aspects of plugins can be queried at run-time.
</para>
</sect2>
@ -82,57 +87,71 @@
<itemizedlist>
<listitem>
<para>
Using GLib g_mem_chunk and fast non-blocking allocation algorithms
using GLib's <function>g_mem_chunk</function> and fast non-blocking allocation algorithms
where possible to minimize dynamic memory allocation.
</para>
</listitem>
<listitem>
<para>
Extremely light-weight links between plugins. Data can travel
extremely light-weight links between plugins. Data can travel
the pipeline with minimal overhead. Data passing between plugins only involves
a pointer dereference in a typical pipeline.
</para>
</listitem>
<listitem>
<para>
Providing a mechanism to directly work on the target memory. A plugin can for example
providing a mechanism to directly work on the target memory. A plugin can for example
directly write to the X server's shared memory space. Buffers can also point to
arbitrary memory, such as a sound card's internal hardware buffer.
</para>
</listitem>
<listitem>
<para>
Refcounting and copy on write minimize usage of memcpy(3).
refcounting and copy on write minimize usage of memcpy.
Sub-buffers efficiently split buffers into manageable pieces.
</para>
</listitem>
<listitem>
<para>
The use of cothreads to minimize the threading overhead. Cothreads are a simple and fast
the use of cothreads to minimize the threading overhead. Cothreads are a simple and fast
user-space method for switching between subtasks. Cothreads were measured to
consume as little as 600 cpu cycles.
</para>
</listitem>
<listitem>
<para>
Allowing hardware acceleration by the use of specialized plugins.
allowing hardware acceleration by using specialized plugins.
</para>
</listitem>
<listitem>
<para>
Using a plugin registry with the specifications of the plugins so
using a plugin registry with the specifications of the plugins so
that the plugin loading can be delayed until the plugin is actually
used.
</para>
</listitem>
<listitem>
<para>
All critical data passing is free of locks and mutexes.
all critical data passing is free of locks and mutexes.
</para>
</listitem>
</itemizedlist>
</sect2>
<sect2 id="sec-goals-separation">
<title>Clean core/plugins separation</title>
<para>
The core of GStreamer is essentially media-agnostic. It only knows
about bytes and blocks, and only contains basic elements.
The core of GStreamer is functional enough to even implement low-level
system tools, like cp.
</para>
<para>
All of the media handling functionality is provided by plugins external
to the core. These tell the core how to handle specific types of media.
</para>
</sect2>
<sect2 id="sec-goals-testbed">
<title>Provide a framework for codec experimentation</title>
<para>
@ -142,6 +161,13 @@
url="http://www.xiph.org/ogg/index.html" type="http">tarkin and
vorbis</ulink>.
</para>
<para>
GStreamer also wants to be an easy framework where codec
developers can experiment with different algorithms, speeding up
the development of open and free multimedia codecs like <ulink
url="http://www.xiph.org/ogg/index.html" type="http">tarkin and
vorbis</ulink>.
</para>
</sect2>
</sect1>

View file

@ -178,7 +178,7 @@ main (int argc, char *argv[])
pipeline as follows:
</para>
<figure float="1" id="sec-hello-img">
<title>The Hello world pipeline</title>
<title>The "hello world" pipeline</title>
<mediaobject>
<imageobject>
<imagedata fileref="images/hello-world.&image;" format="&IMAGE;" />
@ -188,7 +188,7 @@ main (int argc, char *argv[])
</figure>
<para>
Everything is now set up to start the streaming. We use the following
Everything is now set up to start streaming. We use the following
statements to change the state of the pipeline:
</para>
<programlisting>

View file

@ -70,27 +70,15 @@ main (int argc, char *argv[])
g_assert (decode != NULL);
/* add objects to the main bin */
gst_bin_add (GST_BIN (bin), filesrc);
gst_bin_add (GST_BIN (bin), queue);
gst_bin_add_many (GST_BIN (bin), filesrc, queue, NULL);
gst_bin_add (GST_BIN (thread), decode);
gst_bin_add (GST_BIN (thread), queue2);
gst_bin_add_many (GST_BIN (thread), decode, queue2, NULL);
gst_bin_add (GST_BIN (thread2), osssink);
gst_pad_link (gst_element_get_pad (filesrc,"src"),
gst_element_get_pad (queue,"sink"));
gst_element_link_many (filesrc, queue, decode, queue2, osssink, NULL);
gst_pad_link (gst_element_get_pad (queue,"src"),
gst_element_get_pad (decode,"sink"));
gst_pad_link (gst_element_get_pad (decode,"src"),
gst_element_get_pad (queue2,"sink"));
gst_pad_link (gst_element_get_pad (queue2,"src"),
gst_element_get_pad (osssink,"sink"));
gst_bin_add (GST_BIN (bin), thread);
gst_bin_add (GST_BIN (bin), thread2);
gst_bin_add_many (GST_BIN (bin), thread, thread2, NULL);
/* write the bin to stdout */
gst_xml_write_file (GST_ELEMENT (bin), stdout);
@ -204,7 +192,7 @@ xmlNsPtr ns;
...
</programlisting>
<para>
When the thread is saved, the object_save method will be caled. Our example
When the thread is saved, the object_save method will be called. Our example
will insert a comment tag:
</para>
<programlisting>

View file

@ -6,7 +6,7 @@
access to the library functions.
</para>
<para>
Before the <application>GStreamer</application> libraries can be used
Before the <application>GStreamer</application> libraries can be used,
<function>gst_init</function> has to be called from the main application.
This call will perform the necessary initialization of the library as
well as parse the GStreamer-specific command line options.
@ -41,11 +41,6 @@ main (int argc, char *argv[])
with two <symbol>NULL</symbol> arguments, in which case no command line
options will parsed by <application>GStreamer</application>.
</para>
<para>
Use the GST_VERSION_MAJOR, GST_VERSION_MINOR and GST_VERSION_MICRO macros to
get the <application>GStreamer</application> version you are building against or
use gst_version() to get the version you are linked against.
</para>
<sect1>
<title>The popt interface</title>
<para>

View file

@ -2,7 +2,7 @@
<title>Motivation</title>
<para>
Linux has historically lagged behind other operating systems in the multimedia
arena. Microsoft's Windows[tm] and Apple's MacOS[tm] both have strong support
arena. Microsoft's <trademark>Windows</trademark> and Apple's <trademark>MacOS</trademark> both have strong support
for multimedia devices, multimedia content creation,
playback, and realtime processing. Linux, on the other hand, has a poorly integrated
collection of multimedia utilities and applications available, which can hardly compete
@ -12,7 +12,7 @@
<sect1 id="sec-motivation-problems">
<title>Current problems</title>
<para>
We descibe the typical problems in todays media handling on Linux.
We describe the typical problems in today's media handling on Linux.
</para>
<sect2 id="sec-motivation-duplicate">
<title>Multitude of duplicate code</title>
@ -37,9 +37,9 @@
filters or special effects to the video or audio data.
</para>
<para>
If I wanted to convert an MPEG2 video stream into an AVI file, my best
If you want to convert an MPEG2 video stream into an AVI file, your best
option would be to take all of the MPEG2 decoding algorithms out
of the player and duplicate them into my own AVI encoder. These
of the player and duplicate them into your own AVI encoder. These
algorithms cannot easily be shared accross applications.
</para>
<para>
@ -69,7 +69,7 @@
While GStreamer also uses it own plugin system it offers a very rich
framework for the plugin developper and ensures the plugin can be used
in a wide range of applications, transparently interacting with other
plugins. The Framework that GStreamer provides for the plugins is
plugins. The framework that GStreamer provides for the plugins is
flexible enough to host even the most demanding plugins.
</para>
</sect2>
@ -87,7 +87,7 @@
type="http">GNOME object embedding using Bonobo</ulink>.
</para>
<para>
The GStreamer cores does not use network transparent technologies at the
The GStreamer core does not use network transparent technologies at the
lowest level as it only adds overhead for the local case.
That said, it shouldn't be hard to create a wrapper around the
core components.
@ -95,7 +95,7 @@
</sect2>
<sect2 id="sec-motivation-catchup">
<title>Catch up with the Windows(tm) world</title>
<title>Catch up with the <trademark>Windows</trademark> world</title>
<para>
We need solid media handling if we want to see Linux succeed on
the desktop.

View file

@ -15,17 +15,20 @@
<para>
GStreamer's development framework makes it possible to write any type of
streaming multimedia application. The GStreamer framework is designed to make it easy to
write applications that handle either audio or video or both. The pipeline design is made to have
no extra overhead above what the applied filters induce. This makes GStreamer a good framework for designing
even high-end audio applications which puts high demands on latency.
streaming multimedia application. The GStreamer framework is designed
to make it easy to write applications that handle audio or video or both.
It isn't restricted to audio and video, and can process any kind of
data flow.
The pipeline design is made to have little overhead above what the
applied filters induce. This makes GStreamer a good framework for designing
even high-end audio applications which put high demands on latency.
</para>
<para>
One of the the most obvious uses of GStreamer is using it to build
a media player. GStreamer already includes components for building a
media player that can support a very wide variety of formats, including
MP3, Ogg Vorbis, MPEG1, MPEG2, AVI, Quicktime, mod and so on. GStreamer,
MP3, Ogg Vorbis, MPEG1, MPEG2, AVI, Quicktime, mod, and more. GStreamer,
however, is much more than just another media player. Its main advantages
are that the pluggable components can be mixed and matched into arbitrary
pipelines so that it's possible to write a full-fledged video or audio

View file

@ -15,17 +15,20 @@
<para>
GStreamer's development framework makes it possible to write any type of
streaming multimedia application. The GStreamer framework is designed to make it easy to
write applications that handle either audio or video or both. The pipeline design is made to have
no extra overhead above what the applied filters induce. This makes GStreamer a good framework for designing
even high-end audio applications which puts high demands on latency.
streaming multimedia application. The GStreamer framework is designed
to make it easy to write applications that handle audio or video or both.
It isn't restricted to audio and video, and can process any kind of
data flow.
The pipeline design is made to have little overhead above what the
applied filters induce. This makes GStreamer a good framework for designing
even high-end audio applications which put high demands on latency.
</para>
<para>
One of the the most obvious uses of GStreamer is using it to build
a media player. GStreamer already includes components for building a
media player that can support a very wide variety of formats, including
MP3, Ogg Vorbis, MPEG1, MPEG2, AVI, Quicktime, mod and so on. GStreamer,
MP3, Ogg Vorbis, MPEG1, MPEG2, AVI, Quicktime, mod, and more. GStreamer,
however, is much more than just another media player. Its main advantages
are that the pluggable components can be mixed and matched into arbitrary
pipelines so that it's possible to write a full-fledged video or audio

View file

@ -31,6 +31,6 @@ Single
4 0 0 50 0 16 12 0.0000 4 135 330 5550 3975 sink\001
4 0 0 50 0 16 12 0.0000 4 135 330 8175 3975 sink\001
4 0 0 50 0 16 12 0.0000 4 105 255 6825 3975 src\001
4 0 0 50 0 16 12 0.0000 4 135 750 5625 3075 element2\001
4 0 0 50 0 16 12 0.0000 4 135 750 8250 3075 element3\001
4 0 0 50 0 16 12 0.0000 4 135 750 3000 3075 element1\001
4 0 0 50 0 16 12 0.0000 4 135 750 5625 3075 filter\001
4 0 0 50 0 16 12 0.0000 4 135 750 8250 3075 sink_element\001
4 0 0 50 0 16 12 0.0000 4 135 750 3000 3075 source_element\001

View file

@ -97,7 +97,9 @@
<title>Making filtered links</title>
<para>
You can also force a specific media type on the link by using gst_pad_link_filtered ()
and gst_element_link_filtered (). FIXME link to caps documentation.
and gst_element_link_filtered () with capabilities.
See <xref linkend="sec-caps"/> for
an explanation of capabilities.
</para>
</sect1>

View file

@ -143,10 +143,10 @@
&ELEMENTS;
&PLUGINS;
&PADS;
&PLUGINS;
&LINKS;
&BINS;

View file

@ -2,7 +2,7 @@
<title>Motivation</title>
<para>
Linux has historically lagged behind other operating systems in the multimedia
arena. Microsoft's Windows[tm] and Apple's MacOS[tm] both have strong support
arena. Microsoft's <trademark>Windows</trademark> and Apple's <trademark>MacOS</trademark> both have strong support
for multimedia devices, multimedia content creation,
playback, and realtime processing. Linux, on the other hand, has a poorly integrated
collection of multimedia utilities and applications available, which can hardly compete
@ -12,7 +12,7 @@
<sect1 id="sec-motivation-problems">
<title>Current problems</title>
<para>
We descibe the typical problems in todays media handling on Linux.
We describe the typical problems in today's media handling on Linux.
</para>
<sect2 id="sec-motivation-duplicate">
<title>Multitude of duplicate code</title>
@ -37,9 +37,9 @@
filters or special effects to the video or audio data.
</para>
<para>
If I wanted to convert an MPEG2 video stream into an AVI file, my best
If you want to convert an MPEG2 video stream into an AVI file, your best
option would be to take all of the MPEG2 decoding algorithms out
of the player and duplicate them into my own AVI encoder. These
of the player and duplicate them into your own AVI encoder. These
algorithms cannot easily be shared accross applications.
</para>
<para>
@ -69,7 +69,7 @@
While GStreamer also uses it own plugin system it offers a very rich
framework for the plugin developper and ensures the plugin can be used
in a wide range of applications, transparently interacting with other
plugins. The Framework that GStreamer provides for the plugins is
plugins. The framework that GStreamer provides for the plugins is
flexible enough to host even the most demanding plugins.
</para>
</sect2>
@ -87,7 +87,7 @@
type="http">GNOME object embedding using Bonobo</ulink>.
</para>
<para>
The GStreamer cores does not use network transparent technologies at the
The GStreamer core does not use network transparent technologies at the
lowest level as it only adds overhead for the local case.
That said, it shouldn't be hard to create a wrapper around the
core components.
@ -95,7 +95,7 @@
</sect2>
<sect2 id="sec-motivation-catchup">
<title>Catch up with the Windows(tm) world</title>
<title>Catch up with the <trademark>Windows</trademark> world</title>
<para>
We need solid media handling if we want to see Linux succeed on
the desktop.

View file

@ -1,12 +1,27 @@
<chapter id="cha-pads">
<title>GstPad</title>
<title>Pads</title>
<para>
As we have seen in the previous chapter (GstElement), the pads are the element's
links with the outside world.
As we have seen in <xref linkend="cha-elements"/>, the pads are the element's
interface to the outside world.
</para>
<para>
The specific type of media that the element can handle will be exposed by the pads.
The description of this media type is done with capabilities (<classname>GstCaps</classname>)
The description of this media type is done with capabilities(see
<xref linkend="sec-caps"/>)
</para>
<para>
Pads are either source or sink pads. The terminology is defined from the
view of the element itself: elements accept data on their sink pads, and
send data out on their source pads. Sink pads are drawn on the left,
while source pads are drawn on the right of an element. In general,
data flows from left to right in the graph.<footnote>
<para>
In reality, there is no objection to data flowing from a
source pad to the sink pad of an element upstream. Data will, however,
always flow from a source pad of one element to the sink pad of
another.
</para></footnote>
</para>
<sect1 id="sec-pads-get">
@ -58,19 +73,27 @@
GstElement.
</para>
</sect2>
</sect1>
<sect1 id="sec-pads-type">
<title>Types of pads</title>
<sect2 id="sec-pads-dynamic">
<title>Dynamic pads</title>
<para>
Some elements might not have their pads when they are created. This
can happen, for example, with an MPEG2 system demultiplexer. The
Some elements might not have all of their pads when the element is
created. This
can happen, for example, with an MPEG system demultiplexer. The
demultiplexer will create its pads at runtime when it detects the
different elementary streams in the MPEG2 system stream.
different elementary streams in the MPEG system stream.
</para>
<para>
Running <application>gst-inspect mpegdemux</application> will show that
the element has only one pad: a sink pad called 'sink'. The other pads are
"dormant" as you can see in the padtemplates from the 'Exists: Sometimes'
property. Depending on the type of MPEG2 file you play, the pads are created. We
"dormant". You can see this in the pad template because there is
an 'Exists: Sometimes'
property. Depending on the type of MPEG file you play, the pads will
be created. We
will see that this is very important when you are going to create dynamic
pipelines later on in this manual.
</para>
@ -116,7 +139,7 @@ main(int argc, char *argv[])
</programlisting>
<note>
<para>
You need to set the pipeline to READY or NULL if you want to change it.
A pipeline cannot be changed in the PLAYING state.
</para>
</note>
</sect2>
@ -172,12 +195,15 @@ main(int argc, char *argv[])
...
</programlisting>
</sect2>
</sect1>
<sect1 id="sec-pads-description">
<title>Capabilities of a GstPad</title>
<sect1 id="sec-caps">
<title>Capabilities of a pad</title>
<para>
Since the pads play a very important role in how the element is viewed by the
outside world, a mechanism is implemented to describe the pad by using capabilities.
outside world, a mechanism is implemented to describe the data that can
flow through the pad by using capabilities.
</para>
<para>
We will briefly describe what capabilities are, enough for you to get a basic understanding
@ -186,14 +212,29 @@ main(int argc, char *argv[])
</para>
<sect2 id="sec-pads-caps">
<title>What is a capability</title>
<title>What are capabilities ?</title>
<para>
A capability is attached to a pad in order to describe what type of media the pad
can handle.
Capabilities are attached to a pad in order to describe
what type of media the pad can handle.
</para>
<para>
A capability is named and consists of a MIME type and a set of properties. Its data
structure is:
Capabilities is shorthand for "capability chain". A capability chain
is a chain of one capability or more.
</para>
<para>
The basic entity is a capability, and is defined by a name, a MIME
type and a set of properties. A capability can be chained to
another capability, which is why we commonly refer to a chain of
capability entities as "capabilities".<footnote>
<para>
It is important to understand that the term "capabilities" refers
to a chain of one capability or more. This will be clearer when
you see the structure definition of a <classname>GstCaps</classname>
element.
</para></footnote>
</para>
<para>
Its structure is:
</para>
<programlisting>
struct _GstCaps {
@ -245,55 +286,80 @@ Pads:
</programlisting>
</sect2>
<sect2 id="sec-pads-props">
<title>What are properties</title>
<title>What are properties ?</title>
<para>
Properties are used to describe extra information for
capabilities. The properties basically consist of a key (a string) and
a value. There are different possibile value types that can be used:
capabilities. A property consists of a key (a string) and
a value. There are different possible value types that can be used:
</para>
<itemizedlist>
<listitem>
<para>
An integer value: the property has this exact value.
basic types:
</para>
<itemizedlist>
<listitem>
<para>
an integer value: the property has this exact value.
</para>
</listitem>
<listitem>
<para>
a boolean value: the property is either TRUE or FALSE.
</para>
</listitem>
<listitem>
<para>
a fourcc value: this is a value that is commonly used to
describe an encoding for video,
as used for example by the AVI specification.
<footnote><para>
fourcc values consist of four bytes.
<ulink url="http://www.fourcc.org" type="http">The FOURCC
Definition List</ulink> is the most complete resource
on the allowed fourcc values.
</para></footnote>
</para>
</listitem>
<listitem>
<para>
a float value: the property has this exact floating point value.
</para>
</listitem>
<listitem>
<para>
a string value.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
range types:
</para>
<itemizedlist>
<listitem>
<para>
an integer range value: the property denotes a range of
possible integer. For example, the wavparse element has
a source pad where the "rate" property can go from 8000 to
48000.
</para>
</listitem>
<listitem>
<para>
a float range value: the property denotes a range of possible
floating point values.
</para>
</listitem>
</itemizedlist>
</listitem>
<listitem>
<para>
An integer range value. The property denotes a range of possible
values. In the case of the mad element, the source pad has a
property rate that can go from 11025 to 48000.
</para>
</listitem>
<listitem>
<para>
A boolean value.
</para>
</listitem>
<listitem>
<para>
a fourcc value: this is a value that is commonly used to describe an encoding for video,
as used by the AVI specification.
</para>
</listitem>
<listitem>
<para>
A list value: the property can take any value from a list.
</para>
</listitem>
<listitem>
<para>
A float value: the property has this exact floating point value.
</para>
</listitem>
<listitem>
<para>
A float range value: denotes a range of possible floating point values.
</para>
</listitem>
<listitem>
<para>
A string value.
a list value: the property can take any value from a list of
basic value types or range types.
</para>
</listitem>
</itemizedlist>
@ -315,6 +381,8 @@ Pads:
<para>
Compatibility detection: when two pads are linked, <application>GStreamer</application>
can verify if the two pads are talking about the same media types.
The process of linking two pads and checking if they are compatible
is called "caps negotiation".
</para>
</listitem>
</itemizedlist>
@ -407,7 +475,7 @@ GstProps* gst_props_new (const gchar *firstname, ...);
<itemizedlist>
<listitem>
<para>
GST_PROPS_INT_RANGE(a,b): An integer ragne from a to b
GST_PROPS_INT_RANGE(a,b): An integer range from a to b
</para>
</listitem>
<listitem>

View file

@ -247,11 +247,5 @@ Element Signals:
gst-inspect gstelements
</screen>
</sect1>
<sect1>
<title><command>gst-play</command></title>
<para>
A sample media player.
</para>
</sect1>
</chapter>

View file

@ -1,7 +1,7 @@
<chapter id="cha-queues">
<title>Queues</title>
<para>
A <classname>GstQueue</classname> is a filter element.
A queue is a filter element.
Queues can be used to link two elements in such way that the data can
be buffered.
</para>
@ -11,10 +11,10 @@
element as soon as a gst_pad_pull () is called on the queue's source pad.
</para>
<para>
Queues are mostly used in conjunction with a <classname>GstThread</classname> to
provide an external link for the thread elements. You could have one
thread feeding buffers into a <classname>GstQueue</classname> and another
thread repeadedly calling gst_pad_pull () on the queue to feed its
Queues are mostly used in conjunction with a thread bin to
provide an external link for the thread's elements. You could have one
thread feeding buffers into a queue and another
thread repeatedly pulling on the queue to feed its
internal elements.
</para>

View file

@ -5,7 +5,7 @@
<application>GStreamer</application> is a lively project, with
developers from around the globe very actively contributing.
We often hang out on the #gstreamer IRC channel on
irc.openprojects.net: the following are a selection of amusing<footnote>
irc.freenode.org: the following are a selection of amusing<footnote>
<para>No guarantee of sense of humour compatibility is given.</para>
</footnote> quotes from our conversations.
</para>

View file

@ -28,7 +28,7 @@
</itemizedlist>
</para>
<para>
The scheduler is a plugable component; this means that alternative
The scheduler is a pluggable component; this means that alternative
schedulers can be written and plugged into GStreamer. The default scheduler
uses cothreads to schedule the plugins in a pipeline. Cothreads are fast
and lightweight user-space threads.

View file

@ -11,5 +11,5 @@ Single
5625 2775 7575 2775 7575 4425 5625 4425 5625 2775
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
5625 3600 6375 3600 6375 4125 5625 4125 5625 3600
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 element_name\001
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 sink_element\001
4 0 0 50 0 16 12 0.0000 4 135 330 5850 3975 sink\001

View file

@ -12,4 +12,4 @@ Single
2 2 0 1 0 7 50 0 -1 0.000 0 0 -1 0 0 5
5625 2775 7575 2775 7575 4425 5625 4425 5625 2775
4 0 0 50 0 16 12 0.0000 4 105 255 7050 3975 src\001
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 element_name\001
4 0 0 50 0 16 12 0.0000 4 165 1200 5775 3150 source_element\001

View file

@ -8,7 +8,7 @@
<sect1 id="sec-states">
<title>The different element states</title>
<para>
All elements can be in one of the following four states:
An element can be in one of the following four states:
<itemizedlist>
<listitem>
<para>
@ -37,7 +37,7 @@
<para>
All elements start with the NULL state. The elements will go throught
the following state changes: NULL -&gt; READY -&gt; PAUSED -&gt;
PLAYING. Remember when going from PLAYING to READY, GStreamer will
PLAYING. When going from NULL to PLAYING, GStreamer will
internally go throught the intermediate states.
</para>
<para>
@ -53,7 +53,7 @@
</programlisting>
<para>
You can set the following states to an element:
You can set the following states on an element:
</para>
<informaltable pgwide="1" frame="none" role="enum">
<tgroup cols="2">
@ -88,7 +88,7 @@
<title>The NULL state</title>
<para>
When you created the pipeline all of the elements will be in the NULL state. There is
nothing spectacular about the NULL state.
nothing special about the NULL state.
</para>
<note>
<para>
@ -119,14 +119,6 @@
</note>
</sect1>
<sect1 id="sec-states-playing">
<title>The PLAYING state</title>
<para>
A Pipeline that is in the READY state can be started by setting it to the PLAYING state. At
that time data will start to flow all the way through the pipeline.
</para>
</sect1>
<sect1 id="sec-states-paused">
<title>The PAUSED state</title>
<para>
@ -148,5 +140,13 @@
in the pipeline. We will cover dynamic pipeline behaviour in <xref linkend="cha-dynamic"/>.
</para>
</sect1>
<sect1 id="sec-states-playing">
<title>The PLAYING state</title>
<para>
A Pipeline that is in the READY state can be started by setting it to the PLAYING state. At
that time data will start to flow all the way through the pipeline.
</para>
</sect1>
</chapter>

View file

@ -8,44 +8,44 @@ Single
-2
1200 2
2 1 0 1 0 7 50 0 -1 0.000 0 0 -1 1 0 2
1 1 1.00 90.00 120.00
4050 3750 4575 3750
1 1 1.00 77.53 103.38
3759 3501 4212 3501
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
4575 3600 5325 3600 5325 4125 4575 4125 4575 3600
4212 3371 4858 3371 4858 3824 4212 3824 4212 3371
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
5775 3600 6525 3600 6525 4125 5775 4125 5775 3600
5245 3371 5892 3371 5892 3824 5245 3824 5245 3371
2 1 0 1 0 7 50 0 -1 0.000 0 0 -1 1 0 2
1 1 1.00 90.00 120.00
6525 3750 7125 3750
1 1 1.00 77.53 103.38
5892 3501 6408 3501
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
7125 3600 7875 3600 7875 4125 7125 4125 7125 3600
6408 3371 7055 3371 7055 3824 6408 3824 6408 3371
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
8325 3600 9075 3600 9075 4125 8325 4125 8325 3600
7442 3371 8088 3371 8088 3824 7442 3824 7442 3371
2 2 0 1 0 6 50 0 20 0.000 0 0 -1 0 0 5
9600 3600 10350 3600 10350 4125 9600 4125 9600 3600
8541 3371 9187 3371 9187 3824 8541 3824 8541 3371
2 1 0 1 0 7 50 0 -1 0.000 0 0 -1 1 0 2
1 1 1.00 90.00 120.00
9075 3750 9600 3750
1 1 1.00 77.53 103.38
8088 3501 8541 3501
2 2 0 1 0 6 49 0 20 0.000 0 0 -1 0 0 5
3300 3600 4050 3600 4050 4125 3300 4125 3300 3600
3113 3371 3759 3371 3759 3824 3113 3824 3113 3371
2 2 0 1 0 7 50 0 20 0.000 0 0 -1 0 0 5
2100 2775 4050 2775 4050 4425 2100 4425 2100 2775
2079 2661 3759 2661 3759 4082 2079 4082 2079 2661
2 2 0 1 0 7 51 0 20 0.000 0 0 -1 0 0 5
4575 2775 6525 2775 6525 4425 4575 4425 4575 2775
4212 2661 5892 2661 5892 4082 4212 4082 4212 2661
2 2 0 1 0 7 51 0 20 0.000 0 0 -1 0 0 5
6408 2661 8088 2661 8088 4082 6408 4082 6408 2661
2 2 0 1 0 7 51 0 20 0.000 0 0 -1 0 0 5
8541 2661 10221 2661 10221 4082 8541 4082 8541 2661
2 2 0 1 0 7 100 0 19 0.000 0 0 -1 0 0 5
1950 1950 11700 1950 11700 4800 1950 4800 1950 1950
2 2 0 1 0 7 51 0 20 0.000 0 0 -1 0 0 5
7125 2775 9075 2775 9075 4425 7125 4425 7125 2775
2 2 0 1 0 7 51 0 20 0.000 0 0 -1 0 0 5
9600 2775 11550 2775 11550 4425 9600 4425 9600 2775
4 0 0 50 0 16 12 0.0000 4 135 330 4725 3975 sink\001
4 0 0 50 0 16 12 0.0000 4 105 255 6075 3975 src\001
4 0 0 50 0 16 12 0.0000 4 135 330 7350 3975 sink\001
4 0 0 50 0 16 12 0.0000 4 105 255 8625 3975 src\001
4 0 0 50 0 16 12 0.0000 4 135 330 9750 3975 sink\001
4 0 0 50 0 16 12 0.0000 4 165 1005 2250 3075 disk_source\001
4 0 0 50 0 16 12 0.0000 4 150 465 4725 3075 parse\001
4 0 0 50 0 16 12 0.0000 4 135 690 7275 3075 decoder\001
4 0 0 50 0 16 12 0.0000 4 180 930 9750 3075 play_audio\001
4 0 0 48 0 16 12 0.0000 4 105 255 3525 3975 src\001
4 0 0 50 0 16 12 0.0000 4 135 525 2175 2250 thread\001
1950 1950 10350 1950 10350 4405 1950 4405 1950 1950
4 0 0 50 0 16 10 0.0000 4 116 284 4341 3694 sink\001
4 0 0 50 0 16 10 0.0000 4 90 220 5504 3694 src\001
4 0 0 50 0 16 10 0.0000 4 116 284 6602 3694 sink\001
4 0 0 50 0 16 10 0.0000 4 90 220 7701 3694 src\001
4 0 0 50 0 16 10 0.0000 4 116 284 8670 3694 sink\001
4 0 0 50 0 16 10 0.0000 4 142 866 2208 2919 disk_source\001
4 0 0 50 0 16 10 0.0000 4 129 401 4341 2919 parse\001
4 0 0 50 0 16 10 0.0000 4 116 594 6538 2919 decoder\001
4 0 0 50 0 16 10 0.0000 4 155 801 8670 2919 play_audio\001
4 0 0 48 0 16 10 0.0000 4 90 220 3307 3694 src\001
4 0 0 50 0 16 10 0.0000 4 116 452 2144 2208 thread\001

View file

@ -63,7 +63,7 @@
an audio pipeline.
</para>
<para>
A thread can be visualised as below
<xref linkend="sec-threads-img"/> shows how a thread can be visualised.
</para>
<figure float="1" id="sec-threads-img">
<title>A thread</title>

View file

@ -1,5 +1,5 @@
<chapter id="cha-typedetection">
<title>Typedetection</title>
<title>Type Detection</title>
<para>
Sometimes the capabilities of a pad are not specificied. The filesrc
element, for example, does not know what type of file it is reading. Before
@ -8,7 +8,7 @@
</para>
<para>
To solve this problem, a plugin can provide the <application>GStreamer</application>
core library with a typedefinition library with a typedefinition. The typedefinition
core library with a type definition. The type definition
will contain the following information:
<itemizedlist>
<listitem>
@ -43,7 +43,8 @@ typedef GstCaps *(*GstTypeFindFunc) (GstBuffer *buf, gpointer priv);
understand the buffer contents, it will return NULL.
</para>
<para>
<application>GStreamer</application> has a typefind element in its core elements
<application>GStreamer</application> has a typefind element in the set
of core elements
that can be used to determine the type of a given pad.
</para>
<para>
@ -82,14 +83,12 @@ main(int argc, char *argv[])
g_assert (typefind != NULL);
/* add objects to the main pipeline */
gst_bin_add (GST_BIN (bin), filesrc);
gst_bin_add (GST_BIN (bin), typefind);
gst_bin_add_many (GST_BIN (bin), filesrc, typefind, NULL);
g_signal_connect (G_OBJECT (typefind), "have_type",
G_CALLBACK (type_found), NULL);
gst_pad_link (gst_element_get_pad (filesrc, "src"),
gst_element_get_pad (typefind, "sink"));
gst_element_link (filesrc, typefind);
/* start playing */
gst_element_set_state (GST_ELEMENT (bin), GST_STATE_PLAYING);

View file

@ -70,27 +70,15 @@ main (int argc, char *argv[])
g_assert (decode != NULL);
/* add objects to the main bin */
gst_bin_add (GST_BIN (bin), filesrc);
gst_bin_add (GST_BIN (bin), queue);
gst_bin_add_many (GST_BIN (bin), filesrc, queue, NULL);
gst_bin_add (GST_BIN (thread), decode);
gst_bin_add (GST_BIN (thread), queue2);
gst_bin_add_many (GST_BIN (thread), decode, queue2, NULL);
gst_bin_add (GST_BIN (thread2), osssink);
gst_pad_link (gst_element_get_pad (filesrc,"src"),
gst_element_get_pad (queue,"sink"));
gst_element_link_many (filesrc, queue, decode, queue2, osssink, NULL);
gst_pad_link (gst_element_get_pad (queue,"src"),
gst_element_get_pad (decode,"sink"));
gst_pad_link (gst_element_get_pad (decode,"src"),
gst_element_get_pad (queue2,"sink"));
gst_pad_link (gst_element_get_pad (queue2,"src"),
gst_element_get_pad (osssink,"sink"));
gst_bin_add (GST_BIN (bin), thread);
gst_bin_add (GST_BIN (bin), thread2);
gst_bin_add_many (GST_BIN (bin), thread, thread2, NULL);
/* write the bin to stdout */
gst_xml_write_file (GST_ELEMENT (bin), stdout);
@ -204,7 +192,7 @@ xmlNsPtr ns;
...
</programlisting>
<para>
When the thread is saved, the object_save method will be caled. Our example
When the thread is saved, the object_save method will be called. Our example
will insert a comment tag:
</para>
<programlisting>