gstreamer/docs/manual/advanced-dataaccess.xml

766 lines
29 KiB
XML

<chapter id="chapter-dataaccess">
<title>Pipeline manipulation</title>
<para>
This chapter will discuss how you can manipulate your pipeline in several
ways from your application on. Parts of this chapter are downright
hackish, so be assured that you'll need some programming knowledge
before you start reading this.
</para>
<para>
Topics that will be discussed here include how you can insert data into
a pipeline from your application, how to read data from a pipeline,
how to manipulate the pipeline's speed, length, starting point and how
to listen to a pipeline's data processing.
</para>
<sect1 id="section-data-probe">
<title>Data probing</title>
<para>
Probing is best envisioned as a pad listener. Technically, a probe is
nothing more than a callback that can be attached to a pad.
You can attach a probe using <function>gst_pad_add_probe ()</function>.
Similarly, one can use the
<function>gst_pad_remove_probe ()</function>
to remove the callback again.
</para>
<para>
Probes run in pipeline threading context, so callbacks should try to
not block and generally not do any weird stuff, since this could
have a negative impact on pipeline performance or, in case of bugs,
cause deadlocks or crashes. More precisely, one should usually not
call any GUI-related functions from within a probe callback, nor try
to change the state of the pipeline. An application may post custom
messages on the pipeline's bus though to communicate with the main
application thread and have it do things like stop the pipeline.
</para>
<para>
In any case, most common buffer operations
that elements can do in <function>_chain ()</function> functions, can
be done in probe callbacks as well. The example below gives a short
impression on how to use them (even if this usage is not entirely
correct, but more on that below):
</para>
<programlisting><!-- example-begin probe.c -->
#include &lt;gst/gst.h&gt;
static GstPadProbeReturn
cb_have_data (GstPad *pad,
GstPadProbeInfo *info,
gpointer user_data)
{
gint x, y;
GstMapInfo map;
guint16 *ptr, t;
GstBuffer *buffer;
buffer = GST_PAD_PROBE_INFO_BUFFER (info);
buffer = gst_buffer_make_writable (buffer);
gst_buffer_map (buffer, &amp;map, GST_MAP_WRITE);
ptr = (guint16 *) map.data;
/* invert data */
for (y = 0; y &lt; 288; y++) {
for (x = 0; x &lt; 384 / 2; x++) {
t = ptr[384 - 1 - x];
ptr[384 - 1 - x] = ptr[x];
ptr[x] = t;
}
ptr += 384;
}
gst_buffer_unmap (buffer, &amp;map);
GST_PAD_PROBE_INFO_DATA (info) = buffer;
return TRUE;
}
gint
main (gint argc,
gchar *argv[])
{
GMainLoop *loop;
GstElement *pipeline, *src, *sink, *filter, *csp;
GstCaps *filtercaps;
GstPad *pad;
/* init GStreamer */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* build */
pipeline = gst_pipeline_new ("my-pipeline");
src = gst_element_factory_make ("videotestsrc", "src");
if (src == NULL)
g_error ("Could not create 'videotestsrc' element");
filter = gst_element_factory_make ("capsfilter", "filter");
g_assert (filter != NULL); /* should always exist */
csp = gst_element_factory_make ("videoconvert", "csp");
if (csp == NULL)
g_error ("Could not create 'videoconvert' element");
sink = gst_element_factory_make ("xvimagesink", "sink");
if (sink == NULL) {
sink = gst_element_factory_make ("ximagesink", "sink");
if (sink == NULL)
g_error ("Could not create neither 'xvimagesink' nor 'ximagesink' element");
}
gst_bin_add_many (GST_BIN (pipeline), src, filter, csp, sink, NULL);
gst_element_link_many (src, filter, csp, sink, NULL);
filtercaps = gst_caps_new_simple ("video/x-raw",
"format", G_TYPE_STRING, "RGB16",
"width", G_TYPE_INT, 384,
"height", G_TYPE_INT, 288,
"framerate", GST_TYPE_FRACTION, 25, 1,
NULL);
g_object_set (G_OBJECT (filter), "caps", filtercaps, NULL);
gst_caps_unref (filtercaps);
pad = gst_element_get_static_pad (src, "src");
gst_pad_add_probe (pad, GST_PAD_PROBE_TYPE_BUFFER,
(GstPadProbeCallback) cb_have_data, NULL, NULL);
gst_object_unref (pad);
/* run */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* wait until it's up and running or failed */
if (gst_element_get_state (pipeline, NULL, NULL, -1) == GST_STATE_CHANGE_FAILURE) {
g_error ("Failed to go into PLAYING state");
}
g_print ("Running ...\n");
g_main_loop_run (loop);
/* exit */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
return 0;
}
<!-- example-end probe.c --></programlisting>
<para>
Compare that output with the output of <quote>gst-launch-1.0
videotestsrc ! xvimagesink</quote>, just so you know what you're
looking for.
</para>
<para>
Strictly speaking, a pad probe callback is only allowed to modify the
buffer content if the buffer is writable. Whether this is the case or
not depends a lot on the pipeline and the elements involved. Often
enough, this is the case, but sometimes it is not, and if it is not
then unexpected modification of the data or metadata can introduce
bugs that are very hard to debug and track down. You can check if a
buffer is writable with <function>gst_buffer_is_writable ()</function>.
Since you can pass back a different buffer than the one passed in,
it is a good idea to make the buffer writable in the callback function
with <function>gst_buffer_make_writable ()</function>.
</para>
<para>
Pad probes are suited best for looking at data as it passes through
the pipeline. If you need to modify data, you should better write your
own GStreamer element. Base classes like GstAudioFilter, GstVideoFilter or
GstBaseTransform make this fairly easy.
</para>
<para>
If you just want to inspect buffers as they pass through the pipeline,
you don't even need to set up pad probes. You could also just insert
an identity element into the pipeline and connect to its "handoff"
signal. The identity element also provides a few useful debugging tools
like the "dump" property or the "last-message" property (the latter is
enabled by passing the '-v' switch to gst-launch and by setting the
silent property on the identity to FALSE).
</para>
</sect1>
<sect1 id="section-data-spoof">
<title>Manually adding or removing data from/to a pipeline</title>
<para>
Many people have expressed the wish to use their own sources to inject
data into a pipeline. Some people have also expressed the wish to grab
the output in a pipeline and take care of the actual output inside
their application. While either of these methods are strongly
discouraged, &GStreamer; offers support for this.
<emphasis>Beware! You need to know what you are doing.</emphasis> Since
you don't have any support from a base class you need to thoroughly
understand state changes and synchronization. If it doesn't work,
there are a million ways to shoot yourself in the foot. It's always
better to simply write a plugin and have the base class manage it.
See the Plugin Writer's Guide for more information on this topic. Also
see the next section, which will explain how to embed plugins statically
in your application.
</para>
<para>
There's two possible elements that you can use for the above-mentioned
purposes. Those are called <quote>appsrc</quote> (an imaginary source)
and <quote>appsink</quote> (an imaginary sink). The same method applies
to each of those elements. Here, we will discuss how to use those
elements to insert (using appsrc) or grab (using appsink) data from a
pipeline, and how to set negotiation.
</para>
<para>
Both appsrc and appsink provide 2 sets of API. One API uses standard
GObject (action) signals and properties. The same API is also
available as a regular C api. The C api is more performant but
requires you to link to the app library in order to use the elements.
</para>
<sect2 id="section-spoof-appsrc">
<title>Inserting data with appsrc</title>
<para>
First we look at some examples for appsrc, which lets you insert data
into the pipeline from the application. Appsrc has some configuration
options that define how it will operate. You should decide about the
following configurations:
</para>
<itemizedlist>
<listitem>
<para>
Will the appsrc operate in push or pull mode. The stream-type
property can be used to control this. stream-type of
<quote>random-access</quote> will activate pull mode scheduling
while the other stream-types activate push mode.
</para>
</listitem>
<listitem>
<para>
The caps of the buffers that appsrc will push out. This needs to
be configured with the caps property. The caps must be set to a
fixed caps and will be used to negotiate a format downstream.
</para>
</listitem>
<listitem>
<para>
It the appsrc operates in live mode or not. This can be configured
with the is-live property. When operating in live-mode it is
important to configure the min-latency and max-latency in appsrc.
The min-latency should be set to the amount of time it takes between
capturing a buffer and when it is pushed inside appsrc.
In live mode, you should timestamp the buffers with the pipeline
running-time when the first byte of the buffer was captured before
feeding them to appsrc. You can let appsrc do the timestaping with
the do-timestamp property (but then the min-latency must be set
to 0 because it timestamps based on the running-time when the buffer
entered appsrc).
</para>
</listitem>
<listitem>
<para>
The format of the SEGMENT event that appsrc will push. The format
has implications for how the running-time of the buffers will
be calculated so you must be sure you understand this. For
live sources you probably want to set the format property to
GST_FORMAT_TIME. For non-live source it depends on the media type
that you are handling. If you plan to timestamp the buffers, you
should probably put a GST_FORMAT_TIME format, otherwise
GST_FORMAT_BYTES might be appropriate.
</para>
</listitem>
<listitem>
<para>
If appsrc operates in random-access mode, it is important to configure
the size property of appsrc with the number of bytes in the stream.
This will allow downstream elements to know the size of the media and
alows them to seek to the end of the stream when needed.
</para>
</listitem>
</itemizedlist>
<para>
The main way of handling data to appsrc is by using the function
<function>gst_app_src_push_buffer ()</function> or by emiting the
push-buffer action signal. This will put the buffer onto a queue from
which appsrc will read from in its streaming thread. It is important
to note that data transport will not happen from the thread that
performed the push-buffer call.
</para>
<para>
The <quote>max-bytes</quote> property controls how much data can be
queued in appsrc before appsrc considers the queue full. A filled
internal queue will always signal the <quote>enough-data</quote>
signal, which signals the application that it should stop pushing
data into appsrc. The <quote>block</quote> property will cause appsrc to
block the push-buffer method until free data becomes available again.
</para>
<para>
When the internal queue is running out of data, the
<quote>need-data</quote> signal is emitted, which signals the application
that it should start pushing more data into appsrc.
</para>
<para>
In addition to the <quote>need-data</quote> and <quote>enough-data</quote>
signals, appsrc can emit the <quote>seek-data</quote> signal when the
<quote>stream-mode</quote> property is set to <quote>seekable</quote>
or <quote>random-access</quote>. The signal argument will contain the
new desired position in the stream expressed in the unit set with the
<quote>format</quote> property. After receiving the seek-data signal,
the application should push-buffers from the new position.
</para>
<para>
When the last byte is pushed into appsrc, you must call
<function>gst_app_src_end_of_stream ()</function> to make it send
an EOS downstream.
</para>
<para>
These signals allow the application to operate appsrc in push and
pull mode as will be explained next.
</para>
<sect3 id="section-spoof-appsrc-push">
<title>Using appsrc in push mode</title>
<para>
When appsrc is configured in push mode (stream-type is stream or
seekable), the application repeatedly calls the push-buffer method
with a new buffer. Optionally, the queue size in the appsrc can be
controlled with the enough-data and need-data signals by respectively
stopping/starting the push-buffer calls. The value of the
min-percent property defines how empty the internal appsrc queue
needs to be before the need-data signal will be fired. You can set
this to some value >0 to avoid completely draining the queue.
</para>
<para>
When the stream-type is set to seekable, don't forget to implement
a seek-data callback.
</para>
<para>
Use this model when implementing various network protocols or
hardware devices.
</para>
</sect3>
<sect3 id="section-spoof-appsrc-pull">
<title>Using appsrc in pull mode</title>
<para>
In the pull model, data is fed to appsrc from the need-data signal
handler. You should push exactly the amount of bytes requested in the
need-data signal. You are only allowed to push less bytes when you are
at the end of the stream.
</para>
<para>
Use this model for file access or other randomly accessable sources.
</para>
</sect3>
<sect3 id="section-spoof-appsrc-ex">
<title>Appsrc example</title>
<para>
This example application will generate black/white (it switches
every second) video to an Xv-window output by using appsrc as a
source with caps to force a format. We use a colorspace
conversion element to make sure that we feed the right format to
your X server. We configure a video stream with a variable framerate
(0/1) and we set the timestamps on the outgoing buffers in such
a way that we play 2 frames per second.
</para>
<para>
Note how we use the pull mode method of pushing new buffers into
appsrc although appsrc is running in push mode.
</para>
<programlisting><!-- example-begin appsrc.c -->
#include &lt;gst/gst.h&gt;
static GMainLoop *loop;
static void
cb_need_data (GstElement *appsrc,
guint unused_size,
gpointer user_data)
{
static gboolean white = FALSE;
static GstClockTime timestamp = 0;
GstBuffer *buffer;
guint size;
GstFlowReturn ret;
size = 385 * 288 * 2;
buffer = gst_buffer_new_allocate (NULL, size, NULL);
/* this makes the image black/white */
gst_buffer_memset (buffer, 0, white ? 0xff : 0x0, size);
white = !white;
GST_BUFFER_PTS (buffer) = timestamp;
GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 2);
timestamp += GST_BUFFER_DURATION (buffer);
g_signal_emit_by_name (appsrc, "push-buffer", buffer, &amp;ret);
if (ret != GST_FLOW_OK) {
/* something wrong, stop pushing */
g_main_loop_quit (loop);
}
}
gint
main (gint argc,
gchar *argv[])
{
GstElement *pipeline, *appsrc, *conv, *videosink;
/* init GStreamer */
gst_init (&amp;argc, &amp;argv);
loop = g_main_loop_new (NULL, FALSE);
/* setup pipeline */
pipeline = gst_pipeline_new ("pipeline");
appsrc = gst_element_factory_make ("appsrc", "source");
conv = gst_element_factory_make ("videoconvert", "conv");
videosink = gst_element_factory_make ("xvimagesink", "videosink");
/* setup */
g_object_set (G_OBJECT (appsrc), "caps",
gst_caps_new_simple ("video/x-raw",
"format", G_TYPE_STRING, "RGB16",
"width", G_TYPE_INT, 384,
"height", G_TYPE_INT, 288,
"framerate", GST_TYPE_FRACTION, 0, 1,
NULL), NULL);
gst_bin_add_many (GST_BIN (pipeline), appsrc, conv, videosink, NULL);
gst_element_link_many (appsrc, conv, videosink, NULL);
/* setup appsrc */
g_object_set (G_OBJECT (appsrc),
"stream-type", 0,
"format", GST_FORMAT_TIME, NULL);
g_signal_connect (appsrc, "need-data", G_CALLBACK (cb_need_data), NULL);
/* play */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
g_main_loop_run (loop);
/* clean up */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (GST_OBJECT (pipeline));
g_main_loop_unref (loop);
return 0;
}
<!-- example-end appsrc.c --></programlisting>
</sect3>
</sect2>
<sect2 id="section-spoof-appsink">
<title>Grabbing data with appsink</title>
<para>
Unlike appsrc, appsink is a little easier to use. It also supports
a pull and push based model of getting data from the pipeline.
</para>
<para>
The normal way of retrieving samples from appsink is by using the
<function>gst_app_sink_pull_sample()</function> and
<function>gst_app_sink_pull_preroll()</function> methods or by using
the <quote>pull-sample</quote> and <quote>pull-preroll</quote>
signals. These methods block until a sample becomes available in the
sink or when the sink is shut down or reaches EOS.
</para>
<para>
Appsink will internally use a queue to collect buffers from the
streaming thread. If the application is not pulling samples fast
enough, this queue will consume a lot of memory over time. The
<quote>max-buffers</quote> property can be used to limit the queue
size. The <quote>drop</quote> property controls whether the
streaming thread blocks or if older buffers are dropped when the
maximum queue size is reached. Note that blocking the streaming thread
can negatively affect real-time performance and should be avoided.
</para>
<para>
If a blocking behaviour is not desirable, setting the
<quote>emit-signals</quote> property to TRUE will make appsink emit
the <quote>new-sample</quote> and <quote>new-preroll</quote> signals
when a sample can be pulled without blocking.
</para>
<para>
The <quote>caps</quote> property on appsink can be used to control
the formats that appsink can receive. This property can contain
non-fixed caps, the format of the pulled samples can be obtained by
getting the sample caps.
</para>
<para>
If one of the pull-preroll or pull-sample methods return NULL, the
appsink is stopped or in the EOS state. You can check for the EOS state
with the <quote>eos</quote> property or with the
<function>gst_app_sink_is_eos()</function> method.
</para>
<para>
The eos signal can also be used to be informed when the EOS state is
reached to avoid polling.
</para>
<para>
Consider configuring the following properties in the appsink:
</para>
<itemizedlist>
<listitem>
<para>
The <quote>sync</quote> property if you want to have the sink
base class synchronize the buffer against the pipeline clock
before handing you the sample.
</para>
</listitem>
<listitem>
<para>
Enable Quality-of-Service with the <quote>qos</quote> property.
If you are dealing with raw video frames and let the base class
sycnhronize on the clock, it might be a good idea to also let
the base class send QOS events upstream.
</para>
</listitem>
<listitem>
<para>
The caps property that contains the accepted caps. Upstream elements
will try to convert the format so that it matches the configured
caps on appsink. You must still check the
<classname>GstSample</classname> to get the actual caps of the
buffer.
</para>
</listitem>
</itemizedlist>
<sect3 id="section-spoof-appsink-ex">
<title>Appsink example</title>
<para>
What follows is an example on how to capture a snapshot of a video
stream using appsink.
</para>
<programlisting>
<![CDATA[
<!-- example-begin appsink.c -->
#include <gst/gst.h>
#ifdef HAVE_GTK
#include <gtk/gtk.h>
#endif
#include <stdlib.h>
#define CAPS "video/x-raw,format=RGB,width=160,pixel-aspect-ratio=1/1"
int
main (int argc, char *argv[])
{
GstElement *pipeline, *sink;
gint width, height;
GstSample *sample;
gchar *descr;
GError *error = NULL;
gint64 duration, position;
GstStateChangeReturn ret;
gboolean res;
GstMapInfo map;
gst_init (&argc, &argv);
if (argc != 2) {
g_print ("usage: %s <uri>\n Writes snapshot.png in the current directory\n",
argv[0]);
exit (-1);
}
/* create a new pipeline */
descr =
g_strdup_printf ("uridecodebin uri=%s ! videoconvert ! videoscale ! "
" appsink name=sink caps=\"" CAPS "\"", argv[1]);
pipeline = gst_parse_launch (descr, &error);
if (error != NULL) {
g_print ("could not construct pipeline: %s\n", error->message);
g_error_free (error);
exit (-1);
}
/* get sink */
sink = gst_bin_get_by_name (GST_BIN (pipeline), "sink");
/* set to PAUSED to make the first frame arrive in the sink */
ret = gst_element_set_state (pipeline, GST_STATE_PAUSED);
switch (ret) {
case GST_STATE_CHANGE_FAILURE:
g_print ("failed to play the file\n");
exit (-1);
case GST_STATE_CHANGE_NO_PREROLL:
/* for live sources, we need to set the pipeline to PLAYING before we can
* receive a buffer. We don't do that yet */
g_print ("live sources not supported yet\n");
exit (-1);
default:
break;
}
/* This can block for up to 5 seconds. If your machine is really overloaded,
* it might time out before the pipeline prerolled and we generate an error. A
* better way is to run a mainloop and catch errors there. */
ret = gst_element_get_state (pipeline, NULL, NULL, 5 * GST_SECOND);
if (ret == GST_STATE_CHANGE_FAILURE) {
g_print ("failed to play the file\n");
exit (-1);
}
/* get the duration */
gst_element_query_duration (pipeline, GST_FORMAT_TIME, &duration);
if (duration != -1)
/* we have a duration, seek to 5% */
position = duration * 5 / 100;
else
/* no duration, seek to 1 second, this could EOS */
position = 1 * GST_SECOND;
/* seek to the a position in the file. Most files have a black first frame so
* by seeking to somewhere else we have a bigger chance of getting something
* more interesting. An optimisation would be to detect black images and then
* seek a little more */
gst_element_seek_simple (pipeline, GST_FORMAT_TIME,
GST_SEEK_FLAG_KEY_UNIT | GST_SEEK_FLAG_FLUSH, position);
/* get the preroll buffer from appsink, this block untils appsink really
* prerolls */
g_signal_emit_by_name (sink, "pull-preroll", &sample, NULL);
/* if we have a buffer now, convert it to a pixbuf. It's possible that we
* don't have a buffer because we went EOS right away or had an error. */
if (sample) {
GstBuffer *buffer;
GstCaps *caps;
GstStructure *s;
/* get the snapshot buffer format now. We set the caps on the appsink so
* that it can only be an rgb buffer. The only thing we have not specified
* on the caps is the height, which is dependant on the pixel-aspect-ratio
* of the source material */
caps = gst_sample_get_caps (sample);
if (!caps) {
g_print ("could not get snapshot format\n");
exit (-1);
}
s = gst_caps_get_structure (caps, 0);
/* we need to get the final caps on the buffer to get the size */
res = gst_structure_get_int (s, "width", &width);
res |= gst_structure_get_int (s, "height", &height);
if (!res) {
g_print ("could not get snapshot dimension\n");
exit (-1);
}
/* create pixmap from buffer and save, gstreamer video buffers have a stride
* that is rounded up to the nearest multiple of 4 */
buffer = gst_sample_get_buffer (sample);
gst_buffer_map (buffer, &map, GST_MAP_READ);
#ifdef HAVE_GTK
pixbuf = gdk_pixbuf_new_from_data (map.data,
GDK_COLORSPACE_RGB, FALSE, 8, width, height,
GST_ROUND_UP_4 (width * 3), NULL, NULL);
/* save the pixbuf */
gdk_pixbuf_save (pixbuf, "snapshot.png", "png", &error, NULL);
#endif
gst_buffer_unmap (buffer, &map);
} else {
g_print ("could not make snapshot\n");
}
/* cleanup and exit */
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
exit (0);
}
]]>
<!-- example-end appsink.c --></programlisting>
</sect3>
</sect2>
</sect1>
<sect1 id="section-spoof-format">
<title>Forcing a format</title>
<para>
Sometimes you'll want to set a specific format, for example a video
size and format or an audio bitsize and number of channels. You can
do this by forcing a specific <classname>GstCaps</classname> on
the pipeline, which is possible by using
<emphasis>filtered caps</emphasis>. You can set a filtered caps on
a link by using the <quote>capsfilter</quote> element in between the
two elements, and specifying a <classname>GstCaps</classname> as
<quote>caps</quote> property on this element. It will then
only allow types matching that specified capability set for
negotiation. See also <xref linkend="section-caps-filter"/>.
</para>
</sect1>
<sect1 id="section-spoof-probes">
<title>Dynamically changing the pipeline</title>
<para>
WRITEME
</para>
</sect1>
<sect1 id="section-data-manager">
<title>Embedding static elements in your application</title>
<para>
The <ulink type="http"
url="http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/index.html">Plugin
Writer's Guide</ulink> describes in great detail how to write elements
for the &GStreamer; framework. In this section, we will solely discuss
how to embed such elements statically in your application. This can be
useful for application-specific elements that have no use elsewhere in
&GStreamer;.
</para>
<para>
Dynamically loaded plugins contain a structure that's defined using
<function>GST_PLUGIN_DEFINE ()</function>. This structure is loaded
when the plugin is loaded by the &GStreamer; core. The structure
contains an initialization function (usually called
<function>plugin_init</function>) that will be called right after that.
It's purpose is to register the elements provided by the plugin with
the &GStreamer; framework.
If you want to embed elements directly in
your application, the only thing you need to do is to replace
<function>GST_PLUGIN_DEFINE ()</function> with a call to
<function>gst_plugin_register_static ()</function>. As soon as you
call <function>gst_plugin_register_static ()</function>, the elements
will from then on be available like any other element, without them
having to be dynamically loadable libraries. In the example below, you
would be able to call <function>gst_element_factory_make
("my-element-name", "some-name")</function> to create an instance of the
element.
</para>
<programlisting>
/*
* Here, you would write the actual plugin code.
*/
[..]
static gboolean
register_elements (GstPlugin *plugin)
{
return gst_element_register (plugin, "my-element-name",
GST_RANK_NONE, MY_PLUGIN_TYPE);
}
static
my_code_init (void)
{
...
gst_plugin_register_static (
GST_VERSION_MAJOR,
GST_VERSION_MINOR,
"my-private-plugins",
"Private elements of my application",
register_elements,
VERSION,
"LGPL",
"my-application-source",
"my-application",
"http://www.my-application.net/")
...
}
</programlisting>
</sect1>
</chapter>