Threads
GStreamer has support for multithreading through the use of
the GstThread object. This object is in fact
a special GstBin that will become a thread when started.
To construct a new thread you will perform something like:
GstElement *my_thread;
/* create the thread object */
my_thread = gst_thread_new ("my_thread");
/* you could have used gst_element_factory_make ("thread", "my_thread"); */
g_return_if_fail (my_thread != NULL);
/* add some plugins */
gst_bin_add (GST_BIN (my_thread), GST_ELEMENT (funky_src));
gst_bin_add (GST_BIN (my_thread), GST_ELEMENT (cool_effect));
/* connect the elements here... */
...
/* start playing */
gst_element_set_state (GST_ELEMENT (my_thread), GST_STATE_PLAYING);
The above program will create a thread with two elements in it. As soon
as it is set to the PLAYING state, the thread will start to iterate
itself. You never need to explicitly iterate a thread.
Constraints placed on the pipeline by the GstThread
Within the pipeline, everything is the same as in any other bin. The
difference lies at the thread boundary, at the connection between the
thread and the outside world (containing bin). Since GStreamer is
fundamentally buffer-oriented rather than byte-oriented, the natural
solution to this problem is an element that can "buffer" the buffers
between the threads, in a thread-safe fashion. This element is the
queue, described more fully in . It doesn't
matter if the queue is placed in the containing bin or in the thread
itself, but it needs to be present on one side or the other to enable
inter-thread communication.
When would you want to use a thread?
If you are writing a GUI application, making the top-level bin a thread will make your GUI
more responsive. If it were a pipeline instead, it would have to be iterated by your
application's event loop, which increases the latency between events (say, keyboard presses)
and responses from the GUI. In addition, any slight hang in the GUI would delay iteration of
the pipeline, which (for example) could cause pops in the output of the sound card, if it is
an audio pipeline.
A thread can be visualised as below
As an example we show the helloworld program using a thread.
/* example-begin threads.c */
#include <gst/gst.h>
/* we set this to TRUE right before gst_main (), but there could still
be a race condition between setting it and entering the function */
gboolean can_quit = FALSE;
/* eos will be called when the src element has an end of stream */
void
eos (GstElement *src, gpointer data)
{
GstThread *thread = GST_THREAD (data);
g_print ("have eos, quitting\n");
/* stop the bin */
gst_element_set_state (GST_ELEMENT (thread), GST_STATE_NULL);
while (!can_quit) /* waste cycles */ ;
gst_main_quit ();
}
int
main (int argc, char *argv[])
{
GstElement *filesrc, *decoder, *audiosink;
GstElement *thread;
if (argc < 2) {
g_print ("usage: %s <Ogg/Vorbis filename>\n", argv[0]);
exit (-1);
}
gst_init (&argc, &argv);
/* create a new thread to hold the elements */
thread = gst_thread_new ("thread");
g_assert (thread != NULL);
/* create a disk reader */
filesrc = gst_element_factory_make ("filesrc", "disk_source");
g_assert (filesrc != NULL);
g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL);
g_signal_connect (G_OBJECT (filesrc), "eos",
G_CALLBACK (eos), thread);
/* create an ogg decoder */
decoder = gst_element_factory_make ("vorbisfile", "decoder");
g_assert (decoder != NULL);
/* and an audio sink */
audiosink = gst_element_factory_make ("osssink", "play_audio");
g_assert (audiosink != NULL);
/* add objects to the thread */
gst_bin_add_many (GST_BIN (thread), filesrc, decoder, audiosink, NULL);
/* connect them in the logical order */
gst_element_link_many (filesrc, decoder, audiosink, NULL);
/* start playing */
gst_element_set_state (GST_ELEMENT (thread), GST_STATE_PLAYING);
/* do whatever you want here, the thread will be playing */
g_print ("thread is playing\n");
can_quit = TRUE;
gst_main ();
gst_pipeline_destroy (thread);
exit (0);
}
/* example-end threads.c */