Threads GStreamer has support for multithreading through the use of the GstThread object. This object is in fact a special GstBin that will become a thread when started. To construct a new thread you will perform something like: GstElement *my_thread; /* create the thread object */ my_thread = gst_thread_new ("my_thread"); /* you could have used gst_element_factory_make ("thread", "my_thread"); */ g_return_if_fail (my_thread != NULL); /* add some plugins */ gst_bin_add (GST_BIN (my_thread), GST_ELEMENT (funky_src)); gst_bin_add (GST_BIN (my_thread), GST_ELEMENT (cool_effect)); /* link the elements here... */ ... /* start playing */ gst_element_set_state (GST_ELEMENT (my_thread), GST_STATE_PLAYING); The above program will create a thread with two elements in it. As soon as it is set to the PLAYING state, the thread will start to iterate itself. You never need to explicitly iterate a thread. Constraints placed on the pipeline by the GstThread Within the pipeline, everything is the same as in any other bin. The difference lies at the thread boundary, at the link between the thread and the outside world (containing bin). Since GStreamer is fundamentally buffer-oriented rather than byte-oriented, the natural solution to this problem is an element that can "buffer" the buffers between the threads, in a thread-safe fashion. This element is the queue, described more fully in . It doesn't matter if the queue is placed in the containing bin or in the thread itself, but it needs to be present on one side or the other to enable inter-thread communication. When would you want to use a thread? If you are writing a GUI application, making the top-level bin a thread will make your GUI more responsive. If it were a pipeline instead, it would have to be iterated by your application's event loop, which increases the latency between events (say, keyboard presses) and responses from the GUI. In addition, any slight hang in the GUI would delay iteration of the pipeline, which (for example) could cause pops in the output of the sound card, if it is an audio pipeline. shows how a thread can be visualised.
A thread
As an example we show the helloworld program using a thread. #include <gst/gst.h> /* we set this to TRUE right before gst_main (), but there could still be a race condition between setting it and entering the function */ gboolean can_quit = FALSE; /* eos will be called when the src element has an end of stream */ void eos (GstElement *src, gpointer data) { GstThread *thread = GST_THREAD (data); g_print ("have eos, quitting\n"); /* stop the bin */ gst_element_set_state (GST_ELEMENT (thread), GST_STATE_NULL); while (!can_quit) /* waste cycles */ ; gst_main_quit (); } int main (int argc, char *argv[]) { GstElement *filesrc, *demuxer, *decoder, *converter, *audiosink; GstElement *thread; if (argc < 2) { g_print ("usage: %s <Ogg/Vorbis filename>\n", argv[0]); exit (-1); } gst_init (&argc, &argv); /* create a new thread to hold the elements */ thread = gst_thread_new ("thread"); g_assert (thread != NULL); /* create a disk reader */ filesrc = gst_element_factory_make ("filesrc", "disk_source"); g_assert (filesrc != NULL); g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL); g_signal_connect (G_OBJECT (filesrc), "eos", G_CALLBACK (eos), thread); /* create an ogg demuxer */ demuxer = gst_element_factory_make ("oggdemux", "demuxer"); g_assert (demuxer != NULL); /* create a vorbis decoder */ decoder = gst_element_factory_make ("vorbisdec", "decoder"); g_assert (decoder != NULL); /* create an audio converter */ converter = gst_element_factory_make ("audioconvert", "converter"); g_assert (decoder != NULL); /* and an audio sink */ audiosink = gst_element_factory_make ("osssink", "play_audio"); g_assert (audiosink != NULL); /* add objects to the thread */ gst_bin_add_many (GST_BIN (thread), filesrc, demuxer, decoder, converter, audiosink, NULL); /* link them in the logical order */ gst_element_link_many (filesrc, demuxer, decoder, converter, audiosink, NULL); /* start playing */ gst_element_set_state (thread, GST_STATE_PLAYING); /* do whatever you want here, the thread will be playing */ g_print ("thread is playing\n"); can_quit = TRUE; gst_main (); gst_object_unref (GST_OBJECT (thread)); exit (0); }
Queue A queue is a filter element. Queues can be used to link two elements in such way that the data can be buffered. A buffer that is sinked to a Queue will not automatically be pushed to the next linked element but will be buffered. It will be pushed to the next element as soon as a gst_pad_pull () is called on the queue's source pad. Queues are mostly used in conjunction with a thread bin to provide an external link for the thread's elements. You could have one thread feeding buffers into a queue and another thread repeatedly pulling on the queue to feed its internal elements. Below is a figure of a two-threaded decoder. We have one thread (the main execution thread) reading the data from a file, and another thread decoding the data.
a two-threaded decoder with a queue
The standard GStreamer queue implementation has some properties that can be changed using the g_objet_set () method. To set the maximum number of buffers that can be queued to 30, do: g_object_set (G_OBJECT (queue), "max_level", 30, NULL); The following MP3 player shows you how to create the above pipeline using a thread and a queue. #include <stdlib.h> #include <gst/gst.h> gboolean playing; /* eos will be called when the src element has an end of stream */ void eos (GstElement *element, gpointer data) { g_print ("have eos, quitting\n"); playing = FALSE; } int main (int argc, char *argv[]) { GstElement *filesrc, *audiosink, *queue, *decode; GstElement *bin; GstElement *thread; gst_init (&argc,&argv); if (argc != 2) { g_print ("usage: %s <mp3 filename>\n", argv[0]); exit (-1); } /* create a new thread to hold the elements */ thread = gst_thread_new ("thread"); g_assert (thread != NULL); /* create a new bin to hold the elements */ bin = gst_bin_new ("bin"); g_assert (bin != NULL); /* create a disk reader */ filesrc = gst_element_factory_make ("filesrc", "disk_source"); g_assert (filesrc != NULL); g_object_set (G_OBJECT (filesrc), "location", argv[1], NULL); g_signal_connect (G_OBJECT (filesrc), "eos", G_CALLBACK (eos), thread); queue = gst_element_factory_make ("queue", "queue"); g_assert (queue != NULL); /* and an audio sink */ audiosink = gst_element_factory_make ("osssink", "play_audio"); g_assert (audiosink != NULL); decode = gst_element_factory_make ("mad", "decode"); /* add objects to the main bin */ gst_bin_add_many (GST_BIN (thread), decode, audiosink, NULL); gst_bin_add_many (GST_BIN (bin), filesrc, queue, thread, NULL); gst_element_link (filesrc, queue); gst_element_link_many (queue, decode, audiosink, NULL); /* start playing */ gst_element_set_state (GST_ELEMENT (bin), GST_STATE_PLAYING); playing = TRUE; while (playing) { gst_bin_iterate (GST_BIN (bin)); } gst_element_set_state (GST_ELEMENT (bin), GST_STATE_NULL); return 0; }