Threads GStreamer has support for multithreading through the use of the GstThread object. This object is in fact a special GstBin that will start a new thread (using Glib's GThread system) when started. To create a new thread, you can simply use gst_thread_new (). From then on, you can use it similar to how you would use a GstBin. You can add elements to it, change state and so on. The largest difference between a thread and other bins is that the thread does not require iteration. Once set to the GST_STATE_PLAYING state, it will iterate its contained children elements automatically. shows how a thread can be visualised.
A thread
When would you want to use a thread? There are several reasons to use threads. However, there's also some reasons to limit the use of threads as much as possible. We will go into the drawbacks of threading in &GStreamer; in the next section. Let's first list some situations where threads can be useful: Data buffering, for example when dealing with network streams or when recording data from a live stream such as a video or audio card. Short hickups elsewhere in the pipeline will not cause data loss. See for a visualization of this idea. Synchronizing output devices, e.g. when playing a stream containing both video and audio data. By using threads for both outputs, they will run independently and their synchronization will be better. Data pre-rolls. You can use threads and queues (thread boundaries) to cache a few seconds of data before playing. By using this approach, the whole pipeline will already be setup and data will already be decoded. When activating the rest of the pipeline, the switch from PAUSED to PLAYING will be instant.
a two-threaded decoder with a queue
Above, we've mentioned the queue element several times now. A queue is a thread boundary element. It does so by using a classic provider/receiver model as learned in threading classes at universities all around the world. By doing this, it acts both as a means to make data throughput between threads threadsafe, and it can also act as a buffer. Queues have several GObject properties to be configured for specific uses. For example, you can set lower and upper tresholds for the element. If there's less data than the lower treshold (default: disabled), it will block output. If there's more data than the upper treshold, it will block input or (if configured to do so) drop data.
Constraints placed on the pipeline by the GstThread Within the pipeline, everything is the same as in any other bin. The difference lies at the thread boundary, at the link between the thread and the outside world (containing bin). Since &GStreamer; is fundamentally buffer-oriented rather than byte-oriented, the natural solution to this problem is an element that can "buffer" the buffers between the threads, in a thread-safe fashion. This element is the queue element. A queue should be placed in between any two elements whose pads are linked together while the elements live in different threads. It doesn't matter if the queue is placed in the containing bin or in the thread itself, but it needs to be present on one side or the other to enable inter-thread communication. If you are writing a GUI application, making the top-level bin a thread will make your GUI more responsive. If it were a pipeline instead, it would have to be iterated by your application's event loop, which increases the latency between events (say, keyboard presses) and responses from the GUI. In addition, any slight hang in the GUI would delay iteration of the pipeline, which (for example) could cause pops in the output of the sound card, if it is an audio pipeline. A problem with using threads is, however, thread contexts. If you connect to a signal that is emitted inside a thread, then the signal handler for this thread will be executed in that same thread! This is very important to remember, because many graphical toolkits can not run multi-threaded. Gtk+, for example, only allows threaded access to UI objects if you explicitely use mutexes. Not doing so will result in random crashes and X errors. A solution many people use is to place an idle handler in the signal handler, and have the actual signal emission code be executed in the idle handler, which will be executed from the mainloop. Generally, if you use threads, you will encounter some problems. Don't hesistate to ask us for help in case of problems. A threaded example application As an example we show the helloworld program that we coded in using a thread. Note that the whole application lives in a thread (as opposed to half of the application living in a thread and the other half being another thread or a pipeline). Therefore, it does not need a queue element in this specific case. #include <gst/gst.h> GstElement *thread, *source, *decodebin, *audiosink; static gboolean idle_eos (gpointer data) { g_print ("Have idle-func in thread %p\n", g_thread_self ()); gst_main_quit (); /* do this function only once */ return FALSE; } /* * EOS will be called when the src element has an end of stream. * Note that this function will be called in the thread context. * We will place an idle handler to the function that really * quits the application. */ static void cb_eos (GstElement *thread, gpointer data) { g_print ("Have eos in thread %p\n", g_thread_self ()); g_idle_add ((GSourceFunc) idle_eos, NULL); } /* * On error, too, you'll want to forward signals to the main * thread, especially when using GUI applications. */ static void cb_error (GstElement *thread, GstElement *source, GError *error, gchar *debug, gpointer data) { g_print ("Error in thread %p: %s\n", g_thread_self (), error->message); g_idle_add ((GSourceFunc) idle_eos, NULL); } /* * Link new pad from decodebin to audiosink. * Contains no further error checking. */ static void cb_newpad (GstElement *decodebin, GstPad *pad, gboolean last, gpointer data) { gst_pad_link (pad, gst_element_get_pad (audiosink, "sink")); gst_bin_add (GST_BIN (thread), audiosink); gst_bin_sync_children_state (GST_BIN (thread)); } gint main (gint argc, gchar *argv[]) { /* init GStreamer */ gst_init (&argc, &argv); /* make sure we have a filename argument */ if (argc != 2) { g_print ("usage: %s <Ogg/Vorbis filename>\n", argv[0]); return -1; } /* create a new thread to hold the elements */ thread = gst_thread_new ("thread"); g_signal_connect (thread, "eos", G_CALLBACK (cb_eos), NULL); g_signal_connect (thread, "error", G_CALLBACK (cb_error), NULL); /* create elements */ source = gst_element_factory_make ("filesrc", "source"); g_object_set (G_OBJECT (source), "location", argv[1], NULL); decodebin = gst_element_factory_make ("decodebin", "decoder"); g_signal_connect (decodebin, "new-decoded-pad", G_CALLBACK (cb_newpad), NULL); audiosink = gst_element_factory_make ("alsasink", "audiosink"); /* setup */ gst_bin_add_many (GST_BIN (thread), source, decodebin, NULL); gst_element_link (source, decodebin); gst_element_set_state (audiosink, GST_STATE_PAUSED); gst_element_set_state (thread, GST_STATE_PLAYING); /* no need to iterate. We can now use a mainloop */ gst_main (); /* unset */ gst_element_set_state (thread, GST_STATE_NULL); gst_object_unref (GST_OBJECT (thread)); return 0; }