Basic Concepts This chapter of the guide introduces the basic concepts of &GStreamer;. Understanding these concepts will help you grok the issues involved in extending &GStreamer;. Many of these concepts are explained in greater detail in the &GstAppDevMan;; the basic concepts presented here serve mainly to refresh your memory. Elements and Plugins Elements are at the core of &GStreamer;. In the context of plugin development, an element is an object derived from the GstElement class. Elements provide some sort of functionality when linked with other elements: For example, a source element provides data to a stream, and a filter element acts on the data in a stream. Without elements, &GStreamer; is just a bunch of conceptual pipe fittings with nothing to link. A large number of elements ship with &GStreamer;, but extra elements can also be written. Just writing a new element is not entirely enough, however: You will need to encapsulate your element in a plugin to enable &GStreamer; to use it. A plugin is essentially a loadable block of code, usually called a shared object file or a dynamically linked library. A single plugin may contain the implementation of several elements, or just a single one. For simplicity, this guide concentrates primarily on plugins containing one element. A filter is an important type of element that processes a stream of data. Producers and consumers of data are called source and sink elements, respectively. Bin elements contain other elements. One type of bin is responsible for scheduling the elements that they contain so that data flows smoothly. Another type of bin, called autoplugger elements, automatically add other elements to the bin and link them together so that they act as a filter between two arbitary stream types. The plugin mechanism is used everywhere in &GStreamer;, even if only the standard packages are being used. A few very basic functions reside in the core library, and all others are implemented in plugins. A plugin registry is used to store the details of the plugins in an XML file. This way, a program using &GStreamer; does not have to load all plugins to determine which are needed. Plugins are only loaded when their provided elements are requested. See the &GstLibRef; for the current implementation details of GstElement and GstPlugin. Pads Pads are used to negotiate links and data flow between elements in &GStreamer;. A pad can be viewed as a place or port on an element where links may be made with other elements, and through which data can flow to or from those elements. Pads have specific data handling capabilities: A pad can restrict the type of data that flows through it. Links are only allowed between two pads when the allowed data types of the two pads are compatible. An analogy may be helpful here. A pad is similar to a plug or jack on a physical device. Consider, for example, a home theater system consisting of an amplifier, a DVD player, and a (silent) video projector. Linking the DVD player to the amplifier is allowed because both devices have audio jacks, and linking the projector to the DVD player is allowed because both devices have compatible video jacks. Links between the projector and the amplifier may not be made because the projector and amplifier have different types of jacks. Pads in &GStreamer; serve the same purpose as the jacks in the home theater system. For the most part, all data in &GStreamer; flows one way through a link between elements. Data flows out of one element through one or more source pads, and elements accept incoming data through one or more sink pads. Source and sink elements have only source and sink pads, respectively. See the &GstLibRef; for the current implementation details of a GstPad. Buffers All streams of data in &GStreamer; are chopped up into chunks that are passed from a source pad on one element to a sink pad on another element. Buffers are structures used to hold these chunks of data. Buffers can be of any size, theoretically, and they may contain any sort of data that the two linked pads know how to handle. Normally, a buffer contains a chunk of some sort of audio or video data that flows from one element to another. Buffers also contain metadata describing the buffer's contents. Some of the important types of metadata are: A pointer to the buffer's data. An integer indicating the size of the buffer's data. A GstData object describing the type of the buffer's data. A reference count indicating the number of elements currently holding a reference to the buffer. When the buffer reference count falls to zero, the buffer will be unlinked, and its memory will be freed in some sense (see below for more details). See the &GstLibRef; for the current implementation details of a GstBuffer. Buffer Allocation and Buffer Pools Buffers can be allocated using various schemes, and they may either be passed on by an element or unreferenced, thus freeing the memory used by the buffer. Buffer allocation and unlinking are important concepts when dealing with real time media processing, since memory allocation is relatively slow on most systems. To improve the latency in a media pipeline, many &GStreamer; elements use a buffer pool to handle buffer allocation and unlinking. A buffer pool is a relatively large chunk of memory that is the &GStreamer; process requests early on from the operating system. Later, when elements request memory for a new buffer, the buffer pool can serve the request quickly by giving out a piece of the allocated memory. This saves a call to the operating system and lowers latency. [If it seems at this point like &GStreamer; is acting like an operating system (doing memory management, etc.), don't worry: &GStreamer;OS isn't due out for quite a few years!] Normally in a media pipeline, most filter elements in &GStreamer; deal with a buffer in place, meaning that they do not create or destroy buffers. Sometimes, however, elements might need to alter the reference count of a buffer, either by copying or destroying the buffer, or by creating a new buffer. These topics are generally reserved for non-filter elements, so they will be addressed at that point. Types and Properties &GStreamer; uses a type system to ensure that the data passed between elements is in a recognized format. The type system is also important for ensuring that the parameters required to fully specify a format match up correctly when linking pads between elements. Each link that is made between elements has a specified type. The Basic Types &GStreamer; already supports many basic media types. Following is a table of the basic types used for buffers in &GStreamer;. The table contains the name ("mime type") and a description of the type, the properties associated with the type, and the meaning of each property. Table of Basic Types Mime Type Description Property Property Type Property Values Property Description audio/raw Unstructured and uncompressed raw audio data. rate integer greater than 0 The sample rate of the data, in samples per second. channels integer greater than 0 The number of channels of audio data. format string int or float The format in which the audio data is passed. law integer 0, 1, or 2 (Valid only if the data is in integer format.) The law used to describe the data. The value 0 indicates linear, 1 indicates mu law, and 2 indicates A law. endianness boolean 0 or 1 (Valid only if the data is in integer format.) The order of bytes in a sample. The value 0 means little-endian (bytes are least significant first). The value 1 means big-endian (most significant byte first). signed boolean 0 or 1 (Valid only if the data is in integer format.) Whether the samples are signed or not. width integer greater than 0 (Valid only if the data is in integer format.) The number of bits per sample. depth integer greater than 0 (Valid only if the data is in integer format.) The number of bits used per sample. This must be less than or equal to the width: If the depth is less than the width, the low bits are assumed to be the ones used. For example, a width of 32 and a depth of 24 means that each sample is stored in a 32 bit word, but only the low 24 bits are actually used. layout string gfloat (Valid only if the data is in float format.) A string representing the way in which the floating point data is represented. intercept float any, normally 0 (Valid only if the data is in float format.) A floating point value representing the value that the signal centers on. slope float any, normally 1.0 (Valid only if the data is in float format.) A floating point value representing how far the signal deviates from the intercept. A slope of 1.0 and an intercept of 0.0 would mean an audio signal with minimum and maximum values of -1.0 and 1.0. A slope of 0.5 and intercept of 0.5 would represent values in the range 0.0 to 1.0. audio/mp3 Audio data compressed using the mp3 encoding scheme. framed boolean 0 or 1 A true value indicates that each buffer contains exactly one frame. A false value indicates that frames and buffers do not necessarily match up. layer integer 1, 2, or 3 The compression scheme layer used to compress the data. bitrate integer greater than 0 The bitrate, in kilobits per second. For VBR (variable bitrate) mp3 data, this is the average bitrate. channels integer greater than 0 The number of channels of audio data present. joint-stereo boolean 0 or 1 If true, this implies that stereo data is stored as a combined signal and the difference between the signals, rather than as two entirely separate signals. If true, the channels attribute must not be zero. audio/x-ogg Audio data compressed using the Ogg Vorbis encoding scheme. FIXME: There are currently no parameters defined for this type. video/raw Raw video data. fourcc FOURCC code A FOURCC code identifying the format in which this data is stored. FOURCC (Four Character Code) is a simple system to allow unambiguous identification of a video datastream format. See http://www.webartz.com/fourcc/ width integer greater than 0 The number of pixels wide that each video frame is. height integer greater than 0 The number of pixels high that each video frame is. video/mpeg Video data compressed using an MPEG encoding scheme. FIXME: There are currently no parameters defined for this type. video/avi Video data compressed using the AVI encoding scheme. FIXME: There are currently no parameters defined for this type.
Events Sometimes elements in a media processing pipeline need to know that something has happened. An event is a special type of data in &GStreamer; designed to serve this purpose. Events describe some sort of activity that has happened somewhere in an element's pipeline, for example, the end of the media stream or a clock discontinuity. Just like any other data type, an event comes to an element on a sink pad and is contained in a normal buffer. Unlike normal stream buffers, though, an event buffer contains only an event, not any media stream data. See the &GstLibRef; for the current implementation details of a GstEvent.