From 91a20a90eb11166cbbcbe848f7c0000ac3ac63c8 Mon Sep 17 00:00:00 2001 From: Wim Taymans Date: Fri, 28 Sep 2012 16:03:15 +0200 Subject: [PATCH] pwg: update for 1.0 Rewrite clock part. start on interfaces --- docs/pwg/advanced-clock.xml | 205 +++++++++++++++++-------------- docs/pwg/advanced-dparams.xml | 3 + docs/pwg/advanced-interfaces.xml | 19 +-- 3 files changed, 118 insertions(+), 109 deletions(-) diff --git a/docs/pwg/advanced-clock.xml b/docs/pwg/advanced-clock.xml index 4f8ae39b44..beb6bd7790 100644 --- a/docs/pwg/advanced-clock.xml +++ b/docs/pwg/advanced-clock.xml @@ -7,83 +7,77 @@ synchronization mechanism. - - Types of time - - - There are two kinds of time in GStreamer. Clock time is an absolute time. By contrast, - element time is the relative time, - usually to the start of the current media stream. The element time - represents the time that should have a media sample that is being - processed by the element at this time. The element time is calculated by - adding an offset to the clock time. - - Clocks - GStreamer can use different clocks. Though the system time can be used - as a clock, soundcards and other devices provides a better time source. For - this reason some elements provide a clock. The method - get_clock is implemented in elements that provide - one. + Time in &GStreamer; is defined as the value returned from a particular + GstClock object from the method + gst_clock_get_time (). + + + In a typical computer, there are many sources that can be used as a + time source, e.g., the system time, soundcards, CPU performance + counters, ... For this reason, there are many + GstClock implementations available in &GStreamer;. + The clock time doesn't always start from 0 or from some known value. + Some clocks start counting from some known start date, other clocks start + counting since last reboot, etc... - As clocks return an absolute measure of time, they are not usually used - directly. Instead, a reference to a clock is stored in any element that needs - it, and it is used internally by GStreamer to calculate the element time. + directly. Instead, differences between two clock times are used to + measure elapsed time according to a clock. + + + + + Clock running-time + + A clock returns the absolute-time + according to that clock with gst_clock_get_time (). + From the absolute-time is a running-time + calculated, which is simply the difference between a previous snapshot + of the absolute-time called the base-time. + So: + + + running-time = absolute-time - base-time + + + A &GStreamer; GstPipeline object maintains a + GstClock object and a base-time when it goes + to the PLAYING state. The pipeline gives a handle to the selected + GstClock to each element in the pipeline along + with selected base-time. The pipeline will select a base-time in such + a way that the running-time reflects the total time spent in the + PLAYING state. As a result, when the pipeline is PAUSED, the + running-time stands still. + + + Because all objects in the pipeline have the same clock and base-time, + they can thus all calculate the running-time according to the pipeline + clock. + + + + + Buffer running-time + + To calculate a buffer running-time, we need a buffer timestamp and + the SEGMENT event that preceeded the buffer. First we can convert + the SEGMENT event into a GstSegment object + and then we can use the + gst_segment_to_running_time () function to + perform the calculation of the buffer running-time. + + + Synchronization is now a matter of making sure that a buffer with a + certain running-time is played when the clock reaches the same + running-time. Usually this task is done by sink elements. - - - Flow of data between elements and time - - - Now we will see how time information travels the pipeline in different states. - - - - The pipeline starts playing. - The source element typically knows the time of each sample. - - - Sometimes it - is a parser element the one that knows the time, for instance if a pipeline - contains a filesrc element connected to a MPEG decoder element, the former - is the one that knows the time of each sample, because the knowledge of - when to play each sample is embedded in the MPEG format. In this case this - element will be regarded as the source element for this discussion. - - - First, the source element sends a newsegment event. This event carries information - about the current relative time of the next sample. This relative time is - arbitrary, but it must be consistent with the timestamp that will be - placed in buffers. It is expected to be the relative time to the start - of the media stream, or whatever makes sense in the case of each media. - When receiving it, the other elements adjust their offset of the element time so that this - time matches the time written in the event. - - - - Then the source element sends media samples in buffers. This element places a - timestamp in each buffer saying when the sample should be played. When the - buffer reaches the sink pad of the last element, this element compares the - current element time with the timestamp of the buffer. If the timestamp is - higher or equal it plays the buffer, otherwise it waits until the time to - place the buffer arrives with gst_element_wait(). - - - - - If the stream is seeked, the next samples sent will have a timestamp that - is not adjusted with the element time. Therefore, the source element must - send a newsegment event. - - + @@ -96,23 +90,54 @@ </para> <sect2> - <title>Source elements + Non-live source elements - Source elements (or parsers of formats that provide notion of time, such - as MPEG, as explained above) must place a timestamp in each buffer that - they deliver. The origin of the time used is arbitrary, but it must - match the time delivered in the newsegment event (see below). - However, it is expected that the origin is the origin of the media - stream. + Non-live source elements must place a timestamp in each buffer that + they deliver when this is possible. They must choose the timestamps + and the values of the SEGMENT event in such a way that the + running-time of the buffer starts from 0. - In order to initialize the element time of the rest of the pipeline, a - source element must send a newsegment event before starting to play. - In addition, after seeking, a newsegment event must be sent, because - the timestamp of the next element does not match the element time of the - rest of the pipeline. + Some sources, such as filesrc, is not able to generate timestamps + on all buffers. It can and must however create a timestamp on the + first buffer (with a running-time of 0). + + + The source then pushes out the SEGMENT event followed by the + timestamped buffers. + + + + + Live source elements + + Live source elements must place a timestamp in each buffer that + they deliver. They must choose the timestamps and the values of the + SEGMENT event in such a way that the running-time of the buffer + matches exactly the running-time of the pipeline clock when the first + byte in the buffer was captured. + + + + + Parser elements + + Parser elements must use the incomming timestamps and transfer those + to the resulting output buffers. They are allowed to interpolate or + reconstruct timestamps on missing input buffers when they can. + + + + + Demuxer elements + + Demuxer elements can usually set the timestamps stored inside the media + file onto the outgoing buffers. They need to make sure that outgoing + buffers that are to be played at the same time have the same + running-time. Demuxers also need to take into account the incomming + timestamps on buffers and use that to calculate an offset on the outgoing + buffer timestamps. - Sink elements @@ -121,20 +146,12 @@ playing), the element should require a clock, and thus implement the method set_clock. - - In addition, before playing each sample, if the current element time is - less than the timestamp in the sample, it wait until the current time - arrives should call gst_element_wait() - - - With some schedulers, gst_element_wait() - blocks the pipeline. For instance, if there is one audio sink element - and one video sink element, while the audio element is waiting for a - sample the video element cannot play other sample. This behaviour is - under discussion, and might change in a future release. - - + The sink should then make sure that the sample with running-time is played + exactly when the pipeline clock reaches that running-time. Some elements + might use the clock API such as gst_clock_id_wait() + to perform this action. Other sinks might need to use other means of + scheduling timely playback of the data. diff --git a/docs/pwg/advanced-dparams.xml b/docs/pwg/advanced-dparams.xml index 70b4694bbc..8a196c2382 100644 --- a/docs/pwg/advanced-dparams.xml +++ b/docs/pwg/advanced-dparams.xml @@ -3,6 +3,9 @@ Supporting Dynamic Parameters + + Warning, this part describes 0.10 and is outdated. + Sometimes object properties are not powerful enough to control the parameters that affect the behaviour of your element. diff --git a/docs/pwg/advanced-interfaces.xml b/docs/pwg/advanced-interfaces.xml index 0fb959e652..9b598516c3 100644 --- a/docs/pwg/advanced-interfaces.xml +++ b/docs/pwg/advanced-interfaces.xml @@ -34,20 +34,14 @@ to achieve this. The basis of this all is the glib GTypeInterface type. For each case where we think it's useful, we've created interfaces which can be implemented by elements - at their own will. We've also created a small extension to - GTypeInterface (which is static itself, too) which - allows us to query for interface availability based on runtime properties. - This extension is called - GstImplementsInterface. + at their own will. One important note: interfaces do not replace properties. Rather, interfaces should be built next to properties. There are two important reasons for this. First of all, - properties - can be saved in XML files. Second, properties can be specified on the - commandline (gst-launch). + properties can be more easily introspected. Second, properties can be + specified on the commandline (gst-launch). @@ -58,12 +52,7 @@ registered the type itself. Some interfaces have dependencies on other interfaces or can only be registered by certain types of elements. You will be notified of doing that wrongly when using the element: it will - quit with failed assertions, which will explain what went wrong. In the - case of GStreamer, the only dependency that some - interfaces have is - GstImplementsInterface. Per - interface, we will indicate clearly when it depends on this extension. + quit with failed assertions, which will explain what went wrong. If it does, you need to register support for that interface before registering support for the interface that you're wanting to support. The example below explains how to add support for a