pwg: update for 1.0

Rewrite clock part.
start on interfaces
This commit is contained in:
Wim Taymans 2012-09-28 16:03:15 +02:00
parent 3cc8f1412d
commit 91a20a90eb
3 changed files with 118 additions and 109 deletions

View file

@ -7,83 +7,77 @@
synchronization mechanism.
</para>
<sect1 id="section-clock-time-types" xreflabel="Types of time">
<title> Types of time </title>
<para>
There are two kinds of time in GStreamer. <emphasis
role="strong">Clock time</emphasis> is an absolute time. By contrast,
<emphasis role="strong">element time</emphasis> is the relative time,
usually to the start of the current media stream. The element time
represents the time that should have a media sample that is being
processed by the element at this time. The element time is calculated by
adding an offset to the clock time.
</para>
</sect1>
<sect1 id="section-clocks" xreflabel="Clocks">
<title>Clocks</title>
<para>
GStreamer can use different clocks. Though the system time can be used
as a clock, soundcards and other devices provides a better time source. For
this reason some elements provide a clock. The method
<function>get_clock</function> is implemented in elements that provide
one.
Time in &GStreamer; is defined as the value returned from a particular
<classname>GstClock</classname> object from the method
<function>gst_clock_get_time ()</function>.
</para>
<para>
In a typical computer, there are many sources that can be used as a
time source, e.g., the system time, soundcards, CPU performance
counters, ... For this reason, there are many
<classname>GstClock</classname> implementations available in &GStreamer;.
The clock time doesn't always start from 0 or from some known value.
Some clocks start counting from some known start date, other clocks start
counting since last reboot, etc...
</para>
<para>
As clocks return an absolute measure of time, they are not usually used
directly. Instead, a reference to a clock is stored in any element that needs
it, and it is used internally by GStreamer to calculate the element time.
directly. Instead, differences between two clock times are used to
measure elapsed time according to a clock.
</para>
</sect1>
<sect1 id="section-time-data-flow" xreflabel="Flow of data between elements
and time">
<title>
Flow of data between elements and time
</title>
<sect1 id="section-clock-time-types" xreflabel="Clock running-time">
<title> Clock running-time </title>
<para>
Now we will see how time information travels the pipeline in different states.
A clock returns the <emphasis role="strong">absolute-time</emphasis>
according to that clock with <function>gst_clock_get_time ()</function>.
From the absolute-time is a <emphasis role="strong">running-time</emphasis>
calculated, which is simply the difference between a previous snapshot
of the absolute-time called the <emphasis role="strong">base-time</emphasis>.
So:
</para>
<para>
The pipeline starts playing.
The source element typically knows the time of each sample.
<footnote>
<para>
Sometimes it
is a parser element the one that knows the time, for instance if a pipeline
contains a filesrc element connected to a MPEG decoder element, the former
is the one that knows the time of each sample, because the knowledge of
when to play each sample is embedded in the MPEG format. In this case this
element will be regarded as the source element for this discussion.
running-time = absolute-time - base-time
</para>
</footnote>
First, the source element sends a newsegment event. This event carries information
about the current relative time of the next sample. This relative time is
arbitrary, but it must be consistent with the timestamp that will be
placed in buffers. It is expected to be the relative time to the start
of the media stream, or whatever makes sense in the case of each media.
When receiving it, the other elements adjust their offset of the element time so that this
time matches the time written in the event.
</para>
<para>
Then the source element sends media samples in buffers. This element places a
timestamp in each buffer saying when the sample should be played. When the
buffer reaches the sink pad of the last element, this element compares the
current element time with the timestamp of the buffer. If the timestamp is
higher or equal it plays the buffer, otherwise it waits until the time to
place the buffer arrives with <function>gst_element_wait()</function>.
A &GStreamer; <classname>GstPipeline</classname> object maintains a
<classname>GstClock</classname> object and a base-time when it goes
to the PLAYING state. The pipeline gives a handle to the selected
<classname>GstClock</classname> to each element in the pipeline along
with selected base-time. The pipeline will select a base-time in such
a way that the running-time reflects the total time spent in the
PLAYING state. As a result, when the pipeline is PAUSED, the
running-time stands still.
</para>
<para>
If the stream is seeked, the next samples sent will have a timestamp that
is not adjusted with the element time. Therefore, the source element must
send a newsegment event.
Because all objects in the pipeline have the same clock and base-time,
they can thus all calculate the running-time according to the pipeline
clock.
</para>
</sect1>
<sect1 id="section-buffer-time-types" xreflabel="Buffer running-time">
<title> Buffer running-time </title>
<para>
To calculate a buffer running-time, we need a buffer timestamp and
the SEGMENT event that preceeded the buffer. First we can convert
the SEGMENT event into a <classname>GstSegment</classname> object
and then we can use the
<function>gst_segment_to_running_time ()</function> function to
perform the calculation of the buffer running-time.
</para>
<para>
Synchronization is now a matter of making sure that a buffer with a
certain running-time is played when the clock reaches the same
running-time. Usually this task is done by sink elements.
</para>
</sect1>
<sect1 id="section-clock-obligations-of-each-element" xreflabel="Obligations
of each element">
<title>
@ -96,23 +90,54 @@
</para>
<sect2>
<title>Source elements </title>
<title>Non-live source elements </title>
<para>
Source elements (or parsers of formats that provide notion of time, such
as MPEG, as explained above) must place a timestamp in each buffer that
they deliver. The origin of the time used is arbitrary, but it must
match the time delivered in the newsegment event (see below).
However, it is expected that the origin is the origin of the media
stream.
Non-live source elements must place a timestamp in each buffer that
they deliver when this is possible. They must choose the timestamps
and the values of the SEGMENT event in such a way that the
running-time of the buffer starts from 0.
</para>
<para>
In order to initialize the element time of the rest of the pipeline, a
source element must send a newsegment event before starting to play.
In addition, after seeking, a newsegment event must be sent, because
the timestamp of the next element does not match the element time of the
rest of the pipeline.
Some sources, such as filesrc, is not able to generate timestamps
on all buffers. It can and must however create a timestamp on the
first buffer (with a running-time of 0).
</para>
<para>
The source then pushes out the SEGMENT event followed by the
timestamped buffers.
</para>
</sect2>
<sect2>
<title>Live source elements </title>
<para>
Live source elements must place a timestamp in each buffer that
they deliver. They must choose the timestamps and the values of the
SEGMENT event in such a way that the running-time of the buffer
matches exactly the running-time of the pipeline clock when the first
byte in the buffer was captured.
</para>
</sect2>
<sect2>
<title>Parser elements </title>
<para>
Parser elements must use the incomming timestamps and transfer those
to the resulting output buffers. They are allowed to interpolate or
reconstruct timestamps on missing input buffers when they can.
</para>
</sect2>
<sect2>
<title>Demuxer elements </title>
<para>
Demuxer elements can usually set the timestamps stored inside the media
file onto the outgoing buffers. They need to make sure that outgoing
buffers that are to be played at the same time have the same
running-time. Demuxers also need to take into account the incomming
timestamps on buffers and use that to calculate an offset on the outgoing
buffer timestamps.
</para>
</sect2>
<sect2> <title> Sink elements </title>
@ -121,20 +146,12 @@
playing), the element should require a clock, and thus implement the
method <function>set_clock</function>.
</para>
<para>
In addition, before playing each sample, if the current element time is
less than the timestamp in the sample, it wait until the current time
arrives should call <function>gst_element_wait()</function>
<footnote>
<para>
With some schedulers, <function>gst_element_wait()</function>
blocks the pipeline. For instance, if there is one audio sink element
and one video sink element, while the audio element is waiting for a
sample the video element cannot play other sample. This behaviour is
under discussion, and might change in a future release.
</para>
</footnote>
The sink should then make sure that the sample with running-time is played
exactly when the pipeline clock reaches that running-time. Some elements
might use the clock API such as <function>gst_clock_id_wait()</function>
to perform this action. Other sinks might need to use other means of
scheduling timely playback of the data.
</para>
</sect2>
</sect1>

View file

@ -3,6 +3,9 @@
<chapter id="chapter-dparams">
<title>Supporting Dynamic Parameters</title>
<para>
Warning, this part describes 0.10 and is outdated.
</para>
<para>
Sometimes object properties are not powerful enough to control the
parameters that affect the behaviour of your element.

View file

@ -34,20 +34,14 @@
to achieve this. The basis of this all is the glib
<classname>GTypeInterface</classname> type. For each case where we think
it's useful, we've created interfaces which can be implemented by elements
at their own will. We've also created a small extension to
<classname>GTypeInterface</classname> (which is static itself, too) which
allows us to query for interface availability based on runtime properties.
This extension is called <ulink type="http"
url="../../gstreamer/html/GstImplementsInterface.html"><classname>
GstImplementsInterface</classname></ulink>.
at their own will.
</para>
<para>
One important note: interfaces do <emphasis>not</emphasis> replace
properties. Rather, interfaces should be built <emphasis>next to</emphasis>
properties. There are two important reasons for this. First of all,
properties
can be saved in XML files. Second, properties can be specified on the
commandline (<filename>gst-launch</filename>).
properties can be more easily introspected. Second, properties can be
specified on the commandline (<filename>gst-launch</filename>).
</para>
<sect1 id="section-iface-general" xreflabel="How to Implement Interfaces">
@ -58,12 +52,7 @@
registered the type itself. Some interfaces have dependencies on other
interfaces or can only be registered by certain types of elements. You
will be notified of doing that wrongly when using the element: it will
quit with failed assertions, which will explain what went wrong. In the
case of GStreamer, the only dependency that <emphasis>some</emphasis>
interfaces have is <ulink type="http"
url="../../gstreamer/html/GstImplementsInterface.html"><classname>
GstImplementsInterface</classname></ulink>. Per
interface, we will indicate clearly when it depends on this extension.
quit with failed assertions, which will explain what went wrong.
If it does, you need to register support for <emphasis>that</emphasis>
interface before registering support for the interface that you're
wanting to support. The example below explains how to add support for a