gstreamer/docs/design/part-live-source.txt

48 lines
1.9 KiB
Text
Raw Normal View History

Live sources
------------
A live source such as an element capturing audio or video need to be handled
in a special way. It does not make sense to start the dataflow in the PAUSED
state for those devices as the user might wait a long time between going from
PAUSED to PLAYING, making the previously captured buffers irrelevant.
A live source therefore only produces buffers in the PLAYING state. This has
implications for sinks waiting for a buffer to complete the preroll state
since such a buffer might never arrive.
Live sources return NO_PREROLL when going to the PAUSED state to inform the
bin/pipeline that this element will not be able to produce data in the
PAUSED state.
When performing a get_state() on a bin with a non-zero timeout value, the
bin must be sure that there are no live sources in the pipeline because else
the get_state() function would block on the sinks.
A gstbin therefore always performs a zero timeout get_state() on its
elements to discover the NO_PREROLL (and ERROR) elements before performing
a blocking wait.
Scheduling
----------
Live sources can not produce data in the paused state. They block in the
getrange function or in the loop function until they go to PLAYING.
Latency
-------
The live source timestamps its data with the time of the clock at the
time the data was captured. Normally it will take some time to capture
the first sample of data and the last sample. This means that when the
buffer arrives at the sink, it will already be late and will be dropped.
The latency is the time it takes to construct one buffer of data. This latency
could be exposed by latency queries.
Theses latency queries need to be done by the managing pipeline for all sinks.
They can only be done after the meassurements have been taken (all are
prerolled). Thus in pipeline:state_changed:PAUSED_TO_PLAYING we need
get the max-latency and set this as a sync-offset in all sinks.