gstreamer/subprojects/gst-docs/markdown/additional/design/segments.md

Ignoring revisions in .git-blame-ignore-revs. Click here to bypass and see the normal blame view.

112 lines
4.1 KiB
Markdown
Raw Permalink Normal View History

# Segments
A segment in GStreamer denotes a set of media samples that must be
processed. A segment has a start time, a stop time and a processing
rate.
A media stream has a start and a stop time. The start time is always 0
and the stop time is the total duration (or -1 if unknown, for example a
live stream). We call this the complete media stream.
The segment of the complete media stream can be played by issuing a seek
on the stream. The seek has a start time, a stop time and a processing
rate.
```
complete stream
+------------------------------------------------+
0 duration
segment
|--------------------------|
start stop
```
The playback of a segment starts with a source or demuxer element
pushing a `SEGMENT` event containing the start time, stop time and rate of
the segment. The purpose of this event is to inform downstream
elements of the requested segment positions. Some elements might produce
buffers that fall outside of the segment and that might therefore be
discarded or clipped.
## Use cases
### FLUSHING seek
```
filesrc ! avidemux ! videodecoder ! videosink
```
When doing a seek in this pipeline for a segment 1 to 5 seconds, avidemux
will perform the seek.
Avidemux starts by sending a `FLUSH_START` event downstream and upstream. This
will cause its streaming task to `PAUSED` because `_pad_pull_range()` and
`_pad_push()` will return `FLUSHING`. It then waits for the `STREAM_LOCK`,
which will be unlocked when the streaming task pauses. At this point no
streaming is happening anymore in the pipeline and a `FLUSH_STOP` is sent
upstream and downstream.
When avidemux starts playback of the segment from second 1 to 5, it pushes
out a segment with 1 and 5 as start and stop times. The `stream_time` in
the segment is also 1 as this is the position we seek to.
The video decoder stores these values internally and forwards them to the
next downstream element (videosink, which also stores the values)
Since second 1 does not contain a keyframe, the avi demuxer starts sending
data from the previous keyframe which is at timestamp 0.
The video decoder decodes the keyframe but knows it should not push the
video frame yet as it falls outside of the configured segment.
When the video decoder receives the frame with timestamp 1, it is able to
decode this frame as it received and decoded the data up to the previous
keyframe. It then continues to decode and push frames with timestamps >= 1.
When it reaches timestamp 5, it does not decode and push frames anymore.
The video sink receives a frame of timestamp 1. It takes the start value of
the previous segment and applies the following (simplified) formula:
```
render_time = BUFFER_TIMESTAMP - segment_start + element->base_time
```
It then syncs against the clock with this `render_time`. Note that
`BUFFER_TIMESTAMP` is always >= `segment_start` or else it would fall outside
2017-02-16 09:02:24 +00:00
of the configured segment.
Videosink reports its current position as (simplified):
```
current_position = clock_time - element->base_time + segment_time
```
2019-05-26 22:32:55 +00:00
See [synchronisation](additional/design/synchronisation.md) for a more detailed and
accurate explanation of synchronisation and position reporting.
Since after a flushing seek the `stream_time` is reset to 0, the new buffer
will be rendered immediately after the seek and the `current_position` will be
the `stream_time` of the seek that was performed.
The stop time is important when the video format contains B frames. The
video decoder receives a P frame first, which it can decode but not push yet.
When it receives a B frame, it can decode the B frame and push the B frame
followed by the previously decoded P frame. If the P frame is outside of the
segment, the decoder knows it should not send the P frame.
Avidemux stops sending data after pushing a frame with timestamp 5 and
returns `GST_FLOW_EOS` from the chain function to make the upstream
elements perform the EOS logic.
### Live stream
### Segment looping
Consider the case of a wav file with raw audio.
```
filesrc ! wavparse ! alsasink
```
FIXME!