This is used by the queue to schedule putting data into the queue once
it has space again.
Also implement blocking-wait in the queue on the sinkpad if there is no
IOContext upstream and generally clean up various things.
See: https://github.com/FFmpeg/FFmpeg/blob/master/libavdevice/libndi_newtek_common.h#L27
From NDI SDK Documentation:
This is the timecode of this frame in 100ns intervals. This is generally not used internally by the SDK, but is passed
through to applications who may interpret it as they wish. When sending data, a value of
NDIlib_send_timecode_synthesize can be specified (and should be the default), the operation of this value is
documented in the sending section of this documentation. NDIlib_send_timecode_synthesize will yield UTC
time in 100ns intervals since the Unix Time Epoch 1/1/1970 00:00. When interpreting this timecode a receiving
application may choose to localise the time of day based on time zone offset which can optionally be communicated by
the sender in connection metadata. Since timecode is stored in UTC within NDI, communicating timecode time of day for
non UTC time zones requires a translation
We could also go via the glib::Type but that requires more steps unless
we also add a getter from the registered type to the audio/video source
modules.
A buffer can also have one or both of a start and an end offset. These are media-type specific. For video buffers, the start offset will generally be the frame number. For audio buffers, it will be the number of samples produced so far. For compressed data, it could be the byte offset in a source or destination file. Likewise, the end offset will be the offset of the end of the buffer. These can only be meaningfully interpreted if you know the media type of the buffer (the preceding CAPS event). Either or both can be set to GST_BUFFER_OFFSET_NONE.
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstBuffer.html