When subtitle pads are added dynamically, the suffix used was
"synthesis" instead of the expected "subtitle".
This fixes it by refactoring the code to define a new enum with a name()
parameter for the types of custom output channels.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2318>
These threadshare-based elements provide a means to connect an upstream pipeline
to multiple downstream pipelines while taking advantage of reduced nb of threads
& context switches.
Differences with the `ts-proxy` elements:
* Link one to many pipelines instead of one to one.
* No back pressure: items which can't be handled by a downstream pipeline are
lost, wherease they are kept in a pending queue and block the stream for
`ts-proxysink`.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2293>
Before, if a reference time wraparound happened and the arrival time of
newer packets was much lower, the map would grow infinitely. Handle
specifically the case where the difference in timestamp is very large,
if the input data contains a wraparound. Contains a unit/regression
test for this behaviour.
This will be combined with a fix for TWCC parsing in rtptwcc.c in
gstreamer core.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2255>
When webrtcsink takes as an input a stream that is already encoded, it
errors out when there is no encoder element available for the codec in
question.
This change makes it possible to stream an output from a camera that
already produces e.g. video/x-h264 without bloating the GStreamer
installation with a redundant H.264 encoder element.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2273>
Two examples are added, a server that sends N video streams, and a
client that composites them together, then sends messages over the
control data channel to enable some of them and disable some others.
This demonstrates how custom upstream events can be sent from a client
to a server, and how once a connection is established one can start and
stop the flow of data for a specific media without affecting the overall
connection.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2276>
If a GOP starts after current chunk/fragment end draining should still
happen for the other streams. So handle this situation gracefully and
reset the late_gop state when done.
This also fixes GAP buffer handling for video streams that require
DTS.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2236>
If the first GOP of a stream starts after the fragment/chunk end the
stream is marked filled because otherwise the really filled streams
will never be drained.
However, this break the assertion in drain_buffers_one_stream() that
there are at least two GOPs where the pre-last GOP has
final_earliest_pts.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2236>
TAI Clock Information (`taic`) will be defined in ISO/IEC 23001-17 Amendment 1.
The TAI Clock Information box appears in the sample entry for the applicable track.
The clock information is metadata about the type and quality of a TAI clock. That
can be set via a stream tag. For example:
gst-launch-1.0 -v videotestsrc num-buffers=50 ! video/x-raw,width=1280,height=720,format=I420 ! taginject tags=precision-clock-type=can-sync-to-tai,precision-clock-time-uncertainty-nanoseconds=30000 ! isomp4mux ! filesink location=taic.mp4
This version writes compliant taic boxes if either or both of the tags are set. There
are two values in taic (for clock resolution and clock drift rate) that will be set to
reasonable default values.
taic is intended to be used (in video) with a TAITimestampPacket instance that is
per-sample auxilary information (i.e. via saio and saiz entries). That will be
provided as a follow-up change, based on a meta.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/2246>