As described in issue #200, we hold the srcpad's stream lock in some
situations where we notify the `active-pad` property.
If there's a handler installed it will most likely attempt to read the
property, which had to take the `state` lock. Another thread could
already be holding this lock and attempting to obtain the srcpad's
stream lock. This resulted in a deadlock.
To avoid this, move the `active_sinkpad` field into its own Mutex, which
we never hold for long.
Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/200
The internal (C) jitterbuffer needs to know about the configured
latency when calculating a PTS, as it otherwise may consider that
the packet is too late, trigger a resync and cause the element to
discard the packet altogether.
I could not identify when this was broken, but the net effect was
that in the current state, ts-jitterbuffer was discarding up to
half of all the incoming packets.
In roll-up mode, when no more timed text comes in, the closed
captions may remain displayed on screen indefinitely (unless the
decoder implements a timeout, but that is not mandatory).
Expose a property to erase the display memory after a configurable
amount of time has elapsed instead.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/754>
`PadTemplate::caps()` returns a reference to the caps now instead of a
new strong reference, so keeping the template in scope as long as the
caps reference is required.
Writing a proper "depfile" to follow depending files, based on depfiles
generated by rustc.
This is based on work done while working on gobject-examples-rs
The element has a small race condition where it might output two buffers
with the same running time during e.g. a manual switch. In practice this
is not a problem, so the test takes this race into account.
Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues/195
This new video filter is able to detect the dominant color in a video frame.
When the color has changed from the previous frame the filter posts an Element
message on the bus, the associated structure is named `colordetect` and has two
fields:
* a string field named `dominant-color`
* a list field containing the whole color palette, stored as uint values, sorted
by dominance, with more dominant colors first
Also clip raw audio/video buffers according to the caps if they have the
relevant information in the caps, and drop raw audio samples if they're
outside the segment too.
In addition also set durations on raw audio/video buffers based on the
caps if no duration is set.