In gst_ogg_demux_do_seek() when calculating the
keyframe time, account for a non-zero start-time
Handle a discontinuous first packet in
gst_ogg_demux_setup_first_granule() because that's pretty
normal after a seek. Also differentiate between a genuinely
truncated first packet and just bailing out early, by not using
granule = -1 as an error code.
Make the debug output logs clearer about which timestamps
are stream times (PTS) and which are ogg timestamps.
Don't guess a timestamp of the start of the segment when running
in reverse mode, as more likely it means we're discontinuous somewhere
in the middle of the segment, and we'll fix up timestamps once
the frames are decoded and reversed.
When a PTS is not set, we still want to store the rest of the
buffer information, or else we lose important things like the
duration or buffer flags when parsing.
This is a followup commit to b95725c37e
* Resetting the decoder should only happen when we get a new initialization
header (0x01) and not on the other headers
* The initialized variable only gets set to TRUE once all headers have
been parsed. Also check if the vorbis_info struct has been properly resetted
also. Failure to do that would cause vorbisdec to error if it got
two initialization header in a row (the first would configure the underlying
library and the second one would error out because it's already initialized)
https://bugzilla.gnome.org/show_bug.cgi?id=779515
This adds a property to select the maximum number of threads to use for
conversion and scaling. During processing, each plane is split into
an equal number of consecutive lines that are then processed by each
thread.
During tests, this gave up to 1.8x speedup with 2 threads and up to 3.2x
speedup with 4 threads when converting e.g. 1080p to 4k in v210.
https://bugzilla.gnome.org/show_bug.cgi?id=778974
In gst_video_time_code_is_valid, also check for invalid
ranges when using drop-frame TC. Refactor some code which
broke after the check was added.
https://bugzilla.gnome.org/show_bug.cgi?id=779010
See https://bugzilla.gnome.org/show_bug.cgi?id=773666
This would ideally be solved in baseparse but that requires further
thought at this point, and in the meantime it would be good to have
rawbaseparse not assert on this but handle it gracefully instead.
It was taking the initial input y-offset from the output value, which
only works for y=0 (in which case both are the same). If y > 0, we would
always stay behind the requested input offset and never ever read
anything from the input.
The parser might do some conversion on a stream but the stream keeps
being the same, and we need to make sure GstDiscoverer detects it is the
case.
https://bugzilla.gnome.org/show_bug.cgi?id=778298
Probe for MultiQueue source pad might receive EOS twice,
the first is fake-eos and the other is actual EOS.
And the slot can be freed with fake-eos/EOS if the slot has no input.
Since slot freeing is async, double free can be possible.
So, decodebin3 needs to remove the probe also with slot freeing.
https://bugzilla.gnome.org/show_bug.cgi?id=777530
"requested_selection" list might be generated by select-streams event.
And memory of stream-id(s) in select-streams is independent from that of stream-collection.
https://bugzilla.gnome.org/show_bug.cgi?id=775553
The latency query originally had a fallthrough to the default
label at the end as fallback, but that got messed up when the
DURATION and POSITION queries were added, so it then fell through
to the duration query handler instead. Restore original behaviour.
https://bugzilla.gnome.org/show_bug.cgi?id=699077