If we are going to return a (potentially) 64bit integer, don't use
a 32bit one for calculation, otherwise we could end up exceeding
the maximum size of a 32bit int.
For stream mappers that don't set a specific granuleshift, it will
have the default value of -1.
Protect the code for that and return the granule value as-is
Since the default value of a GstOggPad.map.map was 0 ... we would
end up using wrong functions from mappers() if the stream wasn't
initialized yet.
Instead of that, use a default blank/empty first entry.
The granulepos does not have the pre-skip subtracted while timestamps do,
and the last granulepos will be shorter by the number of samples that should
be dropped because of padding in the end.
As such, extrapolating the granule of the beginning of the first frame will
lead to a negative value, which is not a problem but intentional.
https://bugzilla.gnome.org/show_bug.cgi?id=757153
This lets oggdemux determine they are not delta units, and removes
spurious per packet warnings about being unable to determine the
packet's keyframeness.
This should not cause any actual bug since Theora and Daala have
a maximum shift of 31, and a packet duration of 2^31 seems very
implausible. But it fixes:
Coverity 1139804, 1139803, 1139802
Add an extra function to the oggstream map to inform it about
the incoming buffers. This way oggmux can keep a count on the
vp8 invisible frames and calculate the granulepos correctly.
https://bugzilla.gnome.org/show_bug.cgi?id=722682
vp8 stream header shouldn't be assumed to be provided in caps always
as this would repeat the same code in all demuxers/encoders. Instead,
make oggmux generate them if they are not supplied.
https://bugzilla.gnome.org/show_bug.cgi?id=722682
This never really took off - it's hardly used anywhere
and deprecated in favour of Kate. Exposing pads just
leads to confusing 'you are missing a plug-in' messages
when people come across such streams. We could still post
the data on the bus for applications to parse.
In case many packets fit on a page, we may not see a granpos for
a while, and granpos interpolation can wrap the 'frames since last
keyframe' part of the granpos, generating a granpos which is smaller
than what it should be.
This is fixed by detecting keyframe packets (at least for Theora),
and updating the last keyframe granpos from this.
This may still be generating potentially wrong granpos for streams
which have a Theora like granpos (keyframes, a max keyframe distance
and a count of frames since last keyframe), and which allow implicit
granules on packets. For these streams, a custom keyframe detection
routine should be plugged into their GstOggStream mapper.
https://bugzilla.gnome.org/show_bug.cgi?id=669164
Opus streams outside of Ogg may not have headers, and oggstream
may be used by oggmux to mux an Opus stream which does not come
from Ogg - thus without headers.
Determining headerness by packet count would strip the first two
packets from such an Opus stream, leading to a very small amount
of audio being clipped at the beginning of the stream.