The speed-level property, which allows callers to trade of encoding
quality for speed in the libtheora api, has a version-dependent
maximum and default values. Instead of hardcoding the acceptable
range for the theoraenc element's presentation of this setting,
we query the library directly at class initialization time and
set the maximum and default values from that. If the query fails,
we fall back to the previous default setting.
To keep the values reported by gst-inspect (which I'm told use
the spec values from the class) with those available on an\
instantiated element, we remove to setting of enc->speed_level
from the initializer and instead pass G_PARAM_CONSTRUCT to
the property spec flags, asking g_object to set this property
when theoraenc objects are constructed.
NB in theory the maximum speed-level could depend on the actual
video caps. If later versions of libtheoraenc do this, a second
call will need to be made from theora_enc_reset to update the
property, since this function is mostly useful for realtime
adjustment of performance while the pipeline is running.
libtheora has two encoding modes, CBR, where it tries to hit a target
bitrate and VBR where it tries to achieve a target quality.
Internally if the target bitrate is set to anything other then 0 the
encoding-mode is CBR.
This means that the gstreamer element can leave the video_quality
setting alone as long as the user is tweaking the bitrate. Which has the
nice side-effect that if the user explicitely sets the bitrate to 0
(which is actually the default), the quality value doesn't get reset and
one ends up encoding VBR at quality-level 0...
In case the ogg mapper doesn't handle all the accepted input formats
(although it really should). Saves us error handling for that case
though. Also log caps properly.
https://bugzilla.gnome.org/show_bug.cgi?id=629196
Using the IN_CAPS flag for this is brittle, and will fail if either
vorbisparse or vorbistag (which is itself based on vorbisparse) is
inserted between oggdemux and oggmux. Possibly other elements too
(eg, theoraparse, etc).
Using oggstream ensures we Get It Right More Often Than Not.
https://bugzilla.gnome.org/show_bug.cgi?id=629196
Discontinuities are automatically signalled by oggdemux at the start
of a new stream. When oggmux is yet to output actual data pages,
do not signal these discontinuities in the ogg stream.
This patch may miss some actual discontinuities at the very start of
a stream, but avoids the spurious missing pages when encoding happens
normally.
A better fix might involve finding a way to distinguish between actual
data discontinuities and discontinuities merely marking the start of
a new stream.
Fixes an issue with ogg page numbering (would skip a number for no
reason, which then looks like a packet was lost somewhere) when
re-muxing an ogg stream, e.g. when re-tagging in rhythmbox.
https://bugzilla.gnome.org/show_bug.cgi?id=629196
Remove "This property requires libtheora version >= 1.1" qualifiers
from property descriptions. They aren't needed any longer now that
we require libtheora >= 1.1.
This was causing keyframe_granule to be set to 0 for all streams
when seeking to the beginning of the stream, i.e., at the
beginning of playback. Fixes#619778.
Instead, use either 0 or 1, depending on bitstream version, which give
the correct result for streams which aren't cut off at start.
This allows that function to not return negative granpos.
https://bugzilla.gnome.org/show_bug.cgi?id=638276
The offset part of the granpos is not a sign of the newer encoding.
Use the version number instead.
This fixes the criticals thrown by theoraparse, and (at last) the
remaining part of #553244.
allocate buffers using gst_buffer_new_and_alloc() instead of
gst_pad_alloc_buffer_and_set_caps(), as the first one will
cause the pad to block, and we don't want that since that will
prevent subsequent pads from being fed if a block occurs at
start, when all pads must be fed for playback to start.
This fixes autoplugging of the tiger element and other things.
https://bugzilla.gnome.org/show_bug.cgi?id=637822
Oggdemux will currently try to pad alloc a buffer from the peer when it is
reading the header files. This is a relic from the time where we had an internal
parser and needs to be removed at some point in time.
The problem is that when there is no peer pad yet (which is normal when
collecting headers) we should still continue to parse all the packets of a
page instead of erroring out on NOT_LINKED.
Fixes#632167