In some cases, the user might want the stream outputted by encodebin to
be in the exact same format during all the stream. We should let the
user specify when this is the case. This commit add some API in the
GstEncodingProfile to determine whether the format can be renegotiated
after the encoding started or not.
API:
gst_encoding_profile_set_allow_dynamic_output
gst_encoding_profile_get_allow_dynamic_output
https://bugzilla.gnome.org/show_bug.cgi?id=740214
Before we were setting them to PAUSED and (much) later connecting to
their source pad caps notify signal.
There was a race where that demuxer was pushing a caps and later a buffer
on its source pad when we were not even connected to its source pad caps notify
signal leading to decodebin missing the information and not keeping on
building the pipeline on CAPS event thus the demuxer was posting an ERROR
(not linked) message on the bus. This need to be done for 'simple
demuxers' because those have one ALWAYS source pad, not like usual demuxers
that have several dynamic source pads.
A "simple demuxer" is a demuxer that has one and only one ALWAYS source
pad.
https://bugzilla.gnome.org/show_bug.cgi?id=740693
There was a race where:
1) we would put the element to PAUSED
2) It would get data sent to it from upstream
3) It would thus send caps
3) caps_notify_cb would continue autoplugging
4) caps would flow downstream, the last pad would get exposed
5) we were still not done sending the sticky events
Taking the stream lock on the new element's sinkpad and only
releasing it when sticky events have all been sent prevents
the caps from reaching the source pad of the element before
we're all set.
https://bugzilla.gnome.org/show_bug.cgi?id=740694
Otherwise the following can happen:
1. set mute=true
2. play media1 (Ok)
3. play media without audio (audiochain removed)
4. play media2 (audiochain created, mute=*false*)
https://bugzilla.gnome.org/show_bug.cgi?id=740675
There's no reason why we would have to wait for the next buffer to decide
whether to output the current one or not. We just have to check if the
current one is earlier than our expected next time, which is the previous
frame timestamp plus the expected frame duration.
https://bugzilla.gnome.org/show_bug.cgi?id=740018
If there are two parser elements available for the same media format,
then decodebin is autoplugging an extra capsfilter and parser irrespective
of caps and rank. So restrict the decodebin from autoplugging multiple parser
elements back to back in adjacent positions with in a single DecodeChain
for the same media format.
https://bugzilla.gnome.org/show_bug.cgi?id=738416
timestamp_offset is being declared as an int64 variable,
for which the min
value of G_MININT64 is -9223372036854775808
Changing the minimum and maximum limit for the offset variable.
https://bugzilla.gnome.org/show_bug.cgi?id=738568
The "iradio-mode" property used to have a default FALSE value in HTTP
source elements but now it should default to TRUE or just do not exist
as a property so it is not really needed to set it any more in
uridecodebin.
Apart from that this code could've never worked as uridecodebin looks for a
string-typed iradio-mode property, but it's a boolean in all sources.
Fixes https://bugzilla.gnome.org/show_bug.cgi?id=725383
It is better to use storel to splat the variable into the destination.
ORC doesn't know when a variable is last written to so it can't yet optimize
away the copy operation.
Support lanczos scaling method for NV12 and NV21 formats.
Scale the 'Y' plane and scale 'NV' plane.
Implementation for submethods - int16, int32, float and double
https://bugzilla.gnome.org/show_bug.cgi?id=737400
Move the conversion code used in videoconvert to the video library
and expose a simple but generic API to do arbitrary conversion. It can
currently do colorspace conversion but the plan is to add videoscale to
it as well.
See https://bugzilla.gnome.org/show_bug.cgi?id=732415