Apologies for the big commit, but it wasn't really possible to split it
in anything smaller.
* Switch to uridecodebin3 instead of managing urisourcebin and decodebin3
ourselves. No major architectural change with this.
* Reconfigure sinks/outputs when needed. This is possible thanks to the
various streams-related API. Instead of blocking new pads and waiting
for a (fake) no-more-pads to decide what to connect, we instead reconfigure
playsink and the combiners to whatever types are currently selected. All of
this is done in reconfigure_output().
New pads are immediately connected to (combiners and) sinks, allowing
immediate negotiation and usage.
* Since elements are always connected, the "cached-duration" feature is gone
and queries can reach the target elements.
* The auto-plugging related code is currently disabled entirely until
we get the new proper API.
* Store collections at the GstSourceGroup level and not globally
* And more comments a bit everywhere
NOTE: gapless is still not functional, but this opens the way to be able
to handle it in a streams-aware fashion (where several uridecodebin3 can
be active at the same time).
With push-based sources, urisourcebin will emit this signal when
the stream has been fully consumed.
This signal can be used to know when the source is done providing
data.
Even if the input is monoscopic, the app might want to display
it in a different layout, to do side-by-side for VR for example,
so if the app changes the output-multiview-mode always use that.
We can either receive an element that is floating or not and need to
accomodate that in the signal return values. Do so by removing the
floating flag.
https://bugzilla.gnome.org/show_bug.cgi?id=792597
WARNING: Trying to compare values of different types (str, int).
The result of this is undefined and will become a hard error
in a future Meson release.
When the GstRTSPConnection class sends a RTSP over HTTP tunnelling
request, the HTTP Content-Type header is missing from the HTTP POST
request.
This isn't a problem with most servers, but there are servers that
rejects the request without there also being a Content-Type header.
RFC 1945:
Any HTTP/1.0 message containing an entity body should include a
Content-Type header field defining the media type of that body.
Apple Dispatch 28:
QuickTime Streaming uses the "application/x-rtsp-tunnelled" MIME
type in both the Content-Type and Accept headers. This reflects
the data type that is expected and delivered by the client and server.
https://bugzilla.gnome.org/show_bug.cgi?id=793110
It does not timeout anymore, even though it's a very slow test. For the
context, this test runs routines for a fixes amount of time and prints
the throughput. Which means the test takes more time everytime a pixel
format is added. If that becomes a problem again, we should disable the
benchmarks by default.
The source offset (soff) was not incremented for each component and then
each group of 3 components were inverted. This was causing a staircase
effect combined with some noise.
https://bugzilla.gnome.org/show_bug.cgi?id=789876
This adds a 10 bit variant for NV16 packed into 32 bits little endian
words. The MSB 2 bits are padding. This format is used on Xilinx SoC and
identified with the FOURCC XV20.
https://bugzilla.gnome.org/show_bug.cgi?id=789876
This add a 10bit variant of gray scale packed into 32bits little endian
words. The MSB 2 bits are padding and should be ignored. This format is
used on Xilinx SoC and is identified with the FOURCC XV10.
https://bugzilla.gnome.org/show_bug.cgi?id=789876