Intel drivers expose some colorbalance's maximum values much more
bigger than their minimum values, given their middle values (default
value). This means, in practice, that the real middle point between
the maximum and minimum values implies a major change in the color
balance, which is not expected by the GStreamer color balance logic.
This patch makes the given maximum value symmetrical to the minimum
value, given the middle one (default value).
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1580>
It was getting pulled in automatically via cairo, but the version
there is too old for us. We need the latest to fix Windows ARM64 NEON
support:
> ERROR: No specified compiler can handle file subprojects\libpng-1.6.37\arm/filter_neon.S
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1570>
As now, we warn if the decoder have no support src pixel format, but that
warning is called before the type (hence the debug category) is initialized.
Fix this by moving the debug category init out of the type initialization,
into the register funcitons.
This will fix an assertion that occures in the register function and allow
relevant log to be seen by the users.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1588>
Those will cause us to renegotiate at the next aggregate cycle,
and while at that point we may decide to reconfigure upstream
branches (in practice we don't as this is inherently racy,
and that's the reason why mixer subclasses perform conversion
internally), we certainly don't want to just forward the event
willy-nilly to all our sinkpads.
An actual issue this is fixing is when caps downstream of a
compositor are changed at every samples-selected signal emission,
for the purpose of interpolating the output geometry, and the
compositor has a non-zero latency, the reconfigure events were
forwarded to basesrc, which triggered an allocation query, which
in turn caused aggregator to have to drain (thus not being able
to queue <latency> frames), leading to disastrous effects
(choppy output as compositor couldn't consume frames fast enough,
the higher the latency the choppier the output)
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1464>
The merge request workflow can be confusing to people unfamiliar with
it, so add screenshots.
Also add a new section on how to revise merge requests, since a lot of
people tend to open new merge requests to make any changes.
Eliminate the separate "How to Prepare a Merge Request for Submission"
section -- merge it with the main section.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1367>
This is usually necessary to allow gst-indent to treat it as
a statement, but we do not run gst-indent on headers and we do not
have extra semicolons in other places that this macro is used in the
header. Fixes warnings when using the header:
```
In file included from gstreamer/subprojects/gst-plugins-base/gst-libs/gst/video/video.h:185,
from XYZ:9001:
gstreamer/subprojects/gst-plugins-base/gst-libs/gst/video/gstvideoaggregator.h:206:78: warning: ISO C does not allow extra ‘;’ outside of a function [-Wpedantic]
206 | G_DEFINE_AUTOPTR_CLEANUP_FUNC(GstVideoAggregatorConvertPad, gst_object_unref);
| ^
gstreamer/subprojects/gst-plugins-base/gst-libs/gst/video/gstvideoaggregator.h:214:181: warning: ISO C does not allow extra ‘;’ outside of a function [-Wpedantic]
214 | G_DECLARE_DERIVABLE_TYPE (GstVideoAggregatorParallelConvertPad, gst_video_aggregator_parallel_convert_pad, GST, VIDEO_AGGREGATOR_PARALLEL_CONVERT_PAD, GstVideoAggregatorConvertPad);
| ^
```
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1572>
When fixating color, there might be "other caps" with color spaces not
supported by the caps features exposed in the vapostproc's source pad
caps template (perhaps it's a bug somewhere else in GStreamer).
This solution checks if the proposed format exists in the filter
within the caps feature associated with the proposed format.
The check is done with the new filter's function
gst_va_filter_has_video_format().
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1559>
In the need-data appsrc callback, a buffer is pulled from the
appsink. This buffer is then copied so that metadata is writable.
The copy is pushed to the appsrc but it doesn't take ownership
of the buffer so we need to manually unref it. The original buffer
is finally unreffed when the sample is freed.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1548>
This is due to an unsafe usage of the pad task. We didn't ensure proper
ownership of the task. That race involved the task being released too early,
and was detected, luckily, by the glib mutex implementationt that
reported the mutex being disposed while being locked.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1478>
On Android (especially) and for static builds in general it is safer to link
against libsoup and have the dynamic custom loading disabled. For those cases we
can safely assume the application will use either libsoup2 or libsoup3 and not
both.
Fixes#939
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1536>
The earlier size of 2 MB was set back in 2009, it doesn't
seem unreasonable to raise it to 8 MB these days. The use
case at hand is matroskademux containing both a video stream
with a very low amount of compression but no decoding latency,
and a H265 stream.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1538>
It is an extremely common mistake on Windows to have incorrect PATH
values when loading a plugin, and the error from g_module_error()
(which just calls FormatMessageW()) is very confusing in this case:
The specified module could not be found.
https://docs.microsoft.com/en-us/windows/win32/debug/system-error-codes--0-499-#ERROR_MOD_NOT_FOUND
It implies the plugin itself could not be found. The actual issue is
that a DLL dependency could not be found. We need to detect this case
and print a more useful error message.
We should still print the error fetched from FormatMessage() so that
people are able to google for it.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1540>
GstAudioRingBufferSpec::segsize has been configured by using
device period but GstWasapi2RingBuffer was referencing the
buffer size returned by IAudioClient::GetBufferSize()
which is most likely larger than device period.
Fixing to sync them.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1533>
If the `area_surface` got unmapped when changing to the `READY` or
`NULL` state, we currently don't remap it when playback resumes and
`wp_viewporter` is supported. Without `wp_viewporter` we do remap
it, but rather unintentionally and also when not wanted.
On Weston this has not been a big problem as it so far wrongly maps
subsurfaces of unmapped surfaces anyway - i.e. only the black
background was missing on resume. On other compositors and future
Weston this prevents the `video_surface` to get remapped.
Shuffle things around to ensure `area_surface` is mapped in the
right situations and do some minor cleanup.
See also https://gitlab.freedesktop.org/wayland/weston/-/issues/426
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1483>
Latest ffmpeg has removed avcodec_get_context_defaults(), and its
documentation says a new AVCodecContext should be allocated for this
purpose. The pointer returned by avcodec_find_decoder() is now
const-qualified so we also need to adjust for it. And, AVCOL_RANGE_MPEG
is now rejected with strict_std_compliance > FF_COMPLIANCE_UNOFFICIAL.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1531>
The later, doing damage in surface coordinates instead of buffer
coordinates, has been deprecated. The reason for that is that it
is more prone to bugs, both on the client and the compositor side,
especially when paired with buffer scale, `wp_viewporter` or
buffer transforms.
Unfortunately, on Weston this risks running into
https://gitlab.freedesktop.org/wayland/weston/-/issues/446
(which causes trouble for several other projects as well). However,
that bug only affects cases where we run in sync mode, i.e. only
during resizes. In practise I haven't been able to observe the
issue.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446>
Each time we call `wl_surface_damage()` we want to do full surface
damage. Like Mesa, just use `G_MAXINT32` to ensure we always do
full damage, reducing the need to track the right dimensions.
`window->video_rectangle` is now unused, but we keep it around for
now as we may need it again in the future.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446>
From the spec:
> This request is used to describe the regions where the pending
> buffer is different from the current surface contents
We currently also call `wl_surface_damage()` on surfaces without
new or still compositor-hold buffers, e.g. when resizing the window.
In that case we call it on `area_surface_wrapper`, even though it
gets resized via `wp_viewport_set_destination()`, in which case
the compositor is in charge of repainting the area on screen.
Doing so is currently not forbidden by the spec, however it might
be in the future, see
https://gitlab.freedesktop.org/wayland/wayland/-/issues/267
Thus lets stay close to the spec and only call `wl_surface_damage()`
when we just attached a buffer.
Right now this prevents runtime assertions in Mutter.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446>
`gst_wl_window_set_opaque` does not get called on window resizes,
potentially leaving opaque regions too small.
According to the spec opaque regions can be bigger than the surface
size - parts that fall outside of the surface will get ignored.
Thus we can can simply use `G_MAXINT32` and be sure that the whole
surfaces will always be covered.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1446>
If the VANC track does contain packets, but we skip over all packets, just
treat it the same as if there hadn't been any packets at all and send a
GAP event instead of erroring out with "Failed to handle essence element".
We would error out because when we reach the end of the loop without having
found a closed caption packet the flow return variable is still FLOW_ERROR
which is what it has been initialised to.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1518>
Though the profiles[0] is inited as GST_H265_PROFILE_INVALID in the
gst_h265_profile_tier_level_get_profile(), the profile detecting may
change its content later. So the return of profiles[0] may not be an
invalid profile even the len is 0.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1517>
The previous code was mistakenly trying to compute a cc_type out
of the first byte in the byte triplet, whereas it is to be interpreted
as:
> Bit b7 of the LINE value is the field number (0 for field 2; 1 for field 1).
> Bits b6 and b5 are 0. Bits b4-b0 form a 5-bit unsigned integer which
> represents the offset
The same mistake was made when creating padding packets.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1496>
Sometimes we can't output anything because we don't have enough
incoming frames. In that case, the resampler was trying to call
do_quantize() and do_resample() in a loop forever because there would
never be samples to output (so chain->samples would always be NULL).
Fix this by not calling chain->make_func() in a loop -- seems
completely unnecessary since calling it over and over won't change
anything if the make_func() can't output samples.
Also add some checks for the input and / or output being NULL when
doing conversion or quantization. This will happen when we have
nothing to output.
We can't bail early, because we need resampler->samples_avail to be
updated in gst_audio_resampler_resample(), so we must call that and
no-op everything along the way.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1461>
When the image is opaque but the output ProRes format has an alpha
component (4 component, 32 bits per pixel), Apple requires that we
signal that it should be ignored by setting the depth to 24 bits per
pixel. Not doing so causes the encoded files to fail validation.
So we set that in the caps and qtmux sets the depth value in the
container, which will be read by demuxers so that decoders can skip
those bytes entirely. qtdemux does this, but vtdec does not use this
information at present.
The sister change was made in qtmux and qtdemux in:
https://gitlab.freedesktop.org/gstreamer/gst-plugins-good/-/merge_requests/1061
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1489>
Old "application/*" are now as per RFC8081 deprecated in favor of
new "font/*" mime types. Some new encoders are already using the
updated mime types. We need to also add them to the support list
in order for assrender to correctly identify them as fonts.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1481>
If there is no jitterbuffer stats we should not attempt to store them in the
global stats structure.
Also add a g_return_if_fail in _gst_structure_take_structure() about this
because it is a programmer error to pass an invalid pointer address there.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1479>
Add gst_dep to gst_rtsp_server_deps, in the context of buildroot, this
will avoid the following build failure, because the correct girdir
location will be retrieved from gstreamer-1.0.pc:
/home/giuliobenetti/autobuild/run/instance-3/output-1/host/riscv32-buildroot-linux-gnu/sysroot/usr/bin/g-ir-compiler gst/rtsp-server/GstRtspServer-1.0.gir --output gst/rtsp-server/GstRtspServer-1.0.typelib --includedir=/usr/share/gir-1.0
Could not find GIR file 'Gst-1.0.gir'; check XDG_DATA_DIRS or use --includedir
error parsing file gst/rtsp-server/GstRtspServer-1.0.gir: Failed to parse included gir Gst-1.0
If the above error message is about missing .so libraries, then setting up GIR_EXTRA_LIBS_PATH in the .mk file should help.
Typically like this: PKG_MAKE_ENV += GIR_EXTRA_LIBS_PATH="$(@D)/.libs"
Fixes:
- http://autobuild.buildroot.org/results/04af6b22cfa0cffb6a3109a3b32b27137ad2e0b0
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1460>
Prior to this patch, we considered that a stream was blocking
whenever a pad probe was triggered for either the RTP pad or
the RTCP pad.
This led to situations where we subsequently unblocked and expected
to find a segment on the RTP pad, which was racy.
Instead, we now only consider that the stream is blocking when
the pad probe for the RTP pad has triggered with a blockable object
(buffer, buffer list, gap event).
The RTCP pad is simply blocked without affecting the state of the
stream otherwise.
Fixes#929
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1452>
Instead of a sequence of if statements, declare a table to map profile
idc with profiles and traverse it.
Also, first add the profile from the parsed profile idc and later add,
into the profile array, the profile from the compatibility flags.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1440>
It's possible a HEVC stream to have multiple profiles given the
compatibility bits. Instead of returning a single profile, internal
gst_h265_profile_tier_level_get_profiles() returns an array with all
it possible profiles.
Profiles are appended into the array only if the generated profile
is not invalid.
gst_h265_profile_tier_level_get_profile() is rewritten in terms of
gst_h265_profile_tier_level_get_profiles(), returning the first
profile found the array.
And gst_h265_get_profile_from_sps() is also rewritten in terms of
gst_h265_profile_tier_level_get_profiles(), but traversing the array
verifying if the proposed profile is actually valid by Annex A.3.x of
the specification.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1440>
BT.2020 color primaries are designed to cover much wider range of
CIE chromaticity than BT.709, and also it's used for both SDR and HDR
contents. So, the incorrect assumption (i.e., BT.709 as a BT.2020)
is risky and resulting image color tends to be visually very wrong.
Unless there's obvious clue, don't consider color space of high resolution
video stream as BT.2020
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1445>
* Add fec / red encoders as direct children of webrtcbin, instead
of providing them to rtpbin through the request-fec-encoder signal.
That is because they need to be placed before the rtpfunnel, which
is placed upstream of rtpbin.
* Update configuration of red decoders to set a list of RED payloads
on them, instead of setting the pt property.
That is because there may be one RED pt per media in the same session.
* Connect to request-fec-decoder-full instead of request-fec-decoder,
in order to instantiate FEC decoders according to the payload type
of the stream.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1429>
When multiple streams are bundled together, there may be more
than one red payload type to handle.
In addition, as the red decoder works by filling in gaps in
the seqnums, there needs to be one rtp_history queue per sequence
domain.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1429>
In redenc, when input buffers have a header for the TWCC extension,
we now add one to our wrapper buffers.
In ulpfecenc we add one in that case to our protection buffers.
This makes TWCC functional when UlpRed is used in webrtcbin.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1414>
In order to allow "level-asymmetry-allowed" we now handle a new
"profile" field, which as the same semantics as the "profile" field in
H.264 stream so that we can force payloaded stream to have the right
format when using the `gst_sdp_media_get_caps_from_media` to set caps
filter after the payloader. This allows a simple negotiation in standard
RTP negotiation based on SDPs (like webrtc) for that particular case,
closely respecting the specs.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1410>
The ["level-asymmetry-allowed"] field states that the peer wants the
profile specified in the "profile-level-id" fields but doesn't care
about the level. To express this in GStreamer caps term, we add a
"profile" field in the caps, which reuses the usual "profile" semantics
for H.264 streams and, and remove "profile-level-id" and
"level-asymmetry-allowed" fields.
["level-asymmetry-allowed"]: https://www.iana.org/assignments/media-types/video/H264
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1410>
We are querying supported swapchain colorspace via
CheckColorSpaceSupport() but it doesn't seem to be reliable.
Use only tested full-range RGB formats which are:
- sRGB
- BT709 primaries with linear RGB
- BT2020 primaries with PQ gamma
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1433>