During the video session memory allocation, the property flags can
be different from the expected ones, so do not expect all the
property flags and test it with G_MAXUINT32
It's failing with driver 525.47.26 and NVidia HW NVIDIA GeForce
RTX 3050 and 2060
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4850>
Take the case into account when user filters have been set before the
source gets updated.
Note that the further linking of the filters, if present, happens below
in the `gst_camera_bin_check_and_replace_filter()` calls.
The audio filter is still affected by the same issue but left out for
now.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5527>
Even if IDXGIOutput6 says current display colorspace is HDR,
captured texture via IDXGIOutputDuplication::AcquireNextFrame()
is converted frame by OS unless we use IDXGIOutput5::DuplicateOutput1()
with DXGI_FORMAT_R16G16B16A16_FLOAT format, in order for captured
frame to be scRGB color space. Then application should perform
tonemap operation based on reported display white level, color primaries, etc.
Since we don't have any tonemapping implementation, ignores colorimetry
reported by IDXGIOutput6.
Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3128
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5671>
This is how it was documented and how it worked before the port to GstPlay.
Without this, applications expecting signals to be emitted directly
without anything running the main context will simply not receive any
signals.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5672>
d2d runtime seems to execute pending GPU command list
when DXGI ID2D1RenderTarget is being released, and it will invoke
d3d11 immediate context APIs. Should protect all rendering operations
and DXGI resources with lock.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5659>
The code seems to validate that the media-level fingerprint matches
the fingerprint of the previous media or of the whole session. There
is no such requirement in any RFC I found. The session-session one
is just meant to act as a fallback when there is no media-level
fingerprint.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1118>
This commit corrects the mapping relationship between RGB and BGR in GST and DRM.
The previous mapping was incorrect, causing potential color mismatches in the output.
The changes are as follows:
{WL_SHM_FORMAT_RGB888, DRM_FORMAT_RGB888, GST_VIDEO_FORMAT_BGR},
{WL_SHM_FORMAT_BGR888, DRM_FORMAT_BGR888, GST_VIDEO_FORMAT_RGB},
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5620>
After talking with Vivia on IRC, she does not remember why the default
was FALSE and it is in my opinion preferable to stick to whatever
representation best represents time for a given framerate as a default
behavior.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5628>
In the case of encoders and filters when importing a DMABuf, use
GstVideoInfoDmaDrm to get the drm fourcc and modifier.
In both cases, instead of keeping the original GstVideoInfoDmaDrm from caps, the
GstVideoInfo part of the structure is converted as canonical one, given the
format from the fourcc. It's kept in the way to handle V4L2 linear DMABufs and
to avoid too many changes in the current code.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5264>
Instead of guessing the DRM format and modifier, pass a DRM video info to
gst_va_dmabuf_memories_setup().
Still, it checks for the DRM parameters in DRM info, if they are not available,
as in the case of V4L2 buffers, the part of the video info is used.
This is an API breakage, but since the plugin is still in stage, it's still
allowed.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5264>
To import DMAbufs we used VASurfaceAttribExternalBuffers which works, but it's
not specific for DRM PRIME 2, since it lacks of many metadata. This patch
replaces VASurfaceAttribExternalBuffers with VADRMPRIMESurfaceDescriptor in
va_create_surfaces().
Still, this patch assumes linear modifier only.
The hack for RGB surfaces in I965 driver was pushed down into
va_create_surfaces() to avoid handling both structures.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5264>
Should fix auto-generated follow-up sections like "Hierarchy" or
"Factory details" to be listed under the element name in the
table-of-contents of the document, instead of a stand-alone
"Duplex-Mode" section.
Also cleanup some spurious colon suffix after section names.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5625>
Now that nvidia-vaapi-driver appeared and isn't yet supported by GstVA, we've to
add an allowed list of supported drivers.
This patch implements it adding a environment variable to disable this driver
check: GST_VA_ALL_DRIVERS
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5616>
Suppose you have a project where GStreamer and wayland-protocols are
pulled in as dependencies via .wrap files. In that case, Meson's setup
step will fail for gst-plugins-bad with the message "Sandbox violation:
Tried to grab file viewporter.xml outside current (sub)project." To
avoid this exception, one should use Meson's `files` and `join_paths`
functions. The suggested solution is identical to how GTK 4 processes
Wayland files.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5593>
A payload of 0x80 0x80 means that it's padding. It's not a good idea to
throw this away though, because of the cc_valid field.
According to CEA 10-B section 25.2.1, if cc_valid is zero, the run-in
clock and start bit should not be generated. In practice, this means
that any closed captions will be erased and the end-user TV will show
that captions are not available for this stream. This might have
undesired consequences, e.g. we were just showing a long line of
captions and we disable it before the user has had time to read it, or
you can't enable closed captions during silence/music intervals.
We cannot reliably detect whether there's a currently-silent closed
caption stream or just nothing, but we have this information coming from
upstream, so we can at least not discard it.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5508>
When decoding stream using hardware V4L2 decoder element, in any of the
currently supported formats, the decoding will fail once frame number
1000000 is reached. The reported error clearly indicates a wrap-around
occured, instead of receiving decoded frame 1000000, frame 0 is received
from the hardware V4L2 decoder driver.
The problem is actually not in the driver itself, but rather in gstreamer,
which uses `struct v4l2_buffer` member `.timestamp` in a special way. The
timestamp of buffers with encoded data added to the SINK (input) queue of
the driver is copied by the driver into matching buffers with decoded data
added to the SOURCE (output) queue of the driver. In fact, the timestamp
is not a timestamp at all, but rather in this special case, only part of
it is used as an incrementing frame counter.
The `.timestamp` is of type `struct timeval`, which is defined in
`sys/time.h` [1]. Only the `tv_usec` member of this structure is used
for the incrementing frame counter. However, suseconds_t tv_usec [2]
may be limited to range [-1, 1000000]:
"
[XSI] The type suseconds_t shall be a signed integer type capable of
storing values at least in the range [-1, 1000000].
"
Therefore, once frame 1000000 is reached, a rollover occurs and decoding
fails.
Fix this by using both `struct timeval` members, `.tv_sec` and `.tv_usec`
with matching modular arithmetic, this way the failure would occur again
just short of 2^84 frames, which should be plenty.
[1] https://pubs.opengroup.org/onlinepubs/9699919799/basedefs/sys_time.h.html
[2] https://pubs.opengroup.org/onlinepubs/9699919799/basedefs/sys_types.h.html
A test case using stateless hardware h264 decoder, the WARN/ERROR output
in gstreamer log indicates a failure occurred. With this change, that
error no longer occurs and the WARN/ERROR are not present:
```
pc$ gst-launch-1.0 videotestsrc num-buffers=1001001 pattern=6 ! \
video/x-raw,width=16,height=16,format=I420 ! \
x264enc ! filesink location=/tmp/test.h264
dut$ GST_DEBUG="*:3" gst-launch-1.0 filesrc location=/tmp/test.h264 ! \
h264parse ! v4l2slh264dec ! fakesink
...
0:03:51.393677606 12111 0x370df400 WARN \
v4l2codecs-decoder gstv4l2decoder.c:1157:gst_v4l2_request_set_done:<v4l2decoder2> \
Requested frame 1000000, but driver returned frame 0.
0:03:51.394140597 12111 0x370df400 WARN \
v4l2codecs-decoder gstv4l2decoder.c:1157:gst_v4l2_request_set_done:<v4l2decoder2> \
Requested frame 1000001, but driver returned frame 1.
0:03:51.394425216 12111 0x370df400 WARN \
v4l2codecs-decoder gstv4l2decoder.c:1157:gst_v4l2_request_set_done:<v4l2decoder2> \
Requested frame 1000002, but driver returned frame 2.
0:03:51.394665211 12111 0x370df400 WARN \
v4l2codecs-decoder gstv4l2decoder.c:1157:gst_v4l2_request_set_done:<v4l2decoder2> \
Requested frame 1000003, but driver returned frame 3.
0:03:51.394785833 12111 0x370df400 WARN \
v4l2codecs-h264dec gstv4l2codech264dec.c:1059:gst_v4l2_codec_h264_dec_output_picture:<v4l2slh264dec0> \
error: Failed to decode frame 1000000
ERROR: from element /GstPipeline:pipeline0/v4l2slh264dec:v4l2slh264dec0: Failed to decode frame 1000000
```
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5598>
This pair of elements, inspired from shmsink/shmsrc, send unix file
descriptors (e.g. memfd, dmabuf) from one sink to multiple source
elements in other processes.
The unixfdsink proposes a memfd/shm allocator, which causes for example
videotestsrc to write directly into memories that can be transfered to
other processes without copying.
Sponsored-by: Netflix Inc.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5328>
The src caps of the libde265 is now fixed to I420, and so if the
stream is other format, such as 4:4:4 or 10 bits format, the pipeline
will crash because the dowstream element accesses the video buffer as
I420 format.
We now restrain the input caps to "main" profile, which only contains
4:2:0 8 bits stream.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5573>
This can happen with the dummy "noopenh264" library that the freedesktop
flatpak runtime ships, and Fedora is planning on shipping as well. In
both cases the dummy implementation gets replaced with the actual
openh264 library that's downloaded directly from Cisco, but just to be
on safe side, this patch makes it careful to check the return values to
avoid crashing if the underlying library hasn't been swapped out yet.
The patch is taken from freedesktop-sdk and was originally written by
Valentin David <valentin.david@codethink.co.uk>.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5581>
If users update geometry related properties very frequently
for a stream to be animated, redrawing on every update
can make rendering choppy or can be a performance bottleneck.
To address the issue, adding a property to control the behavior
of redrawing scene when geometry related properties are updated.
Also, do not resize swapchain on such property update, since
re-allocating backbuffer and multi-sampled render target is
unnecessary in that case.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5575>
Other Windows applications allow window switching even when
an application window is in fullscreen mode. Also fixing
regression introduced in 15248d8b84
which makes restored window is always located at topmost
since we do not call SetWindowPos() anymore when restoring
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5574>