`StdVideoDecodeH265PictureInfo.flags.IsReference` refers to section 3.132 ITU-T
H.265 specification:
reference picture: A picture that is a short-term reference picture or a
long-term reference picture.
`GstH265Picture.ref` doesn't reflect this, but we need to query the NAL type of
the processed slice.
This patch fixes the validation layer error
`VUID-vkCmdBeginVideoCodingKHR-slotIndex-07239` while using the NVIDIA driver.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6901>
Conceptually identical to the present signal of d3d11videosink.
This signal will be emitted with current render target
(i.e., swapchain backbuffer) and command queue. Signal handler
can record GPU commands for an overlay image or to blend
an image to the render target.
In addition to d3d12 resources, videosink will send
d3d11 and d2d resources depending on "overlay-mode"
property, so that signal handler can render by using
preferred/required DirectX API.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6838>
The idea of using separate command queue per videosink was that
swapchain is bound to a command queue and we need to flush the
command queue when window size is changed. But the separate
queue does not seem to improve performance a lot.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6838>
On various 32 bit systems, time_t is actually 64 bits while long is
still only 32 bits. The macro would wrongly trigger its assertion in
this case if a value with more than 68 years worth of seconds is
converted.
Examples are various newer 32 bit platforms and old ones that are
compiled with -D_TIME_BITS=64.
Also statically assert that time_t is either 32 or 64 bits. Other values
might need adjustments in the macro.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6869>
The prime_fds for multi planes may be the same. For example, on Intel's
platform, the NV12 surface may have the same FD for the plane0 and the
plane1. Then, the DRM_IOCTL_GEM_CLOSE will close the same handle twice
and get an "Invalid argument 22" error the second time.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6914>
From the spec (chapter 34, v1.3.283):
````
UNORM: the components are unsigned normalized values in the range [0, 1]
SRGB: the R, G and B components are unsigned normalized value that represent
values using sRGB nonlinear encoding, while the A component (if one
exists) is a regular unsigned normalized value
```
The difference is the storage encoding, the first one is aimed for image
transfers, while the second is for shaders, mostly in the swapchain stage in the
pipeline, and it's done automatically if needed [1].
As far as I have checked, other frameworks (FFmpeg, GTK+), when import or export
images from/to Vulkan, use exclusively UNORM formats, while SRGB formats are
ignored.
My conclusion is that Vulkan formats are related on how bits are stored in
memory rather their transfer functions (colorimetry).
This patch does two interrelated changes:
1. It swaps certain color format maps to try first, in both
gst_vulkan_format_from_video_info() and gst_vulkan_format_from_video_info_2(),
the UNORM formats, when comparing its usage, and later check for SRGB.
2. It removes the code that check for colorimetry in
gst_vulkan_format_from_video_info_2(), since it not storage related.
1. https://community.khronos.org/t/noob-difference-between-unorm-and-srgb/106132/7
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6797>
This mirrors the behaviour in vp8enc / vp9enc and is generally more
useful than using any framerate from the caps as it provides some degree
of accuracy if the stream doesn't have timestamps perfectly according to
the framerate.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6891>
Turns out AudioConvertHostTimeToNanos and AudioGetCurrentHostTime are macOS-only APIs, which prevents apps using
GStreamer on iOS from being accepted into App Store.
This commit replaces those functions with a manual version of what they do - mach_absolute_time() for the current time,
and data from mach_timebase_info() at the beginning to convert host timestamps to nanoseconds.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6789>
The `videosink` refernce in main() is a floating one, so it should not
be unref'ed (the playbin practically takes ownership of it).
This prevents a "gst_object_unref: assertion '((GObject *)
object)->ref_count > 0' failed" at runtime.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6883>
Using g_error() crashes the application, producing a coredump e.g. when
the user closes the video window, which can be confusing (especially on
the very first tutorial).
Let's change this to log the error message without crashing, using
g_printerr(), like subsequent tutorials.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6883>
To simplify the description, I'm assuming we only have two streams: video and audio.
For the video stream, we have the following events :
- STREAM_START => stream->wait set to true
- NEW_SEGMENT(1) => blocked waiting in gst_stream_synchronizer_wait
- FLUSH_START => unblocked
- FLUSH_STOP => stream->wait reset to false
- NEW_SEGMENT(2) => not waiting, since stream->wait is false
Then for the audio stream, we have the following events :
- STREAM_START => stream->wait set to true
- NEW_SEGMENT(2) => blocked waiting in gst_stream_synchronizer_wait for ever.
Note: The first NEW_SEGMENT event and the FLUSH_START, FLUSH_STOP events of the audio stream
are dropped before being received by the streamsynchronizer element, because the decodebin audio pad src
is not yet linked to the playsink audio pad sink.
To fix this deadlock, we don't reset stream->wait to false in the FLUSH_STOP event when it is not
waiting for the EOS of the other streams.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6763>
If we have constant duration buffers, set the duration on
outgoing buffers, like rtpmp4adepay does. This fixes
problems with (for example) muxers like mp4mux not writing
the duration of the final sample into the index.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6878>