Even if resolution and/or bitdepth is not updated, required
DPB size can be changed per SPS update and it could be even
larger than previously configured size of DPB. If so, we need
to reconfigure DPB d3d11 texture pool again.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1839>
We don't need to preserve input color range for transformed target
color space. Also some GPUs doesn't seem to be happy with 16-235
color range for RGB color space.
Also, since our default display target color space is
DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709, choosing full color range
would make more sense.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1814>
Managing reference picture type by using two variables
(ref and long_term) seems to be redundant and that can be
represented by using a single enum value.
This is to sync this implementation with gstreamer-vaapi so that
make comparison between this and gstreamer-vaapi easier and also
in order to minimize the change required for subclass to be able
to support interlaced.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1534>
As per spec 7.4.3 Slice header semantics, the flag value is derived as
MbaffFrameFlag = (mb_adaptive_frame_field_flag && !field_pic_flag)
and DXVA uses the value.
Regarding FrameNumList, in case of long-term ref, FrameNumList[i]
value should be long_term_frame_idx not long_term_pic_num.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1780>
Output texture of d3d11 decoder cannot have the bind flag
D3D11_BIND_SHADER_RESOURCE (meaning that it cannot be used for shader
input resource). So d3d11convert (and it's subclasses) was copying
texture into another internal texture to use d3d11 shader.
It's obviously overhead and we can avoid texture copy for
colorspace conversion or resizing via ID3D11VideoProcessor
as it supports decoder output texture.
This commit would be a visible optimization for d3d11 decoder with
d3d11compositor use case because we can avoid texture copy per frame.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1718>
GstMemory object could be disposed if GstBuffer is not allocated
by GstD3D11BufferPool such as via gst_buffer_copy() and/or
gst_buffer_make_writable(). So attaching qdata on GstMemory
object would cause unnecessary view alloc/free.
By using view pool which is implemented in GstD3D11Allocator,
we can avoid redundant view alloc/free.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1716>
In order to know the chroma format, besides profile, subsampling_x and
subsampling_y are needed (Spec 7.2.2 Color config semantics). These values are
in GstVp9Parser but not in GstVp9Framehdr.
Also, bit_depth is available in parser but not frame header. Evenmore, those
values are copied to picture structure later.
In case of VA-API, to configure the pipeline, it is require to know the chroma
format and depth.
It is possible to know chroma and depth through caps coming from vp9parser, but
it requires string parsing. It would be less error prone to get these values
through the parser structure at new_sequence() virtual method.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1700>
Add new video composition element which is equivalent to compositor
and glvideomixer elements. When d3d11 decoder elements are used,
d3d11compositor can do efficient graphics memory handling
(zero copying or at least copying memory on GPU memory space).
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323>
New d3d11colorconvert and d3d11scale elements will perform only
colorspace conversion and rescale, respectively. Those new elements
would be useful when only colorspace conversion or rescale is required
and the other part should be done by another elements.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1323>
Staging texture is used for memory transfer between system and
gpu memory. Apart from d3d11{upload,download} elements, however,
it should happen very rarely.
Before this commit, d3d11bufferpool was allocating at least one
staging texture in order to calculate cpu accessible memory size,
and it wasn't freed for later use of the texture unconditionally.
But it will increase system memory usage. Although GstD3D11memory
object is implemented so that support CPU access, most memory
transfer will happen in d3d11{upload,download} elements.
By this commit, the initial staging texture will be freed immediately
once cpu accessible memory size is calculated.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1627>
Note that newly added formats (YUY2, UYVY, and VYUY) are not supported
render target view formats. So such formats can be only input of d3d11convert
or d3d11videosink. Another note is that YUY2 format is a very common
format for hardware en/decoders on Windows.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1581>
If the run_async() method is expected to be called from streaming
thread and not from application thread, use INFINITE as timeout value
so that d3d11window can wait UI dispatcher thread in any case.
There is no way to get a robust timeout value from library side.
So the fixed timeout value might not be optimal and therefore
we should avoid it as much as possible.
Rule whether a timeout value can be INFINITE or not is,
* If the waiting can be cancelled by GstBaseSink:unlock(), use INFINITE.
GstD3D11Window:on_resize() is one case for example.
* Otherwise, use timeout value
Some details are, GstBaseSink:start() and GstBaseSink:stop() will be called
when NULL to READY or READY to NULL state change, so there will be no
chance for GstBaseSink:unlock() and GstBaseSink:unlock_stop()
to be called around them. So there is no other way then timeout way.
GstD3D11Window:consturcted() and GstD3D11Window:unprepare() are the case.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1461>
All subclasses are retrieving list to get target output frame, which
can be done by baseclass. And pass the ownership of the GstH264Picture
to subclass so that subclass can clear implementation dependent resources
before finishing the frame.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1449>
We don't need to duplicate a method for HRESULT error code to string
conversion. This patch is intended to
* Remove duplicated code
* Ensure FormatMessageW (Unicode version) and avoid FormatMessageA
(ANSI version), as the ANSI format is not portable at all.
Note that if "UNICODE" is not defined, FormatMessageA will be aliased
as FormatMessage by default.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1442>
Current shader code is not compatible with HLSL profile "ps_4_0_level_9_3"
or lower. So d3dcompiler cannot compile our shader code in that case.
Note that VirtualBox is one known driver which doesn't support currently
implemented shader code.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1343>
d3d11videosink has an advantage over d3dvideosink, such as
* Zero-copy playback with d3d11 decoders
* HDR rendering with 10-bit format/swapchain support
* UWP support
* Any system memory alignment/padding can be supported
* User can select target GPU device
And old d3dvideosink's functionality (e.g., navigation event, overlaycomposition)
can be covered by d3d11videosink
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1311>
If parent and child windows are running on different thread,
there is always a chance to cause deadlock as DefWindowProc() call
from child window thread might be blocked until the message
is handled by parent's window procedure.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1299>
GObject::dispose method can be called multiple times. As win32 d3d11window
has an internal thread and because GObject::dispose method could be called from the
thread, it might cause problems such as trying to join self-thread
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1299>
GstD3D11ColorConverter implementation is able to rescale video as well.
By doing colorspace conversion and rescale at once, we can save
one cycle of shader pipeline which will can save GPU resource.
Since this element can support color space conversion and rescale,
it's renamed as d3d11convert
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1275>
D3D11_CREATE_DEVICE_DEBUG flag will be used while creating d3d11 device
to activate debug layer. However, if system doesn't support the
debug layer for some reason, we should try to create d3d11 device
without the flag. Debug layer should be optional for device creation.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1217>
We should directly check the values of the `debug` and `optimization`
options instead.
`get_option('buildtype')` will return `'custom'` for most combinations
of `-Doptimization` and `-Ddebug`, but those two will always be set
correctly if only `-Dbuildtype` is set. So we should look at those
options directly.
For the two-way mapping between `buildtype` and `optimization`
+ `debug`, see this table:
https://mesonbuild.com/Builtin-options.html#build-type-options