A420 is a four planar format similar to I420 but with an extra buffer
for alpha values.
A common use of the gl stack is for GPU format conversions using
shaders, in which case one can use glupload, glcolorconvert and
gldownload elements to upload their buffer to the GPU context, perform
the conversion on the GPU itself and then retrieve the data to CPU
context.
A420 was not supported. This patch adds said support mainly by adding
the corresponding shader to perform the conversion and updating the
supported caps.
Both A420->RGBA and RGBA->A420 conversions are supported.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1153>
The issue can be reproduced on a computer with a Radeon graphics card
when trying to force GStreamer Editing Services to use GL for video
mixing in GESSmartMixer, instead of the GstCompositor that smart mixer
would normally use. This change causes the resulting video stream to
have "video/x-raw(memory:GLMemory) ... texture-target: 2D" caps (instead
of "video/x-raw ..." caps). At the PlaySink stage of the pipeline, a
GstGLImageSinkBin is plugged, with a GstGLColorBalance on it. For some
reason that is still to be debugged (and out of the scope of this
patch), gst_gl_filter_set_caps() is never called on that color balance
element, leaving filter->in_texture_target set to its default
GST_GL_TEXTURE_TARGET_NONE value. The incomplete _create_shader() logic
does the rest and silently generates a shader code that doesn't build.
This is the command I use to reproduce the issue (I'm not sure if I
would be able to isolate the issue in a simple pipeline, though):
GST_PLUGIN_FEATURE_RANK=vaapih265enc:NONE,vaapih264enc:NONE,vaapisink:NONE,vaapidecodebin:NONE,vaapipostproc:NONE,vaapih265dec:NONE,vaapivc1dec:NONE,vaapih264dec:NONE,vaapimpeg2dec:NONE,vaapijpegdec:NONE,glvideomixer:260
ges-launch-1.0 +clip /tmp/video.mp4
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1159>
While so far it worked, we are about to introduce a format that break this
assuming. We have a format which consist of NV12 with alpha, and this format
does not have a direct mapping of the component against their plane indexes.
Fix this by using gst_video_format_info_component() introduced in 1.18 for
this purpose.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1151>
e.g. if running a dual wgl/egl built library, then egl will always
succeed in creating the GstGLContext because almost anything could
support egl, as long as eglGetDisplay() works.
wgl, however has a check for the correct display type so should move
earlier in the tried list.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1154>
Improves throughput of the total convert and blend process and allows
for higher performance across slightly more threads.
Also make use of video aggregator's task pool for blending as well in
order to reduce the number of threads.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1129>
New API:
- gst_gl_context_get_config()
- gst_gl_context_request_config()
A GL context configuration is a GstStructure that has some well-known
names for common values that can also be extended in platform-specific
ways if necessary.
Wrapped OpenGL contexts may be able to retrieve the GL context
configuration depending on the platform. If that information is
available, GstGLContext will attempt to create an context that matches
the shared OpenGL context config unless gst_gl_context_request_config()
has been called.
A new environment variable 'GST_GL_CONFIG' will be read to influence the
configuration chosen. The environment variable will only be used as a
fallback if gst_gl_context_request_config() has not been called.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/809>
This meta hold one buffer of the same codec data as the parent memory. This
extra frame luma will be used as the alpha values for the final combined
frame. This is notably used to support VP8/VP9 alpha as defined in WebM and
matroska specification.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1128>
Otherwise real-time clock changes can wrongly trigger timeouts, or not
cause timeouts to happen in time.
Unfortunately real-time clock times still have to be kept track inside
the elements for the statistics. Switching those over to the monotonic
clock would cause behaviour changes from the application point of view.
The statistics are extended with fields with monotonic times though.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1137>
These work the same way as the corresponding properties on queue and
allow to control the internal buffer size of the appsrc in a more
flexible way.
Unlike in queue the max-buffers and max-time properties are 0 (i.e.
disabled) by default for backwards compatibility reasons.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1133>
Cropping was not handled properly when frames have to be copied to
xvimage's buffer pool, first because the crop meta were dropped, and
second because the allocated frame size in xvimage's buffer pool were
smaller than the incoming frame.
This patch updates xvimagesink's video info when propose_allocation()
is called, and copies the GstVideoCropMeta from source frame to
destination one.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1088>
Buffer pool is created every time setcaps() is called, but it's
required only when upstream doesn't use it, so it's only needed to
copy frames onto XV buffers.
This patch delay the creation of the buffer pool until it's frame copy
is required.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/-/merge_requests/1088>