If VPP is available, we always try to implicitly convert the source
buffer to the "native" surface format for the underlying accelerator.
This means that no optimization is performed yet to propagate raw YUV
buffers to the downstream element as is, if VPP is available. i.e. it
will always cause a color conversion.
Even if we only support deinterlacing for now, use flags to specify
which filters are to be applied to each frame we receive in transform().
This is preparatory work for integrating new filters.
Add support for "mixed" interlace-mode, whereby the video frame buffer
shall be deinterlaced only if its flags mention that's actually an
interlaced frame buffer.
Add gst_caps_set_interlaced() helper function that would reset the
interlace-mode field to "progressive" for GStreamer >= 1.0, or the
interlaced field to "false" for GStreamer 0.10.
Add basic deinterlacing support, i.e. bob-deinterlacing whereby only
the selected field from the input surface is kept for the target surface.
Setting gst_vaapi_filter_set_deinterlacing() method argument to
GST_VAAPI_DEINTERLACE_METHOD_NONE means to disable deinterlacing.
Also move GstVaapiDeinterlaceMethod definition from vaapipostproc plug-in
to libgstvaapi core library.
Signed-off-by: Gwenole Beauchesne <gwenole.beauchesne@intel.com>
Fix reference counting bug for passthrough mode, whereby the input buffer
was propagated as is downstream through gst_pad_push() without increasing
its reference count before. The was a problem when gst_pad_push() returns
an error and we further decrease the reference count of the input buffer.
Add support for interlaced streams with GStreamer 1.0 too. Basically,
this enables vaapipostproc, though it is not auto-plugged yet. We also
make sure to reply to CAPS queries, and happily handle CAPS events.
Port vaapidecode and vaapisink plugins to GStreamer API >= 1.0. This
is rather minimalistic so that to test the basic functionality.
Disable vaapiupload, vaapidownload and vaapipostproc plugins. The latter
needs polishing wrt. to GStreamer 1.x functionality and the former are
totally phased out in favor of GstVaapiVideoMemory map/unmap facilities,
which are yet to be implemented.
Signed-off-by: Gwenole Beauchesne <gwenole.beauchesne@intel.com>
Move GstVaapiVideoMeta from core libgstvaapi decoding library to the
actual plugin elements. That's only useful there. Also inline reference
counting code from GstVaapiMiniObject.
Use gst_element_class_set_static_metadata() from GStreamer 1.0, which
basically is the same as gst_element_class_set_details_simple() in
GStreamer 0.10 context.
Move GstImplementsInterface and GstVideoContext support functions up
so that to keep a clear separation between the plugin element and its
interface hooks.
Update plugin elements with the new GstVaapiVideoMeta API.
This also fixes support for subpictures/overlay because GstVideoDecoder
generates a sub-buffer from the GstVaapiVideoBuffer. So, that sub-buffer
is marked as read-only. However, when comes in the textoverlay element
for example, it checks whether the input buffer is writable. Since that
buffer read-only, then a new GstBuffer is created. Since gst_buffer_copy()
does not preserve the parent field, the generated buffer in textoverlay
is not exploitable because we lost all VA specific information.
Now, with GstVaapiVideoMeta information attached to a standard GstBuffer,
all information are preserved through gst_buffer_copy() since the latter
does copy metadata (qdata in this case).
Determine whether the buffer represents the top-field only by checking for
the GST_VIDEO_BUFFER_TFF flag instead of relying on the GstVaapiSurfaceProxy
flag. Also trust "interlaced" caps to determine whether the input frame
is interleaved or not.
Intermediate elements may produce a sub-buffer from a valid GstVaapiVideoBuffer
for non raw YUV cases. Make sure vaapipostproc now understands those buffers.
If vaapisink is in the GStreamer pipeline, then we shall allocate a
unique GstVaapiDisplay and propagate it upstream. i.e. subsequent
queries from vaapidecode shall get a valid answer from vaapisink.
This flag is obsolete. It was meant to explicitly enable/disable VA/GLX API
support, or fallback to TFP+FBO if this API is not found. Now, we check for
the VA/GLX API by default if --enable-glx is set. If this API is not found,
we now default to use TFP+FBO.
Note: TFP+FBO, i.e. using vaPutSurface() is now also a deprecated usage and
will be removed in the future. If GLX rendering is requested, then the VA/GLX
API shall be used as it covers most usages. e.g. AMD driver can't render to
an X pixmap yet.
Bump GStreamer required version to 0.10.14, needed for
gst_element_class_set_details_simple().
Signed-off-by: Gwenole Beauchesne <gwenole.beauchesne@intel.com>
Add new "interlaced" attribute to GstVaapiSurfaceProxy. Use this in
vaapipostproc so that to handles cases where bitstream is interlaced
but almost only frame pictures are generated. In this case, we should
not be alternating between top/bottom fields.
Add vaapipostproc element for video postprocessing. So far, only basic
bob deinterlacing is implemented. Interlaced mode is automatically
detected based on sink caps ("interlaced" field).