mirror of
https://gitlab.freedesktop.org/gstreamer/gstreamer.git
synced 2024-11-30 05:31:15 +00:00
GStreamer multimedia framework
ee21fd9053
We prefer to use the same format between image and surface for gst vaapi allocator. The old way may choose different formats between image and surface. For example, the RGBA image may have a NV12 surface. So we need to do format conversion when we put/get image to surface. Some drivers such as iHD can not support such conversion and always cause a data flow error. There may also have some performance cost for format conversion when put/get images. So we prefer to use the same format for image and surface in the allocator. If the surface can not support that format, we then fallback to find a best one as the surface format. Co-authored-by: Víctor Jáquez <vjaquez@igalia.com> |
||
---|---|---|
common@3fa2c9e372 | ||
docs | ||
gst | ||
gst-libs | ||
m4 | ||
tests | ||
.gitlab-ci.yml | ||
.gitmodules | ||
AUTHORS | ||
autogen.sh | ||
ChangeLog | ||
configure.ac | ||
COPYING.LIB | ||
git.mk | ||
gstreamer-vaapi.doap | ||
Makefile.am | ||
meson.build | ||
meson_options.txt | ||
NEWS | ||
README | ||
RELEASE |
gstreamer-vaapi VA-API support to GStreamer Copyright (C) 2010-2011 Splitted-Desktop Systems Copyright (C) 2011-2014 Intel Corporation Copyright (C) 2011 Collabora Ltd. License ------- gstreamer-vaapi helper libraries and plugin elements are available under the terms of the GNU Lesser General Public License v2.1+ Overview -------- gstreamer-vaapi consists in a collection of VA-API based plugins for GStreamer and helper libraries. * `vaapi<CODEC>dec' is used to decode JPEG, MPEG-2, MPEG-4:2, H.264 AVC, H.264 MVC, VP8, VC-1, WMV3, HEVC videos to VA surfaces, depending on the actual value of <CODEC> and the underlying hardware capabilities. This plugin is also able to implicitly download the decoded surface to raw YUV buffers. * `vaapi<CODEC>enc' is used to encode into MPEG-2, H.264 AVC, H.264 MVC, JPEG, VP8, HEVC videos, depending on the actual value of <CODEC> (mpeg2, h264, etc.) and the hardware capabilities. By default, raw format bitstreams are generated, so the result may be piped to a muxer, e.g. qtmux for MP4 containers. * `vaapipostproc' is used to filter VA surfaces, for e.g. scaling, deinterlacing (bob, motion-adaptive, motion-compensated), noise reduction or sharpening. This plugin is also used to upload raw YUV pixels into VA surfaces. * `vaapisink' is used to render VA surfaces to an X11 or Wayland display. This plugin also features a "headless" mode (DRM) more suited to remote transcode scenarios, with faster throughput. Features -------- * VA-API support from 0.29 to 0.38 * JPEG, MPEG-2, MPEG-4, H.264 AVC, H.264 MVC, VP8, VC-1, HEVC and VP9 ad-hoc decoders * MPEG-2, H.264 AVC,H.264 MVC, JPEG, VP8 and HEVC ad-hoc encoders * OpenGL rendering through VA/GLX or GLX texture-from-pixmap + FBO * Support for EGL backend * Support for the Wayland display server * Support for headless decode pipelines with VA/DRM * Support for major HW video decoding solutions on Linux (AMD, Intel, NVIDIA) * Support for HW video encoding on Intel HD Graphics hardware * Support for VA Video Processing APIs (VA/VPP) - Scaling and color conversion - Image enhancement filters: Sharpening, Noise Reductio, Color Balance, Skin-Tone-Enhancement - Advanced deinterlacing: Motion-Adaptive, Motion-Compensated Requirements ------------ Software requirements * GStreamer 1.9.x: libgstreamer1.0-dev (>= 1.9.x) libgstreamer-plugins-base1.0-dev (>= 1.9.x) libgstreamer-plugins-bad1.0-dev (>= 1.9.0) * Renderers: DRM: libva-dev (>= 1.1.0), libdrm-dev, libudev-dev X11: libva-dev (>= 1.0.1) GLX: libva-dev (>= 1.0.3) Wayland: libva-dev (>= 1.1.0), libwayland-dev (>= 1.0.2) Hardware requirements * Intel Eaglelake (G45) * Intel Ironlake, Sandybridge, Ivybridge, Haswell, Broadwell, Skylake, etc. (HD Graphics) * Intel BayTrail, Braswell * Intel Poulsbo (US15W) * Intel Medfield or Cedar Trail * Hardware supported by Mesa VA gallium state-tracker Usage ----- VA elements are automatically plugged into GStreamer pipelines. So, using playbin should work as is. However, here are a few alternate pipelines that could be manually constructed. * Play an H.264 video with an MP4 container in fullscreen mode $ gst-launch-1.0 -v filesrc location=/path/to/video.mp4 ! \ qtdemux ! vaapidecodebin ! vaapisink fullscreen=true * Play a raw MPEG-2 interlaced stream $ gst-launch-1.0 -v filesrc location=/path/to/mpeg2.bits ! \ mpegvideoparse ! vaapimpeg2dec ! vaapipostproc ! vaapisink * Convert from one pixel format to another, while also downscaling $ gst-launch-1.0 -v filesrc location=/path/to/raw_video.yuv ! \ videoparse format=yuy2 width=1280 height=720 ! \ vaapipostproc format=nv12 height=480 ! vaapisink * Encode a 1080p stream in raw I420 format into H.264 $ gst-launch-1.0 -v filesrc location=/path/to/raw_video.yuv ! \ videoparse format=i420 width=1920 height=1080 framerate=30/1 ! \ vaapih264enc rate-control=cbr tune=high-compression ! \ qtmux ! filesink location=/path/to/encoded_video.mp4 Sources ------- gstreamer-vaapi is Open Source software, so updates to this framework are really easy to get. Stable source code releases can be found at: <https://gstreamer.freedesktop.org/src/gstreamer-vaapi/> Git repository for work-in-progress changes is available at: <https://cgit.freedesktop.org/gstreamer/gstreamer-vaapi> Reporting Bugs -------------- Bugs can be reported in the GNOME Bugzilla system at: <https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer&component=gstreamer-vaapi>