This commits add common elements for Ancillary Data and Closed
Caption support in GStreamer:
* A VBI (Video Blanking Interval) parser that supports detection
and extraction of Ancillary data according to the SMPTE S291M
specification. Currently supports the v210 and UYVY video
formats.
* A new GstMeta for Closed Caption : GstVideoCaptionMeta. This
supports the two types of CC : CEA-608 and CEA-708, along with
the 4 different ways they can be transported (other systems
are super-set of those).
https://bugzilla.gnome.org/show_bug.cgi?id=794901
We need different export decorators for the different libs.
For now no actual change though, just rename before the release,
and add prelude headers to define the new decorator to GST_EXPORT.
Add flags and enums to support multiview signalling in
GstVideoInfo and GstVideoFrame, and the caps serialisation and
deserialisation.
videoencoder: Copy multiview settings from reference input state
Add gst_video_multiview_* support API and GstVideoMultiviewMeta meta
https://bugzilla.gnome.org/show_bug.cgi?id=611157
Refactor the encoder's caps query proxying function to a common place
and use it in the videodecoder to proxy downstream restrictions.
The new function is private to the gstvideo lib.
https://bugzilla.gnome.org/show_bug.cgi?id=741263
Add a video scaler object build on top of the resampler. It has
implementation to deal with interlaced video as well as horizontal and
vertical scaling functions.
Move the conversion code used in videoconvert to the video library
and expose a simple but generic API to do arbitrary conversion. It can
currently do colorspace conversion but the plan is to add videoscale to
it as well.
See https://bugzilla.gnome.org/show_bug.cgi?id=732415
The _1_0 suffixed environment variables override the
non-suffixed ones, so if we're in an environment that
sets the _1_0 suffixed ones, such as jhbuild, we need
to set those to make sure ours actually always get
used.
This reverts commit e39fbe6b7e.
Looks like we need to pass the full .la file after all in a setup
with libtool, or it might not find the library, e.g. like
ERROR: can't resolve libraries to shared libraries: gstfft-1.0
Conflicts:
gst-libs/gst/audio/Makefile.am
gst-libs/gst/pbutils/Makefile.am
Also see https://bugzilla.gnome.org/show_bug.cgi?id=603710
Add support for supporting chroma subsampling correctly in the pack
function.
Fill in the pack and unpack functions for most formats.
Add some missing pack/unpack functions to the orc file.
Video base classes and theora plugin still needs to be ported again
Conflicts:
docs/libs/gst-plugins-base-libs-docs.sgml
docs/libs/gst-plugins-base-libs-sections.txt
docs/libs/gst-plugins-base-libs.types
ext/theora/gsttheoradec.c
ext/theora/gsttheoradec.h
ext/theora/gsttheoraenc.c
ext/theora/gsttheoraenc.h
gst-libs/gst/video/Makefile.am
gst-libs/gst/video/video.c
gst-libs/gst/video/video.h
gst/playback/gsturidecodebin.c
tests/check/libs/video.c
tests/check/pipelines/theoraenc.c
win32/common/libgstvideo.def
Fix building of the libgstvideo module on Android by adding the
missing and needed $(DEFAULT_INCLUDES) to CFLAGS for the
androgenizer call on gst-libs/gst/video/Makefile.am
Before this change, building was failing due to gst-plugins-base/
and gst-plugins-base/gst-libs/gst/video being left out of the
include path.
Basic API to attach overlay rectangles to buffers,
or blend them directly onto raw video buffers.
To be used primarily for things like subtitles or
logo overlays, not meant to replace videomixer.
Allows us to associate subtitle overlays with
non-raw video surface buffers, so that subtitles
are not lost and can instead be rendered later
when those surfaces are displayed or converted,
whilst re-using all the existing overlay plugins
and not having to teach them about our special
video surfaces. Could also have been made part
of the surface buffer abstraction of course, but
a secondary goal was to consolidate the blending
code for raw video into libgstvideo, and this
kind of API allows us to do both in a way that's
minimally invasive to existing elements, and at
the same time is fairly intuitive.
More features and extensions like the ability to
pass the source data or text/markup directly will
be added later.
https://bugzilla.gnome.org/show_bug.cgi?id=665080
API: gst_video_buffer_get_overlay_composition()
API: gst_video_buffer_set_overlay_composition()
API: gst_video_overlay_composition_new()
API: gst_video_overlay_composition_add_rectangle()
API: gst_video_overlay_composition_n_rectangles()
API: gst_video_overlay_composition_get_rectangle()
API: gst_video_overlay_composition_make_writable()
API: gst_video_overlay_composition_copy()
API: gst_video_overlay_composition_ref()
API: gst_video_overlay_composition_unref()
API: gst_video_overlay_composition_blend()
API: gst_video_overlay_rectangle_new_argb()
API: gst_video_overlay_rectangle_get_pixels_argb()
API: gst_video_overlay_rectangle_get_pixels_unscaled_argb()
API: gst_video_overlay_rectangle_get_render_rectangle()
API: gst_video_overlay_rectangle_set_render_rectangle()
API: gst_video_overlay_rectangle_copy()
API: gst_video_overlay_rectangle_ref()
API: gst_video_overlay_rectangle_unref()