Rework the GL context code. Now both avfvideosrc and vtdec can create an
internal GL context for pushing textures. Both elements will still try to
use/switch to a local context where available (including after RECONFIGURE
events).
Implement a new memory type wrapping CVPixelBuffer.
There are two immediate advantages:
a) Make the GstMemory itself retain the CVPixelBuffer. Previously,
the containing GstBuffer was solely responsible for the lifetime of
the backing CVPixelBuffer.
With this change, we remove the GST_MEMORY_FLAG_NO_SHARE so that
GstMemory objects be referenced by multiple GstBuffers (doing away
with the need to copy.)
b) Delay locking CVPixelBuffer into CPU memory until it's actually
mapped -- possibly never.
The CVPixelBuffer object is shared among references, shares and
(in planar formats) planes, so a wrapper GstAppleCoreVideoPixelBuffer
structure was introduced to manage locking.
https://bugzilla.gnome.org/show_bug.cgi?id=747216
Otherwise qtkitvideosrc fails to build on OSX 10.10.4
because QTKit has been deprecated since OS X 10.9.
Also set -mmacosx-version-min=10.8 in front to allow
the user or cerbero to override the version.
https://bugzilla.gnome.org/show_bug.cgi?id=745564
Switch to using IOSurface instead of CVOpenGLTextureCache on OSX. The latter can't be
used anymore to do YUV => RGB with opengl3 on El Capitan as GL_YCBCR_422_APPLE
has been removed from the opengl3 driver. Also switch to NV12 from UYVY, which
was the only YUV format supported by CVOpenGLTextureCache.
First of a few commits to stop using CVOpenGLTextureCache on OSX and use
IOSurfaces directly instead. CVOpenGLTextureCache hasn't been updated for OpenGL
3 which is why texture sharing is currently disabled on OSX.
Public frameworks don't need to build the API dynamically, we instead
use the framework directly.
The exception is for VideoToolbox which went public in the 10.8 SDK,
but it's still private in older version of the SDK and iOS. This allow
building the plugin against SDK's where it's not a public framework.
This element makes use of the documented AVFoundation framework made
available starting with iOS 4.0, and hence this means we can finally
capture video using a public API.
Provides the following elements:
qtkitvideosrc: OS X video source relying on the QTKit API. Comes with
hard-coded caps as the API does not provide any way of querying for
formats supported by the hardware. Hasn't been tested a lot, but seems
to work.
miovideosrc: OS X video source which uses the undocumented/private
CoreMediaIOServices API, which is also the one used by iChat.
Present on latest version of Leopard and all versions of Snow Leopard.
Has been tested extensively with built-in cameras and TANDBERG's
PrecisionHD USB camera.
vtenc, vtdec: Generic codec wrappers which make use of the undocumented/
private VideoToolbox API on OS X and iOS. List of codecs are currently
hard-coded to H.264 for vtenc, and H.264 + JPEG for vtdec. Can easily be
expanded by adding new entries to the lists, but haven't yet had time to
do that. Should probably also implement probing as available codecs depend
on the OS and its version, and there doesn't seem to be any way to
enumerate the available codecs.
vth264decbin, vth264encbin: Wrapper bins to make it easier to use
vtdec_h264/vtenc_h264 in live scenarios.
iphonecamerasrc: iPhone camera source relying on the undocumented/private
Celestial API. Tested on iOS 3.1 running on an iPhone 3GS. Stops working
after a few minutes, presumably because of a resource leak. Needs some
love.
Note that the iOS parts haven't yet been ported to iOS 4.x.