On some hardware the first few frames are bogus and not very useful.
Their timestamps are off, they have no timecodes, or there are spurious
black frames / no-signal frames. After a few frames this stabilizes
though.
https://bugzilla.gnome.org/show_bug.cgi?id=774850
Based on this we calculate the actual capture time, which should get us
rid of any capturing jitter by averaging it out.
Also add a output-stream-time property which forces the elements to
output the stream time directly instead of doing any conversion to the
pipeline clock. Use with care.
https://bugzilla.gnome.org/show_bug.cgi?id=774850
The hardware timestamps have no relation to when frames were produced,
only when frames arrived somewhere in the hardware. Especially there is
no guarantee that audio and video will have the same hardware timestamps
although they belong together, and even more important: the rate with
which the hardware timestamps increase is completely unrelated to the
rate with which the frames are captured!
As such we can as well use the pipeline clock directly and stop doing
complicated calculations. Also as a side effect this allows now running
without any pipeline clock, by directly making use of the stream times
as reported by the driver.
https://bugzilla.gnome.org/show_bug.cgi?id=774850
libkms should not be used, because it imposes limitations on the DRM
API, especially regarding bpp and stride. Instead the DRM IOCTL should
be used directly.
Switch from libkms to the IOCTL interface. Set bpp and height for
framebuffer allocation to properly handle planar video formats.
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Signed-off-by: Víctor Jáquez <vjaquez@igalia.com>
When a frame is found to not have an associated input source (cable
unplugged, wrong mode selected), an element warning will be issued. When
the next frame in the stream is found to have an input source selected
(e.g. cable replugged), an element info will be issued.
https://bugzilla.gnome.org/show_bug.cgi?id=774629
Fixes:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[NSString stringWithUTF8String:]: NULL cString
in the state change test.
The default get_times() function of the base sink is just fine.
Remove the custom get_times() function, because the default function
already reads the timestamps from the buffers.
Signed-off-by: Michael Tretter <m.tretter@pengutronix.de>
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Unfortunately this does not go through the normal state change
machinery, so we don't get notified about this in change_state().
However we need to stop scheduled playback, so that once PLAYING is
reached again we can start scheduled playback with the correct time.
Without this, flushing seeks in PLAYING will not work correctly:
decklinkvideosink will wait before showing the new frames for the amount
of time the pipeline was in PLAYING before.
Drawing is done via the GDI drawing functions. The cursor is
converted to a monochrome version before drawing. This is because
the GDI drawing functions seem to have undefined behavior with
cursor images including an alpha channel.
I could not find any other reliable way to draw these alpha
channel cursors without producing unwanted artifacts. These type
of cursors were introduced with Window Vista when run with it's
Aero theme.
Also adjust the cursor coordinates when capturing non-primary
screens via the "monitor" option.
https://bugzilla.gnome.org/show_bug.cgi?id=760172
* Rephrase tune error to be delsys-neutral
* Refer to the actual check in the 'missing sanity check' warnings
* Use "Delivery system" instead of 'delsys'. The
latter is OK as a shorthand in the code but not
even a real word
Currently dx9screencapsrc prints a verbose warning in case the screen
index is out of range for the current number of detected monitors. This
value is then dropped.
However there is no initial indication (beside the console print) if it
worked or not. This may result in capturing an unwanted screen as it
would capture the last set index that was not rejected.
This patch sets the index regardless. Instead, the element throws an
error when it tries to run or getting caps for an invalid index.
https://bugzilla.gnome.org/show_bug.cgi?id=771817
In most display sink, the logic is to use as much as possible
of the given window. In this case, the window is the screen,
hence it's logical to scale up.
https://bugzilla.gnome.org/show_bug.cgi?id=767422
The source region was scaled for display before being passed
to drmModeSetPlane, which resulted in a portion of the video
being cropped. While when crop meta was present, the rectangle
was not centered since we where using unscaled width/height.
https://bugzilla.gnome.org/show_bug.cgi?id=767422
Some kms drivers demands specific pitches over the ones calculated by
GstVideoInfo. For example, intel driver demands strides round up 64.
This patch queries the driver for the prefered pitch and overwrites it
in the pool's GstVideoInfo structure.
https://bugzilla.gnome.org/show_bug.cgi?id=768446
While gint64 and int64_t are always the same, clang does not agree with that.
/Applications/Xcode.app/Contents/Developer/usr/bin/make -C decklink
CXX libgstdecklink_la-gstdecklinkaudiosink.lo
gstdecklinkaudiosink.cpp:675:79: error: cannot initialize a parameter of type 'int64_t *' (aka 'long long *') with an rvalue of type 'gint64 *' (aka 'long *')
ret = buf->output->attributes->GetInt (BMDDeckLinkMaximumAudioChannels, &max_channels);
^~~~~~~~~~~~~
./linux/DeckLinkAPI.h:692:87: note: passing argument to parameter 'value' here
virtual HRESULT GetInt (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ int64_t *value) = 0;
^
Scale down to milliseconds, otherwise at least some hardware has problems
scheduling the frames (or schedules them too slow) and we run out of available
frames.
https://bugzilla.gnome.org/show_bug.cgi?id=770282
This commit introduces IOSGLMemory which is a GLMemory that falls back to
GstAppleCoreVideoMemory for CPU access. This is a temporary solution until
IOSurface gets exposed as a public framework on iOS and so we can use
IOSurfaceMemory on both MacOS and iOS.
https://bugzilla.gnome.org/show_bug.cgi?id=769210
Add systemstream=false to caps, otherwise the decoder
may be picked for MPEG-PS files. Also parsed=true,
as video toolbox expects entire frame in
VTDecompressionSessionDecodeFrame.
https://bugzilla.gnome.org/show_bug.cgi?id=770049
https://github.com/mesonbuild/meson
With contributions from:
Tim-Philipp Müller <tim@centricular.com>
Matej Knopp <matej.knopp@gmail.com>
Jussi Pakkanen <jpakkane@gmail.com> (original port)
Highlights of the features provided are:
* Faster builds on Linux (~40-50% faster)
* The ability to build with MSVC on Windows
* Generate Visual Studio project files
* Generate XCode project files
* Much faster builds on Windows (on-par with Linux)
* Seriously fast configure and building on embedded
... and many more. For more details see:
http://blog.nirbheek.in/2016/05/gstreamer-and-meson-new-hope.htmlhttp://blog.nirbheek.in/2016/07/building-and-developing-gstreamer-using.html
Building with Meson should work on both Linux and Windows, but may
need a few more tweaks on other operating systems.
_stdint.h is generated by Autotools and we don't really need it. All
supported platforms now ship with stdint.h. The only stickler was MSVC,
and since Visual Studio 2015 it also ships stdint.h now.
Uncompressed RGB frames can be (usually are) bottom-up
layout in DirectShow, and the code to flip them wasn't
properly ported from 0.10. Fix it.
Fix post-processing of RGB buffers. We need a writable
buffer, but the requests pool is holding an extra ref.
This could use more fixing to use a buffer pool
On the ODroid C1+ the H265 and H264 have the same name but are listed as two
different codecs. We have to handle them as the same one that supports both,
as otherwise we will register the same GType name twice which fails and we
then only have H265 support and not H264 support.
ahssrc is a new plugin that enables Gstreamer to read from the
android.hardware.Sensor Android sensors. These sensors are treated as
buffers and can be passed through and manipulated by the pipeline.
https://bugzilla.gnome.org/show_bug.cgi?id=768110
The calculation of the offset table was done base on a plane size
estimation. This does not always work. Instead, use memory offset the
same we as it's implement in GstVideoMeta map functions.
Without setting the DRM_CLIENT_CAP_UNIVERSAL_PLANES capability bit, only
overlay planes are made available for compatibility with legacy clients.
But if a CRTC doesn't have an overlay plane associated, then kmssink is
not able to find a plane for the CRTC and the pipeline will fail, i.e:
ERROR kmssink gstkmssink.c:482:gst_kms_sink_start:<kmssink0> Could not find a plane for crtc
If no overlay planes were found for a given CRTC, fallback to universal
planes so DRM will also return primary planes that can be used instead.
https://bugzilla.gnome.org/show_bug.cgi?id=768183
Signed-off-by: Javier Martinez Canillas <javier@osg.samsung.com>
Without setting the DRM_CLIENT_CAP_UNIVERSAL_PLANES capability bit, only
overlay planes are made available for compatibility with legacy clients.
But if a CRTC doesn't have an overlay plane associated, then kmssink is
not able to find a plane for the CRTC and the pipeline will fail, i.e:
ERROR kmssink gstkmssink.c:482:gst_kms_sink_start:<kmssink0> Could not find a plane for crtc
This patch adds a plane-id property to the kmssink element so a specific
plane can be used in case that a CRTC has only a primary plane associated.
https://bugzilla.gnome.org/show_bug.cgi?id=768183