The default get_times() function of the base sink is just fine.
Remove the custom get_times() function, because the default function
already reads the timestamps from the buffers.
Signed-off-by: Michael Tretter <m.tretter@pengutronix.de>
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Unfortunately this does not go through the normal state change
machinery, so we don't get notified about this in change_state().
However we need to stop scheduled playback, so that once PLAYING is
reached again we can start scheduled playback with the correct time.
Without this, flushing seeks in PLAYING will not work correctly:
decklinkvideosink will wait before showing the new frames for the amount
of time the pipeline was in PLAYING before.
Drawing is done via the GDI drawing functions. The cursor is
converted to a monochrome version before drawing. This is because
the GDI drawing functions seem to have undefined behavior with
cursor images including an alpha channel.
I could not find any other reliable way to draw these alpha
channel cursors without producing unwanted artifacts. These type
of cursors were introduced with Window Vista when run with it's
Aero theme.
Also adjust the cursor coordinates when capturing non-primary
screens via the "monitor" option.
https://bugzilla.gnome.org/show_bug.cgi?id=760172
* Rephrase tune error to be delsys-neutral
* Refer to the actual check in the 'missing sanity check' warnings
* Use "Delivery system" instead of 'delsys'. The
latter is OK as a shorthand in the code but not
even a real word
Currently dx9screencapsrc prints a verbose warning in case the screen
index is out of range for the current number of detected monitors. This
value is then dropped.
However there is no initial indication (beside the console print) if it
worked or not. This may result in capturing an unwanted screen as it
would capture the last set index that was not rejected.
This patch sets the index regardless. Instead, the element throws an
error when it tries to run or getting caps for an invalid index.
https://bugzilla.gnome.org/show_bug.cgi?id=771817
In most display sink, the logic is to use as much as possible
of the given window. In this case, the window is the screen,
hence it's logical to scale up.
https://bugzilla.gnome.org/show_bug.cgi?id=767422
The source region was scaled for display before being passed
to drmModeSetPlane, which resulted in a portion of the video
being cropped. While when crop meta was present, the rectangle
was not centered since we where using unscaled width/height.
https://bugzilla.gnome.org/show_bug.cgi?id=767422
Some kms drivers demands specific pitches over the ones calculated by
GstVideoInfo. For example, intel driver demands strides round up 64.
This patch queries the driver for the prefered pitch and overwrites it
in the pool's GstVideoInfo structure.
https://bugzilla.gnome.org/show_bug.cgi?id=768446
While gint64 and int64_t are always the same, clang does not agree with that.
/Applications/Xcode.app/Contents/Developer/usr/bin/make -C decklink
CXX libgstdecklink_la-gstdecklinkaudiosink.lo
gstdecklinkaudiosink.cpp:675:79: error: cannot initialize a parameter of type 'int64_t *' (aka 'long long *') with an rvalue of type 'gint64 *' (aka 'long *')
ret = buf->output->attributes->GetInt (BMDDeckLinkMaximumAudioChannels, &max_channels);
^~~~~~~~~~~~~
./linux/DeckLinkAPI.h:692:87: note: passing argument to parameter 'value' here
virtual HRESULT GetInt (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ int64_t *value) = 0;
^
Scale down to milliseconds, otherwise at least some hardware has problems
scheduling the frames (or schedules them too slow) and we run out of available
frames.
https://bugzilla.gnome.org/show_bug.cgi?id=770282
This commit introduces IOSGLMemory which is a GLMemory that falls back to
GstAppleCoreVideoMemory for CPU access. This is a temporary solution until
IOSurface gets exposed as a public framework on iOS and so we can use
IOSurfaceMemory on both MacOS and iOS.
https://bugzilla.gnome.org/show_bug.cgi?id=769210