Implement videooverlay interface in kmssink, divided into two cases:
when driver supports scale, then we do refresh in show_frame(); if
not, send a reconfigure event to upstream and re-negotiate, using the
new size.
https://bugzilla.gnome.org/show_bug.cgi?id=784599
We used to to handle the driver pitch only for single plan video format.
Add support for multi planes format by re-using the extrapolate function
from the v4l2 element.
Also use this pitch to calculate the proper offsets.
Prevent DRM drivers to pick a slow path if the pitches/offsets don't
match the ones it reported.
https://bugzilla.gnome.org/show_bug.cgi?id=785029
No semantic change, just renamed the 'tmp' variable to a more meaningful
name and to use the same structure as in gst_kms_allocator_bo_alloc().
Needed as I'm going to move the gst_memory_init() call after the
allocation of the DUMB buffer.
https://bugzilla.gnome.org/show_bug.cgi?id=785029
HRESULT is unsigned long on Windows, but the Decklink headers define
it to 'int' on Linux. Confusingly, the defines that talk about the
possible return values for it use long constants. The easy fix would
be to change the linux/LinuxCOM.h header, but that's copied from the
decklink SDK.
Change the logging to always upcast to unsigned long while printing
HRESULT for consistency across platforms.
gstdecklinkvideosrc.cpp:425:7: warning: format '%x' expects argument of type 'unsigned int', but argument 8 has type 'HRESULT {aka long int}' [-Wformat]
[and so on]
gstdecklinkaudiosink.cpp:155:19: error: conflicting type attributes specified for 'virtual HRESULT GStreamerAudioOutputCallback::QueryInterface(const IID&, void**)'
In file included from /var/lib/jenkins/workspace/cerbero-cross-mingw32/workdir/mingw/w32/bin/../lib/gcc/i686-w64-mingw32/4.7.3/../../../../i686-w64-mingw32/include/objbase.h:153:0,
from /var/lib/jenkins/workspace/cerbero-cross-mingw32/workdir/mingw/w32/bin/../lib/gcc/i686-w64-mingw32/4.7.3/../../../../i686-w64-mingw32/include/ole2.h:16,
from /var/lib/jenkins/workspace/cerbero-cross-mingw32/workdir/mingw/w32/bin/../lib/gcc/i686-w64-mingw32/4.7.3/../../../../i686-w64-mingw32/include/windows.h:94,
from /var/lib/jenkins/workspace/cerbero-cross-mingw32/workdir/mingw/w32/bin/../lib/gcc/i686-w64-mingw32/4.7.3/../../../../i686-w64-mingw32/include/rpc.h:16,
from win/DeckLinkAPI.h:27,
from gstdecklink.h:35,
from gstdecklinkaudiosink.h:27,
from gstdecklinkaudiosink.cpp:25:
/var/lib/jenkins/workspace/cerbero-cross-mingw32/workdir/mingw/w32/bin/../lib/gcc/i686-w64-mingw32/4.7.3/../../../../i686-w64-mingw32/include/unknwn.h:67:25: error: overriding 'virtual HRESULT IUnknown::QueryInterface(const IID&, void**)'
(and many more)
https://ci.gstreamer.net/job/cerbero-cross-mingw32/6407/console
The default memory allocator of the decklink library allocates
a fixed pool of buffers, and the number of buffers is unknown.
This makes it impossible do useful queuing downstream. The new
memory allocator can create an unlimited number of buffers,
giving all queuing features one would expect from a live source.
https://bugzilla.gnome.org/show_bug.cgi?id=782556
In this patch we keep track of the cached kmsmem in a way
that we can clear the cache during the drain process. This
release the framebuffer before waiting for the next vblank,
hence add support for DRM driver (like Intel one) that release
the associated DMABuf reference asynchronously.
https://bugzilla.gnome.org/show_bug.cgi?id=782774
kmssink keeps a reference on the last rendered buffer. If this buffer
refers to an upstream buffer, it should be should be released on DRAIN
and ALLOCATION queries so all upstream buffers can be returned to the
pool if needed. As the buffer may be used for scanout, we copy this
buffer into a dumb buffer prior to let it go.
Based on patch from Guillaume Desmottes <guillaume.desmottes@collabora.com>
https://bugzilla.gnome.org/show_bug.cgi?id=782774
This otherwise breaks DMABuf reclaiming. This is not visible from
userspace, but inside the kernel, the DRM driver will hold a ref to the
DMABuf object. With a V4L2 driver allocating those DMABuf, it then
prevent changing the resolution and re-allocation new buffers.
https://bugzilla.gnome.org/show_bug.cgi?id=782774
Milliseconds was wrong and made use of this timeout quite
confusing. The code uses the value as microsenconds so
any meaningful number was off by orders of magnitude.
Set the pts and dts on the frame that we receive from the msdk.
Also fix the inverted logic in setting sync points, previously we
were marking all frames as sync points except IDRs.
https://bugzilla.gnome.org/show_bug.cgi?id=782801
When extracting an aux buffer from an MJPG carrier, at
*least* put the original timestamp on it, even if we
fail to apply any other timestamp (which we always do
at the moment, because the timestamp calculating code
was never finished). Apply a DTS using the camera
supplied delay value as well, assuming that there's
no re-ordering going on (there isn't in the C920,
which is really the only extant camera doing this
stuff) and a warning if that turns out not to be true.
This is basically a frame counter provided by the driver and it's
advancing at the speed of the HDMI/SDI input. Having this available on
each buffer allows to know what constant-framerate-based timestamp each
frame is corresponding to and can be used e.g. to write out files
accordingly without having the local pipeline clock timestamps used.
https://bugzilla.gnome.org/show_bug.cgi?id=779213
The main advantage is that our sleeps can be interrupted in case of
an src_reset(). Earlier, we would need to wait for a read to complete
before we could do a reset, which could take a long time.
https://bugzilla.gnome.org/show_bug.cgi?id=781249
The audio packet times can be completely unrelated to the video stream
time, depending on the card. While this looks like a bug in the driver,
just always using the video stream time (which is correct) works as a
workaround for now.
Earlier, the plugin was ignoring those settings and blindly setting
buffer-time to 2 seconds and latency-time to 200ms, which forced all
pipelines to have a minimum latency of 200ms + sink latency.
The values of segsize and segtotal were also not derived correctly.
Now we obey these values, and you can get close to the previous
behaviour by setting buffer-time and latency-time manually. Note that
they are set in microseconds.
As a consequence, when we haven't received enough data from the
device, we now sleep for a time proportional to the data remaining.
However, Directsound is a deprecated API so it maintains its own
software ringbuffer which updates at arbitrary intervals. Hence we
might have to wait a full segsize to get the last 10% of data. To
avoid tight loops, we clamp our sleep floor at 10ms.
In my testing, this keeps the wakeups not-too-high (proportional to
the latency-time set on the source). Further improvements should be
made by fixing the WASAPI audio source plugin instead of this.
Directsound is deprecated and as the comments explain, it is
impossible to get low latency, decent quality, or good performance
from it.
Based on a patch by Sebastian Dröge <sebastian@centricular.com>
https://bugzilla.gnome.org/show_bug.cgi?id=781249
This reverts commit 845832263b.
The commit broke cross-mingw CI:
https://ci.gstreamer.net/job/GStreamer-master/8659/console
It seems that cross-mingw on Autotools and native-mingw on Meson
disagree about the size of HRESULT. Revert for now till I can
investigate the Meson side of things some more.
MinGW does not provide comsupp.lib, so there's no implementation of
_com_util::ConvertBSTRToString. Use a fallback implementation that
uses wcstombs() instead.
On MinGW we also truncate the name to 100 chars which should be fine.
The QTKit framework had been deprecated for long in favour of AVFundation
framework and we already have avfvideosrc that provides the same
functionality.
https://bugzilla.gnome.org/show_bug.cgi?id=782078
MediaCodec gives us a presentation timestamp of 0 if it does not know
anything, but GStreamer gives us GST_CLOCK_TIME_NONE. Don't mix up these
two.
https://bugzilla.gnome.org/show_bug.cgi?id=780190
This is basically a frame counter provided by the driver and it's
advancing at the speed of the HDMI/SDI input. Having this available on
each buffer allows to know what constant-framerate-based timestamp each
frame is corresponding to and can be used e.g. to write out files
accordingly without having the local pipeline clock timestamps used.
https://bugzilla.gnome.org/show_bug.cgi?id=779213
This reverts commit 6d256d9908.
It was configuring the period/buffer size in a way that often causes
drop-outs or complete underruns. Needs further investigation.
"meson encountered an error in file
sys/decklink/meson.build, line 33, column 2:
Invalid use of addition: must be str, not list"
Also remove nonsensical linker flags on windows.
https://bugzilla.gnome.org/show_bug.cgi?id=781156
segsize should be based on latency-time, and must be a multiple of the
frame size. segtotal should be based on buffer-time and segsize.
This prevents errors caused by outputting buffers that are not a
multiple of the frame size, and actually makes the buffer-time and
latency-time properties do what they're supposed to do.
gstkmssink.c: In function ‘gst_kms_sink_get_input_buffer’:
gstkmssink.c:1102:29: error: ‘mems[0]’ may be used uninitialized in this function [-Werror=maybe-uninitialized]
kmsmem = (GstKMSMemory *) get_cached_kmsmem (mems[0]);
^~~~~~~~~~~~~~~~~~~~~~~~~~~
cc1: all warnings being treated as errors
Avfvideosrc represents an iphone camera or, on mac, a screencapture session.
The old API allowed you to select an input device by device index only. The new
API adds the ability to select the position (front or back facing) and
device-type (wide angle, telephoto, etc.). Furthermore, you can now specify
the orientation (portrait, landscape, etc.) of the videostream.
https://bugzilla.gnome.org/show_bug.cgi?id=778333
All code interacting with Objective-C objects should now use Automated
Reference Counting rather than manual memory management or Garbage
Collection. Because ARC prohibits C-structs from containing
references to Objective-C objects, all such fields are now typed
'gpointer'. Setting and gettings Objective-C fields on such a
struct now uses explicit __bridge_* calls to tell ARC about
object lifetimes.
https://bugzilla.gnome.org/show_bug.cgi?id=777847
It was previously possible for videotexturecache to be finalized before all of
its textures. Finalizing outstanding textures in this circumstance leads
to a crash. This patch ensure resources are freed in the proper order.
https://bugzilla.gnome.org/show_bug.cgi?id=779247
This seems to happen sometimes on some hardware, and is not really
critical as long as the scheduling of the normal frames works fine.
Only post a warning message for this case.
Overriding the pad query function completely overrides all the default
query handling implemented in basesrc, including caps etc. The correct
thing to do is just override the basesrc query vfunc and then chain up
for the queries we don't handle.
The cached texture was treated as user_data passed to GstGLBaseMemory
and freed with a GDestroyNotify function. However, this data must
be treated specially: it must be destroyed in the GL thread.
https://bugzilla.gnome.org/show_bug.cgi?id=778434
Enforce exactly the same raw video format on both sides, include a
videoconvert and queue before the video sink and make the shm area a
little bit bigger so that things don't get stuck.
and error out here already otherwise. We currently don't support
reconfiguration here and it can't happen really either unless the auto
mode is selected.
15:18:47 gstdecklinkaudiosrc.cpp:745:45: error: cannot initialize a parameter of type 'int64_t *' (aka 'long long *') with an rvalue of type 'gint64 *' (aka 'long *')
15:18:47 (BMDDeckLinkMaximumAudioChannels, &self->channels_found);
15:18:47 ^~~~~~~~~~~~~~~~~~~~~
15:18:47 ./linux/DeckLinkAPI.h:970:87: note: passing argument to parameter 'value' here
15:18:47 virtual HRESULT GetInt (/* in */ BMDDeckLinkAttributeID cfgID, /* out */ int64_t *value) = 0;
15:18:47 ^
gstdecklink.cpp:821:11: warning: variable 'dtc' is used uninitialized whenever 'if' condition is false [-Wsometimes-uninitialized]
if (m_input->videosrc) {
^~~~~~~~~~~~~~~~~
gstdecklink.cpp:837:41: note: uninitialized use occurs here
stream_time, stream_duration, dtc, no_signal);
^~~
gstdecklink.cpp:821:7: note: remove the 'if' if its condition is always true
if (m_input->videosrc) {
^~~~~~~~~~~~~~~~~~~~~~~
gstdecklink.cpp:810:29: note: initialize the variable 'dtc' to silence this warning
IDeckLinkTimecode *dtc;
^
= NULL
In some places a GST_FLOW_FLUSHING result was return as a FALSE
gboolean and then returned from a parent function as
GST_FLOW_ERROR. This prevented seeking from working.
https://bugzilla.gnome.org/show_bug.cgi?id=776360
gstamcvideodec.c: In function 'gst_amc_video_dec_src_query':
gstamcvideodec.c:2412:55: error: 'self' undeclared (first use in this function)
if (gst_gl_handle_context_query ((GstElement *) self, query,
This logic did not belong to the channel configuration
parser (only used by dvbbasebin) but to dvbsrc, which
is the element directly using this value and honoring
the "adapter" property.
Allows previously non-working cases like this to work:
GST_DVB_ADAPTER=1 gst-launch-1.0 dvbsrc delsys=11 modulation=7 frequency=689000000 ! fakesink
If they were not ported after 4+ years it seems unlikely that anybody is
ever going to need them again. They're still in the GIT history if
needed.
https://bugzilla.gnome.org/show_bug.cgi?id=774530
Configure the display mode when setting the negotiated caps instead of
during showing the first frame.
A framebuffer is required to set the mode. Allocate a buffer object
according to the negotiated caps and use it to set the mode. This buffer
object cannot be freed until another page flip happened on the crtc
(i.e., until the first frame is rendered).
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
The force-modesetting parameter forces the kmssink to ignore already
configured display modes, to configure the display mode itself and use
the base plane for output.
https://bugzilla.gnome.org/show_bug.cgi?id=773473
If the input buffers have a different size than the display, the frames
would have to be scaled or positioned on the display. The kmssink cannot
decide which behaviour would be appropriate for which use case.
In order to avoid scaling or positioning of the input stream, allow only
the supported connector resolutions in the sink caps.
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
Displays usually support multiple modes. Therefore, the kmssink should
not only support the preferred mode, but any mode that is supported by
the display.
https://bugzilla.gnome.org/show_bug.cgi?id=773473
The kmssink assumed that the mode was already set by another application
and used an overlay plane for displaying the frames.
Use the preferred mode of the monitor and render to the base plane if
the crtc does not have a valid mode.
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
gstdecklink.cpp: In member function ‘virtual HRESULT GStreamerDecklinkInputCallback::VideoInputFrameArrived(IDeckLinkVideoInputFrame*, IDeckLinkAudioInputPacket*)’:
gstdecklink.cpp:766:34: error: ‘base_time’ may be used uninitialized in this function [-Werror=maybe-uninitialized]
capture_time -= base_time;
^
First of all, all the HD and UHD modes should be top-field-first, as
also returned by the Decklink mode iterator API.
Then we should include the caps field "field-order" in the caps of the
source (not the sink due to negotiation problems with optional fields).
And finally we should set the TFF flag on interlaced buffers that are
top-field-first.
On some hardware the first few frames are bogus and not very useful.
Their timestamps are off, they have no timecodes, or there are spurious
black frames / no-signal frames. After a few frames this stabilizes
though.
https://bugzilla.gnome.org/show_bug.cgi?id=774850
Based on this we calculate the actual capture time, which should get us
rid of any capturing jitter by averaging it out.
Also add a output-stream-time property which forces the elements to
output the stream time directly instead of doing any conversion to the
pipeline clock. Use with care.
https://bugzilla.gnome.org/show_bug.cgi?id=774850
The hardware timestamps have no relation to when frames were produced,
only when frames arrived somewhere in the hardware. Especially there is
no guarantee that audio and video will have the same hardware timestamps
although they belong together, and even more important: the rate with
which the hardware timestamps increase is completely unrelated to the
rate with which the frames are captured!
As such we can as well use the pipeline clock directly and stop doing
complicated calculations. Also as a side effect this allows now running
without any pipeline clock, by directly making use of the stream times
as reported by the driver.
https://bugzilla.gnome.org/show_bug.cgi?id=774850
libkms should not be used, because it imposes limitations on the DRM
API, especially regarding bpp and stride. Instead the DRM IOCTL should
be used directly.
Switch from libkms to the IOCTL interface. Set bpp and height for
framebuffer allocation to properly handle planar video formats.
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Signed-off-by: Víctor Jáquez <vjaquez@igalia.com>
When a frame is found to not have an associated input source (cable
unplugged, wrong mode selected), an element warning will be issued. When
the next frame in the stream is found to have an input source selected
(e.g. cable replugged), an element info will be issued.
https://bugzilla.gnome.org/show_bug.cgi?id=774629
Fixes:
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** +[NSString stringWithUTF8String:]: NULL cString
in the state change test.
The default get_times() function of the base sink is just fine.
Remove the custom get_times() function, because the default function
already reads the timestamps from the buffers.
Signed-off-by: Michael Tretter <m.tretter@pengutronix.de>
https://bugzilla.gnome.org/show_bug.cgi?id=773473
Unfortunately this does not go through the normal state change
machinery, so we don't get notified about this in change_state().
However we need to stop scheduled playback, so that once PLAYING is
reached again we can start scheduled playback with the correct time.
Without this, flushing seeks in PLAYING will not work correctly:
decklinkvideosink will wait before showing the new frames for the amount
of time the pipeline was in PLAYING before.
Drawing is done via the GDI drawing functions. The cursor is
converted to a monochrome version before drawing. This is because
the GDI drawing functions seem to have undefined behavior with
cursor images including an alpha channel.
I could not find any other reliable way to draw these alpha
channel cursors without producing unwanted artifacts. These type
of cursors were introduced with Window Vista when run with it's
Aero theme.
Also adjust the cursor coordinates when capturing non-primary
screens via the "monitor" option.
https://bugzilla.gnome.org/show_bug.cgi?id=760172
* Rephrase tune error to be delsys-neutral
* Refer to the actual check in the 'missing sanity check' warnings
* Use "Delivery system" instead of 'delsys'. The
latter is OK as a shorthand in the code but not
even a real word