* A copy-paste error was getting the information from the wrong
query
* The 'allocation_meta' GstStructure was being leaked
* No check was done on whether the query existed (to try to set the
resulting allocation meta on)
CID: 1439872
CID: 1439873
CID: 1439874
CID: 1439875
CID: 1439876
CID: 1439877
Fixes regression introduced by "clean-up" done as part of commit 98ebcb4.
dummy must live as long as use the return value of localtime_r() since
that's just a pointer to it, and by putting it inside the block we made
dummy go out of scope right after localtime_r() returned, which messed
up the time values since when we poked at the struct the contents might
already have been overwritten.
Fixes#722
This was added in 1.0.1 more than 16 years ago, I think we
can safely assume this is always present now. Also in tremor.
While at it, bump vorbis requirement to 1.3.1 from 2010.
When DMABuf was tried, we would renegotiate back and fourth between
DMABuf and system memory if the export failed. This would happen for
every single frame.
This patch introduces try_dmabuf_exports boolean, which is unset when
an export failed. This boolean is then put back to TRUE when upstream
pushes new caps, or downstream pushes a reconfigure event.
This introduces an enum in order to cleanup how we select the
transfer mode. It also fixes the case where we callback to PBO but
we didn't execute the PBO. That was not causing any issue, just that
the processing latency would be delayed to the next element, which
can be confusing.
This is possibly not strictly needed when pixels are being downloaded to
CPU memory, but would cause issue when exporting DMABuf, as the data may
not be yet ready when the DMABuf reaches the consumer.
Fixes macos werror build
../ext/gl/caopengllayersink.m:336:23: error: '__bridge_retained' casts have no effect when not using ARC [-Werror,-Warc-bridge-casts-disallowed-in-nonarc]
ca_sink->layer = (__bridge_retained gpointer)layer;
^~~~~~~~~~~~~~~~~~
By passing NULL to `g_signal_new` instead of a marshaller, GLib will
actually internally optimize the signal (if the marshaller is available
in GLib itself) by also setting the valist marshaller. This makes the
signal emission a bit more performant than the regular marshalling,
which still needs to box into `GValue` and call libffi in case of a
generic marshaller.
Note that for custom marshallers, one would use
`g_signal_set_va_marshaller()` with the valist marshaller instead.
The gltestsrc element was refactored to inherit from this base class which
handles the GL context. The sub-class only needs to implement the gl_start,
gl_stop and fill_gl_memory vfuncs, along with properly advertizing the GL APIs
it supports through the supported_gl_api GstGLBaseSrc class attribute.
In this mode, buffer timestamps are displayed as an absolute date
since a user-specifiable epoch. The format is also specifiable as
a string property, that will be passed to g_date_time_format().
If we do, then multiple disjoint OpenGL contexts will not perform the
necessary download and reupload of data that is necessary to cross between
each OpenGL context sharegroup.
gst_gl_shader_string_get_highest_precision needs to make an OpenGL call
so execution outside the OpenGL thread and context results in undefined
behaviour.
Removing gstalsadeviceprobe.[ch] as it was a relique from the 0.10
century.
This doesn't implement device monitoring but only probing, monitoring
should be implemented in its own commit.
Doing so involves retrieving the current viewport from OpenGL which as
with any glGet operation, is expensive.
This means that the various sinks need to reset the viewport on draw.
In the process, fix resizing on cocoa.
If we only ever make it to READY, transform_caps can create an
internal convert object that will never be freed by basetransform's
stop vmethod (PAUSED->READY).