Adds list of formats that should be used by element in needs to passthrough
video. It contains the full list of video format plus DMA_DRM format
and will be extended in the future as needed. This patches includes 3 new
symbols:
- GST_VIDEO_FORMATS_ANY_STR
- GST_VIDEO_FORMATS_ANY
- gst_video_formats_any()
The last one can be used by bindings or for code that prefers having
GstVideoFormat values instead of strings.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5386>
Fixes a potential GPU stall if an immediately freed texture/buffer is
attempted to be reused immediately by the CPU, e.g. when uploading.
Problematic scenario is this:
1. element does GPU processing reading from texture
2. frees the buffer back to the pool
3. pool acquire returns the just released buffer
4. GPU processing then has to wait for the previous GPU operation to
complete causing a stall
If there was a reliable way to know whether a buffer had been finished
with across all GPU drivers, we would use it. However as that does not
exist, this workaround is to keep the released buffer unusable until the
next released buffer.
This is the same approach as is used in the qml (Qt5) elements.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5144>
If a depayloader aggregates multiple RTP buffers into one buffer only
the last RTP buffer was checked for header extensions. Now the
depayloader remembers all RTP packets pushed before a output buffer is
pushed and checks all RTP buffers for header extensions.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4979>
These 10bit formats are identical to NV12_16L32S, but 64bytes of data is being
prefixed with 16bytes data with four pixels of lower 2bits per byte. For
MT2110T, the lower two bits set so each bytes contains a column of 4 pixels,
also describe as tiled lower 2 bits. MT2110T has been chosen as a name to match
the vendor chosen name. This format is unlikely to exist for other vendors.
For MT2110R, the 2 low bits are in raster order.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3444>
The current limit is `x10`, which allows just `+20 dB` of gain.
While it may seem sufficient, this came up as a problem
in a real-world, non-specially-engineered situation,
in strawberry's EBU R 128 loudness normalization.
(https://github.com/strawberrymusicplayer/strawberry/pull/1216)
There is an audio track (that was not intentionally engineered that way),
that has integrated loudness of `-38 LUFS`,
and if we want to normalize it's loudness to e.g. `-16 LUFS`,
which is a very reasonable thing to do,
we need to apply gain of `+22 dB`,
which is larger than `+20 dB`, and we fail...
I think it should allow at least `+96 dB` of gain,
and therefore should be at `10^(96/20) ~= 63096`.
But, i don't see why we need to put any specific restriction
on that parameter in the first place, other than the fact
that the fixed-point multiplication scheme does not support volume
larger than 15x-ish.
So let's just implement a floating-point fall-back path
that does not involve fixed-point multiplication
and lift the restriction altogether?
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5063>
Don't assume that compositor will output only single buffer
for single input buffer. If buffer's running time is not completly
aligned to output buffer running time or duration, compositor
can generate multiple buffers. If that happens, two threads,
one is aggregator output thread and main thread were trying
to modify buffer in this test. Clear the buffer after
shutting down pipeline to avoid the race.
Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2836
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5081>
The current implementation copies metas without checking if the buffer
is writable.
The operation that needs to be done, replacing the input buffer and
copying the metas, is only part of that process. We create a new function
that does both.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4912>
Fixes test: validate.uridecodebin.expose_raw_pad_caps
testsrcbin (currently part of debugutilsbad) is an useful element for
validate tests.
validate.uridecodebin.expose_raw_pad_caps makes use of it.
Unfortunately, because validate tests with GStreamer only run with
whitelisted plugins and `debugutilsbad` wasn't in the whitelist, the
test was failing and being auto-skipped.
This patch adds debugutilsbad to the whitelists used by validate tests
in subprojects with a validate/meson.build.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4931>
Allow a project to use gstreamer-full as a static library
and link to create a binary without dependencies.
Introduce the option 'gst-full-target-type' to
select the build type, dynamic(default) or static.
In gstreamer-full/static build configuration gstreamer (gst.c)
needs the symbol gst_init_static_plugins which is defined
in gstreamer-full.
All the tests and examples are linking with gstreamer but the
symbol gst_init_static_plugins is only defined in the gstreamer-full
library. gstreamer-full can not be built first as it needs to know what plugins
will be built.
One option would be to build all the examples and tests after
gstreamer-full as the tools.
Disable tools build in subprojects too as it will be built at the end of
build process.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4128>
If gst_buffer_pool_set_config() fails then the pool will use its old
config. This may include different width or height when
pic_width/pic_height != frame_width/frame_height.
As a result, the assertions in theora_handle_image() will fail.
So check the result of gst_buffer_pool_set_config() and only use the pool
if it succeeds. Otherwise let the parrent decide_allocation() create a new
pool.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4600>
The first serialized events that can be send on a src pad are a CAPS and then a
SEGMENT event.
When handling events from user in appsrc, we used to send a segment
automatically if the SEGMENT has not been sent yet.
This breaks if the CAPS event was not send either as we were now sending
a SEGMENT before the CAPS.
Fix this by delaying such events until the CAPS has been configured.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4297>
Adding propose_allocation is to meet the requirement of Application to
request buffers. Application sometimes need to create buffer pool
and request buffers to maintain buffer management itself, and Gstreamer plugin
import Application's buffers to use. So, add propose_allocation in
appsink like waylandsink and kmssink etc.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4185>
Read and flush console buffer from the console thread immediately,
instead of main thread. Otherwise (if main thread is busy)
the console thread will keep adding idle source and then main thread
will be unresponsive.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4067>
This allows correct handling of wrapping around backwards during the
first wraparound period and avoids the infamous "Cannot unwrap, any
wrapping took place yet" error message.
It allows makes sure that for actual timestamp jumps a valid value is
returned instead of 0, which then allows the caller to handle it
properly. Not having this can have the caller see the same timestamp (0)
for a very long time, which for example can cause rtpjitterbuffer to
output the same timestamp for a very long time.
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1500
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3202>
The scenario is what we try in the tests:
- we have a segment with .stop set
- some frame(s) flow
- we get a CAPS event
- we get an EOS (before getting buffers after the CAPS event)
in that case, without that patch, the segment is not properly closed
which is not correct. In this patch we keep track of previous caps until
a new buffer arrives, this way in that situation we set previous caps
again, and close the segment with the previous buffer.
Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1352
in this specific case
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3059>
This allows users to let videorate fully fill the segments when received
EOS or on new segment, removing an arbitrary limit of 25 duplicates which
might not be what the user wants (for example on low FPS stream in GES,
that sometimes leaded to broken behavior)
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3000>
We are supposed to guarantee that pads that are exposed have the caps
set, but for sources that have pad with "all raw caps" templates, we end
up exposing pads that don't have caps set yet, which can break code (in
GES for example).
To avoid that we let uridecodebin plug a `decodebin` after such pads and
let decodebin to handle that for us. In the end the only thing that
decodebin does in those cases is to wait for pads to be ready and expose
them, after that `uridecodebin` will expose those pads.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3009>
SMPTE 170M and 240M use the same RGB and white point coordinates
and therefore both primaries can be considered functionally
equivalent.
Also, some transfer functions have different name but equal
gamma functions. Adding another colorimetry compare function
to deal with thoes cases at once
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2765>