The order of the devices iterator from the SDK is undefined and can
randomly change.
Keep the device-number property for backwards compatibility and
simplicity but prefer the persistent-id property and also use it for the
device provider implementation.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3078>
GstDXGIGetDebugInterface() is unused when targeting UWP. We directly
call DXGIGetDebugInterface1() in that case.
Fixes build failure:
../gst-libs/gst/d3d11/gstd3d11device.cpp(271): error C2440: '=': cannot convert from 'HRESULT (__cdecl *)(UINT,const IID &,void **)' to 'DXGIGetDebugInterface_t'
../gst-libs/gst/d3d11/gstd3d11device.cpp(271): note: This conversion requires a reinterpret_cast, a C-style cast or function-style cast
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3118>
According to W3C
specification (https://w3c.github.io/webrtc-pc/#datachannel-send) we
should return InvalidStateError exception when trying to send when the
channel is not open. In the world of C/glib/gstreamer we don't have
exceptions but have to rely on gboolean/GError instead. Introducing
these calls for a change in function signature of the action signals
used to send data on the datachannel. Changing the signature of the
existing "send-string" and "send-data" signals would mean an immediate
breaking change so instead we deprecate them. Furthermore, there is no
way to express GError** as an argument to an action signal in a way
that fits language bindings (pointer-to-pointer simply does not work)
and we have to use regular functions instead.
Therefore we introduce gst_webrtc_data_channel_send_data_full() and
gst_webrtc_data_channel_send_string_full() while deprecating the old
functions and corresponding signals.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1958>
Always hold a reference to the soft volume element
provided by the playsinkaudioconvert bin helper, the
same as when volume is provided by a sink element,
or the soft volume element gets unreffed too soon.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3108>
This is a regression that was introduced in
cca2f555d1 (yes, 9 years ago).
The only place where a demuxer streaming thread should be stopped is when the
sinkpad is deactivated from pull mode (i.e. PAUSED->READY).
Attempting to stop the task in this function would cause this to happen when a
FLUSH_STOP or STREAM_START event is received... which can cause deadlocks.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3109>
It would constantly want to renegotiate (and spam the debug log) even
though the channel layout hasn't actually changed. We use the same
fallback in gst_ffmpegauddec_negotiate() already.
This happens with WMA files for example.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3103>
We need to call this to register the MusixBrainz tags before we use
them in an XMP schema.
Fixes this critical when attempting to run jpegparse on a JPEG
containing MusicBrainz XMP tags:
GStreamer-CRITICAL **: 20:41:07.885: gst_tag_get_type: assertion 'info != NULL' failed
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3092>
Currently if the user is not able to access the devices under /dev/media*,
either due to no media devices present on the system or simply no permission
to access the device, v4l2codecs initialises with no features or debug messages.
Since calling `GST_DEBUG="v4l2*:7" gst-inspect-1.0 v4l2codecs` is a typical way
to diagnose why element(s) failed to enumerate, we should be more verbose here
when the user is not able to access any /dev/media* device. So print a simple
debug message in this case to aid debugging.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3088>
The purpose of a deep buffer copy is to be able to release the source
buffer and all its dependencies. Attaching the parent buffer meta to
the newly created deep copy needlessly keeps holding a reference to the
parent buffer.
The issue this solves is the fact you need to allocate more
buffers, as you have free buffers being held for no reason. In the good
cases it will use more memory, in the bad case it will stall your
pipeline (since codecs often need a minimum number of buffers to
actually work).
Fixes#283
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2928>
Since commit a79a756b79 we could change to ignore-pcr automatically at 500ms
into a live stream when no PCR is seen by then. However the stream counting in
program change detection was wrongly considering ignore-pcr programs to have a
separate PCR PID, even though we are actually ignoring the PCR PID completely,
resulting in an erroneous program switch getting triggered from the different
stream count. This in turn would send an EOS and switch out the pads for what
actually is still the same program, while we intended to simply apply a
workaround for broken encoders.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3060>
If the SETUP request returns an IPv6 server address in the Transport
field, we would generate an incorrect URI, and multiudpsink would fail
to initialize:
```
rtspsrc gstrtspsrc.c:9780:dump_key_value:<source> key: 'Transport', value: 'RTP/AVP;unicast;source=fe80::dc27:25ff:fe5e:bd13:8080;client_port=62696-62697;server_port=4000-4001'
...
rtspsrc gstrtspsrc.c:4595:gst_rtspsrc_stream_configure_udp_sinks:<source> configure RTP UDP sink for fe80::dc27:25ff:fe5e:bd13:8080:4000
...
multiudpsink gstmultiudpsink.c:1229:gst_multiudpsink_configure_client:<udpsink0> error: Invalid address family (got 23)
```
We can't look at stream->is_ipv6 because we can't rely on the server
returning the right value there. In the issue reported about this,
server reported itself as `KuP RTSP Server/0.1`, and the SDP was:
```
c=IN IP4
m=video 54608 RTP/AVP 96
a=rtpmap:96 H264/90000
```
So we need to parse the string value and figure out the family
ourselves.
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1058
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1819>
Fixes warning with meson 0.62:
gst-plugins-bad| subprojects/gst-plugins-bad/meson.build:546: WARNING:
Project targets '>= 0.62' but uses feature deprecated since '0.62.0':
pkgconfig.generate variable for builtin directories. They will be
automatically included when referenced
and more.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3086>
Update unit test for some mpd cases that were reporting
timestamps including the period start time, while
dashdemux2 expects that it needs to add the period
start time itself.
Fix the tests to not expect the period start time
to be included.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3025>
These values will be referred to as timestamp relative to period start
so need to subtract period start time from the values.
Fixes a problem with determining the start position when playing Live content
with SegmentTimeline, presentationTimeOffset and a non-0 period start time.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3025>
Starting with Meson 0.62, meson automatically populates the variables
list in the pkgconfig file if you reference builtin directories in the
pkgconfig file (whether via a custom pkgconfig variable or elsewhere).
We need this, because ${prefix}/libexec is a hard-coded value which is
incorrect on, for example, Debian.
Bump requirement to 0.62, and remove version compares that retained
support for older Meson versions.
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/1245
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3061>
Change the way streams are woken up to download more data.
Instead of checking the level on tracks that are being
output as data is dequeued, calculate a 'wakeup time'
at which it should download more data, and wake up
the stream when the global output position crosses
that threshold.
For efficiency, compute the earliest wakeup time
for all streams and store it on the period, so the
output loop can quickly check only a single value
to decide if something needs waking up.
Does the same buffering as the previous method,
but ensures that as we approach the end of
one period, the next period continues incrementally
downloading data so that it is fully buffered when
the period starts.
Fixes issues with multi-period VOD content where
download of the second period resumes only after
the first period is completely drained.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3055>
When pushing several buffers while the pipeline is in NULL state, meaning
that the action are executed "interlaced", previous code was deadlocking.
This new implementation makes it so the override is always on and we
expect all buffers to go through to be associated to a function, which
is a safe assumption.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3052>
Some servers can return playlists with "old" media playlists and different
Discont Sequence.
In those cases, the segment stream times would be negative when creating a new
time mapping. In order to properly handle such scenarios, shift the values to
stored accordingly to end up with non-negative reference stream time.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3054>
This allows users to let videorate fully fill the segments when received
EOS or on new segment, removing an arbitrary limit of 25 duplicates which
might not be what the user wants (for example on low FPS stream in GES,
that sometimes leaded to broken behavior)
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3000>
The AV1 support multi spatial layers within one TU with different
resolutions, and only the highest spatial layer need to be output.
For example, there are two spatial layer, base level is 800x600
and higher level is 1920x1080. We need to decode both because the
higher level needs base layer as reference, but we only need to output
1920x1080 frames here.
The current manner always renegotiates the caps once we detect the
current picture resolution changes, so we renegotiate again and
again between different layers. That's a big waste and has very
low performance. We now only do the renegotiation for the highest
output layer. For other non output layers, we just keep a internal
buffer pool which is big enough to handle the surface allocation.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2382>
As SPEC says, when multi spatial layer exists, we should only output
one frame with the highest spatial id from each TU. We now store the
highest spatial layer information in the base class in order to let
the sub class handle different layers easily.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2382>
doesn't align on 20 millisecond frame size.
The AMR-WB codec imposes a fixed 20 millisecond frame size. In its current
form, the `voamrwbenc` plugin deals with this limitation by discarding any
audio at the end of the stream that falls short of 20 milliseconds. This patch
keeps the audio data, and appends silence to the end to preserve frame size
alignment.
The patch also adds tests to check for the updated behavior. I noticed that
tests weren't being built, so I changed the build to allow for building the
tests when the `tests` and `voamrwbenc` options are set.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3027>
when expose-all=False
When trying to find an decoder in that case, we loop over the different
decoder factories, and check that it outputs a format that matches the
requested one (through the :caps property), but if we find a decoder
that do match but later on some other don't we end up failing
autopluging. This patch ensures that we still plug the decoder that can
work.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3011>
- Update the docker image we use, starting using the standard one adding
`gtk4-doc` as required by rust plugins
- Update the plugins_doc_caches as required, some more plugins are built
with the new image
- Install ninja from pip as the version from F31 is too old
- Avoid buildings all GSreamer plugins when building the doc as it takes
time and resources for no good reason
- Stop linking to `GInstanceInitFunc` as it is not present in latest GLib
documentation, leading to warnings in hotdoc.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2954>
We are supposed to guarantee that pads that are exposed have the caps
set, but for sources that have pad with "all raw caps" templates, we end
up exposing pads that don't have caps set yet, which can break code (in
GES for example).
To avoid that we let uridecodebin plug a `decodebin` after such pads and
let decodebin to handle that for us. In the end the only thing that
decodebin does in those cases is to wait for pads to be ready and expose
them, after that `uridecodebin` will expose those pads.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3009>
We are always building our printf implementation, even when
GST_DEBUG is disabled, since we are exposing api (gst_print*)
that's dependant on our printf behavior.
We don't need to keep __gst_info_fallback_vasprintf around anymore.
Close#640
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/739>
Handle d3d11 device context in set_context() method with
additional device compatibility check so that only NVIDIA GPU
associated d3d11 device can be configured in the element.
And clear old d3d11 device per set_info() for d3d11 device to be
updated as well.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3018>
... and fix d3d11 specific enum type name
GST_CUDA_HAS_D3D is a build time define which indicates whether
GstD3D11 library is available or not, but DirectX SDK headers
must be available on the build system already.
Expose Direct3D related symbols if the build target is Windows
(i.e., if G_OS_WIN32 is defined)
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3018>
GLib made the unfortunate decision to prevent libgobject from ever being
unloaded, which means that now any library which registers a static type
can't ever be unloaded either (and any library that depends on those,
ad nauseam).
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/778>
GstVA is not currently build by CI, because libva version is lower
than expected. So, the gstva library is not build, thus some symbols
aren't documented, breaking the documentation CI.
To move things forward, let's just remove temporarly the va plugins
from cache. While we decide on how to update the libva package in
the CI.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1025>
When advancing fragment in live, it's normal to return
GST_FLOW_EOS when playing at the live edge of the available
fragments. In that case, we still want to adjust bitrate
dynamically.
Fixes issue with dashdemux2 where the current bitrate of
each adaptation set is changed to the lowest one when
updating the mpd for a live stream.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3020>
Instead of trying to hardcode site-packages paths for different platforms
just use python.get_install_dir() from meson and let it deal with the rest.
Also no longer try to import pygobject, which would otherwise not be
required at build time.
python.get_install_dir() was at the beginning broken on Windows, but
that was fixed in 0.60 via https://github.com/mesonbuild/meson/pull/9156
and since ges now requires >0.60 this can be ignored.
This change was motivated by the install path being wrong under MSYS2, where
the unix install layout is used and the detection code not taking that into
account.
This MR is a continuation of https://gitlab.freedesktop.org/gstreamer/gst-editing-services/-/merge_requests/230
see the discussion there for extra context.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/3012>
Just like for the seconds field, there are no limitations on the hours and
minutes fields. The specification for xml schema duration fields doesn't forbid
specifying durations with only (huge) minutes or hours values.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2951>
When picking an available payload type, we need to pick one that is
available across all media.
The previous code, when multiple media were present, looked at the first one,
noticed it had pt 96 as the media pt, then simply looked at the next media,
noticed it didn't, and decided 96 was available.
Instead, check if the pt is used by any of the media, if it is, decide
it is not available and go to the next pt. I'm fairly sure that was the
original intent.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/2984>