Removes the usage of [NSApp terminate] to avoid killing the process and thus never actually returning a value.
The new way is just to use [NSApp stop] and send an event, since stop only happens after an event is processed.
Unlike terminate, stop will only halt the event loop, not the whole process.
This uses an NSApplicationDelegate to listen for NSApp finishing the launch process, and then signals the 'main' thread
to proceed. That makes sure to never call [NSApp stop] before NSApp is actually running, which could happen if the
provided 'main' function finished quickly enough.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6103>
Upon creating a window, glimagesink and osxvideosink now set the policy to
NSApplicationActivationPolicyRegular, which lets us show an icon in the Dock
for convenience and appear in the top menu bar like other apps.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6103>
Setting the policy to NSApplicationActivationPolicyAccessory by default makes
sure that we can activate windows programmatically or by clicking on them.
Without that, windows would disappear if you clicked outside them and there
would be no way to bring them to front again. This change also allows osxvideosink
to receive navigation events correctly.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6103>
WebKit commit b12e7ed2ad3a ("[WPE] Upstream the new WPE platform API
https://bugs.webkit.org/show_bug.cgi?id=265286")[1] added a `WPEView` typedef
which clashes with our `WPEView` class.
Rename the `WPEView` class to `GstWPEThreadedView` to avoid the collision.
Also prefix the `WPEContextThread` class with `Gst` and rename the
source files to reflect the new class name and use lowercase while at it
for consistency
[1] b12e7ed2ad
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6096>
* Bump the rank of the musepack v7/v8 FFmpeg demuxers to SECONDARY
* Bump the rank of the musepack v7/v8 FFmpeg audio decoders to SECONDARY
* Demote the rank of the musepackdec element to MARGINAL
This is for two reasons:
* The musepack library is no longer maintained, whereas the FFmpeg
implementation can/will receive fixes
* The `musepackdec` implementation was a all-in-one "parsing and decoding" blob
which doesn't play nicely with decodebin3 and others
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3033
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6085>
The `imp` module got removed in python 3.12 and the `importlib` module should be
used instead.
This is also a good excuse to switch to the new finder module from PEP 451 :
https://www.python.org/dev/peps/pep-0451/
This only requires implement the `find_spec()` method in our custom loaders
Co-authored-by: Stefan <107316-stefan6419846@users.noreply.gitlab.freedesktop.org>
Co-authored-by: Jordan Petrids <jordan@centricular.com>
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6050>
Converting from RGB to YUV: When comparing the info.colorimetry to
GST_VIDEO_COLORIMETRY_BT709 it does not make sense to look at the input
signal because that is of type of RGB. The plugin needs to look at the
output YUV-type and compare GST_VIDEO_COLORIMETRY_BT709 to that, because
that is the YUV-type the plugin needs to convert input-RGB into.
Converting from YUV to RGB: Comparing to the input is correct, but because
here the color encoding info BT601/BT709 is on input side of the plugin.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6046>
Remove optional sprop-stereo and sprop-maxcapture fields from Opus
remote offer caps before intersecting with local codec preferences.
According to https://datatracker.ietf.org/doc/html/rfc7587#section-7.1
those fields are sender-only informative, and don't affect
interoperability.
Fixes cases where the webrtc media will end up receive-only if the
local side wants to send stereo but the remote is sending mono, or
vice versa.
There may be other fields in other codecs, so the implementation
anticipates needing to add further fields and codecs in the future.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5999>
Test included.
The problem appears when aggregator drops the query while
it's being proccessed by the klass->sink_query handler.
This can happen on FLUSH_START event. If the query is still
in the queue, it can be safely dropped, but if it's already
in the klass->sink_query() handler, then sink pad has no
choice and has to wait for the proccessing to complete.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5958>
This simplifies the way it picks the closest caps to preference and take into
consideration the framerate to avoid picking high resolution at 5fps or so.
Simply calculate a "distance" of caps A and B from the preference and put
closest first, sorting by framerate first.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5956>
Use gst_data_queue_push_force() for most events so they
are immediately enqueued. Only gap events and actual buffer
data will now block when the queue is full.
This fixes a problem with non-flushing seek handling
where events following a segment-done event would block
if they precede the SEGMENT event, since only SEGMENT
events would clear the 'eos' state of the multiqueue
queue.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5954>
Don't call wait_event() at all for gap events, as basesink will
end up waiting for the time that the gap event would be rendered
out at the audio device. There's no need to render it at all,
just treat it as a handy point to resync the audio if needed,
let the ringbuffer render silence, and place the next buffer
into the ringbuffer where it belongs.
The only thing we really need to do is make sure the ringbuffer
and clock are running, and wait for preroll.
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2749
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5953>
If GST_PAD_SINK is passed in this means that we're supposed to convert
from sink caps to src caps, not the other way around. In other words, if
GST_PAD_SINK is passed we're supposed to produce the possible output
caps.
Previously this was inverted. This had the effect that glcolorconvert
pretended to be able to convert *to* I420 without glDrawBuffers, which is
not possible, and pretended not to be able to convert *from* I420
without glDrawBuffers, which it always supports.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5947>
The GstMpeg2Picture system_frame_number is guint32, constant 1000 is guint32,
GstV4l2CodecMpeg2Dec *_ref_ts multiplication result is u64 .
```
u64 result = (u32)((u32)system_frame_number * (u32)1000);
```
behaves the same as
```
u64 result = (u32)(((u32)system_frame_number * (u32)1000) & 0xffffffff);
```
so in case `system_frame_number > 4294967295 / 1000`, the `result` will
wrap around. Since the `result` is really used as a cookie used to look
up V4L2 buffers related to the currently decoded frame, this wraparound
leads to visible corruption during MPEG2 decoding. At 30 FPS this occurs
after cca. 40 hours of playback .
Fix this by changing the 1000 from u32 to u64, i.e.:
```
u64 result = (u64)((u32)system_frame_number * (u64)1000ULL);
```
this way, the wraparound is prevented and the correct cookie is used.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5850>
The GstVp9Picture system_frame_number is guint32, constant 1000 is guint32,
GstV4l2CodecVp9Dec v4l2_vp9_frame.*_frame_ts multiplication result is u64 .
```
u64 result = (u32)((u32)system_frame_number * (u32)1000);
```
behaves the same as
```
u64 result = (u32)(((u32)system_frame_number * (u32)1000) & 0xffffffff);
```
so in case `system_frame_number > 4294967295 / 1000`, the `result` will
wrap around. Since the `result` is really used as a cookie used to look
up V4L2 buffers related to the currently decoded frame, this wraparound
leads to visible corruption during VP9 decoding. At 30 FPS this occurs
after cca. 40 hours of playback .
Fix this by changing the 1000 from u32 to u64, i.e.:
```
u64 result = (u64)((u32)system_frame_number * (u64)1000ULL);
```
this way, the wraparound is prevented and the correct cookie is used.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5850>
The GstVp8Picture system_frame_number is guint32, constant 1000 is guint32,
GstV4l2CodecVp8Dec v4l2_vp8_frame.*_frame_ts multiplication result is u64 .
```
u64 result = (u32)((u32)system_frame_number * (u32)1000);
```
behaves the same as
```
u64 result = (u32)(((u32)system_frame_number * (u32)1000) & 0xffffffff);
```
so in case `system_frame_number > 4294967295 / 1000`, the `result` will
wrap around. Since the `result` is really used as a cookie used to look
up V4L2 buffers related to the currently decoded frame, this wraparound
leads to visible corruption during VP8 decoding. At 30 FPS this occurs
after cca. 40 hours of playback .
Fix this by changing the 1000 from u32 to u64, i.e.:
```
u64 result = (u64)((u32)system_frame_number * (u64)1000ULL);
```
this way, the wraparound is prevented and the correct cookie is used.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5850>
In gst_va_allocator_try, the first try is to use derive_image, if it
succeeds, we should use info from derived image to create bufferpool.
If derive fails, then try create_image and give created image info
to the pool.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5778>
When exporting a DMABuf from a VASurface the user might tell that the surface
was allocated with certain fourcc, but the returned VADRMPRIMESurfaceDescriptor
migth tell a different fourcc, as in the case or radeonsi driver, for duplicated
fourcc, such as YUY2 and YUYV.
Originally it was supposed to be a failed exportation. This patch relax this
validation by allowing different fourcc.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5778>
When overlay coordinates are updated, after the initial coordinates
are set, the shader indices are applied to the wrong buffer, resulting
in the background image appearing where the overlay should.
Bind the array buffer before applying subsequent coordinate
updates.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5909>
When level value is greater than 127, it was being clamped but this clamped
value was not the one being actually used. For level values greater than 127
this resulted in an incorrect value being used. As an example, a level value
of 187, after and'ed with 0x7F, it would result in 0x3B being reported as the
level value.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5894>
This reverts a part of de92a6c7f2. Unlike `image_filter` and
`video_filter`, `viewfinder_filter` does not get linked to `src` but
`viewfinderbin_queue`. Thus the fix in the mentioned commit does not
apply for it and should be reverted.
This was not spotted earlier as only the other filters are used in
the project that uncovered the issue.
Fixes: de92a6c7f2 ("camerabin: Fix source updates with user filters")
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5842>
When reopening a v4l2 device, the v4l2object->poll will include some old fds,
which was assigned to this device before. If the pipeline opens multiple v4l2
devices, the old fd may been assigned to other v4l2 devices when reopening
devices.
This will cause the timing of the pipeline become confusing when polling devices,
leading functional abnormalities.
Therefore, when closing v4l2object, remove the old fds in poll to ensure that the
pipeline timing is normal.
Signed-off-by: Chao Guo <chao.guo@nxp.com>
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5840>
Since the AV1 specification is not explicitly mentioning about
the array size bounds, array sizes in scalability structure
should be defined as possible maximum sizes that can have.
Also, this commit removes GST_AV1_MAX_SPATIAL_LAYERS define from
public header which is API break but the define is misleading
and this patch is introducing ABI break already
ZDI-CAN-22300
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5824>
This commit corrects the mapping relationship between RGB and BGR in GST and DRM.
The previous mapping was incorrect, causing potential color mismatches in the output.
The changes are as follows:
{WL_SHM_FORMAT_RGB888, DRM_FORMAT_RGB888, GST_VIDEO_FORMAT_BGR},
{WL_SHM_FORMAT_BGR888, DRM_FORMAT_BGR888, GST_VIDEO_FORMAT_RGB},
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5792>
If the ladspa plugin is enabled explicitly or via auto-features, the
liblrdf dependency can not be disabled.
As the RDF parsing currently provides hardly any features, the possibility
to disable it fairly useful.
Fixes: #3168
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5802>
When doing a segment seek, the base offset in the new segment
would be increased by segment.position which is basically the
timestamp of the last packet. This does not include the duration
of the last packet though, so might be slightly shorter than the
actual duration of the clip or the requested segment.
Increase the base offset by the segment duration instead when
accumulating segments, which is more correct as it doesn't cut
off the last frame and makes the effective loop segment duration
consistent with the actual duration returned from a duration
query.
In case a segment stop was specified it's also possible that
some data was sent beyond the stop that's necessary for decoding
so the base offset increment should be based on that then and
not on the timestamp of the last buffer pushed out.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5787>
This avoid a build failure when compiling against OpenSSL 3.2.0. The
problem is when windows.h is included before WinSock2.h. Because
windows.h includes winsock.h[1]. Defining _WINSOCKAPI_ stops windows.h
including winsock.h.
Error:
```
[748/1041] Compiling C object ext/dtls/gstdtls.dll.p/gstdtlscertificate.c.obj
FAILED: ext/dtls/gstdtls.dll.p/gstdtlscertificate.c.obj
[...]
Windows Kits\10\include\10.0.17763.0\shared\ws2def.h(235): error C2011: 'sockaddr': 'struct' type redefinition
Windows Kits\10\include\10.0.17763.0\um\winsock.h(482): note: see declaration of 'sockaddr'
```
[1] https://stackoverflow.com/a/1372836
Closes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3167
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5783>
Clip tile rows and cols to 64 as describe in AV1 specification
to avoid writing outside array range but preserve sb_cols
and sb_rows value which are used to futher computation.
Fixes ZDI-CAN-22226 / CVE-2023-44429
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5734>
When the subclass attempts to finish without an explicit `out_buffer`,
we take a buffer from our adapter. We need to make this buffer writable
before copying the metadata.
This led to data races such as in the following pipeline, which randomly
messed up the buffer PTS:
gst-launch-1.0 -e audiotestsrc timestamp-offset=5555 num-buffers=100 \
! opusenc ! tee name=t ! queue ! opusparse ! fakesink silent=0 \
t. ! queue ! opusparse ! fakesink silent=0 -v | grep '0000, dur'
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5720>
Whenever that caps changes does not imply that a new segment will start.
Don't reset the last_ts if only the caps have changed. This fixes issues
if you have a stream without only first frame with TS=0, and have resolution
change happening. This was a regression introduced by !3059, which issue was
described in #1352. The reported issue is still fix after this change.
Fixes#1034
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5712>
When seek flush, gst v4l2 buffer pool flush is not atomic which will
lead double enqueue buffer (qbuf) issue, and v4l2 buffer pool qbuf is
also not atomic which will lead no free buffer found in the pool.
1. add lock for calculate enqueue number in streamon function
2. add lock for v4l2 capture end streamoff in pool flush function
3. lock the whole funciton of v4l2 buffer pool qbuf, then the buffer
pool index and qbuf operation are atomic
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5695>
When copying a buffer, for example with gst_buffer_make_writable(), the
new buffer might reference the same GstMemory as the src buffer,
making those memories not writable. If the src buffer gets disposed
first it should return to its buffer pool, but since some of its
memories are not writable it gets discarded and new buffer/memory gets
allocated.
Solves this by making the new buffer keep a reference to the src buffer,
that ensures that by the time the src buffer gets disposed no other
buffer are referencing its memories and it can thus return safely to its
pool.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5696>
gst_buffer_add_parent_buffer_meta() is used when a GstBuffer uses
GstMemory from another buffer that was allocated from a pool. In that
case we want to make sure the buffer returns to the pool when the memory
is writable again, otherwise a copy of the memory is created. That means
the child buffer must drop its ref to the memory first, then drop the
ref to parent buffer so it can return to the pool when it is the only
owner of the memory.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5696>
Take the case into account when user filters have been set before the
source gets updated.
Note that the further linking of the filters, if present, happens below
in the `gst_camera_bin_check_and_replace_filter()` calls.
The audio filter is still affected by the same issue but left out for
now.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5682>
Even if IDXGIOutput6 says current display colorspace is HDR,
captured texture via IDXGIOutputDuplication::AcquireNextFrame()
is converted frame by OS unless we use IDXGIOutput5::DuplicateOutput1()
with DXGI_FORMAT_R16G16B16A16_FLOAT format, in order for captured
frame to be scRGB color space. Then application should perform
tonemap operation based on reported display white level, color primaries, etc.
Since we don't have any tonemapping implementation, ignores colorimetry
reported by IDXGIOutput6.
Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3128
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5679>
This commit makes sure that pads are valid for linking
after the pads has been temporarily unlocked in the linking process.
Not doing this opens up for a race condition where
pads potentially can be linked twice.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5678>
Because we treat raw audio chunks/samples as keyframes, they were interfering
with seek time adjustment.
Became apparent when the accompanying video stream was I-frame only,
for example ProRes.
Since raw audio streams can be seeked freely, it's fine to just ignore them here,
giving priority to the real keyframes in the video stream.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5674>
This is how it was documented and how it worked before the port to GstPlay.
Without this, applications expecting signals to be emitted directly
without anything running the main context will simply not receive any
signals.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5673>
The code seems to validate that the media-level fingerprint matches
the fingerprint of the previous media or of the whole session. There
is no such requirement in any RFC I found. The session-session one
is just meant to act as a fallback when there is no media-level
fingerprint.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5663>