If input height and parsed one are identical, do not consider it as interlaced
Fixing below pipeline:
gst-launch-1.0 videotestsrc ! video/x-raw,format=I420,width=640,height=10 \
! jpegenc ! jpegparse ! jpegdec ! videoconvert ! autovideosink
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6229>
Unprepare method posts WM_GST_D3D11_DESTROY_INTERNAL_WINDOW
command to the window queue, and from that moment considers
internal_hwnd to be released, and so it sets it to null.
The problem is that it's possible that right at that moment
the window thread might be already processing some other
command, or just another command might be already in the queue.
On practice we met a crash when WM_PAINT got processed in between
(unprepare already finished and WM_GST_D3D11_DESTROY_INTERNAL_WINDOW
was not handled yet)
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6230>
After a flushing seek, rtspsrc doesn't reset the last_ret value for
streams, so might immediately shut down again when it resumes pushing
buffers to pads due to a cached `GST_FLOW_FLUSHING` result
Prevent a stored flushing value from immediately stopping
playback again by resetting pad flows before (re)starting
playback.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6216>
The original idea was to select the type of mapping (either using derive images
or downloading the image) in runtime, under the assumption that both methods
shared the same memory layout (offsets and strides), because a single
GstVideoMeta is assigned by the buffer pool at allocation time. Nonetheless, in
recent hardware this assumption is invalid, raising memory access errors.
This patch removes completely the mapping type selection at runtime, using the
method selected when the allocator is configured, synced with the bufferpool
allocation.
This problem was fixed originally for iHD driver only. But now it makes sense to
remove all of it.
Original-patch-by: Mengkejiergeli Ba <mengkejiergeli.ba@intel.com>
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6204>
This reverts questionable commit 009bc15f33
which looks completely wrong.
The GstWasapi2RingBuffer:buffer_size variable is used to
calculate available buffer size we can write
(i.e., available size = buffer_size - padding_size).
But the commit makes the size to be exactly same as buffer period.
Then, it can confuse this element as if the endpoint buffer is full on
I/O event callback (if padding size is equal to buffer period)
but it's not true.
Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2870
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6140>
Removes the usage of [NSApp terminate] to avoid killing the process and thus never actually returning a value.
The new way is just to use [NSApp stop] and send an event, since stop only happens after an event is processed.
Unlike terminate, stop will only halt the event loop, not the whole process.
This uses an NSApplicationDelegate to listen for NSApp finishing the launch process, and then signals the 'main' thread
to proceed. That makes sure to never call [NSApp stop] before NSApp is actually running, which could happen if the
provided 'main' function finished quickly enough.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6103>
Upon creating a window, glimagesink and osxvideosink now set the policy to
NSApplicationActivationPolicyRegular, which lets us show an icon in the Dock
for convenience and appear in the top menu bar like other apps.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6103>
Setting the policy to NSApplicationActivationPolicyAccessory by default makes
sure that we can activate windows programmatically or by clicking on them.
Without that, windows would disappear if you clicked outside them and there
would be no way to bring them to front again. This change also allows osxvideosink
to receive navigation events correctly.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6103>
WebKit commit b12e7ed2ad3a ("[WPE] Upstream the new WPE platform API
https://bugs.webkit.org/show_bug.cgi?id=265286")[1] added a `WPEView` typedef
which clashes with our `WPEView` class.
Rename the `WPEView` class to `GstWPEThreadedView` to avoid the collision.
Also prefix the `WPEContextThread` class with `Gst` and rename the
source files to reflect the new class name and use lowercase while at it
for consistency
[1] b12e7ed2ad
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6096>
* Bump the rank of the musepack v7/v8 FFmpeg demuxers to SECONDARY
* Bump the rank of the musepack v7/v8 FFmpeg audio decoders to SECONDARY
* Demote the rank of the musepackdec element to MARGINAL
This is for two reasons:
* The musepack library is no longer maintained, whereas the FFmpeg
implementation can/will receive fixes
* The `musepackdec` implementation was a all-in-one "parsing and decoding" blob
which doesn't play nicely with decodebin3 and others
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3033
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6085>
The `imp` module got removed in python 3.12 and the `importlib` module should be
used instead.
This is also a good excuse to switch to the new finder module from PEP 451 :
https://www.python.org/dev/peps/pep-0451/
This only requires implement the `find_spec()` method in our custom loaders
Co-authored-by: Stefan <107316-stefan6419846@users.noreply.gitlab.freedesktop.org>
Co-authored-by: Jordan Petrids <jordan@centricular.com>
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6050>
Converting from RGB to YUV: When comparing the info.colorimetry to
GST_VIDEO_COLORIMETRY_BT709 it does not make sense to look at the input
signal because that is of type of RGB. The plugin needs to look at the
output YUV-type and compare GST_VIDEO_COLORIMETRY_BT709 to that, because
that is the YUV-type the plugin needs to convert input-RGB into.
Converting from YUV to RGB: Comparing to the input is correct, but because
here the color encoding info BT601/BT709 is on input side of the plugin.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6046>
Remove optional sprop-stereo and sprop-maxcapture fields from Opus
remote offer caps before intersecting with local codec preferences.
According to https://datatracker.ietf.org/doc/html/rfc7587#section-7.1
those fields are sender-only informative, and don't affect
interoperability.
Fixes cases where the webrtc media will end up receive-only if the
local side wants to send stereo but the remote is sending mono, or
vice versa.
There may be other fields in other codecs, so the implementation
anticipates needing to add further fields and codecs in the future.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5999>
Test included.
The problem appears when aggregator drops the query while
it's being proccessed by the klass->sink_query handler.
This can happen on FLUSH_START event. If the query is still
in the queue, it can be safely dropped, but if it's already
in the klass->sink_query() handler, then sink pad has no
choice and has to wait for the proccessing to complete.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5958>
This simplifies the way it picks the closest caps to preference and take into
consideration the framerate to avoid picking high resolution at 5fps or so.
Simply calculate a "distance" of caps A and B from the preference and put
closest first, sorting by framerate first.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5956>
Use gst_data_queue_push_force() for most events so they
are immediately enqueued. Only gap events and actual buffer
data will now block when the queue is full.
This fixes a problem with non-flushing seek handling
where events following a segment-done event would block
if they precede the SEGMENT event, since only SEGMENT
events would clear the 'eos' state of the multiqueue
queue.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5954>
Don't call wait_event() at all for gap events, as basesink will
end up waiting for the time that the gap event would be rendered
out at the audio device. There's no need to render it at all,
just treat it as a handy point to resync the audio if needed,
let the ringbuffer render silence, and place the next buffer
into the ringbuffer where it belongs.
The only thing we really need to do is make sure the ringbuffer
and clock are running, and wait for preroll.
Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/2749
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5953>
If GST_PAD_SINK is passed in this means that we're supposed to convert
from sink caps to src caps, not the other way around. In other words, if
GST_PAD_SINK is passed we're supposed to produce the possible output
caps.
Previously this was inverted. This had the effect that glcolorconvert
pretended to be able to convert *to* I420 without glDrawBuffers, which is
not possible, and pretended not to be able to convert *from* I420
without glDrawBuffers, which it always supports.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5947>