Using terminate() kills the whole process instead of just stopping the event loop, so we're back to the 'old' way.
However, if the provided function finishes too early, that can also fail (will call stop() on a not-yet-running NSApp).
Creating a delegate and waiting for the callback makes sure NSApp is running before the actual main() is called.
Also, for whatever reason only tutorials were changed to use terminate(). Now both tutorials and examples are
using identical code.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1378>
Enables subclassing gst_rtsp_server::RTSPAuth and overriding its
authenticate/check/generate_authenticate_header methods
Also add new methods in RTSPContext to retrieve RTSP request/response, and to
get/replace tokens.
Additionally, added RTSPMessage with methods to add an authentication
header to a request / retrieve authentication parameters from a
response.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1359>
Glutin completely detached from `winit` in the `0.30` upgrade, concerning
itself exclusively with OpenGL and WSI APIs around it and leaving any
windowing system interop to the `raw-window-handle` crate specifically
designed for this purpose.
This untanglement massively cleans up and simplifies the `glutin`
codebase, and expands on surfaceless rendering as well as drawing to
simple views (textures) on the screen as is common on Android, without
having control over the entire "window" and event loop.
Some winit boilerplate is however still provided as part of the
`glutin-winit` crate. Most of the `glutin`+`winit` flow in this
`glupload` example is adopted from `glutin`'s example, following
platform-specific initialization sequences that heavily clutter the code
(only creating a window upfront on Windows, only forcing transparency on
macOS, and trying various fallback attributes to create a context).
At the same time `winit`'s `Event::Resumed` and `Event::Suspended`
event strategy is adopted: this event was previously for Android and
iOS exclusively - where window handles come and go at the merit of
the OS, rather than existing for the lifetime of the application -
but is now emitted on all platforms for consistency. A `Surface` (via
`RawWindowHandle`) is only available and usable after `Event::Resumed`,
where we can create a GL surface and "current" the context on that
surface for rendering. This is where the `GstPipeline` will be set
to `Playing` so that data starts flowing. The inverse should happen in
`Event::Suspended` where the `Surface` has to be given up again after
un-currenting, before giving control back to the OS to free the rest of
the resources. This will however be implemented when Android is brought
online for these examples.
Finally, now that the `gst-gl-egl` and `gst-gl-x11` features turn on
the relevant features in `glutin` and `winit`, it is now possible to
easily test `x11` on Wayland (over XWayland) without even unsetting
`WAYLAND_DISPLAY`, by simply compiling the whole stack without EGL/
Wayland support (on the previous example `winit` would always default to
a Wayland handle, while `glupload` could only create `GstGLDisplayX11`).
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1336>
This is what you get with an untyped API. `glGetError()` further down
the line was returning `GL_INVALID_OPERATION` and failing other calls
after `load()` in the `glutin 0.31` upgrade. This turns out to be
[returned by `glGetProgramiv`] when the `program` that is passed in
does not refer to a program object. Which was the case here, where the
fragment shader identifier was passed in instead.
Just in case we insert a few extra asserts that check the result of
`glGetError()` to catch such issues earlier on in the chain, instead of
postponing them and falsely accusing code that runs later.
[returned by `glGetProgramiv`]: https://registry.khronos.org/OpenGL-Refpages/es2.0/xhtml/glGetProgramiv.xml#errors
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1336>
`GstGLDisplayWayland` calls GstGLDisplayEGL::from_gl_display()` under
the hood (which calls `GstGLDisplayEGL::from_native()`, which calls
`eglGetPlatformDisplay()`) to retrieve the underlying `EGLDisplay`
handle, which thus far seems to be the same value as `glutin`. However,
newer `glutin 0.31` passes attributes to this function resulting in a
different handle, causing all kinds of trouble further down the line
when sharing resources between `glutin` and `gstreamer-rs` that both
operate on a distinct `EGLDisplay`.
Furthermore `GstGLDisplayEGL` thinks that it uniquely owns the
handle returned by `eglGetPlatformDisplay()` and _does not_ set
`.foreign_display = TRUE` (which `GstGLDisplayEGL::with_egl_display()`
would), causing it to call `eglTerminate()` as soon as the
`GstGLDisplay` is destroyed, leaving `glutin` dysfunctional.
To solve all of this, simply remove this wrongly-behaving class from the
example as it is not suitable for sharing an `EGLDisplay` with `glutin`.
It might however be interesting to create a different example that
showcases how to use raw window handles instead of EGL/GLX handles,
however only Wayland and any platform on EGL like Android, via
`GstGLDisplayEGL::from_native()`, support this.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1336>
It is annoying to see only a single line of error when debugging a chain
of (e.g. `anyhow::Context::context()`-created) errors, and have a
zero exit-code while at it.
Instead Rust supports returning a `Result` type straight from main which
is `Debug`- instead of `Display`-printed, so that - in the case of
`anyhow::Error` - all nested (via `.source()`) `Error`s are printed in
backtrace-like form, and the exit code is appropriately non-zero.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1336>
In preparation to make a more specialized VideoFrameGL this extracts
common helper functions valid for all VideoFrames into a trait that can
be implemented without too much code duplication.
Note that this is a breaking change, now VideoFrame and VideoFrameRef
cannot really be used without include the gst_video prelude.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1312>
Previously, with add_watch()/add_watch_local() you had to remember
calling remove_watch() in order not to leak the bus, the watch source
and two associated file descriptors. Now these methods instead return an
object of type BusWatchGuard that will automatically remove the bus
watch when the object is dropped.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1248>
Use can change the video player zoom using the next keys:
* +: Zoom in
* -: Zoom out
* Up/Down/Right/Left: Move the frame
* r: reset the zoom
Also mouse navigation events can be used for a better UX.
Furthermore, it works with an pipeline using other video compositor
filters like glvideomixer. For instance:
glvideomixer \
name=mix background=1 \
sink_0::xpos=0 sink_0::ypos=0 sink_0::zorder=0 \
sink_0::width={WIDTH} sink_0::height={HEIGHT} \
! glimagesinkelement \
gltestsrc pattern=mandelbrot name=src \
! video/x-raw(memory:GLMemory),framerate=30/1,width={WIDTH},height={HEIGHT},pixel-aspect-ratio=1/1 \
! queue \
! mix.sink_0
Probe was added in the sink pad to get direct navigation events w/o
transformation done by the mixer. More info about it in the PR [1].
[1] https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/1495
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer-rs/-/merge_requests/1217>