The InPlace/Taken logic was introduced to avoid using an extra lock
around the session, but it places expectations that are not always
obvious to meet around when a session is expected to be taken or not.
Any code that expects to have access to the sessions at all times thus
needs either extra logic in the session wrapper, or to maintain the
state of the session outside of the session (eg mids).
This commit removes the logic, and wraps sessions in Arc<Mutex>>.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1859>
RFC7273 related attributes are set in the SDP offer by passing them via the
transceiver `codec-preferences` signal. These attributes are intended to be set
at the media level so they must be prefixed by `a-` in the `Caps` argument to
the signal. Otherwise they end up under `a=fmtp`.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1811>
webrtcsink was starting the negotiation process on Ready and concurrently
moving the consumer pipeline to Playing, but when answering the remote
description was set so fast that input streams were connected (and the time
format set on appsrc) before the state change to Paused had completed.
This meant gst_base_src_start was happening after that and setting the format
back to bytes, the time segment that was next coming in then caused:
basesrc gstbasesrc.c:4255:gst_base_src_push_segment:<video_0> segment format mismatched, ignore
And the consumer pipeline errored out.
The same issue existed in theory when webrtcsink was creating the offer,
but was much harder to trigger as it required that the remote answer
came in before the state change to Paused had completed.
This commit fixes the issue by simply waiting for the state to have
changed to Paused before negotiating.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1738>
We can use `is_some_and(...)` instead of `map_or(false, ...)`.
Also in a few places the factory was retrieved multiple times, one time
with unwrapping and another time with handling the `None` case
correctly. Instead of unwrapping, move code to handle the `None` case.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1630>
- generate a new session id for every new client
use the session id in the resource url
- remove the producer-peer-id property in the WhipServer signaler as it
is redundant to have producer id in a session having only one producer
- read the 'producer-peer-id' property on the signaller conditionally
if it exists else use the session id as producer id
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1339>
- Add a new structure Session
- manage each producer using a session
- avoid send EOS when a session terminates, instead keep running
waiting for any new producer to connect
- Maintain a bin element per session
- each session bin encapsulates webrtcbin and the decoder if needed
as well as the parser and filter if requested by the application
(through request-encoded-filter)
- this will be helpful to cleanup the session's respective elements
when the corresponding producer terminates the session
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1339>
This commit adds an Android `webrtcsrc` based example with the following
features:
* A first view allows retrieving the producer list from the signaller (peer ids
are uuids which are too long to tap, especially using an onscreen keyboard).
* Selecting a producer opens a second view. The first available video stream is
rendered on a native Surface. All the audio streams are rendered using
`autoaudiosink`.
Available Settings:
* Signaller URI.
* A toggle to prefer hardware decoding for OPUS, otherwise the app defaults to
raising `opusdec`'s rank. Hardware decoding was moved aside since it was found
to crash the app on all tested devices (2 smartphones, 1 tv).
**Warning**: in order to ease testing, this demonstration application enables
unencrypted network communication. See `AndroidManifest.xml`.
The application uses the technologies currenlty proposed by Android Studio when
creating a new project:
* Kotlin as the default language, which is fully interoperable with Java and
uses the same SDK.
* gradle 8.6.
* kotlin dialect for gradle. The structure is mostly the same as the previously
preferred dialect, for which examples can be found online readily.
* However, JNI code generation still uses Makefiles (instead of CMake) due to
the need to call [`gstreamer-1.0.mk`] for `gstreamer_android` generation.
Note: on-going work on that front:
- https://gitlab.freedesktop.org/gstreamer/cerbero/-/merge_requests/1466
- https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6794
Current limitations:
* x86 support is currently discarded as `gstreamer_android` libs generation
fails (observed with `gstreamer-1.0-android-universal-1.24.3`).
* A selector could be added to let the user chose the video streams and
possibly decide whether to render all audio streams or just select one.
Nice to have:
* Support for the synchronization features of the `webrtc-precise-sync-recv`
example (NTP clock, RFC 7273).
* It could be nice to use Rust for the specific native code.
[`gstreamer-1.0.mk`]: https://gitlab.freedesktop.org/gstreamer/cerbero/-/blob/main/data/ndk-build/gstreamer-1.0.mk
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1578>
If a user constrained the supported CAPS, for instance using `video-caps`:
```shell
gst-launch-1.0 videotestsrc ! video/x-raw,format=I420 ! x264enc \
! webrtcsink video-caps=video/x-vp8
```
... a panic would occur which was internally caught without the user being
informed except for the following message which was written to stderr:
> thread 'tokio-runtime-worker' panicked at net/webrtc/src/webrtcsink/imp.rs:3533:22:
> expected audio or video raw caps: video/x-h264, [...] <br>
> note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
The pipeline kept running.
This commit converts the panic into an `Error` which bubbles up as an element
`StreamError::CodecNotFound` which can be handled by the application.
With the above `gst-launch`, this terminates the pipeline with:
> [...] ERROR webrtcsink net/webrtc/src/webrtcsink/imp.rs:3771:gstrswebrtc::
> webrtcsink:👿:BaseWebRTCSink::start_stream_discovery_if_needed::{{closure}}:<webrtcsink0>
> Error running discovery: Unsupported caps: video/x-h264, [...] <br>
> ERROR: from element /GstPipeline:pipeline0/GstWebRTCSink:webrtcsink0:
> There is no codec present that can handle the stream's type. <br>
> Additional debug info: <br>
> net/webrtc/src/webrtcsink/imp.rs(3772): gstrswebrtc::webrtcsink:👿:BaseWebRTCSink::
> start_stream_discovery_if_needed::{{closure}} (): /GstPipeline:pipeline0/GstWebRTCSink:webrtcsink0:
> Failed to look up output caps: Unsupported caps: video/x-h264, [...] <br>
> Execution ended after 0:00:00.055716661 <br>
> Setting pipeline to NULL ...
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1540>
Clippy caught the missing feature `signal` which is used by the WebRTC precise
synchronization examples. When running `cargo` `check`, `build` or `clippy`
without `no-default-dependencies`, this feature was already present due to
dependents crates.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/1541>