mirror of
https://gitlab.freedesktop.org/gstreamer/gstreamer.git
synced 2024-11-13 12:51:16 +00:00
README: link to blog post, document multiparty example
Also add TODO stubs for MCU and SFU
This commit is contained in:
parent
2e5204ae3b
commit
492d13a7c9
1 changed files with 42 additions and 2 deletions
|
@ -1,7 +1,9 @@
|
|||
## GStreamer WebRTC demos
|
||||
# GStreamer WebRTC demos
|
||||
|
||||
All demos use the same signalling server in the `signalling/` directory
|
||||
|
||||
## Build
|
||||
|
||||
The GStreamer WebRTC implementation has now been merged upstream, so all
|
||||
you need is the latest GStreamer git master, as of 2 February 2018 or later:
|
||||
|
||||
|
@ -18,11 +20,23 @@ Or with Meson gst-build:
|
|||
|
||||
https://cgit.freedesktop.org/gstreamer/gst-build/
|
||||
|
||||
You may need to install the following packages using your package manager:
|
||||
|
||||
json-glib, libsoup, libnice, libnice-gstreamer1 (the gstreamer plugin for libnice)
|
||||
|
||||
## Documentation
|
||||
|
||||
Currently, the best way to understand the API is to read the examples. This post breaking down the API should help with that:
|
||||
|
||||
http://blog.nirbheek.in/2018/02/gstreamer-webrtc.html
|
||||
|
||||
## Examples
|
||||
|
||||
### sendrecv: Send and receive audio and video
|
||||
|
||||
* Serve the `js/` directory on the root of your website, or open https://webrtc.nirbheek.in
|
||||
- The JS code assumes the signalling server is on port 8443 of the same server serving the HTML
|
||||
* Build and run the sources in the `gst/` directory on your machine
|
||||
* Build the sources in the `gst/` directory on your machine
|
||||
|
||||
```console
|
||||
$ gcc webrtc-sendrecv.c $(pkg-config --cflags --libs gstreamer-webrtc-1.0 gstreamer-sdp-1.0 libsoup-2.4 json-glib-1.0) -o webrtc-sendrecv
|
||||
|
@ -31,3 +45,29 @@ $ gcc webrtc-sendrecv.c $(pkg-config --cflags --libs gstreamer-webrtc-1.0 gstrea
|
|||
* Open the website in a browser and ensure that the status is "Registered with server, waiting for call", and note the `id` too.
|
||||
* Run `webrtc-sendrecv --peer-id=ID` with the `id` from the browser. You will see state changes and an SDP exchange.
|
||||
* You will see a bouncing ball + hear red noise in the browser, and your browser's webcam + mic in the gst app
|
||||
|
||||
TODO: Port to Python and Rust.
|
||||
|
||||
### multiparty-sendrecv: Multiparty audio conference with N peers
|
||||
|
||||
* Build the sources in the `gst/` directory on your machine
|
||||
|
||||
```console
|
||||
$ gcc mp-webrtc-sendrecv.c $(pkg-config --cflags --libs gstreamer-webrtc-1.0 gstreamer-sdp-1.0 libsoup-2.4 json-glib-1.0) -o mp-webrtc-sendrecv
|
||||
```
|
||||
|
||||
* Run `mp-webrtc-sendrecv --room-id=ID` with `ID` as a room name. The peer will connect to the signalling server and setup a conference room.
|
||||
* Run this as many times as you like, each will spawn a peer that sends red noise and outputs the red noise it receives from other peers.
|
||||
- To change what a peer sends, find the `audiotestsrc` element in the source and change the `wave` property.
|
||||
- You can, of course, also replace `audiotestsrc` itself with `autoaudiosrc` (any platform) or `pulsesink` (on linux).
|
||||
* TODO: implement JS to do the same, derived from the JS for the `sendrecv` example.
|
||||
|
||||
### TODO: Selective Forwarding Unit (SFU) example
|
||||
|
||||
* Server routes media between peers
|
||||
* Participant sends 1 stream, receives n-1 streams
|
||||
|
||||
### TODO: Multipoint Control Unit (MCU) example
|
||||
|
||||
* Server mixes media from all participants
|
||||
* Participant sends 1 stream, receives 1 stream
|
||||
|
|
Loading…
Reference in a new issue