webrtc: Add gstwebrtc-api subproject in net/webrtc plugin

This subproject adds a high-level web API compatible with GStreamer
webrtcsrc and webrtcsink elements and the corresponding signaling
server. It allows a perfect bidirectional communication between HTML5
WebRTC API and native GStreamer.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/merge_requests/946>
This commit is contained in:
Loïc Le Page 2022-10-18 18:46:35 +02:00
parent 8845f6a4c6
commit f17622a1e1
30 changed files with 6342 additions and 4490 deletions

View file

@ -400,6 +400,16 @@ rustfmt:
- cargo fmt --version
- cargo fmt -- --color=always --check
gstwebrtc-api lint:
image: node:lts
stage: "lint"
rules:
- when: 'always'
script:
- cd net/webrtc/gstwebrtc-api
- npm install
- npm run check
check commits:
extends: .img-stable
stage: "lint"

View file

@ -1,13 +1,19 @@
# webrtcsink
# webrtcsink and webrtcsrc
All-batteries included GStreamer WebRTC producer, that tries its best to do The Right Thing™.
All-batteries included GStreamer WebRTC producer and consumer, that try their
best to do The Right Thing™.
It also provides a flexible and all-purposes WebRTC signalling server
([gst-webrtc-signalling-server](signalling/src/bin/server.rs)) and a Javascript
API ([gstwebrtc-api](gstwebrtc-api)) to produce and consume compatible WebRTC
streams from a web browser.
## Use case
The [webrtcbin] element in GStreamer is extremely flexible and powerful, but using
it can be a difficult exercise. When all you want to do is serve a fixed set of streams
to any number of consumers, `webrtcsink` (which wraps `webrtcbin` internally) can be a
useful alternative.
The [webrtcbin] element in GStreamer is extremely flexible and powerful, but
using it can be a difficult exercise. When all you want to do is serve a fixed
set of streams to any number of consumers, `webrtcsink` (which wraps
`webrtcbin` internally) can be a useful alternative.
[webrtcbin]: https://gstreamer.freedesktop.org/documentation/webrtc/index.html
@ -15,25 +21,27 @@ useful alternative.
`webrtcsink` implements the following features:
* Built-in signaller: when using the default signalling server, this element will
perform signalling without requiring application interaction.
* Built-in signaller: when using the default signalling server, this element
will perform signalling without requiring application interaction.
This makes it usable directly from `gst-launch`.
* Application-provided signalling: `webrtcsink` can be instantiated by an application
with a custom signaller. That signaller must be a GObject, and must implement the
`Signallable` interface as defined [here](src/webrtcsink/mod.rs). The
[default signaller](src/signaller/mod.rs) can be used as an example.
* Application-provided signalling: `webrtcsink` can be instantiated by an
application with a custom signaller. That signaller must be a GObject, and
must implement the `Signallable` interface as defined
[here](src/webrtcsink/mod.rs). The [default signaller](src/signaller/mod.rs)
can be used as an example.
An [example project] is also available to use as a boilerplate for implementing
and using a custom signaller.
An [example project] is also available to use as a boilerplate for
implementing and using a custom signaller.
* Sandboxed consumers: when a consumer is added, its encoder / payloader / webrtcbin
elements run in a separately managed pipeline. This provides a certain level of
sandboxing, as opposed to having those elements running inside the element itself.
* Sandboxed consumers: when a consumer is added, its encoder / payloader /
webrtcbin elements run in a separately managed pipeline. This provides a
certain level of sandboxing, as opposed to having those elements running
inside the element itself.
It is important to note that at this moment, encoding is not shared between consumers.
While this is not on the roadmap at the moment, nothing in the design prevents
implementing this optimization.
It is important to note that at this moment, encoding is not shared between
consumers. While this is not on the roadmap at the moment, nothing in the
design prevents implementing this optimization.
* Congestion control: the element leverages transport-wide congestion control
feedback messages in order to adapt the bitrate of individual consumers' video
@ -45,20 +53,20 @@ useful alternative.
* Packet loss mitigation: webrtcsink now supports sending protection packets for
Forward Error Correction, modulating the amount as a function of the available
bandwidth, and can honor retransmission requests. Both features can be disabled
via properties.
bandwidth, and can honor retransmission requests. Both features can be
disabled via properties.
It is important to note that full control over the individual elements used by
`webrtcsink` is *not* on the roadmap, as it will act as a black box in that respect,
for example `webrtcsink` wants to reserve control over the bitrate for congestion
control.
`webrtcsink` is *not* on the roadmap, as it will act as a black box in that
respect, for example `webrtcsink` wants to reserve control over the bitrate for
congestion control.
A signal is now available however for the application to provide the initial
configuration for the encoders `webrtcsink` instantiates.
If more granular control is required, applications should use `webrtcbin` directly,
`webrtcsink` will focus on trying to just do the right thing, although it might
expose more interfaces to guide and tune the heuristics it employs.
If more granular control is required, applications should use `webrtcbin`
directly, `webrtcsink` will focus on trying to just do the right thing, although
it might expose more interfaces to guide and tune the heuristics it employs.
[example project]: https://github.com/centricular/webrtcsink-custom-signaller
@ -74,38 +82,49 @@ cargo build
## Usage
Open three terminals. In the first, run:
Open three terminals. In the first one, run the signalling server:
``` shell
WEBRTCSINK_SIGNALLING_SERVER_LOG=debug cargo run --bin gst-webrtc-signalling-server
```
In the second, run:
In the second one, run a web browser client (can produce and consume streams):
``` shell
python3 -m http.server -d www/
cd gstwebrtc-api
npm install
npm start
```
In the third, run:
In the third one, run a webrtcsink producer from a GStreamer pipeline:
``` shell
export GST_PLUGIN_PATH=$PWD/target/debug:$GST_PLUGIN_PATH
gst-launch-1.0 webrtcsink name=ws videotestsrc ! ws. audiotestsrc ! ws.
gst-launch-1.0 webrtcsink name=ws meta="meta,name=gst-stream" videotestsrc ! ws. audiotestsrc ! ws.
```
When the pipeline above is running successfully, open a browser and
point it to the http server:
The webrtcsink produced stream will appear in the former web page
(automatically opened at https://localhost:9090) under the name "gst-stream",
if you click on it you should see a test video stream and hear a test tone.
You can also produce WebRTC streams from the web browser and consume them with
a GStreamer pipeline. Click on the "Start Capture" button and copy the
"Client ID" value.
Then open a new terminal and run:
``` shell
gio open http://127.0.0.1:8000
export GST_PLUGIN_PATH=$PWD/target/debug:$GST_PLUGIN_PATH
gst-launch-1.0 playbin uri=gstwebrtc://127.0.0.1:8443?peer-id=[Client ID]
```
You should see an identifier listed in the left-hand panel, click on
it. You should see a test video stream, and hear a test tone.
Replacing the "peer-id" value with the previously copied "Client ID" value. You
should see the playbin element opening a window and showing you the content
produced by the web page.
## Configuration
The element itself can be configured through its properties, see
The webrtcsink element itself can be configured through its properties, see
`gst-inspect-1.0 webrtcsink` for more information about that, in addition the
default signaller also exposes properties for configuring it, in
particular setting the signalling server address, those properties
@ -122,15 +141,15 @@ gst-launch-1.0 webrtcsink signaller::address="ws://127.0.0.1:8443" ..
with the content, for example move with your mouse, entering keys with the
keyboard, etc... On top of that a `WebRTCDataChannel` based protocol has been
implemented and can be activated with the `enable-data-channel-navigation=true`
property. The [demo](www/) implements the protocol and you can easily test this
feature, using the [`wpesrc`] for example.
property. The [gstwebrtc-api](gstwebrtc-api) implements the protocol and you
can easily test this feature using the [`wpesrc`] for example.
As an example, the following pipeline allows you to navigate the GStreamer
documentation inside the video running within your web browser (in
http://127.0.0.1:8000 if you followed previous steps of that readme):
documentation inside the video running within your web browser (at
https://127.0.0.1:9090 if you followed previous steps in that readme):
``` shell
gst-launch-1.0 wpesrc location=https://gstreamer.freedesktop.org/documentation/ ! webrtcsink enable-data-channel-navigation=true
gst-launch-1.0 wpesrc location=https://gstreamer.freedesktop.org/documentation/ ! queue ! webrtcsink enable-data-channel-navigation=true meta="meta,name=web-stream"
```
[`GstNavigation`]: https://gstreamer.freedesktop.org/documentation/video/gstnavigation.html
@ -139,16 +158,16 @@ gst-launch-1.0 wpesrc location=https://gstreamer.freedesktop.org/documentation/
## Testing congestion control
For the purpose of testing congestion in a reproducible manner, a
[simple tool] has been used, I only used it on Linux but it is documented
as usable on MacOS too. I had to run the client browser on a separate
machine on my local network for congestion to actually be applied, I didn't
look into why that was necessary.
[simple tool] has been used, it has been used on Linux exclusively but it is
also documented as usable on MacOS too. Client web browser has to be launched
on a separate machine on the LAN to test for congestion, although specific
configurations may allow to run it on the same machine.
My testing procedure was:
Testing procedure was:
* identify the server machine network interface (eg with `ifconfig` on Linux)
* identify the server machine network interface (e.g. with `ifconfig` on Linux)
* identify the client machine IP address (eg with `ifconfig` on Linux)
* identify the client machine IP address (e.g. with `ifconfig` on Linux)
* start the various services as explained in the Usage section (use
`GST_DEBUG=webrtcsink:7` to get detailed logs about congestion control)
@ -171,7 +190,7 @@ My testing procedure was:
$HOME/go/bin/comcast --device=$SERVER_INTERFACE --stop
```
For comparison, the congestion control property can be set to disabled on
For comparison, the congestion control property can be set to "disabled" on
webrtcsink, then the above procedure applied again, the expected result is
for playback to simply crawl down to a halt until the bandwidth limitation
is lifted:
@ -184,18 +203,18 @@ gst-launch-1.0 webrtcsink congestion-control=disabled
## Monitoring tool
An example server / client application for monitoring per-consumer stats
An example of client/server application for monitoring per-consumer stats
can be found [here].
[here]: https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/tree/main/net/webrtc/examples
## License
All the rust code in this repository is licensed under the [Mozilla Public License Version 2.0].
All the rust code in this repository is licensed under the
[Mozilla Public License Version 2.0].
Parts of the JavaScript code in the www/ example are licensed under the [Apache License, Version 2.0],
the rest is licensed under the [Mozilla Public License Version 2.0] unless advertised in the
header.
Code in [gstwebrtc-api](gstwebrtc-api) is also licensed under the
[Mozilla Public License Version 2.0].
## Using the AWS KVS signaller
@ -212,4 +231,3 @@ AWS_ACCESS_KEY_ID="XXX" AWS_SECRET_ACCESS_KEY="XXX" gst-launch-1.0 videotestsrc
* Connect a viewer @ <https://awslabs.github.io/amazon-kinesis-video-streams-webrtc-sdk-js/examples/index.html>
[Mozilla Public License Version 2.0]: http://opensource.org/licenses/MPL-2.0
[Apache License, Version 2.0]: https://www.apache.org/licenses/LICENSE-2.1

View file

@ -0,0 +1,9 @@
root = true
[*]
end_of_line = lf
insert_final_newline = true
trim_trailing_whitespace = true
charset = utf-8
indent_style = space
indent_size = 2

View file

@ -0,0 +1,61 @@
{
"root": true,
"parserOptions": {
"ecmaVersion": 2017,
"sourceType": "module"
},
"env": {
"browser": true,
"es6": true
},
"extends": "eslint:recommended",
"rules": {
"getter-return": "error",
"no-await-in-loop": "error",
"no-console": "off",
"no-extra-parens": "off",
"no-template-curly-in-string": "error",
"consistent-return": "error",
"curly": "error",
"eqeqeq": "error",
"no-eval": "error",
"no-extra-bind": "error",
"no-invalid-this": "error",
"no-labels": "error",
"no-lone-blocks": "error",
"no-loop-func": "error",
"no-multi-spaces": "error",
"no-return-assign": "error",
"no-return-await": "error",
"no-self-compare": "error",
"no-throw-literal": "error",
"no-unused-expressions": "error",
"no-useless-call": "error",
"no-useless-concat": "error",
"no-useless-return": "error",
"no-void": "error",
"no-shadow": "error",
"block-spacing": "error",
"brace-style": [
"error",
"1tbs",
{
"allowSingleLine": true
}
],
"camelcase": "error",
"comma-dangle": "error",
"eol-last": "error",
"indent": [
"error",
2
],
"linebreak-style": "error",
"new-parens": "error",
"no-lonely-if": "error",
"no-multiple-empty-lines": "error",
"no-trailing-spaces": "error",
"quotes": "error",
"semi": "error"
}
}

View file

@ -0,0 +1 @@
* text=auto eol=lf

4
net/webrtc/gstwebrtc-api/.gitignore vendored Normal file
View file

@ -0,0 +1,4 @@
/dist/
/docs/
/node_modules/
/*.tgz

View file

@ -0,0 +1,5 @@
engine-strict = true
fund = false
git-tag-version = false
package-lock = false
save-exact = true

View file

@ -0,0 +1 @@
../../../LICENSE-MPL-2.0

View file

@ -0,0 +1,150 @@
# gstwebrtc-api
[![License: MPL 2.0](https://img.shields.io/badge/License-MPL_2.0-brightgreen.svg)](https://opensource.org/licenses/MPL-2.0)
Javascript API used to integrate GStreamer WebRTC streams produced and consumed by webrtcsink and webrtcsrc elements
into a web browser or a mobile WebView.
This API allows a complete 360º interconnection between GStreamer and web interfaces for realtime streaming using the
WebRTC protocol.
This API is released under the Mozilla Public License Version 2.0 (MPL-2.0) that can be found in the LICENSE-MPL-2.0
file or at https://opensource.org/licenses/MPL-2.0
Copyright (C) 2022 Igalia S.L. <<info@igalia.com>><br>
Author: Loïc Le Page <<llepage@igalia.com>>
It includes external source code from [webrtc-adapter](https://github.com/webrtcHacks/adapter) that is embedded with
the API. The webrtc-adapter BSD 3-Clause license is available at
https://github.com/webrtcHacks/adapter/blob/master/LICENSE.md
Webrtc-adapter is Copyright (c) 2014, The WebRTC project authors, All rights reserved.<br>
Copyright (c) 2018, The adapter.js project authors, All rights reserved.
It also includes Keyboard.js source code from the Apache [guacamole-client](https://github.com/apache/guacamole-client)
that is embedded with the API to enable remote control of the webrtcsink producer.
Keyboard.js is released under the Apache 2.0 license available at:
https://github.com/apache/guacamole-client/blob/master/LICENSE
## Building the API
The GstWebRTC API uses [Webpack](https://webpack.js.org/) to bundle all source files and dependencies together.
You only need to install [Node.js](https://nodejs.org/en/) to run all commands.
On first time, install the dependencies by calling:
```shell
$ npm install
```
Then build the bundle by calling:
```shell
$ npm run make
```
It will build and compress the code into the *dist/* folder, there you will find 2 files:
- *gstwebrtc-api-[version].min.js* which is the only file you need to include into your web application to use the API.
It already embeds all dependencies.
- *gstwebrtc-api-[version].min.js.map* which is useful for debugging the API code, you need to put it in the same
folder as the API script on your web server if you want to allow debugging, else you can just ignore it.
The API documentation is created into the *docs/* folder. It is automatically created when building the whole API.
If you want to build the documentation only, you can call:
```shell
$ npm run docs
```
If you only want to build the API without the documentation, you can call:
```shell
$ npm run build
```
## Packaging the API
You can create a portable package of the API by calling:
```shell
$ npm pack
```
It will create a *gstwebrtc-api-[version].tgz* file that contains all source code, documentation and built API. This
portable package can be installed as a dependency in any Node.js project by calling:
```shell
$ npm install gstwebrtc-api-[version].tgz
```
## Testing and debugging the API
To easily test and debug the GstWebRTC API, you just need to:
1. launch the webrtc signalling server by calling (from the repository *gst-plugins-rs* root folder):
```shell
$ cargo run --bin gst-webrtc-signalling-server
```
2. launch the GstWebRTC API server by calling (from the *net/webrtc/gstwebrtc-api* sub-folder):
```shell
$ npm start
```
It will launch a local HTTPS server listening on port 9090 and using an automatically generated self-signed
certificate.
With this server you can test the reference example shipped in *index.html* from a web browser on your local computer
or a mobile device.
## Interconnect with GStreamer pipelines
Once the signalling and gstwebrtc-api servers launched, you can interconnect the streams produced and consumed from
the web browser with GStreamer pipelines using the webrtcsink and webrtcsrc elements.
### Consume a WebRTC stream produced by the gstwebrtc-api
On the web browser side, click on the *Start Capture* button and give access to the webcam. The gstwebrtc-api will
start producing a video stream.
The signalling server logs will show the registration of a new producer with the same *peer_id* as the *Client ID*
that appears on the webpage.
Then launch the following GStreamer pipeline:
```shell
$ gst-launch-1.0 playbin uri=gstwebrtc://[signalling server]?peer-id=[client ID of the producer]
```
Using the local signalling server, it will look like this:
```shell
$ gst-launch-1.0 playbin uri=gstwebrtc://127.0.0.1:8443?peer-id=e54e5d6b-f597-4e8f-bc96-2cc3765b6567
```
The underlying *uridecodebin* element recognizes the *gstwebrtc://* scheme as a WebRTC stream compatible with the
gstwebrtc-api and will correctly use a *webrtcsrc* element to manage this stream.
The *gstwebrtc://* scheme is used for normal WebSocket connections to the signalling server, and the *gstwebrtcs://*
scheme for secured connections over SSL or TLS.
### Produce a GStreamer WebRTC stream consumed by the gstwebrtc-api
Launch the following GStreamer pipeline:
```shell
$ gst-launch-1.0 videotestsrc ! agingtv ! webrtcsink meta="meta,name=native-stream"
```
By default *webrtcsink* element uses *ws://127.0.0.1:8443* for the signalling server address, so there is no need
for more arguments. If you're hosting the signalling server elsewhere, you can specify its address by adding
`signaller::address="ws[s]://[signalling server]"` to the list of *webrtcsink* properties.
Once the GStreamer pipeline launched, you will see the registration of a new producer in the logs of the signalling
server and a new remote stream, with the name *native-stream*, will appear on the webpage.
You just need to click on the corresponding entry to connect as a consumer to the remote native stream.
### Produce a GStreamer interactive WebRTC stream with remote control
Launch the following GStreamer pipeline:
```shell
$ gst-launch-1.0 wpesrc location=https://gstreamer.freedesktop.org/documentation ! queue ! webrtcsink enable-data-channel-navigation=true meta="meta,name=web-stream"
```
Once the GStreamer pipeline launched, you will see a new producer with the name *web-stream*. When connecting to this
producer you will see the remote rendering of the web page. You can interact remotely with this web page, controls are
sent through a special WebRTC data channel while the rendering is done remotely by the
[wpesrc](https://gstreamer.freedesktop.org/documentation/wpe/wpesrc.html) element.

View file

@ -0,0 +1,456 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>GstWebRTC API</title>
<style>
body {
background-color: #3a3f44;
color: #c8c8c8;
}
section {
border-top: 2px solid #272b30;
}
main {
border-bottom: 2px solid #272b30;
padding-bottom: 1em;
}
.button {
cursor: pointer;
border-radius: 10px;
user-select: none;
}
.button:disabled {
cursor: default;
}
button.button {
box-shadow: 4px 4px 14px 1px #272b30;
border: none;
}
.spinner {
display: inline-block;
position: absolute;
width: 80px;
height: 80px;
}
.spinner>div {
box-sizing: border-box;
display: block;
position: absolute;
width: 64px;
height: 64px;
margin: 8px;
border: 8px solid #fff;
border-radius: 50%;
animation: spinner 1.2s cubic-bezier(0.5, 0, 0.5, 1) infinite;
border-color: #fff transparent transparent transparent;
}
.spinner div:nth-child(1) {
animation-delay: -0.45s;
}
.spinner div:nth-child(2) {
animation-delay: -0.3s;
}
.spinner div:nth-child(3) {
animation-delay: -0.15s;
}
@keyframes spinner {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}
video:focus-visible,
video:focus {
outline: none;
}
div.video {
position: relative;
display: inline-block;
margin: 1em;
}
div.video>div.fullscreen {
position: absolute;
top: 0;
right: 0;
width: 2.6em;
height: 2.6em;
}
div.video>div.fullscreen>span {
position: absolute;
top: 0.3em;
right: 0.4em;
font-size: 1.5em;
font-weight: bolder;
cursor: pointer;
user-select: none;
display: none;
text-shadow: 1px 1px 4px #272b30;
}
div.video>video {
width: 320px;
height: 240px;
background-color: #202020;
border-radius: 15px;
box-shadow: 4px 4px 14px 1px #272b30;
}
div.video>.spinner {
top: 80px;
left: 120px;
}
#capture {
padding-top: 1.2em;
}
#capture>.button {
vertical-align: top;
margin-top: 1.5em;
margin-left: 1em;
background-color: #98d35e;
width: 5em;
height: 5em;
}
#capture>.client-id {
display: block;
}
#capture>.client-id::before {
content: "Client ID:";
margin-right: 0.5em;
}
#capture.has-session>.button {
background-color: #e36868;
}
#capture>.button::after {
content: "Start Capture";
}
#capture.has-session>.button::after {
content: "Stop Capture";
}
#capture .spinner {
display: none;
}
#capture.starting .spinner {
display: inline-block;
}
#remote-streams {
list-style: none;
padding-left: 1em;
}
#remote-streams>li .button::before {
content: "\2799";
padding-right: 0.2em;
}
#remote-streams>li.has-session .button::before {
content: "\2798";
}
#remote-streams>li div.video {
display: none;
}
#remote-streams>li.has-session div.video {
display: inline-block;
}
#remote-streams>li.streaming .spinner {
display: none;
}
#remote-streams>li.streaming>div.video>div.fullscreen:hover>span {
display: block;
}
#remote-streams .remote-control {
display: none;
position: absolute;
top: 0.2em;
left: 0.3em;
font-size: 1.8em;
font-weight: bolder;
animation: blink 1s ease-in-out infinite alternate;
text-shadow: 1px 1px 4px #272b30;
}
@keyframes blink {
to {
opacity: 0;
}
}
#remote-streams>li.streaming.has-remote-control .remote-control {
display: block;
}
#remote-streams>li.streaming.has-remote-control>div.video>video {
width: 640px;
height: 480px;
}
</style>
<script>
const signalingProtocol = window.location.protocol.startsWith("https") ? "wss" : "ws";
window.gstWebRTCConfig = {
meta: { name: `WebClient-${Date.now()}` },
signalingServerUrl: `${signalingProtocol}://${window.location.host}/webrtc`
};
function initCapture() {
const captureSection = document.getElementById("capture");
const clientIdElement = captureSection.querySelector(".client-id");
const videoElement = captureSection.getElementsByTagName("video")[0];
const listener = {
connected: function(clientId) { clientIdElement.textContent = clientId; },
disconnected: function() { clientIdElement.textContent = "none"; }
};
gstWebRTCAPI.registerConnectionListener(listener);
document.getElementById("capture-button").addEventListener("click", (event) => {
event.preventDefault();
if (captureSection._producerSession) {
captureSection._producerSession.close();
} else if (!captureSection.classList.contains("starting")) {
captureSection.classList.add("starting");
const constraints = {
video: { width: 1280, height: 720 }
};
navigator.mediaDevices.getUserMedia(constraints).then((stream) => {
const session = gstWebRTCAPI.createProducerSession(stream);
if (session) {
captureSection._producerSession = session;
session.addEventListener("error", (event) => {
if (captureSection._producerSession === session) {
console.error(event.message, event.error);
}
});
session.addEventListener("closed", () => {
if (captureSection._producerSession === session) {
videoElement.pause();
videoElement.srcObject = null;
captureSection.classList.remove("has-session", "starting");
delete captureSection._producerSession;
}
});
session.addEventListener("stateChanged", (event) => {
if ((captureSection._producerSession === session) &&
(event.target.state === gstWebRTCAPI.SessionState.streaming)) {
videoElement.srcObject = stream;
videoElement.play().catch(() => {});
captureSection.classList.remove("starting");
}
});
session.addEventListener("clientConsumerAdded", (event) => {
if (captureSection._producerSession === session) {
console.info(`client consumer added: ${event.detail.peerId}`);
}
});
session.addEventListener("clientConsumerRemoved", (event) => {
if (captureSection._producerSession === session) {
console.info(`client consumer removed: ${event.detail.peerId}`);
}
});
captureSection.classList.add("has-session");
session.start();
} else {
for (const track of stream.getTracks()) {
track.stop();
}
captureSection.classList.remove("starting");
}
}).catch((error) => {
console.error("cannot have access to webcam and microphone", error);
captureSection.classList.remove("starting");
});
}
});
}
function initRemoteStreams() {
const remoteStreamsElement = document.getElementById("remote-streams");
const listener = {
producerAdded: function(producer) {
const producerId = producer.id
if (!document.getElementById(producerId)) {
remoteStreamsElement.insertAdjacentHTML("beforeend",
`<li id="${producerId}">
<div class="button">${producer.meta.name || producerId}</div>
<div class="video">
<div class="spinner">
<div></div>
<div></div>
<div></div>
<div></div>
</div>
<span class="remote-control">&#xA9;</span>
<video></video>
<div class="fullscreen"><span title="Toggle fullscreen">&#x25A2;</span></div>
</div>
</li>`);
const entryElement = document.getElementById(producerId);
const videoElement = entryElement.getElementsByTagName("video")[0];
videoElement.addEventListener("playing", () => {
if (entryElement.classList.contains("has-session")) {
entryElement.classList.add("streaming");
}
});
entryElement.addEventListener("click", (event) => {
event.preventDefault();
if (!event.target.classList.contains("button")) {
return;
}
if (entryElement._consumerSession) {
entryElement._consumerSession.close();
} else {
const session = gstWebRTCAPI.createConsumerSession(producerId);
if (session) {
entryElement._consumerSession = session;
session.addEventListener("error", (event) => {
if (entryElement._consumerSession === session) {
console.error(event.message, event.error);
}
});
session.addEventListener("closed", () => {
if (entryElement._consumerSession === session) {
videoElement.pause();
videoElement.srcObject = null;
entryElement.classList.remove("has-session", "streaming", "has-remote-control");
delete entryElement._consumerSession;
}
});
session.addEventListener("streamsChanged", () => {
if (entryElement._consumerSession === session) {
const streams = session.streams;
if (streams.length > 0) {
videoElement.srcObject = streams[0];
videoElement.play().catch(() => {});
}
}
});
session.addEventListener("remoteControllerChanged", () => {
if (entryElement._consumerSession === session) {
const remoteController = session.remoteController;
if (remoteController) {
entryElement.classList.add("has-remote-control");
remoteController.attachVideoElement(videoElement);
} else {
entryElement.classList.remove("has-remote-control");
}
}
});
entryElement.classList.add("has-session");
session.connect();
}
}
});
}
},
producerRemoved: function(producer) {
const element = document.getElementById(producer.id);
if (element) {
if (element._consumerSession) {
element._consumerSession.close();
}
element.remove();
}
}
};
gstWebRTCAPI.registerProducersListener(listener);
for (const producer of gstWebRTCAPI.getAvailableProducers()) {
listener.producerAdded(producer);
}
}
window.addEventListener("DOMContentLoaded", () => {
document.addEventListener("click", (event) => {
if (event.target.matches("div.video>div.fullscreen:hover>span")) {
event.preventDefault();
event.target.parentNode.previousElementSibling.requestFullscreen();
}
});
initCapture();
initRemoteStreams();
});
</script>
</head>
<body>
<header>
<h1>GstWebRTC API</h1>
</header>
<main>
<section id="capture">
<span class="client-id">none</span>
<button class="button" id="capture-button"></button>
<div class="video">
<div class="spinner">
<div></div>
<div></div>
<div></div>
<div></div>
</div>
<video></video>
</div>
</section>
<section>
<h1>Remote Streams</h1>
<ul id="remote-streams"></ul>
</section>
</main>
</body>
</html>

View file

@ -0,0 +1,59 @@
{
"name": "gstwebrtc-api",
"version": "1.0.0",
"description": "Javascript API to integrate GStreamer WebRTC streams (webrtcsrc/webrtcsink) in a web browser",
"keywords": [
"webrtc",
"multimedia",
"realtime",
"gstreamer",
"audio",
"video"
],
"homepage": "https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/tree/main/net/webrtc/gstwebrtc-api",
"bugs": {
"url": "https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/issues"
},
"license": "MPL-2.0",
"author": {
"name": "Loïc Le Page",
"email": "llepage@igalia.com",
"url": "https://www.igalia.com/"
},
"repository": {
"type": "git",
"url": "https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs.git",
"directory": "net/webrtc/gstwebrtc-api"
},
"browser": "dist/gstwebrtc-api-${npm_package_version}.min.js",
"files": [
"dist/",
"docs/",
"src/",
"third-party/",
"index.html",
"webpack.config.js"
],
"devDependencies": {
"eslint": "8.37.0",
"html-webpack-plugin": "5.5.0",
"jsdoc": "4.0.2",
"rimraf": "4.4.1",
"terser-webpack-plugin": "5.3.7",
"webpack": "5.77.0",
"webpack-cli": "5.0.1",
"webpack-dev-server": "4.13.1"
},
"dependencies": {
"webrtc-adapter": "8.2.2"
},
"scripts": {
"check": "eslint src",
"format": "eslint --fix --fix-type layout src",
"build": "rimraf dist && webpack",
"docs": "rimraf docs && jsdoc src/*.js -d docs/ -p package.json -R README.md",
"make": "npm run check && npm run build && npm run docs",
"prepack": "npm run make",
"start": "webpack serve"
}
}

View file

@ -0,0 +1,334 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import ConsumerSession from "./consumer-session";
import ProducerSession from "./producer-session";
const SignallingServerMessageType = Object.freeze({
welcome: "welcome",
peerStatusChanged: "peerStatusChanged",
list: "list",
sessionStarted: "sessionStarted",
peer: "peer",
startSession: "startSession",
endSession: "endSession",
error: "error"
});
function normalizeProducer(producer, excludedId) {
if (!producer || (typeof (producer) !== "object")) {
return null;
}
const normalizedProducer = {
id: "",
meta: {}
};
if (producer.id && (typeof (producer.id) === "string")) {
normalizedProducer.id = producer.id;
} else if (producer.peerId && (typeof (producer.peerId) === "string")) {
normalizedProducer.id = producer.peerId;
} else {
return null;
}
if (normalizedProducer.id === excludedId) {
return null;
}
if (producer.meta && (typeof (producer.meta) === "object")) {
normalizedProducer.meta = producer.meta;
}
Object.freeze(normalizedProducer.meta);
return Object.freeze(normalizedProducer);
}
export default class ComChannel extends EventTarget {
constructor(url, meta, webrtcConfig) {
super();
this._meta = meta;
this._webrtcConfig = webrtcConfig;
this._ws = new WebSocket(url);
this._ready = false;
this._channelId = "";
this._producerSession = null;
this._consumerSessions = {};
this._ws.onerror = (event) => {
this.dispatchEvent(new ErrorEvent("error", {
message: event.message || "WebSocket error",
error: event.error || new Error(
this._ready ? "transportation error" : "cannot connect to signaling server")
}));
this.close();
};
this._ws.onclose = () => {
this._ready = false;
this._channelId = "";
this._ws = null;
this.closeAllConsumerSessions();
if (this._producerSession) {
this._producerSession.close();
this._producerSession = null;
}
this.dispatchEvent(new Event("closed"));
};
this._ws.onmessage = (event) => {
try {
const msg = JSON.parse(event.data);
if (msg && (typeof (msg) === "object")) {
switch (msg.type) {
case SignallingServerMessageType.welcome:
this._channelId = msg.peerId;
try {
this._ws.send(JSON.stringify({
type: "setPeerStatus",
roles: ["listener"],
meta: meta
}));
} catch (ex) {
this.dispatchEvent(new ErrorEvent("error", {
message: "cannot initialize connection to signaling server",
error: ex
}));
this.close();
}
break;
case SignallingServerMessageType.peerStatusChanged:
if (msg.peerId === this._channelId) {
if (!this._ready && msg.roles.includes("listener")) {
this._ready = true;
this.dispatchEvent(new Event("ready"));
this.send({ type: "list" });
}
if (this._producerSession && msg.roles.includes("producer")) {
this._producerSession.onProducerRegistered();
}
} else {
const normalizedProducer = normalizeProducer(msg, this._channelId);
if (normalizedProducer) {
if (msg.roles.includes("producer")) {
this.dispatchEvent(new CustomEvent("producerAdded", { detail: normalizedProducer }));
} else {
this.dispatchEvent(new CustomEvent("producerRemoved", { detail: normalizedProducer }));
}
}
}
break;
case SignallingServerMessageType.list:
for (const producer of msg.producers) {
const normalizedProducer = normalizeProducer(producer, this._channelId);
if (normalizedProducer) {
this.dispatchEvent(new CustomEvent("producerAdded", { detail: normalizedProducer }));
}
}
break;
case SignallingServerMessageType.sessionStarted:
{
const session = this.getConsumerSession(msg.peerId);
if (session) {
delete this._consumerSessions[msg.peerId];
session.onSessionStarted(msg.peerId, msg.sessionId);
if (session.sessionId && !(session.sessionId in this._consumerSessions)) {
this._consumerSessions[session.sessionId] = session;
} else {
session.close();
}
}
}
break;
case SignallingServerMessageType.peer:
{
const session = this.getConsumerSession(msg.sessionId);
if (session) {
session.onSessionPeerMessage(msg);
} else if (this._producerSession) {
this._producerSession.onSessionPeerMessage(msg);
}
}
break;
case SignallingServerMessageType.startSession:
if (this._producerSession) {
this._producerSession.onStartSessionMessage(msg);
}
break;
case SignallingServerMessageType.endSession:
{
const session = this.getConsumerSession(msg.sessionId);
if (session) {
session.close();
} else if (this._producerSession) {
this._producerSession.onEndSessionMessage(msg);
}
}
break;
case SignallingServerMessageType.error:
this.dispatchEvent(new ErrorEvent("error", {
message: "error received from signaling server",
error: new Error(msg.details)
}));
break;
default:
throw new Error(`unknown message type: "${msg.type}"`);
}
}
} catch (ex) {
this.dispatchEvent(new ErrorEvent("error", {
message: "cannot parse incoming message from signaling server",
error: ex
}));
}
};
}
get meta() {
return this._meta;
}
get webrtcConfig() {
return this._webrtcConfig;
}
get ready() {
return this._ready;
}
get channelId() {
return this._channelId;
}
get producerSession() {
return this._producerSession;
}
createProducerSession(stream) {
if (!this._ready || !(stream instanceof MediaStream)) {
return null;
}
if (this._producerSession) {
if (this._producerSession.stream === stream) {
return this._producerSession;
} else {
return null;
}
}
const session = new ProducerSession(this, stream);
this._producerSession = session;
session.addEventListener("closed", () => {
if (this._producerSession === session) {
this._producerSession = null;
}
});
return session;
}
createConsumerSession(producerId) {
if (!this._ready || !producerId || (typeof (producerId) !== "string")) {
return null;
}
if (producerId in this._consumerSessions) {
return this._consumerSessions[producerId];
}
for (const session of Object.values(this._consumerSessions)) {
if (session.peerId === producerId) {
return session;
}
}
const session = new ConsumerSession(producerId, this);
this._consumerSessions[producerId] = session;
session.addEventListener("closed", (event) => {
let sessionId = event.target.sessionId;
if (!sessionId) {
sessionId = event.target.peerId;
}
if ((sessionId in this._consumerSessions) && (this._consumerSessions[sessionId] === session)) {
delete this._consumerSessions[sessionId];
}
});
return session;
}
getConsumerSession(sessionId) {
if (sessionId in this._consumerSessions) {
return this._consumerSessions[sessionId];
} else {
return null;
}
}
closeAllConsumerSessions() {
for (const session of Object.values(this._consumerSessions)) {
session.close();
}
this._consumerSessions = {};
}
send(data) {
if (this._ready && data && (typeof (data) === "object")) {
try {
this._ws.send(JSON.stringify(data));
return true;
} catch (ex) {
this.dispatchEvent(new ErrorEvent("error", {
message: "cannot send message to signaling server",
error: ex
}));
}
}
return false;
}
close() {
if (this._ws) {
this._ready = false;
this._channelId = "";
this._ws.close();
this.closeAllConsumerSessions();
if (this._producerSession) {
this._producerSession.close();
this._producerSession = null;
}
}
}
}

View file

@ -0,0 +1,53 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
/**
* GStreamer WebRTC configuration.
* <p>You can override default values by defining configuration before receiving the <i>DOMContentLoaded</i> event.<br>
* Once the <i>DOMContentLoaded</i> event triggered, changing configuration will have no effect.</p>
* <p>For example:
* <pre>
* const signalingProtocol = window.location.protocol.startsWith("https") ? "wss" : "ws";
* window.gstWebRTCConfig = {
* meta: { name: `WebClient-${Date.now()}` },
* signalingServerUrl: `${signalingProtocol}://${window.location.host}/webrtc`
* };
* </pre></p>
* @typedef {object} gstWebRTCConfig
* @property {object} meta=null - Client free-form information that will be exchanged with all peers through the
* signaling <i>meta</i> property, its content depends on your application.
* @property {string} signalingServerUrl=ws://127.0.0.1:8443 - The WebRTC signaling server URL.
* @property {number} reconnectionTimeout=2500 - Timeout in milliseconds to reconnect to the signaling server in
* case of an unexpected disconnection.
* @property {object} webrtcConfig={iceServers...} - The WebRTC peer connection configuration passed to
* {@link external:RTCPeerConnection}. Default configuration only includes a list of free STUN servers
* (<i>stun[0-4].l.google.com:19302</i>).
*/
const defaultConfig = Object.freeze({
meta: null,
signalingServerUrl: "ws://127.0.0.1:8443",
reconnectionTimeout: 2500,
webrtcConfig: {
iceServers: [
{
urls: [
"stun:stun.l.google.com:19302",
"stun:stun1.l.google.com:19302",
"stun:stun2.l.google.com:19302",
"stun:stun3.l.google.com:19302",
"stun:stun4.l.google.com:19302"
]
}
]
}
});
export { defaultConfig as default };

View file

@ -0,0 +1,236 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import WebRTCSession from "./webrtc-session";
import SessionState from "./session-state";
import RemoteController from "./remote-controller";
/**
* Event name: "streamsChanged".<br>
* Triggered when the underlying media streams of a {@link gstWebRTCAPI.ConsumerSession} change.
* @event gstWebRTCAPI#StreamsChangedEvent
* @type {external:Event}
* @see gstWebRTCAPI.ConsumerSession#streams
*/
/**
* Event name: "remoteControllerChanged".<br>
* Triggered when the underlying remote controller of a {@link gstWebRTCAPI.ConsumerSession} changes.
* @event gstWebRTCAPI#RemoteControllerChangedEvent
* @type {external:Event}
* @see gstWebRTCAPI.ConsumerSession#remoteController
*/
/**
* @class gstWebRTCAPI.ConsumerSession
* @hideconstructor
* @classdesc Consumer session managing a peer-to-peer WebRTC channel between a remote producer and this client
* instance.
* <p>Call {@link gstWebRTCAPI#createConsumerSession} to create a ConsumerSession instance.</p>
* @extends {gstWebRTCAPI.WebRTCSession}
* @fires {@link gstWebRTCAPI#event:StreamsChangedEvent}
* @fires {@link gstWebRTCAPI#event:RemoteControllerChangedEvent}
*/
export default class ConsumerSession extends WebRTCSession {
constructor(peerId, comChannel) {
super(peerId, comChannel);
this._streams = [];
this._remoteController = null;
this.addEventListener("closed", () => {
this._streams = [];
if (this._remoteController) {
this._remoteController.close();
}
});
}
/**
* The array of remote media streams consumed locally through this WebRTC channel.
* @member {external:MediaStream[]} gstWebRTCAPI.ConsumerSession#streams
* @readonly
*/
get streams() {
return this._streams;
}
/**
* The remote controller associated with this WebRTC consumer session. Value may be null if consumer session
* has no remote controller.
* @member {gstWebRTCAPI.RemoteController} gstWebRTCAPI.ConsumerSession#remoteController
* @readonly
*/
get remoteController() {
return this._remoteController;
}
/**
* Connects the consumer session to its remote producer.<br>
* This method must be called after creating the consumer session in order to start receiving the remote streams.
* It registers this consumer session to the signaling server and gets ready to receive audio/video streams.
* <p>Even on success, streaming can fail later if any error occurs during or after connection. In order to know
* the effective streaming state, you should be listening to the [error]{@link gstWebRTCAPI#event:ErrorEvent},
* [stateChanged]{@link gstWebRTCAPI#event:StateChangedEvent} and/or [closed]{@link gstWebRTCAPI#event:ClosedEvent}
* events.</p>
* @method gstWebRTCAPI.ConsumerSession#connect
* @returns {boolean} true in case of success (may fail later during or after connection) or false in case of
* immediate error (wrong session state or no connection to the signaling server).
*/
connect() {
if (!this._comChannel || (this._state === SessionState.closed)) {
return false;
}
if (this._state !== SessionState.idle) {
return true;
}
const msg = {
type: "startSession",
peerId: this._peerId
};
if (!this._comChannel.send(msg)) {
this.dispatchEvent(new ErrorEvent("error", {
message: "cannot connect consumer session",
error: new Error("cannot send startSession message to signaling server")
}));
this.close();
return false;
}
this._state = SessionState.connecting;
this.dispatchEvent(new Event("stateChanged"));
return true;
}
onSessionStarted(peerId, sessionId) {
if ((this._peerId === peerId) && (this._state === SessionState.connecting) && !this._sessionId) {
this._sessionId = sessionId;
}
}
onSessionPeerMessage(msg) {
if ((this._state === SessionState.closed) || !this._comChannel || !this._sessionId) {
return;
}
if (!this._rtcPeerConnection) {
const connection = new RTCPeerConnection(this._comChannel.webrtcConfig);
this._rtcPeerConnection = connection;
connection.ontrack = (event) => {
if ((this._rtcPeerConnection === connection) && event.streams && (event.streams.length > 0)) {
if (this._state === SessionState.connecting) {
this._state = SessionState.streaming;
this.dispatchEvent(new Event("stateChanged"));
}
let streamsChanged = false;
for (const stream of event.streams) {
if (!this._streams.includes(stream)) {
this._streams.push(stream);
streamsChanged = true;
}
}
if (streamsChanged) {
this.dispatchEvent(new Event("streamsChanged"));
}
}
};
connection.ondatachannel = (event) => {
const rtcDataChannel = event.channel;
if (rtcDataChannel && (rtcDataChannel.label === "input")) {
if (this._remoteController) {
const previousController = this._remoteController;
this._remoteController = null;
previousController.close();
}
const remoteController = new RemoteController(rtcDataChannel, this);
this._remoteController = remoteController;
this.dispatchEvent(new Event("remoteControllerChanged"));
remoteController.addEventListener("closed", () => {
if (this._remoteController === remoteController) {
this._remoteController = null;
this.dispatchEvent(new Event("remoteControllerChanged"));
}
});
}
};
connection.onicecandidate = (event) => {
if ((this._rtcPeerConnection === connection) && event.candidate && this._comChannel) {
this._comChannel.send({
type: "peer",
sessionId: this._sessionId,
ice: event.candidate.toJSON()
});
}
};
this.dispatchEvent(new Event("rtcPeerConnectionChanged"));
}
if (msg.sdp) {
this._rtcPeerConnection.setRemoteDescription(msg.sdp).then(() => {
if (this._rtcPeerConnection) {
return this._rtcPeerConnection.createAnswer();
} else {
return null;
}
}).then((desc) => {
if (this._rtcPeerConnection && desc) {
return this._rtcPeerConnection.setLocalDescription(desc);
} else {
return null;
}
}).then(() => {
if (this._rtcPeerConnection && this._comChannel) {
const sdp = {
type: "peer",
sessionId: this._sessionId,
sdp: this._rtcPeerConnection.localDescription.toJSON()
};
if (!this._comChannel.send(sdp)) {
throw new Error("cannot send local SDP configuration to WebRTC peer");
}
}
}).catch((ex) => {
if (this._state !== SessionState.closed) {
this.dispatchEvent(new ErrorEvent("error", {
message: "an unrecoverable error occurred during SDP handshake",
error: ex
}));
this.close();
}
});
} else if (msg.ice) {
const candidate = new RTCIceCandidate(msg.ice);
this._rtcPeerConnection.addIceCandidate(candidate).catch((ex) => {
if (this._state !== SessionState.closed) {
this.dispatchEvent(new ErrorEvent("error", {
message: "an unrecoverable error occurred during ICE handshake",
error: ex
}));
this.close();
}
});
} else {
throw new Error(`invalid empty peer message received from consumer session ${this._sessionId}`);
}
}
}

View file

@ -0,0 +1,379 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import defaultConfig from "./config";
import ComChannel from "./com-channel";
import SessionState from "./session-state";
const apiState = {
config: null,
channel: null,
producers: {},
connectionListeners: [],
producersListeners: []
};
/**
* @interface gstWebRTCAPI.ConnectionListener
*/
/**
* Callback method called when this client connects to the WebRTC signaling server.
* The callback implementation should not throw any exception.
* @method gstWebRTCAPI.ConnectionListener#connected
* @abstract
* @param {string} clientId - The unique identifier of this WebRTC client.<br>This identifier is provided by the
* signaling server to uniquely identify each connected peer.
*/
/**
* Callback method called when this client disconnects from the WebRTC signaling server.
* The callback implementation should not throw any exception.
* @method gstWebRTCAPI.ConnectionListener#disconnected
* @abstract
*/
/**
* Registers a connection listener that will be called each time the WebRTC API connects to or disconnects from the
* signaling server.
* @function
* @memberof gstWebRTCAPI
* @instance
* @param {gstWebRTCAPI.ConnectionListener} listener - The connection listener to register.
* @returns {boolean} true in case of success (or if the listener was already registered), or false if the listener
* doesn't implement all callback functions and cannot be registered.
*/
function registerConnectionListener(listener) {
if (!listener || (typeof (listener) !== "object") ||
(typeof (listener.connected) !== "function") ||
(typeof (listener.disconnected) !== "function")) {
return false;
}
if (!apiState.connectionListeners.includes(listener)) {
apiState.connectionListeners.push(listener);
}
return true;
}
/**
* Unregisters a connection listener.<br>
* The removed listener will never be called again and can be garbage collected.
* @function
* @memberof gstWebRTCAPI
* @instance
* @param {gstWebRTCAPI.ConnectionListener} listener - The connection listener to unregister.
* @returns {boolean} true if the listener is found and unregistered, or false if the listener was not previously
* registered.
*/
function unregisterConnectionListener(listener) {
const idx = apiState.connectionListeners.indexOf(listener);
if (idx >= 0) {
apiState.connectionListeners.splice(idx, 1);
return true;
}
return false;
}
/**
* Unregisters all previously registered connection listeners.
* @function
* @memberof gstWebRTCAPI
* @instance
*/
function unregisterAllConnectionListeners() {
apiState.connectionListeners = [];
}
/**
* Creates a new producer session.
* <p>You can only create a producer session at once.<br>
* To request streaming from a new stream you will first need to close the previous producer session.</p>
* <p>You can only request a producer session while you are connected to the signaling server. You can use the
* {@link gstWebRTCAPI.ConnectionListener} interface and {@link gstWebRTCAPI#registerConnectionListener} function to
* listen to the connection state.</p>
* @function
* @memberof gstWebRTCAPI
* @instance
* @param {external:MediaStream} stream - The audio/video stream to offer as a producer through WebRTC.
* @returns {gstWebRTCAPI.ProducerSession} The created producer session or null in case of error. To start streaming,
* you still need to call {@link gstWebRTCAPI.ProducerSession#start} after adding on the returned session all the event
* listeners you may need.
*/
function createProducerSession(stream) {
if (apiState.channel) {
return apiState.channel.createProducerSession(stream);
} else {
return null;
}
}
/**
* Information about a remote producer registered by the signaling server.
* @typedef {object} gstWebRTCAPI.Producer
* @readonly
* @property {string} id - The remote producer unique identifier set by the signaling server (always non-empty).
* @property {object} meta - Free-form object containing extra information about the remote producer (always non-null,
* but may be empty). Its content depends on your application.
*/
/**
* Gets the list of all remote WebRTC producers available on the signaling server.
* <p>The remote producers list is only populated once you've connected to the signaling server. You can use the
* {@link gstWebRTCAPI.ConnectionListener} interface and {@link gstWebRTCAPI#registerConnectionListener} function to
* listen to the connection state.</p>
* @function
* @memberof gstWebRTCAPI
* @instance
* @returns {gstWebRTCAPI.Producer[]} The list of remote WebRTC producers available.
*/
function getAvailableProducers() {
return Object.values(apiState.producers);
}
/**
* @interface gstWebRTCAPI.ProducersListener
*/
/**
* Callback method called when a remote producer is added on the signaling server.
* The callback implementation should not throw any exception.
* @method gstWebRTCAPI.ProducersListener#producerAdded
* @abstract
* @param {gstWebRTCAPI.Producer} producer - The remote producer added on server-side.
*/
/**
* Callback method called when a remote producer is removed from the signaling server.
* The callback implementation should not throw any exception.
* @method gstWebRTCAPI.ProducersListener#producerRemoved
* @abstract
* @param {gstWebRTCAPI.Producer} producer - The remote producer removed on server-side.
*/
/**
* Registers a producers listener that will be called each time a producer is added or removed on the signaling
* server.
* @function
* @memberof gstWebRTCAPI
* @instance
* @param {gstWebRTCAPI.ProducersListener} listener - The producer listener to register.
* @returns {boolean} true in case of success (or if the listener was already registered), or false if the listener
* doesn't implement all callback functions and cannot be registered.
*/
function registerProducersListener(listener) {
if (!listener || (typeof (listener) !== "object") ||
(typeof (listener.producerAdded) !== "function") ||
(typeof (listener.producerRemoved) !== "function")) {
return false;
}
if (!apiState.producersListeners.includes(listener)) {
apiState.producersListeners.push(listener);
}
return true;
}
/**
* Unregisters a producers listener.<br>
* The removed listener will never be called again and can be garbage collected.
* @function
* @memberof gstWebRTCAPI
* @instance
* @param {gstWebRTCAPI.ProducersListener} listener - The producers listener to unregister.
* @returns {boolean} true if the listener is found and unregistered, or false if the listener was not previously
* registered.
*/
function unregisterProducersListener(listener) {
const idx = apiState.producersListeners.indexOf(listener);
if (idx >= 0) {
apiState.producersListeners.splice(idx, 1);
return true;
}
return false;
}
/**
* Unregisters all previously registered producers listeners.
* @function
* @memberof gstWebRTCAPI
* @instance
*/
function unregisterAllProducersListeners() {
apiState.producersListeners = [];
}
/**
* Creates a consumer session by connecting the local client to a remote WebRTC producer.
* <p>You can only create one consumer session per remote producer.</p>
* <p>You can only request a new consumer session while you are connected to the signaling server. You can use the
* {@link gstWebRTCAPI.ConnectionListener} interface and {@link gstWebRTCAPI#registerConnectionListener} function to
* listen to the connection state.</p>
* @function
* @memberof gstWebRTCAPI
* @instance
* @param {string} producerId - The unique identifier of the remote producer to connect to.
* @returns {gstWebRTCAPI.ConsumerSession} The WebRTC session between the selected remote producer and this local
* consumer, or null in case of error. To start connecting and receiving the remote streams, you still need to call
* {@link gstWebRTCAPI.ConsumerSession#connect} after adding on the returned session all the event listeners you may
* need.
*/
function createConsumerSession(producerId) {
if (apiState.channel) {
return apiState.channel.createConsumerSession(producerId);
} else {
return null;
}
}
/**
* The GStreamer WebRTC Javascript API.
* @namespace gstWebRTCAPI
*/
const gstWebRTCAPI = Object.freeze({
SessionState: SessionState,
registerConnectionListener: registerConnectionListener,
unregisterConnectionListener: unregisterConnectionListener,
unregisterAllConnectionListeners: unregisterAllConnectionListeners,
createProducerSession: createProducerSession,
getAvailableProducers: getAvailableProducers,
registerProducersListener: registerProducersListener,
unregisterProducersListener: unregisterProducersListener,
unregisterAllProducersListeners: unregisterAllProducersListeners,
createConsumerSession: createConsumerSession
});
function triggerConnected(clientId) {
for (const listener of apiState.connectionListeners) {
try {
listener.connected(clientId);
} catch (ex) {
console.error("a listener callback should not throw any exception", ex);
}
}
}
function triggerDisconnected() {
for (const listener of apiState.connectionListeners) {
try {
listener.disconnected();
} catch (ex) {
console.error("a listener callback should not throw any exception", ex);
}
}
}
function triggerProducerAdded(producer) {
if (producer.id in apiState.producers) {
return;
}
apiState.producers[producer.id] = producer;
for (const listener of apiState.producersListeners) {
try {
listener.producerAdded(producer);
} catch (ex) {
console.error("a listener callback should not throw any exception", ex);
}
}
}
function triggerProducerRemoved(producerId) {
if (producerId in apiState.producers) {
const producer = apiState.producers[producerId];
delete apiState.producers[producerId];
for (const listener of apiState.producersListeners) {
try {
listener.producerRemoved(producer);
} catch (ex) {
console.error("a listener callback should not throw any exception", ex);
}
}
}
}
function connectChannel() {
if (apiState.channel) {
const oldChannel = apiState.channel;
apiState.channel = null;
oldChannel.close();
for (const key in apiState.producers) {
triggerProducerRemoved(key);
}
apiState.producers = {};
triggerDisconnected();
}
apiState.channel = new ComChannel(
apiState.config.signalingServerUrl,
apiState.config.meta,
apiState.config.webrtcConfig);
apiState.channel.addEventListener("error", (event) => {
if (event.target === apiState.channel) {
console.error(event.message, event.error);
}
});
apiState.channel.addEventListener("closed", (event) => {
if (event.target === apiState.channel) {
apiState.channel = null;
for (const key in apiState.producers) {
triggerProducerRemoved(key);
}
apiState.producers = {};
triggerDisconnected();
if (apiState.config.reconnectionTimeout > 0) {
window.setTimeout(connectChannel, apiState.config.reconnectionTimeout);
}
}
});
apiState.channel.addEventListener("ready", (event) => {
if (event.target === apiState.channel) {
triggerConnected(apiState.channel.channelId);
}
});
apiState.channel.addEventListener("producerAdded", (event) => {
if (event.target === apiState.channel) {
triggerProducerAdded(event.detail);
}
});
apiState.channel.addEventListener("producerRemoved", (event) => {
if (event.target === apiState.channel) {
triggerProducerRemoved(event.detail.id);
}
});
}
function start(userConfig) {
if (apiState.config) {
throw new Error("GstWebRTC API is already started");
}
const config = Object.assign({}, defaultConfig);
if (userConfig && (typeof (userConfig) === "object")) {
Object.assign(config, userConfig);
}
if (typeof (config.meta) !== "object") {
config.meta = null;
}
apiState.config = config;
connectChannel();
}
export { gstWebRTCAPI, start };

View file

@ -0,0 +1,57 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import "webrtc-adapter";
import { gstWebRTCAPI, start } from "./gstwebrtc-api";
/**
* @external MediaStream
* @see https://developer.mozilla.org/en-US/docs/Web/API/MediaStream
*/
/**
* @external RTCPeerConnection
* @see https://developer.mozilla.org/en-US/docs/Web/API/RTCPeerConnection
*/
/**
* @external RTCDataChannel
* @see https://developer.mozilla.org/en-US/docs/Web/API/RTCDataChannel
*/
/**
* @external EventTarget
* @see https://developer.mozilla.org/en-US/docs/Web/API/EventTarget
*/
/**
* @external Event
* @see https://developer.mozilla.org/en-US/docs/Web/API/Event
*/
/**
* @external ErrorEvent
* @see https://developer.mozilla.org/en-US/docs/Web/API/ErrorEvent
*/
/**
* @external CustomEvent
* @see https://developer.mozilla.org/en-US/docs/Web/API/CustomEvent
*/
/**
* @external Error
* @see https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error
*/
/**
* @external HTMLVideoElement
* @see https://developer.mozilla.org/en-US/docs/Web/API/HTMLVideoElement
*/
if (!window.gstWebRTCAPI) {
window.gstWebRTCAPI = gstWebRTCAPI;
window.addEventListener("DOMContentLoaded", () => {
start(window.gstWebRTCConfig);
});
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,279 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import WebRTCSession from "./webrtc-session";
import SessionState from "./session-state";
/**
* @class gstWebRTCAPI.ClientSession
* @hideconstructor
* @classdesc Client session representing a link between a remote consumer and a local producer session.
* @extends {gstWebRTCAPI.WebRTCSession}
*/
class ClientSession extends WebRTCSession {
constructor(peerId, sessionId, comChannel, stream) {
super(peerId, comChannel);
this._sessionId = sessionId;
this._state = SessionState.streaming;
const connection = new RTCPeerConnection(this._comChannel.webrtcConfig);
this._rtcPeerConnection = connection;
for (const track of stream.getTracks()) {
connection.addTrack(track, stream);
}
connection.onicecandidate = (event) => {
if ((this._rtcPeerConnection === connection) && event.candidate && this._comChannel) {
this._comChannel.send({
type: "peer",
sessionId: this._sessionId,
ice: event.candidate.toJSON()
});
}
};
this.dispatchEvent(new Event("rtcPeerConnectionChanged"));
connection.setLocalDescription().then(() => {
if ((this._rtcPeerConnection === connection) && this._comChannel) {
const sdp = {
type: "peer",
sessionId: this._sessionId,
sdp: this._rtcPeerConnection.localDescription.toJSON()
};
if (!this._comChannel.send(sdp)) {
throw new Error("cannot send local SDP configuration to WebRTC peer");
}
}
}).catch((ex) => {
if (this._state !== SessionState.closed) {
this.dispatchEvent(new ErrorEvent("error", {
message: "an unrecoverable error occurred during SDP handshake",
error: ex
}));
this.close();
}
});
}
onSessionPeerMessage(msg) {
if ((this._state === SessionState.closed) || !this._rtcPeerConnection) {
return;
}
if (msg.sdp) {
this._rtcPeerConnection.setRemoteDescription(msg.sdp).catch((ex) => {
if (this._state !== SessionState.closed) {
this.dispatchEvent(new ErrorEvent("error", {
message: "an unrecoverable error occurred during SDP handshake",
error: ex
}));
this.close();
}
});
} else if (msg.ice) {
const candidate = new RTCIceCandidate(msg.ice);
this._rtcPeerConnection.addIceCandidate(candidate).catch((ex) => {
if (this._state !== SessionState.closed) {
this.dispatchEvent(new ErrorEvent("error", {
message: "an unrecoverable error occurred during ICE handshake",
error: ex
}));
this.close();
}
});
} else {
throw new Error(`invalid empty peer message received from producer's client session ${this._peerId}`);
}
}
}
/**
* Event name: "clientConsumerAdded".<br>
* Triggered when a remote consumer peer connects to a local {@link gstWebRTCAPI.ProducerSession}.
* @event gstWebRTCAPI#ClientConsumerAddedEvent
* @type {external:CustomEvent}
* @property {gstWebRTCAPI.ClientSession} detail - The WebRTC session associated with the added consumer peer.
* @see gstWebRTCAPI.ProducerSession
*/
/**
* Event name: "clientConsumerRemoved".<br>
* Triggered when a remote consumer peer disconnects from a local {@link gstWebRTCAPI.ProducerSession}.
* @event gstWebRTCAPI#ClientConsumerRemovedEvent
* @type {external:CustomEvent}
* @property {gstWebRTCAPI.ClientSession} detail - The WebRTC session associated with the removed consumer peer.
* @see gstWebRTCAPI.ProducerSession
*/
/**
* @class gstWebRTCAPI.ProducerSession
* @hideconstructor
* @classdesc Producer session managing the streaming out of a local {@link external:MediaStream}.<br>
* It manages all underlying WebRTC connections to each peer client consuming the stream.
* <p>Call {@link gstWebRTCAPI#createProducerSession} to create a ProducerSession instance.</p>
* @extends {external:EventTarget}
* @fires {@link gstWebRTCAPI#event:ErrorEvent}
* @fires {@link gstWebRTCAPI#event:StateChangedEvent}
* @fires {@link gstWebRTCAPI#event:ClosedEvent}
* @fires {@link gstWebRTCAPI#event:ClientConsumerAddedEvent}
* @fires {@link gstWebRTCAPI#event:ClientConsumerRemovedEvent}
*/
export default class ProducerSession extends EventTarget {
constructor(comChannel, stream) {
super();
this._comChannel = comChannel;
this._stream = stream;
this._state = SessionState.idle;
this._clientSessions = {};
}
/**
* The local stream produced out by this session.
* @member {external:MediaStream} gstWebRTCAPI.ProducerSession#stream
* @readonly
*/
get stream() {
return this._stream;
}
/**
* The current producer session state.
* @member {gstWebRTCAPI.SessionState} gstWebRTCAPI.ProducerSession#state
* @readonly
*/
get state() {
return this._state;
}
/**
* Starts the producer session.<br>
* This method must be called after creating the producer session in order to start streaming. It registers this
* producer session to the signaling server and gets ready to serve peer requests from consumers.
* <p>Even on success, streaming can fail later if any error occurs during or after connection. In order to know
* the effective streaming state, you should be listening to the [error]{@link gstWebRTCAPI#event:ErrorEvent},
* [stateChanged]{@link gstWebRTCAPI#event:StateChangedEvent} and/or [closed]{@link gstWebRTCAPI#event:ClosedEvent}
* events.</p>
* @method gstWebRTCAPI.ProducerSession#start
* @returns {boolean} true in case of success (may fail later during or after connection) or false in case of
* immediate error (wrong session state or no connection to the signaling server).
*/
start() {
if (!this._comChannel || (this._state === SessionState.closed)) {
return false;
}
if (this._state !== SessionState.idle) {
return true;
}
const msg = {
type: "setPeerStatus",
roles: ["listener", "producer"],
meta: this._comChannel.meta
};
if (!this._comChannel.send(msg)) {
this.dispatchEvent(new ErrorEvent("error", {
message: "cannot start producer session",
error: new Error("cannot register producer to signaling server")
}));
this.close();
return false;
}
this._state = SessionState.connecting;
this.dispatchEvent(new Event("stateChanged"));
return true;
}
/**
* Terminates the producer session.<br>
* It immediately disconnects all peer consumers attached to this producer session and unregisters the producer
* from the signaling server.
* @method gstWebRTCAPI.ProducerSession#close
*/
close() {
if (this._state !== SessionState.closed) {
for (const track of this._stream.getTracks()) {
track.stop();
}
if ((this._state !== SessionState.idle) && this._comChannel) {
this._comChannel.send({
type: "setPeerStatus",
roles: ["listener"],
meta: this._comChannel.meta
});
}
this._state = SessionState.closed;
this.dispatchEvent(new Event("stateChanged"));
this._comChannel = null;
this._stream = null;
for (const clientSession of Object.values(this._clientSessions)) {
clientSession.close();
}
this._clientSessions = {};
this.dispatchEvent(new Event("closed"));
}
}
onProducerRegistered() {
if (this._state === SessionState.connecting) {
this._state = SessionState.streaming;
this.dispatchEvent(new Event("stateChanged"));
}
}
onStartSessionMessage(msg) {
if (this._comChannel && this._stream && !(msg.sessionId in this._clientSessions)) {
const session = new ClientSession(msg.peerId, msg.sessionId, this._comChannel, this._stream);
this._clientSessions[msg.sessionId] = session;
session.addEventListener("closed", (event) => {
const sessionId = event.target.sessionId;
if ((sessionId in this._clientSessions) && (this._clientSessions[sessionId] === session)) {
delete this._clientSessions[sessionId];
this.dispatchEvent(new CustomEvent("clientConsumerRemoved", { detail: session }));
}
});
session.addEventListener("error", (event) => {
this.dispatchEvent(new ErrorEvent("error", {
message: `error from client consumer ${event.target.peerId}: ${event.message}`,
error: event.error
}));
});
this.dispatchEvent(new CustomEvent("clientConsumerAdded", { detail: session }));
}
}
onEndSessionMessage(msg) {
if (msg.sessionId in this._clientSessions) {
this._clientSessions[msg.sessionId].close();
}
}
onSessionPeerMessage(msg) {
if (msg.sessionId in this._clientSessions) {
this._clientSessions[msg.sessionId].onSessionPeerMessage(msg);
}
}
}

View file

@ -0,0 +1,360 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import Guacamole from "../third-party/Keyboard";
import getKeysymString from "./keysyms";
const eventsNames = Object.freeze([
"wheel",
"contextmenu",
"mousemove",
"mousedown",
"mouseup",
"touchstart",
"touchend",
"touchmove",
"touchcancel"
]);
const mouseEventsNames = Object.freeze({
mousemove: "MouseMove",
mousedown: "MouseButtonPress",
mouseup: "MouseButtonRelease"
});
const touchEventsNames = Object.freeze({
touchstart: "TouchDown",
touchend: "TouchUp",
touchmove: "TouchMotion",
touchcancel: "TouchUp"
});
function getModifiers(event) {
const modifiers = [];
if (event.altKey) {
modifiers.push("alt-mask");
}
if (event.ctrlKey) {
modifiers.push("control-mask");
}
if (event.metaKey) {
modifiers.push("meta-mask");
}
if (event.shiftKey) {
modifiers.push("shift-mask");
}
return modifiers.join("+");
}
/**
* @class gstWebRTCAPI.RemoteController
* @hideconstructor
* @classdesc Manages a specific WebRTC data channel created by a remote GStreamer webrtcsink producer and offering
* remote control of the producer through
* [GstNavigation]{@link https://gstreamer.freedesktop.org/documentation/video/gstnavigation.html} events.
* <p>The remote control data channel is created by the GStreamer webrtcsink element on the producer side. Then it is
* announced through the consumer session thanks to the {@link gstWebRTCAPI#event:RemoteControllerChangedEvent}
* event.</p>
* <p>You can attach an {@link external:HTMLVideoElement} to the remote controller, then all mouse and keyboard events
* emitted by this element will be automatically relayed to the remote producer.</p>
* @extends {external:EventTarget}
* @fires {@link gstWebRTCAPI#event:ErrorEvent}
* @fires {@link gstWebRTCAPI#event:ClosedEvent}
* @see gstWebRTCAPI.ConsumerSession#remoteController
* @see gstWebRTCAPI.RemoteController#attachVideoElement
* @see https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/tree/main/net/webrtc/gstwebrtc-api#produce-a-gstreamer-interactive-webrtc-stream-with-remote-control
*/
export default class RemoteController extends EventTarget {
constructor(rtcDataChannel, consumerSession) {
super();
this._rtcDataChannel = rtcDataChannel;
this._consumerSession = consumerSession;
this._videoElement = null;
this._videoElementComputedStyle = null;
this._videoElementKeyboard = null;
this._lastTouchEventTimestamp = 0;
rtcDataChannel.addEventListener("close", () => {
if (this._rtcDataChannel === rtcDataChannel) {
this.close();
}
});
rtcDataChannel.addEventListener("error", (event) => {
if (this._rtcDataChannel === rtcDataChannel) {
const error = event.error;
this.dispatchEvent(new ErrorEvent("error", {
message: (error && error.message) || "Remote controller error",
error: error || new Error("unknown error on the remote controller data channel")
}));
}
});
}
/**
* The underlying WebRTC data channel connected to a remote GStreamer webrtcsink producer offering remote control.
* The value may be null if the remote controller has been closed.
* @member {external:RTCDataChannel} gstWebRTCAPI.RemoteController#rtcDataChannel
* @readonly
*/
get rtcDataChannel() {
return this._rtcDataChannel;
}
/**
* The consumer session associated with this remote controller.
* @member {gstWebRTCAPI.ConsumerSession} gstWebRTCAPI.RemoteController#consumerSession
* @readonly
*/
get consumerSession() {
return this._consumerSession;
}
/**
* The video element that is currently used to send all mouse and keyboard events to the remote producer. Value may
* be null if no video element is attached.
* @member {external:HTMLVideoElement} gstWebRTCAPI.RemoteController#videoElement
* @readonly
* @see gstWebRTCAPI.RemoteController#attachVideoElement
*/
get videoElement() {
return this._videoElement;
}
/**
* Associates a video element with this remote controller.<br>
* When a video element is attached to this remote controller, all mouse and keyboard events emitted by this
* element will be sent to the remote GStreamer webrtcink producer.
* @method gstWebRTCAPI.RemoteController#attachVideoElement
* @param {external:HTMLVideoElement|null} element - the video element to use to relay mouse and keyboard events,
* or null to detach any previously attached element. If the provided element parameter is not null and not a
* valid instance of an {@link external:HTMLVideoElement}, then the method does nothing.
*/
attachVideoElement(element) {
if ((element instanceof HTMLVideoElement) && (element !== this._videoElement)) {
if (this._videoElement) {
this.attachVideoElement(null);
}
this._videoElement = element;
this._videoElementComputedStyle = window.getComputedStyle(element);
this._videoElementKeyboard = new Guacamole.Keyboard(element);
this._videoElementKeyboard.onkeydown = (keysym, modifierState) => {
this._sendGstNavigationEvent({
event: "KeyPress",
key: getKeysymString(keysym),
modifier_state: modifierState // eslint-disable-line camelcase
});
};
this._videoElementKeyboard.onkeyup = (keysym, modifierState) => {
this._sendGstNavigationEvent({
event: "KeyRelease",
key: getKeysymString(keysym),
modifier_state: modifierState // eslint-disable-line camelcase
});
};
for (const eventName of eventsNames) {
element.addEventListener(eventName, this);
}
element.setAttribute("tabindex", "0");
} else if ((element === null) && this._videoElement) {
const previousElement = this._videoElement;
previousElement.removeAttribute("tabindex");
this._videoElement = null;
this._videoElementComputedStyle = null;
this._videoElementKeyboard.onkeydown = null;
this._videoElementKeyboard.onkeyup = null;
this._videoElementKeyboard.reset();
this._videoElementKeyboard = null;
this._lastTouchEventTimestamp = 0;
for (const eventName of eventsNames) {
previousElement.removeEventListener(eventName, this);
}
}
}
/**
* Closes the remote controller channel.<br>
* It immediately shuts down the underlying WebRTC data channel connected to a remote GStreamer webrtcsink
* producer and detaches any video element that may be used to relay mouse and keyboard events.
* @method gstWebRTCAPI.RemoteController#close
*/
close() {
this.attachVideoElement(null);
const rtcDataChannel = this._rtcDataChannel;
this._rtcDataChannel = null;
if (rtcDataChannel) {
rtcDataChannel.close();
this.dispatchEvent(new Event("closed"));
}
}
_sendGstNavigationEvent(data) {
try {
if (!data || (typeof (data) !== "object")) {
throw new Error("invalid GstNavigation event");
}
if (!this._rtcDataChannel) {
throw new Error("remote controller data channel is closed");
}
this._rtcDataChannel.send(JSON.stringify(data));
} catch (ex) {
this.dispatchEvent(new ErrorEvent("error", {
message: `cannot send GstNavigation event over session ${this._consumerSession.sessionId} remote controller`,
error: ex
}));
}
}
_computeVideoMousePosition(event) {
const mousePos = { x: 0, y: 0 };
if (!this._videoElement || (this._videoElement.videoWidth <= 0) || (this._videoElement.videoHeight <= 0)) {
return mousePos;
}
const padding = {
left: parseFloat(this._videoElementComputedStyle.paddingLeft),
right: parseFloat(this._videoElementComputedStyle.paddingRight),
top: parseFloat(this._videoElementComputedStyle.paddingTop),
bottom: parseFloat(this._videoElementComputedStyle.paddingBottom)
};
if (("offsetX" in event) && ("offsetY" in event)) {
mousePos.x = event.offsetX - padding.left;
mousePos.y = event.offsetY - padding.top;
} else {
const clientRect = this._videoElement.getBoundingClientRect();
const border = {
left: parseFloat(this._videoElementComputedStyle.borderLeftWidth),
top: parseFloat(this._videoElementComputedStyle.borderTopWidth)
};
mousePos.x = event.clientX - clientRect.left - border.left - padding.left;
mousePos.y = event.clientY - clientRect.top - border.top - padding.top;
}
const videoOffset = {
x: this._videoElement.clientWidth - (padding.left + padding.right),
y: this._videoElement.clientHeight - (padding.top + padding.bottom)
};
const ratio = Math.min(videoOffset.x / this._videoElement.videoWidth, videoOffset.y / this._videoElement.videoHeight);
videoOffset.x = Math.max(0.5 * (videoOffset.x - this._videoElement.videoWidth * ratio), 0);
videoOffset.y = Math.max(0.5 * (videoOffset.y - this._videoElement.videoHeight * ratio), 0);
const invRatio = (ratio !== 0) ? (1 / ratio) : 0;
mousePos.x = (mousePos.x - videoOffset.x) * invRatio;
mousePos.y = (mousePos.y - videoOffset.y) * invRatio;
mousePos.x = Math.min(Math.max(mousePos.x, 0), this._videoElement.videoWidth);
mousePos.y = Math.min(Math.max(mousePos.y, 0), this._videoElement.videoHeight);
return mousePos;
}
handleEvent(event) {
if (!this._videoElement) {
return;
}
switch (event.type) {
case "wheel":
event.preventDefault();
{
const mousePos = this._computeVideoMousePosition(event);
this._sendGstNavigationEvent({
event: "MouseScroll",
x: mousePos.x,
y: mousePos.y,
delta_x: -event.deltaX, // eslint-disable-line camelcase
delta_y: -event.deltaY, // eslint-disable-line camelcase
modifier_state: getModifiers(event) // eslint-disable-line camelcase
});
}
break;
case "contextmenu":
event.preventDefault();
break;
case "mousemove":
case "mousedown":
case "mouseup":
event.preventDefault();
{
const mousePos = this._computeVideoMousePosition(event);
const data = {
event: mouseEventsNames[event.type],
x: mousePos.x,
y: mousePos.y,
modifier_state: getModifiers(event) // eslint-disable-line camelcase
};
if (event.type !== "mousemove") {
data.button = event.button + 1;
if ((event.type === "mousedown") && (event.button === 0)) {
this._videoElement.focus();
}
}
this._sendGstNavigationEvent(data);
}
break;
case "touchstart":
case "touchend":
case "touchmove":
case "touchcancel":
for (const touch of event.changedTouches) {
const mousePos = this._computeVideoMousePosition(touch);
const data = {
event: touchEventsNames[event.type],
identifier: touch.identifier,
x: mousePos.x,
y: mousePos.y,
modifier_state: getModifiers(event) // eslint-disable-line camelcase
};
if (("force" in touch) && ((event.type === "touchstart") || (event.type === "touchmove"))) {
data.pressure = touch.force;
}
this._sendGstNavigationEvent(data);
}
if (event.timeStamp > this._lastTouchEventTimestamp) {
this._lastTouchEventTimestamp = event.timeStamp;
this._sendGstNavigationEvent({
event: "TouchFrame",
modifier_state: getModifiers(event) // eslint-disable-line camelcase
});
}
break;
}
}
}

View file

@ -0,0 +1,32 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
/**
* Session states enumeration.<br>
* Session state always increases from idle to closed and never switches backwards.
* @typedef {enum} gstWebRTCAPI.SessionState
* @readonly
* @property {0} idle - Default state when creating a new session, goes to <i>connecting</i> when starting
* the session.
* @property {1} connecting - Session is trying to connect to remote peers, goes to <i>streaming</i> in case of
* success or <i>closed</i> in case of error.
* @property {2} streaming - Session is correctly connected to remote peers and currently streaming audio/video, goes
* to <i>closed</i> when any peer closes the session.
* @property {3} closed - Session is closed and can be garbage collected, state will not change anymore.
*/
const SessionState = Object.freeze({
idle: 0,
connecting: 1,
streaming: 2,
closed: 3
});
export { SessionState as default };

View file

@ -0,0 +1,148 @@
/*
* gstwebrtc-api
*
* Copyright (C) 2022 Igalia S.L. <info@igalia.com>
* Author: Loïc Le Page <llepage@igalia.com>
*
* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/.
*/
import SessionState from "./session-state";
/**
* Event name: "error".<br>
* Triggered when any kind of error occurs.
* <p>When emitted by a session, it is in general an unrecoverable error. Normally, the session is automatically closed
* but in the specific case of a {@link gstWebRTCAPI.ProducerSession}, when the error occurs on an underlying
* {@link gstWebRTCAPI.ClientSession} between the producer session and a remote client consuming the streamed media,
* then only the failing {@link gstWebRTCAPI.ClientSession} is closed. The producer session can keep on serving the
* other consumer peers.</p>
* @event gstWebRTCAPI#ErrorEvent
* @type {external:ErrorEvent}
* @property {string} message - The error message.
* @property {external:Error} error - The error exception.
* @see gstWebRTCAPI.WebRTCSession
* @see gstWebRTCAPI.RemoteController
*/
/**
* Event name: "stateChanged".<br>
* Triggered each time a session state changes.
* @event gstWebRTCAPI#StateChangedEvent
* @type {external:Event}
* @see gstWebRTCAPI.WebRTCSession#state
*/
/**
* Event name: "rtcPeerConnectionChanged".<br>
* Triggered each time a session internal {@link external:RTCPeerConnection} changes. This can occur during the session
* connecting state when the peer-to-peer WebRTC connection is established, and when closing the
* {@link gstWebRTCAPI.WebRTCSession}.
* @event gstWebRTCAPI#RTCPeerConnectionChangedEvent
* @type {external:Event}
* @see gstWebRTCAPI.WebRTCSession#rtcPeerConnection
*/
/**
* Event name: "closed".<br>
* Triggered when a session is definitively closed (then it can be garbage collected as session instances are not
* reusable).
* @event gstWebRTCAPI#ClosedEvent
* @type {external:Event}
*/
/**
* @class gstWebRTCAPI.WebRTCSession
* @hideconstructor
* @classdesc Manages a WebRTC session between a producer and a consumer (peer-to-peer channel).
* @extends {external:EventTarget}
* @fires {@link gstWebRTCAPI#event:ErrorEvent}
* @fires {@link gstWebRTCAPI#event:StateChangedEvent}
* @fires {@link gstWebRTCAPI#event:RTCPeerConnectionChangedEvent}
* @fires {@link gstWebRTCAPI#event:ClosedEvent}
* @see gstWebRTCAPI.ConsumerSession
* @see gstWebRTCAPI.ProducerSession
* @see gstWebRTCAPI.ClientSession
*/
export default class WebRTCSession extends EventTarget {
constructor(peerId, comChannel) {
super();
this._peerId = peerId;
this._sessionId = "";
this._comChannel = comChannel;
this._state = SessionState.idle;
this._rtcPeerConnection = null;
}
/**
* Unique identifier of the remote peer to which this session is connected.
* @member {string} gstWebRTCAPI.WebRTCSession#peerId
* @readonly
*/
get peerId() {
return this._peerId;
}
/**
* Unique identifier of this session (defined by the signaling server).<br>
* The local session ID equals "" until it is created on server side. This is done during the connection handshake.
* The local session ID is guaranteed to be valid and to correctly reflect the signaling server value once
* session state has switched to {@link gstWebRTCAPI.SessionState#streaming}.
* @member {string} gstWebRTCAPI.WebRTCSession#sessionId
* @readonly
*/
get sessionId() {
return this._sessionId;
}
/**
* The current WebRTC session state.
* @member {gstWebRTCAPI.SessionState} gstWebRTCAPI.WebRTCSession#state
* @readonly
*/
get state() {
return this._state;
}
/**
* The internal {@link external:RTCPeerConnection} used to manage the underlying WebRTC connnection with session
* peer. Value may be null if session has no active WebRTC connection. You can listen to the
* {@link gstWebRTCAPI#event:RTCPeerConnectionChangedEvent} event to be informed when the connection is established
* or destroyed.
* @member {external:RTCPeerConnection} gstWebRTCAPI.WebRTCSession#rtcPeerConnection
* @readonly
*/
get rtcPeerConnection() {
return this._rtcPeerConnection;
}
/**
* Terminates the WebRTC session.<br>
* It immediately disconnects the remote peer attached to this session and unregisters the session from the
* signaling server.
* @method gstWebRTCAPI.WebRTCSession#close
*/
close() {
if (this._state !== SessionState.closed) {
if ((this._state !== SessionState.idle) && this._comChannel && this._sessionId) {
this._comChannel.send({
type: "endSession",
sessionId: this._sessionId
});
}
this._state = SessionState.closed;
this.dispatchEvent(new Event("stateChanged"));
this._comChannel = null;
if (this._rtcPeerConnection) {
this._rtcPeerConnection.close();
this._rtcPeerConnection = null;
this.dispatchEvent(new Event("rtcPeerConnectionChanged"));
}
this.dispatchEvent(new Event("closed"));
}
}
}

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,33 @@
Keyboard.js is released under the Apache 2.0 license and downloaded from:
https://github.com/apache/guacamole-client/tree/master/guacamole-common-js/src/main/webapp/modules/Keyboard.js
```javascript
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
```
Line 20 has been modified to allow importation with webpack, from:
```javascript
var Guacamole = Guacamole || {};
```
to:
```javascript
const Guacamole = {};
export { Guacamole as default };
```

View file

@ -0,0 +1,62 @@
"use strict";
/* eslint-disable */
const packageVersion = require("./package.json").version;
const webpack = require("webpack");
const HtmlWebpackPlugin = require("html-webpack-plugin");
const TerserWebpackPlugin = require("terser-webpack-plugin");
const isDevServer = process.argv.includes("serve");
/* eslint-enable */
const config = {
target: ["web", "es2017"],
mode: isDevServer ? "development" : "production",
devtool: isDevServer ? "eval" : "source-map",
entry: { "gstwebrtc-api": "./src" },
output: { filename: isDevServer ? "[name]-[contenthash].min.js" : `[name]-${packageVersion}.min.js` },
devServer: {
open: true,
static: false,
proxy: {
"/webrtc": {
target: "ws://127.0.0.1:8443",
ws: true
}
},
server: "https",
port: 9090
},
optimization: {
minimizer: [
new TerserWebpackPlugin({
extractComments: false,
terserOptions: {
ecma: 2017,
toplevel: true,
output: {
comments: false,
preamble: "/*! gstwebrtc-api (https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/-/tree/main/net/webrtc/gstwebrtc-api), MPL-2.0 License, Copyright (C) 2022 Igalia S.L. <info@igalia.com>, Author: Loïc Le Page <llepage@igalia.com> */\n" +
"/*! Contains embedded adapter from webrtc-adapter (https://github.com/webrtcHacks/adapter), BSD 3-Clause License, Copyright (c) 2014, The WebRTC project authors. All rights reserved. Copyright (c) 2018, The adapter.js project authors. All rights reserved. */\n" +
"/*! Contains embedded Keyboard.js from guacamole-client (https://github.com/apache/guacamole-client), Apache 2.0 License */"
}
}
})
]
},
plugins: [new webpack.ProgressPlugin()]
};
if (isDevServer) {
config.plugins.push(new HtmlWebpackPlugin({
template: "./index.html",
inject: "head",
minify: false
}));
}
module.exports = config; // eslint-disable-line no-undef

View file

@ -17,6 +17,7 @@ pub struct Peer {
/// Messages sent from the server to peers
pub enum OutgoingMessage {
/// Welcoming message, sets the Peer ID linked to a new connection
#[serde(rename_all = "camelCase")]
Welcome { peer_id: String },
/// Notifies listeners that a peer status has changed
PeerStatusChanged(PeerStatus),
@ -27,13 +28,14 @@ pub enum OutgoingMessage {
#[serde(rename_all = "camelCase")]
SessionStarted { peer_id: String, session_id: String },
/// Signals that the session the peer was in was ended
#[serde(rename_all = "camelCase")]
EndSession(EndSessionMessage),
/// Messages directly forwarded from one peer to another
Peer(PeerMessage),
/// Provides the current list of consumer peers
#[serde(rename_all = "camelCase")]
List { producers: Vec<Peer> },
/// Notifies that an error occurred with the peer's current session
#[serde(rename_all = "camelCase")]
Error { details: String },
}
@ -42,10 +44,8 @@ pub enum OutgoingMessage {
/// Register with a peer type
pub enum PeerRole {
/// Register as a producer
#[serde(rename_all = "camelCase")]
Producer,
/// Register as a listener
#[serde(rename_all = "camelCase")]
Listener,
}
@ -83,11 +83,13 @@ pub struct StartSessionMessage {
/// Conveys a SDP
pub enum SdpMessage {
/// Conveys an offer
#[serde(rename_all = "camelCase")]
Offer {
/// The SDP
sdp: String,
},
/// Conveys an answer
#[serde(rename_all = "camelCase")]
Answer {
/// The SDP
sdp: String,

View file

@ -1,38 +0,0 @@
<!DOCTYPE html>
<!--
vim: set sts=2 sw=2 et :
Demo Javascript app for negotiating and streaming a sendrecv webrtc stream
with a GStreamer app. Runs only in passive mode, i.e., responds to offers
with answers, exchanges ICE candidates, and streams.
Author: Nirbheek Chauhan <nirbheek@centricular.com>
-->
<html>
<head>
<style>
.error { color: red; }
</style>
<link rel="stylesheet" type="text/css" href="theme.css">
<script src="https://webrtc.github.io/adapter/adapter-latest.js"></script>
<script src="keyboard.js"></script>
<script src="input.js"></script>
<script src="webrtc.js"></script>
<script>window.onload = setup;</script>
</head>
<body>
<div class="holygrail-body">
<div class="content">
<div id="sessions">
</div>
<div id="image-holder">
<img id="image"></img>
</div>
</div>
<ul class="nav" id="camera-list">
</ul>
</div>
</body>
</html>

View file

@ -1,482 +0,0 @@
/**
* Copyright 2019 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
/*global GamepadManager*/
/*eslint no-unused-vars: ["error", { "vars": "local" }]*/
class Input {
/**
* Input handling for WebRTC web app
*
* @constructor
* @param {Element} [element]
* Video element to attach events to
* @param {function} [send]
* Function used to send input events to server.
*/
constructor(element, send) {
/**
* @type {Element}
*/
this.element = element;
/**
* @type {function}
*/
this.send = send;
/**
* @type {boolean}
*/
this.mouseRelative = false;
/**
* @type {Object}
*/
this.m = null;
/**
* @type {Keyboard}
*/
this.keyboard = null;
/**
* @type {GamepadManager}
*/
this.gamepadManager = null;
/**
* @type {Integer}
*/
this.x = 0;
/**
* @type {Integer}
*/
this.y = 0;
/**
* @type {Integer}
*/
this.lastTouch = 0;
/**
* @type {function}
*/
this.ongamepadconnected = null;
/**
* @type {function}
*/
this.ongamepaddisconneceted = null;
/**
* List of attached listeners, record keeping used to detach all.
* @type {Array}
*/
this.listeners = [];
/**
* @type {function}
*/
this.onresizeend = null;
// internal variables used by resize start/end functions.
this._rtime = null;
this._rtimeout = false;
this._rdelta = 200;
}
/**
* Handles mouse button and motion events and sends them to WebRTC app.
* @param {MouseEvent} event
*/
_mouseButtonMovement(event) {
const down = (event.type === 'mousedown' ? 1 : 0);
var data = {};
if (event.type === 'mousemove' && !this.m) return;
if (!document.pointerLockElement) {
if (this.mouseRelative)
event.target.requestPointerLock();
}
// Hotkey to enable pointer lock, CTRL-SHIFT-LeftButton
if (down && event.button === 0 && event.ctrlKey && event.shiftKey) {
event.target.requestPointerLock();
return;
}
if (document.pointerLockElement) {
// FIXME - mark as relative!
console.warn("FIXME: Make event relative!")
this.x = event.movementX;
this.y = event.movementY;
} else if (event.type === 'mousemove') {
this.x = this._clientToServerX(event.clientX);
this.y = this._clientToServerY(event.clientY);
data["event"] = "MouseMove"
}
if (event.type === 'mousedown') {
data["event"] = "MouseButtonPress";
} else if (event.type === 'mouseup') {
data["event"] = "MouseButtonRelease";
}
if (event.type === 'mousedown' || event.type === 'mouseup') {
data["button"] = event.button + 1;
}
data["x"] = this.x;
data["y"] = this.y;
data["modifier_state"] = this._modifierState(event);
this.send(data);
}
/**
* Handles touch events and sends them to WebRTC app.
* @param {TouchEvent} event
*/
_touch(event) {
var mod_state = this._modifierState(event);
// Use TouchUp for cancelled touch points
if (event.type === 'touchcancel') {
let data = {};
data["event"] = "TouchUp";
data["identifier"] = event.changedTouches[0].identifier;
data["x"] = this._clientToServerX(event.changedTouches[0].clientX);
data["y"] = this._clientToServerY(event.changedTouches[0].clientY);
data["modifier_state"] = mod_state;
this.send(data);
return;
}
if (event.type === 'touchstart') {
var event_name = "TouchDown";
} else if (event.type === 'touchmove') {
var event_name = "TouchMotion";
} else if (event.type === 'touchend') {
var event_name = "TouchUp";
}
for (let touch of event.changedTouches) {
let data = {};
data["event"] = event_name;
data["identifier"] = touch.identifier;
data["x"] = this._clientToServerX(touch.clientX);
data["y"] = this._clientToServerY(touch.clientY);
data["modifier_state"] = mod_state;
if (event.type !== 'touchend') {
if ('force' in touch) {
data["pressure"] = touch.force;
} else {
data["pressure"] = NaN;
}
}
this.send(data);
}
if (event.timeStamp > this.lastTouch) {
let data = {};
data["event"] = "TouchFrame";
data["modifier_state"] = mod_state;
this.send(data);
this.lastTouch = event.timeStamp;
}
event.preventDefault();
}
/**
* Handles mouse wheel events and sends them to WebRTC app.
* @param {MouseEvent} event
*/
_wheel(event) {
let data = {
"event": "MouseScroll",
"x": this.x,
"y": this.y,
"delta_x": -event.deltaX,
"delta_y": -event.deltaY,
"modifier_state": this._modifierState(event),
};
this.send(data);
event.preventDefault();
}
/**
* Captures mouse context menu (right-click) event and prevents event propagation.
* @param {MouseEvent} event
*/
_contextMenu(event) {
event.preventDefault();
}
/**
* Sends WebRTC app command to hide the remote pointer when exiting pointer lock.
*/
_exitPointerLock() {
document.exitPointerLock();
}
/**
* constructs the string representation for the active modifiers on the event
*/
_modifierState(event) {
let masks = []
if (event.altKey) masks.push("alt-mask");
if (event.ctrlKey) masks.push("control-mask");
if (event.metaKey) masks.push("meta-mask");
if (event.shiftKey) masks.push("shift-mask");
return masks.join('+')
}
/**
* Captures display and video dimensions required for computing mouse pointer position.
* This should be fired whenever the window size changes.
*/
_windowMath() {
const windowW = this.element.offsetWidth;
const windowH = this.element.offsetHeight;
const frameW = this.element.videoWidth;
const frameH = this.element.videoHeight;
const multi = Math.min(windowW / frameW, windowH / frameH);
const vpWidth = frameW * multi;
const vpHeight = (frameH * multi);
var elem = this.element;
var offsetLeft = 0;
var offsetTop = 0;
do {
if (!isNaN(elem.offsetLeft)) {
offsetLeft += elem.offsetLeft;
}
if (!isNaN(elem.offsetTop)) {
offsetTop += elem.offsetTop;
}
} while (elem = elem.offsetParent);
this.m = {
mouseMultiX: frameW / vpWidth,
mouseMultiY: frameH / vpHeight,
mouseOffsetX: Math.max((windowW - vpWidth) / 2.0, 0),
mouseOffsetY: Math.max((windowH - vpHeight) / 2.0, 0),
offsetLeft: offsetLeft,
offsetTop: offsetTop,
scrollX: window.scrollX,
scrollY: window.scrollY,
frameW,
frameH,
};
}
/**
* Translates pointer position X based on current window math.
* @param {Integer} clientX
*/
_clientToServerX(clientX) {
var serverX = Math.round((clientX - this.m.mouseOffsetX - this.m.offsetLeft + this.m.scrollX) * this.m.mouseMultiX);
if (serverX === this.m.frameW - 1) serverX = this.m.frameW;
if (serverX > this.m.frameW) serverX = this.m.frameW;
if (serverX < 0) serverX = 0;
return serverX;
}
/**
* Translates pointer position Y based on current window math.
* @param {Integer} clientY
*/
_clientToServerY(clientY) {
let serverY = Math.round((clientY - this.m.mouseOffsetY - this.m.offsetTop + this.m.scrollY) * this.m.mouseMultiY);
if (serverY === this.m.frameH - 1) serverY = this.m.frameH;
if (serverY > this.m.frameH) serverY = this.m.frameH;
if (serverY < 0) serverY = 0;
return serverY;
}
/**
* When fullscreen is entered, request keyboard and pointer lock.
*/
_onFullscreenChange() {
if (document.fullscreenElement !== null) {
// Enter fullscreen
this.requestKeyboardLock();
this.element.requestPointerLock();
}
// Reset local keyboard. When holding to exit full-screen the escape key can get stuck.
this.keyboard.reset();
// Reset stuck keys on server side.
// FIXME: How to implement resetting keyboard with the GstNavigation interface
// this.send("kr");
}
/**
* Called when window is being resized, used to detect when resize ends so new resolution can be sent.
*/
_resizeStart() {
this._rtime = new Date();
if (this._rtimeout === false) {
this._rtimeout = true;
setTimeout(() => { this._resizeEnd() }, this._rdelta);
}
}
/**
* Called in setTimeout loop to detect if window is done being resized.
*/
_resizeEnd() {
if (new Date() - this._rtime < this._rdelta) {
setTimeout(() => { this._resizeEnd() }, this._rdelta);
} else {
this._rtimeout = false;
if (this.onresizeend !== null) {
this.onresizeend();
}
}
}
/**
* Attaches input event handles to docuemnt, window and element.
*/
attach() {
this.listeners.push(addListener(this.element, 'resize', this._windowMath, this));
this.listeners.push(addListener(this.element, 'wheel', this._wheel, this));
this.listeners.push(addListener(this.element, 'contextmenu', this._contextMenu, this));
this.listeners.push(addListener(this.element.parentElement, 'fullscreenchange', this._onFullscreenChange, this));
this.listeners.push(addListener(window, 'resize', this._windowMath, this));
this.listeners.push(addListener(window, 'resize', this._resizeStart, this));
if ('ontouchstart' in window) {
console.warning("FIXME: Enabling mouse pointer display for touch devices.");
} else {
this.listeners.push(addListener(this.element, 'mousemove', this._mouseButtonMovement, this));
this.listeners.push(addListener(this.element, 'mousedown', this._mouseButtonMovement, this));
this.listeners.push(addListener(this.element, 'mouseup', this._mouseButtonMovement, this));
}
this.listeners.push(addListener(this.element, 'touchstart', this._touch, this));
this.listeners.push(addListener(this.element, 'touchend', this._touch, this));
this.listeners.push(addListener(this.element, 'touchmove', this._touch, this));
this.listeners.push(addListener(this.element, 'touchcancel', this._touch, this));
// Adjust for scroll offset
this.listeners.push(addListener(window, 'scroll', () => {
this.m.scrollX = window.scrollX;
this.m.scrollY = window.scrollY;
}, this));
// Using guacamole keyboard because it has the keysym translations.
this.keyboard = new Keyboard(this.element);
this.keyboard.onkeydown = (keysym, state) => {
this.send({"event": "KeyPress", "key": keysym, "modifier_state": state});
};
this.keyboard.onkeyup = (keysym, state) => {
this.send({"event": "KeyRelease", "key": keysym, "modifier_state": state});
};
this._windowMath();
}
detach() {
removeListeners(this.listeners);
this._exitPointerLock();
if (this.keyboard) {
this.keyboard.onkeydown = null;
this.keyboard.onkeyup = null;
this.keyboard.reset();
delete this.keyboard;
// FIXME: How to implement resetting keyboard with the GstNavigation interface
// this.send("kr");
}
}
/**
* Request keyboard lock, must be in fullscreen mode to work.
*/
requestKeyboardLock() {
// event codes: https://www.w3.org/TR/uievents-code/#key-alphanumeric-writing-system
const keys = [
"AltLeft",
"AltRight",
"Tab",
"Escape",
"ContextMenu",
"MetaLeft",
"MetaRight"
];
console.log("requesting keyboard lock");
navigator.keyboard.lock(keys).then(
() => {
console.log("keyboard lock success");
}
).catch(
(e) => {
console.log("keyboard lock failed: ", e);
}
)
}
getWindowResolution() {
return [
parseInt(this.element.offsetWidth * window.devicePixelRatio),
parseInt(this.element.offsetHeight * window.devicePixelRatio)
];
}
}
/**
* Helper function to keep track of attached event listeners.
* @param {Object} obj
* @param {string} name
* @param {function} func
* @param {Object} ctx
*/
function addListener(obj, name, func, ctx) {
const newFunc = ctx ? func.bind(ctx) : func;
obj.addEventListener(name, newFunc);
return [obj, name, newFunc];
}
/**
* Helper function to remove all attached event listeners.
* @param {Array} listeners
*/
function removeListeners(listeners) {
for (const listener of listeners)
listener[0].removeEventListener(listener[1], listener[2]);
}

File diff suppressed because it is too large Load diff

View file

@ -1,141 +0,0 @@
/* Reset CSS from Eric Meyer */
html, body, div, span, applet, object, iframe,
h1, h2, h3, h4, h5, h6, p, blockquote, pre,
a, abbr, acronym, address, big, cite, code,
del, dfn, em, img, ins, kbd, q, s, samp,
small, strike, strong, sub, sup, tt, var,
b, u, i, center,
dl, dt, dd, ol, ul, li,
fieldset, form, label, legend,
table, caption, tbody, tfoot, thead, tr, th, td,
article, aside, canvas, details, embed,
figure, figcaption, footer, header, hgroup,
menu, nav, output, ruby, section, summary,
time, mark, audio, video {
margin: 0;
padding: 0;
border: 0;
font-size: 100%;
font: inherit;
vertical-align: baseline;
}
/* HTML5 display-role reset for older browsers */
article, aside, details, figcaption, figure,
footer, header, hgroup, menu, nav, section {
display: block;
}
body {
line-height: 1;
}
ol, ul {
list-style: none;
}
blockquote, q {
quotes: none;
}
blockquote:before, blockquote:after,
q:before, q:after {
content: '';
content: none;
}
table {
border-collapse: collapse;
border-spacing: 0;
}
/* Our style */
body{
display: flex;
flex-direction: column;
min-height: 100vh;
background-color: #222;
color: white;
}
.holygrail-body {
flex: 1 0 auto;
display: flex;
}
.holygrail-body .content {
width: 100%;
}
#sessions {
display: flex;
flex-direction: row;
flex-wrap: wrap;
justify-content: space-around;
}
.holygrail-body .nav {
width: 220px;
list-style: none;
text-align: left;
order: -1;
background-color: #333;
margin: 0;
}
@media (max-width: 700px) {
.holygrail-body {
flex-direction: column;
}
.holygrail-body .nav {
width: 100%;
}
}
.session p span {
float: right;
}
.session p {
padding-top: 5px;
padding-bottom: 5px;
}
.stream {
background-color: black;
width: 480px;
}
#camera-list {
text-align: center;
}
.button {
border: none;
padding: 8px;
text-align: center;
text-decoration: none;
display: inline-block;
font-size: 16px;
-webkit-transition-duration: 0.4s; /* Safari */
transition-duration: 0.4s;
cursor: pointer;
margin: 5px auto;
width: 90%;
}
.button1 {
background-color: #222;
color: white;
border: 2px solid #4CAF50;
word-wrap: anywhere;
}
.button1:hover {
background-color: #4CAF50;
color: white;
}
#image-holder {
display: flex;
flex-direction: row;
flex-wrap: wrap;
justify-content: space-around;
}

View file

@ -1,466 +0,0 @@
/* vim: set sts=4 sw=4 et :
*
* Demo Javascript app for negotiating and streaming a sendrecv webrtc stream
* with a GStreamer app. Runs only in passive mode, i.e., responds to offers
* with answers, exchanges ICE candidates, and streams.
*
* Author: Nirbheek Chauhan <nirbheek@centricular.com>
*/
// Set this to override the automatic detection in websocketServerConnect()
var ws_server;
var ws_port;
// Override with your own STUN servers if you want
var rtc_configuration = {iceServers: [{urls: "stun:stun.l.google.com:19302"},
/* TODO: do not keep these static and in clear text in production,
* and instead use one of the mechanisms discussed in
* https://groups.google.com/forum/#!topic/discuss-webrtc/nn8b6UboqRA
*/
{'urls': 'turn:turn.homeneural.net:3478?transport=udp',
'credential': '1qaz2wsx',
'username': 'test'
}],
/* Uncomment the following line to ensure the turn server is used
* while testing. This should be kept commented out in production,
* as non-relay ice candidates should be preferred
*/
// iceTransportPolicy: "relay",
};
var sessions = {}
/* https://stackoverflow.com/questions/105034/create-guid-uuid-in-javascript */
function getOurId() {
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
var r = Math.random() * 16 | 0, v = c == 'x' ? r : (r & 0x3 | 0x8);
return v.toString(16);
});
}
function Uint8ToString(u8a){
var CHUNK_SZ = 0x8000;
var c = [];
for (var i=0; i < u8a.length; i+=CHUNK_SZ) {
c.push(String.fromCharCode.apply(null, u8a.subarray(i, i+CHUNK_SZ)));
}
return c.join("");
}
function Session(our_id, peer_id, closed_callback) {
this.id = null;
this.peer_connection = null;
this.ws_conn = null;
this.peer_id = peer_id;
this.our_id = our_id;
this.closed_callback = closed_callback;
this.data_channel = null;
this.input = null;
this.getVideoElement = function() {
return document.getElementById("stream-" + this.our_id);
};
this.resetState = function() {
if (this.peer_connection) {
this.peer_connection.close();
this.peer_connection = null;
}
var videoElement = this.getVideoElement();
if (videoElement) {
videoElement.pause();
videoElement.src = "";
}
var session_div = document.getElementById("session-" + this.our_id);
if (session_div) {
session_div.parentNode.removeChild(session_div);
}
if (this.ws_conn) {
this.ws_conn.close();
this.ws_conn = null;
}
this.input && this.input.detach();
this.data_channel = null;
};
this.handleIncomingError = function(error) {
this.resetState();
this.closed_callback(this.our_id);
};
this.setStatus = function(text) {
console.log(text);
var span = document.getElementById("status-" + this.our_id);
// Don't set the status if it already contains an error
if (!span.classList.contains('error'))
span.textContent = text;
};
this.setError = function(text) {
console.error(text);
var span = document.getElementById("status-" + this.our_id);
span.textContent = text;
span.classList.add('error');
this.resetState();
this.closed_callback(this.our_id);
};
// Local description was set, send it to peer
this.onLocalDescription = function(desc) {
console.log("Got local description: " + JSON.stringify(desc), this);
var thiz = this;
this.peer_connection.setLocalDescription(desc).then(() => {
this.setStatus("Sending SDP answer");
var sdp = {
'type': 'peer',
'sessionId': this.id,
'sdp': this.peer_connection.localDescription.toJSON()
};
this.ws_conn.send(JSON.stringify(sdp));
}).catch(function(e) {
thiz.setError(e);
});
};
this.onRemoteDescriptionSet = function() {
this.setStatus("Remote SDP set");
this.setStatus("Got SDP offer");
this.peer_connection.createAnswer()
.then(this.onLocalDescription.bind(this)).catch(this.setError);
}
// SDP offer received from peer, set remote description and create an answer
this.onIncomingSDP = function(sdp) {
var thiz = this;
this.peer_connection.setRemoteDescription(sdp)
.then(this.onRemoteDescriptionSet.bind(this))
.catch(function(e) {
thiz.setError(e)
});
};
// ICE candidate received from peer, add it to the peer connection
this.onIncomingICE = function(ice) {
var candidate = new RTCIceCandidate(ice);
var thiz = this;
this.peer_connection.addIceCandidate(candidate).catch(function(e) {
thiz.setError(e)
});
};
this.onServerMessage = function(event) {
console.log("Received " + event.data);
try {
msg = JSON.parse(event.data);
} catch (e) {
if (e instanceof SyntaxError) {
this.handleIncomingError("Error parsing incoming JSON: " + event.data);
} else {
this.handleIncomingError("Unknown error parsing response: " + event.data);
}
return;
}
if (msg.type == "registered") {
this.setStatus("Registered with server");
this.connectPeer();
} else if (msg.type == "sessionStarted") {
this.setStatus("Registered with server");
this.id = msg.sessionId;
} else if (msg.type == "error") {
this.handleIncomingError(msg.details);
} else if (msg.type == "endSession") {
this.resetState();
this.closed_callback(this.our_id);
} else if (msg.type == "peer") {
// Incoming peer message signals the beginning of a call
if (!this.peer_connection)
this.createCall(msg);
if (msg.sdp != null) {
this.onIncomingSDP(msg.sdp);
} else if (msg.ice != null) {
this.onIncomingICE(msg.ice);
} else {
this.handleIncomingError("Unknown incoming JSON: " + msg);
}
}
};
this.streamIsPlaying = function(e) {
this.setStatus("Streaming");
};
this.onServerClose = function(event) {
this.resetState();
this.closed_callback(this.our_id);
};
this.onServerError = function(event) {
this.handleIncomingError('Server error');
};
this.websocketServerConnect = function() {
// Clear errors in the status span
var span = document.getElementById("status-" + this.our_id);
span.classList.remove('error');
span.textContent = '';
console.log("Our ID:", this.our_id);
var ws_port = ws_port || '8443';
if (window.location.protocol.startsWith ("file")) {
var ws_server = ws_server || "127.0.0.1";
} else if (window.location.protocol.startsWith ("http")) {
var ws_server = ws_server || window.location.hostname;
} else {
throw new Error ("Don't know how to connect to the signalling server with uri" + window.location);
}
var ws_url = 'ws://' + ws_server + ':' + ws_port
this.setStatus("Connecting to server " + ws_url);
this.ws_conn = new WebSocket(ws_url);
/* When connected, immediately register with the server */
this.ws_conn.addEventListener('open', (event) => {
this.setStatus("Connecting to the peer");
this.connectPeer();
});
this.ws_conn.addEventListener('error', this.onServerError.bind(this));
this.ws_conn.addEventListener('message', this.onServerMessage.bind(this));
this.ws_conn.addEventListener('close', this.onServerClose.bind(this));
};
this.connectPeer = function() {
this.setStatus("Connecting " + this.peer_id);
this.ws_conn.send(JSON.stringify({
"type": "startSession",
"peerId": this.peer_id
}));
};
this.onRemoteStreamAdded = function(event) {
var videoTracks = event.stream.getVideoTracks();
var audioTracks = event.stream.getAudioTracks();
console.log(videoTracks);
if (videoTracks.length > 0) {
console.log('Incoming stream: ' + videoTracks.length + ' video tracks and ' + audioTracks.length + ' audio tracks');
this.getVideoElement().srcObject = event.stream;
this.getVideoElement().play();
} else {
this.handleIncomingError('Stream with unknown tracks added, resetting');
}
};
this.createCall = function(msg) {
console.log('Creating RTCPeerConnection');
this.peer_connection = new RTCPeerConnection(rtc_configuration);
this.peer_connection.onaddstream = this.onRemoteStreamAdded.bind(this);
this.peer_connection.ondatachannel = (event) => {
console.log(`Data channel created: ${event.channel.label}`);
this.data_channel = event.channel;
video_element = this.getVideoElement();
if (video_element) {
this.input = new Input(video_element, (data) => {
if (this.data_channel) {
console.log(`Navigation data: ${data}`);
this.data_channel.send(JSON.stringify(data));
}
});
}
this.data_channel.onopen = (event) => {
console.log("Receive channel opened, attaching input");
this.input.attach();
}
this.data_channel.onclose = (event) => {
console.info("Receive channel closed");
this.input && this.input.detach();
this.data_channel = null;
}
this.data_channel.onerror = (event) => {
this.input && this.input.detach();
console.warn("Error on receive channel", event.data);
this.data_channel = null;
}
let buffer = [];
this.data_channel.onmessage = (event) => {
if (typeof event.data === 'string' || event.data instanceof String) {
if (event.data == 'BEGIN_IMAGE')
buffer = [];
else if (event.data == 'END_IMAGE') {
var decoder = new TextDecoder("ascii");
var array_buffer = new Uint8Array(buffer);
var str = decoder.decode(array_buffer);
let img = document.getElementById("image");
img.src = 'data:image/png;base64, ' + str;
}
} else {
var i, len = buffer.length
var view = new DataView(event.data);
for (i = 0; i < view.byteLength; i++) {
buffer[len + i] = view.getUint8(i);
}
}
}
}
this.peer_connection.onicecandidate = (event) => {
if (event.candidate == null) {
console.log("ICE Candidate was null, done");
return;
}
this.ws_conn.send(JSON.stringify({
"type": "peer",
"sessionId": this.id,
"ice": event.candidate.toJSON()
}));
};
this.setStatus("Created peer connection for call, waiting for SDP");
};
document.getElementById("stream-" + this.our_id).addEventListener("playing", this.streamIsPlaying.bind(this), false);
this.websocketServerConnect();
}
function startSession() {
var peer_id = document.getElementById("camera-id").value;
if (peer_id === "") {
return;
}
sessions[peer_id] = new Session(peer_id);
}
function session_closed(peer_id) {
sessions[peer_id] = null;
}
function addPeer(peer_id, meta) {
console.log("Meta: ", JSON.stringify(meta));
var nav_ul = document.getElementById("camera-list");
meta = meta ? meta : {"display-name": peer_id};
let display_html = `${meta["display-name"] ? meta["display-name"] : peer_id}<ul>`;
for (const key in meta) {
if (key != "display-name") {
display_html += `<li>- ${key}: ${meta[key]}</li>`;
}
}
display_html += "</ul>"
var li_str = '<li id="peer-' + peer_id + '"><button class="button button1">' + display_html + '</button></li>';
nav_ul.insertAdjacentHTML('beforeend', li_str);
var li = document.getElementById("peer-" + peer_id);
li.onclick = function(e) {
var sessions_div = document.getElementById('sessions');
var our_id = getOurId();
var session_div_str = '<div class="session" id="session-' + our_id + '"><video preload="none" class="stream" id="stream-' + our_id + '"></video><p>Status: <span id="status-' + our_id + '">unknown</span></p></div>'
sessions_div.insertAdjacentHTML('beforeend', session_div_str);
sessions[peer_id] = new Session(our_id, peer_id, session_closed);
}
}
function clearPeers() {
var nav_ul = document.getElementById("camera-list");
while (nav_ul.firstChild) {
nav_ul.removeChild(nav_ul.firstChild);
}
}
function onServerMessage(event) {
console.log("Received " + event.data);
try {
msg = JSON.parse(event.data);
} catch (e) {
if (e instanceof SyntaxError) {
console.error("Error parsing incoming JSON: " + event.data);
} else {
console.error("Unknown error parsing response: " + event.data);
}
return;
}
if (msg.type == "welcome") {
console.info(`Got welcomed with ID ${msg.peer_id}`);
ws_conn.send(JSON.stringify({
"type": "list"
}));
} else if (msg.type == "list") {
clearPeers();
for (i = 0; i < msg.producers.length; i++) {
addPeer(msg.producers[i].id, msg.producers[i].meta);
}
} else if (msg.type == "peerStatusChanged") {
var li = document.getElementById("peer-" + msg.peerId);
if (msg.roles.includes("producer")) {
if (li == null) {
console.error('Adding peer');
addPeer(msg.peerId, msg.meta);
}
} else if (li != null) {
li.parentNode.removeChild(li);
}
} else {
console.error("Unsupported message: ", msg);
}
};
function clearConnection() {
ws_conn.removeEventListener('error', onServerError);
ws_conn.removeEventListener('message', onServerMessage);
ws_conn.removeEventListener('close', onServerClose);
ws_conn = null;
}
function onServerClose(event) {
clearConnection();
clearPeers();
console.log("Close");
window.setTimeout(connect, 1000);
};
function onServerError(event) {
clearConnection();
clearPeers();
console.log("Error", event);
window.setTimeout(connect, 1000);
};
function connect() {
var ws_port = ws_port || '8443';
if (window.location.protocol.startsWith ("file")) {
var ws_server = ws_server || "127.0.0.1";
} else if (window.location.protocol.startsWith ("http")) {
var ws_server = ws_server || window.location.hostname;
} else {
throw new Error ("Don't know how to connect to the signalling server with uri" + window.location);
}
var ws_url = 'ws://' + ws_server + ':' + ws_port
console.log("Connecting listener");
ws_conn = new WebSocket(ws_url);
ws_conn.addEventListener('open', (event) => {
ws_conn.send(JSON.stringify({
"type": "setPeerStatus",
"roles": ["listener"]
}));
});
ws_conn.addEventListener('error', onServerError);
ws_conn.addEventListener('message', onServerMessage);
ws_conn.addEventListener('close', onServerClose);
}
function setup() {
connect();
}