The sink can accept audio or video directly, or if both should be
provided at once it is necesary to use the ndisinkcombiner before the
ndisink to merge both audio and video into the same stream.
Fixes https://github.com/teltek/gst-plugin-ndi/issues/10
In addition to the old one based on the receive time and timestamp.
Also make that new mode the default as it will usually give more
accurate results because the timestamp is just the send time while the
timecode is usually set by the sender based on the media timestamps.
This allows keeping audio/video more in sync with how the sender was
sending it, while also handling network jitter and clock drift in a
reasonable way.
This ensures that we'll be able to capture every frame even if
downstream of the source is blocking for a moment, and also allows us to
make all operations cancellable.
stream-name is called ndi-name everywhere in the NDI SDK and documentation
ip is called ip-address everywhere
Rename loss-threshold to timeout and change it to be in milliseconds
instead of iterations.
Add connect-timeout for timeout during connection
Add bandwidth and receiver-ndi-name properties, and initialize the
latter with a reasonable default value.
We can't just fixate to any close to what we receive right now but only
support exactly the caps we receive. So check the format of each frame
and negotiate exactly those caps as needed when receiving frames.
Also re-negotiate if the caps are ever changing.
Due to the possibility to connect to two or more streams simultaneously with different clocks synchronization It's necessary to improve the timestamps calculation to detect this.
Prior to this commit, we saved the first timestamp that arrive and use it to calculate the running time of the stream for the rest of frames (pts field in gstreamer buffer) in all of the streams. This lead to problems when connecting to multiple streams in multiple computers and the clocks were not correctly synchronized.
To fix this, now we save a different initial timestamp for each stream.
We could also go via the glib::Type but that requires more steps unless
we also add a getter from the registered type to the audio/video source
modules.