mirror of
https://github.com/matthew1000/gstreamer-cheat-sheet.git
synced 2024-11-25 01:31:03 +00:00
Various tweaks from a few more months of learning
This commit is contained in:
parent
98e56613d7
commit
4e75636e01
14 changed files with 297 additions and 43 deletions
|
@ -13,10 +13,11 @@ A few Python examples are also [included](python_examples/) for when you need GS
|
|||
* [Images](images.md)
|
||||
* [Queues](queues.md)
|
||||
* [Writing to files](writing_to_files.md)
|
||||
* [Capturing images](capturing_images.md)
|
||||
* [Sending to multiple destinations (tee)](tee.md)
|
||||
* [Sharing and receiving pipelines (including sending/receiving video from shared memory)](sharing_and_splitting_pipelines.md)
|
||||
* [Network transfer](network_transfer.md) (including how to send so that VLC can preview)
|
||||
* [RTP](rtp.md)
|
||||
* [SRT](srt.md)
|
||||
|
||||
## Sources and references
|
||||
|
||||
|
@ -50,6 +51,7 @@ Other good GStreamer Python resources that I've found:
|
|||
* [Getting started with GStreamer with Python](https://www.jonobacon.com/2006/08/28/getting-started-with-gstreamer-with-python/)
|
||||
* [Python GStreamer Tutorial](http://brettviren.github.io/pygst-tutorial-org/pygst-tutorial.html)
|
||||
* [Function reference](http://lazka.github.io/pgi-docs/#Gst-1.0)
|
||||
* Including a useful [mapping from C](https://lazka.github.io/pgi-docs/Gst-1.0/mapping.html)
|
||||
* [Nice example script](https://github.com/rabits/rstream/blob/master/rstream.py)
|
||||
|
||||
### C/C++ with GStreamer
|
||||
|
@ -59,3 +61,7 @@ My favourite reference is [Valadoc](https://valadoc.org/gstreamer-1.0/index.htm)
|
|||
# Problems or suggestions with this guide?
|
||||
|
||||
If you spot anything incorrect or incomplete, reports are welcome, either using [issues](issues) or [pull requests](pulls)
|
||||
|
||||
# My GStreamer project
|
||||
|
||||
Creating this guide gave me enough GStreamer understanding to make [Brave](https://github.com/bbc/brave), a live video editor for the cloud.
|
|
@ -20,13 +20,7 @@ gst-launch-1.0 playbin uri=file://$SRC
|
|||
|
||||
This works with video, audio, RTMP streams, and so much more.
|
||||
|
||||
The 'bin' in 'playbin' means that under-the-hood, it's a collection of elements. For example, we can achieve the same thing by going to the next level of elements, which separate the decoding part from the playing part:
|
||||
|
||||
```
|
||||
gst-launch-1.0 filesrc location=$SRC ! decodebin ! playsink
|
||||
```
|
||||
|
||||
Or, we can split down even further:
|
||||
The 'bin' in 'playbin' means that under-the-hood, it's a collection of elements. We could split it down into the individual components:
|
||||
|
||||
```
|
||||
gst-launch-1.0 filesrc location=$SRC ! \
|
||||
|
|
|
@ -1,32 +0,0 @@
|
|||
# Capturing images (GStreamer command-line cheat sheet)
|
||||
|
||||
### Capture an image as PNG
|
||||
|
||||
The `pngenc` element can create a single PNG:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! pngenc ! filesink location=foo.png
|
||||
```
|
||||
|
||||
### Capture an image as JPEG
|
||||
|
||||
The `jpegenc` element can create a single JPEG:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! jpegenc ! filesink location=foo.jpg
|
||||
```
|
||||
|
||||
### Capturing images every X seconds
|
||||
|
||||
This example captures one frame every 3 seconds, and places it in files with the format `img00001.jpg`.
|
||||
It also displays the video (as the `tee` command sends the video to both `multifilesink` and `autovideosink`).
|
||||
To change the frequency, change the `framerate=1/3`.
|
||||
e.g. `framerate=2/1` would capture a frame twice a second.
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! \
|
||||
videoscale ! videoconvert ! video/x-raw,width=640,height=480 ! \
|
||||
queue ! decodebin ! tee name=t ! queue ! videoconvert ! videoscale ! \
|
||||
videorate ! video/x-raw,width=640,height=480,framerate=1/3 ! \
|
||||
jpegenc ! multifilesink location=img%05d.jpg t. ! queue ! autovideosink
|
||||
```
|
35
images.md
35
images.md
|
@ -1,7 +1,14 @@
|
|||
# Images (GStreamer command-line cheat sheet)
|
||||
|
||||
Images can be added to video.
|
||||
In addition, video can be converted into images.
|
||||
This page looks at both types.
|
||||
|
||||
## Including images in video
|
||||
|
||||
Gstreamer can show images on video using the `imagefreeze` element.
|
||||
|
||||
|
||||
### Create a video from an image
|
||||
|
||||
![Example imagefreeze single image](images/imagefreeze_single.png "Example imagefreeze single image")
|
||||
|
@ -29,3 +36,31 @@ gst-launch-1.0 \
|
|||
uridecodebin uri=$PIC ! videoscale ! video/x-raw, width=263, height=240 ! imagefreeze ! m.
|
||||
```
|
||||
|
||||
## Capturing video output as images
|
||||
|
||||
### Capture an image as PNG
|
||||
|
||||
The `pngenc` element can create a single PNG:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! pngenc ! filesink location=foo.png
|
||||
```
|
||||
|
||||
### Capture an image as JPEG
|
||||
|
||||
The `jpegenc` element can create a single JPEG:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! jpegenc ! filesink location=foo.jpg
|
||||
```
|
||||
|
||||
### Capturing images every X seconds
|
||||
|
||||
This example captures one frame every 3 seconds, and places it in files with the format `img00001.jpg`.
|
||||
It also displays the video (as the `tee` command sends the video to both `multifilesink` and `autovideosink`).
|
||||
To change the frequency, change the `framerate=1/3`.
|
||||
e.g. `framerate=2/1` would capture a frame twice a second.
|
||||
|
||||
```
|
||||
gst-launch-1.0 -v videotestsrc is-live=true ! clockoverlay font-desc=\"Sans, 48\" ! videoconvert ! videorate ! video/x-raw,framerate=1/3 ! jpegenc ! multifilesink location=file-%02d.jpg
|
||||
```
|
||||
|
|
13
mixing.md
13
mixing.md
|
@ -255,3 +255,16 @@ gst-launch-1.0 \
|
|||
queue2 ! audioconvert ! audioresample ! audiomix. \
|
||||
demux2. ! queue2 ! decodebin ! videoconvert ! videoscale ! video/x-raw,width=320,height=180 ! videomix.
|
||||
```
|
||||
|
||||
## Fading
|
||||
It's often nicer to fade between sources than to abruptly cut betwen them. This can be done both with video (temporarily blending using alpha channel) or audio (lowering the volume of one whilst raising it on another).
|
||||
|
||||
It's not possible to do this on the command line... alhough `alpha` and `volume` can be set, they can only be set to discrete values.
|
||||
|
||||
Programatically, however, this is possible through [dynamically controlled parameters](https://gstreamer.freedesktop.org/documentation/application-development/advanced/dparams.html). With this, you tell GStreamer the 'from' and 'to' values (e.g. 0 and 1 to increase volume), and the time period to do it.
|
||||
|
||||
See [`mix with fade.py`](python_examples/mix_with_fade.py) for a simple Python example.
|
||||
|
||||
As a slightly richer example, [`mix_with_other_props.py`](python_examples/mix_with_other_props.py) shows how other properties, such as the image's position and size, can also be changed this way.
|
||||
|
||||
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
#!/usr/bin/env python
|
||||
# Shows how two pipelines can be connected, using interaudiosink/interaudiosrc.
|
||||
# Shows how two pipelines can be connected, using interaudiosink/interaudiosrc.
|
||||
# (Search and replace 'audio' with 'video' to get a video example.)
|
||||
# It will output a test audio sound.
|
||||
import gi
|
||||
|
|
|
@ -13,7 +13,7 @@ gi.require_version('Gst', '1.0')
|
|||
from gi.repository import GObject, Gst
|
||||
import os
|
||||
|
||||
Gst.init()
|
||||
Gst.init(None)
|
||||
mainloop = GObject.MainLoop()
|
||||
|
||||
pipeline = Gst.Pipeline.new("pipe")
|
||||
|
|
63
python_examples/mix_with_fade.py
Normal file
63
python_examples/mix_with_fade.py
Normal file
|
@ -0,0 +1,63 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
'''
|
||||
An example of how fading can work into a compositor (mixer).
|
||||
Similar to a Gist at https://gist.github.com/lsiden/649d8ef9e02758ffde30b2b10fbac45a
|
||||
|
||||
'''
|
||||
|
||||
import gi
|
||||
gi.require_version('Gst', '1.0')
|
||||
gi.require_version('GstController', '1.0')
|
||||
from gi.repository import GObject, Gst, GstController
|
||||
import os
|
||||
from time import sleep
|
||||
from threading import Thread
|
||||
|
||||
def get_alpha_controller(pad):
|
||||
cs = GstController.InterpolationControlSource()
|
||||
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
|
||||
|
||||
cb = GstController.DirectControlBinding.new(pad, 'alpha', cs)
|
||||
pad.add_control_binding(cb)
|
||||
return cs
|
||||
|
||||
Gst.init(None)
|
||||
mainloop = GObject.MainLoop()
|
||||
pipe = Gst.parse_launch("videotestsrc ! timeoverlay name=timeoverlay halignment=right font-desc=\"Sans, 50\" deltax=10 ! " "compositor name=compositor sink_1::width=100 ! "
|
||||
"autovideosink videotestsrc pattern=1 ! compositor.")
|
||||
compositor = pipe.get_by_name('compositor')
|
||||
|
||||
sink1 = compositor.get_static_pad('sink_1')
|
||||
cs = get_alpha_controller(sink1)
|
||||
is_showing = True
|
||||
# cs.set(0*Gst.SECOND, 0)
|
||||
# cs.set(1*Gst.SECOND, 1)
|
||||
# cs.set(2*Gst.SECOND, 0)
|
||||
# cs.set(3*Gst.SECOND, 0)
|
||||
# cs.set(4*Gst.SECOND, 1)
|
||||
|
||||
pipe.set_state(Gst.State.PLAYING)
|
||||
|
||||
# Separate thread to provide user interaction:
|
||||
def separate_thread():
|
||||
global is_showing
|
||||
while True:
|
||||
input("Press enter to fade...")
|
||||
current = 0
|
||||
dest = 1
|
||||
if is_showing:
|
||||
current = 1
|
||||
dest = 0
|
||||
|
||||
pos = pipe.query_position(Gst.Format.TIME).cur
|
||||
print('Moving from %d to %d' % (current, dest))
|
||||
cs.set(pos, current)
|
||||
cs.set(pos + 1*Gst.SECOND, dest)
|
||||
is_showing = not is_showing
|
||||
|
||||
|
||||
myThread = Thread(target=separate_thread, args=())
|
||||
myThread.start()
|
||||
|
||||
mainloop.run()
|
48
python_examples/mix_with_other_props.py
Normal file
48
python_examples/mix_with_other_props.py
Normal file
|
@ -0,0 +1,48 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
'''
|
||||
An example of how dynamic values can work for properties such as size and position.
|
||||
Here an image moves across (in the first second) and then enlarges (in seconds 1-3).
|
||||
'''
|
||||
|
||||
import gi
|
||||
gi.require_version('Gst', '1.0')
|
||||
gi.require_version('GstController', '1.0')
|
||||
from gi.repository import GObject, Gst, GstController
|
||||
import os
|
||||
from time import sleep
|
||||
from threading import Thread
|
||||
|
||||
def get_timing_controller(pad, field):
|
||||
cs = GstController.InterpolationControlSource()
|
||||
cs.set_property('mode', GstController.InterpolationMode.LINEAR)
|
||||
|
||||
cb = GstController.DirectControlBinding.new_absolute(pad, field, cs)
|
||||
pad.add_control_binding(cb)
|
||||
return cs
|
||||
|
||||
Gst.init(None)
|
||||
mainloop = GObject.MainLoop()
|
||||
pipe = Gst.parse_launch(\
|
||||
"videotestsrc pattern=4 ! "
|
||||
"timeoverlay valignment=top font-desc=\"Sans, 50\" ! "
|
||||
"compositor name=compositor sink_1::width=80 sink_1::height=45 sink_1::ypos=50 ! autovideosink"
|
||||
" videotestsrc pattern=5 ! compositor.")
|
||||
compositor = pipe.get_by_name('compositor')
|
||||
textoverlay = pipe.get_by_name('textoverlay')
|
||||
|
||||
sink1 = compositor.get_static_pad('sink_1')
|
||||
xpos_cs = get_timing_controller(sink1, 'xpos')
|
||||
xpos_cs.set(0*Gst.SECOND, 150)
|
||||
xpos_cs.set(1*Gst.SECOND, 50)
|
||||
|
||||
width_cs = get_timing_controller(sink1, 'width')
|
||||
width_cs.set(1*Gst.SECOND, 80)
|
||||
width_cs.set(3*Gst.SECOND, 160)
|
||||
|
||||
height_cs = get_timing_controller(sink1, 'height')
|
||||
height_cs.set(1*Gst.SECOND, 45)
|
||||
height_cs.set(3*Gst.SECOND, 90)
|
||||
|
||||
pipe.set_state(Gst.State.PLAYING)
|
||||
mainloop.run()
|
70
python_examples/playbin_reliable.py
Normal file
70
python_examples/playbin_reliable.py
Normal file
|
@ -0,0 +1,70 @@
|
|||
#!/usr/bin/env python
|
||||
#
|
||||
# Plays any URI to screen. And shows how to handle buffering.
|
||||
#
|
||||
# Make sure the environment variable SRC is set to a playable file
|
||||
# e.g.
|
||||
# export SRC='file:///tmp/me.mp4'
|
||||
#
|
||||
|
||||
import gi
|
||||
gi.require_version('Gst', '1.0')
|
||||
from gi.repository import GObject, Gst
|
||||
import os
|
||||
|
||||
Gst.init(None)
|
||||
mainloop = GObject.MainLoop()
|
||||
pipeline = None
|
||||
|
||||
def on_state_change(bus, message):
|
||||
is_pipeline_state_change = isinstance(message.src, Gst.Pipeline)
|
||||
if is_pipeline_state_change:
|
||||
old_state, new_state, pending_state = message.parse_state_changed()
|
||||
print('State is now %s' % new_state.value_nick.upper())
|
||||
|
||||
if new_state == Gst.State.PAUSED:
|
||||
consider_move_to_playing_if_not_buffering()
|
||||
|
||||
def consider_move_to_playing_if_not_buffering():
|
||||
query_buffer = Gst.Query.new_buffering(Gst.Format.PERCENT)
|
||||
result = pipeline.query(query_buffer)
|
||||
|
||||
if result:
|
||||
result_parsed = query_buffer.parse_buffering_percent()
|
||||
buffering_is_busy = result_parsed.busy
|
||||
if not buffering_is_busy:
|
||||
pipeline.set_state(Gst.State.PLAYING)
|
||||
|
||||
|
||||
def on_error(bus, message):
|
||||
print('ERROR:', message.parse_error())
|
||||
|
||||
def on_buffering(bus, message):
|
||||
if pipeline.get_state(0)[1] in [Gst.State.PAUSED, Gst.State.PLAYING]:
|
||||
buffering_percent = message.parse_buffering()
|
||||
print('Buffering %d%%' % buffering_percent)
|
||||
if pipeline.get_state(0)[1] == Gst.State.PAUSED and buffering_percent == 100:
|
||||
pipeline.set_state(Gst.State.PLAYING)
|
||||
|
||||
# The percentage goes to 0 too soon. So setting back to PAUSED can cause a needless streamingn blip.
|
||||
# if pipeline.get_state(0)[1] == Gst.State.PLAYING and buffering_percent == 0:
|
||||
# pipeline.set_state(Gst.State.PAUSED)
|
||||
|
||||
def go():
|
||||
global pipeline
|
||||
pipeline = Gst.ElementFactory.make('playbin')
|
||||
pipeline.set_property('uri', os.environ['SRC'])
|
||||
|
||||
# How large a buffer would you like?
|
||||
pipeline.set_property('buffer-duration', 3 * Gst.SECOND)
|
||||
|
||||
bus = pipeline.get_bus()
|
||||
bus.add_signal_watch()
|
||||
bus.connect('message::buffering', on_buffering)
|
||||
bus.connect('message::state-changed', on_state_change)
|
||||
bus.connect('message::error', on_error)
|
||||
|
||||
pipeline.set_state(Gst.State.PAUSED)
|
||||
mainloop.run()
|
||||
|
||||
go()
|
2
rtmp.md
2
rtmp.md
|
@ -6,7 +6,7 @@ If you need your own RTMP server, [the Nginx RTMP extension](https://github.com/
|
|||
|
||||
### Play an RTMP stream
|
||||
|
||||
To play from RTMP server, playbin can be used (as with files and HLS streams):
|
||||
To play from RTMP server, playbin can be used (as with files, HLS streams, DASH streams, etc):
|
||||
|
||||
```
|
||||
gst-launch-1.0 playbin uri=$RTMP_SRC
|
||||
|
|
13
rtp.md
Normal file
13
rtp.md
Normal file
|
@ -0,0 +1,13 @@
|
|||
(Found this with help from https://gist.github.com/esrever10/7d39fe2d4163c5b2d7006495c3c911bb)
|
||||
|
||||
### To send AV over RTP:
|
||||
|
||||
```
|
||||
gst-launch-1.0 -v avfvideosrc capture-screen=true ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=127.0.0.1 port=5000
|
||||
```
|
||||
|
||||
### To receive AV via RTP:
|
||||
|
||||
```
|
||||
gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
|
||||
```
|
38
srt.md
Normal file
38
srt.md
Normal file
|
@ -0,0 +1,38 @@
|
|||
# SRT (GStreamer command-line cheat sheet)
|
||||
|
||||
SRT has a client-server relationship. One side must be the server, the other the client.
|
||||
The server can either be the side that is sending the audio/video (pushing) or the side that is
|
||||
receiving (pulling). The server must have started exist before the client.
|
||||
|
||||
NOTE: THIS IS FOR GSTREAMER 1.14; IT HAS CHANGED IN 1.16.
|
||||
|
||||
## Server sending the AV
|
||||
|
||||
Create a sender server like this:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! video/x-raw, height=360, width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtserversink uri=srt://127.0.0.1:8888/
|
||||
```
|
||||
|
||||
And create a receiver client like this:
|
||||
|
||||
```
|
||||
gst-launch-1.0 -v srtclientsrc uri="srt://127.0.0.1:8888" ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
## Server receiving the AV
|
||||
|
||||
To have the server receiving, rather than sending, swap 'srtclientsrc' for 'srcserversrc'.
|
||||
Likewise, to have the client sending rather than receiving, swap 'srtserversink' for 'srtclientsink'.
|
||||
|
||||
Create a receiver server like this:
|
||||
|
||||
```
|
||||
gst-launch-1.0 -v srtserversrc uri="srt://127.0.0.1:8888" ! decodebin ! autovideosink
|
||||
```
|
||||
|
||||
And a sender client like this:
|
||||
|
||||
```
|
||||
gst-launch-1.0 videotestsrc ! video/x-raw, height=360, width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtclientsink uri="srt://127.0.0.1:8888/"
|
||||
```
|
|
@ -21,6 +21,12 @@ gst-launch-1.0 -e videotestsrc pattern=ball ! \
|
|||
mp4mux name=mux ! filesink location=file.mp4
|
||||
```
|
||||
|
||||
gst-launch-1.0 -e videotestsrc pattern=ball ! \
|
||||
video/x-raw,width=640,height=320 ! \
|
||||
x264enc ! mux. \
|
||||
audiotestsrc is-live=true freq=200 ! audioconvert ! audioresample ! faac bitrate=32000 ! mux. \
|
||||
mp4mux name=mux ! filesink location=file.mp4
|
||||
|
||||
|
||||
### Other examples
|
||||
|
||||
|
|
Loading…
Reference in a new issue