and [gst\_element\_get\_contexts()](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstElement.html#gst-element-get-contexts). GstContexts are used to share context-specific configuration objects
between elements and can also be used by applications to set context-specific
configuration objects on elements, e.g. for OpenGL or Hardware-accelerated
video decoding.
- New [GST\_BUFFER\_DTS\_OR\_PTS()](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstBuffer.html#GST-BUFFER-DTS-OR-PTS:CAPS)
convenience macro that returns the decode timestamp if one is set and
otherwise returns the presentation timestamp
- New GstPadEventFullFunc that returns a GstFlowReturn instead of a gboolean.
This new API is mostly for internal use and was added to fix a race condition
where occasionally internal flow error messages were posted on the bus when
sticky events were propagated at just the wrong moment whilst the pipeline
was shutting down. This happened primarily when the pipeline was shut down
immediately after starting it up. GStreamer would not know that the reason
the events could not be propagated was because the pipeline was shutting down
and not some other problem, and now the flow error allows GStreamer to know
the reason for the failure (and that there's no reason to post an error
message). This is particularly useful for queue-like elements which may need
to asynchronously propagate a previous flow return from downstream.
- Pipeline dumps in form of "dot files" now also show pad properties that
differ from their default value, the same as it does for elements. This is
useful for elements with pad subclasses that provide additional properties,
e.g. videomixer or compositor.
- Pad probes are now guaranteed to be called in the order they were added
(before they were called in reverse order, but no particular order was
documented or guaranteed)
- Plugins can now have dependencies on device nodes (not just regular files)
and also have a prefix filter. This is useful for plugins that expose
features (elements) based on available devices, such as the video4linux
plugin does with video decoders on certain embedded systems.
- gst\_segment\_to\_position() has been deprecated and been replaced by the
better-named gst\_segment\_position\_from\_running\_time(). At the same time
gst\_segment\_position\_from\_stream\_time() was added, as well as \_full()
variants of both to deal with negative stream time.
- GstController: the interpolation control source gained a new monotonic cubic
interpolation mode that, unlike the existing cubic mode, will never overshoot
the min/max y values set.
- GstNetAddressMeta: can now be read from buffers in language bindings as well,
via the new gst\_buffer\_get\_net\_address\_meta() function
- ID3 tag PRIV frames are now extraced into a new GST\_TAG\_PRIVATE\_DATA tag
- gst-launch-1.0 and gst\_parse\_launch() now warn in the most common case if
a dynamic pad link could not be resolved, instead of just silently
waiting to see if a suitable pad appears later, which is often perceived
by users as hanging -- they are now notified when this happens and can check
their pipeline.
- GstRTSPConnection now also parses custom RTSP message headers and retains
them for the application instead of just ignoring them
- rtspsrc handling of authentication over tunneled connections (e.g. RTSP over HTTP)
was fixed
- gst\_video\_convert\_sample() now crops if there is a crop meta on the input buffer
- The debugging system printf functions are now exposed for general use, which
supports special printf format specifiers such as GST\_PTR\_FORMAT and
GST\_SEGMENT\_FORMAT to print GStreamer-related objects. This is handy for
systems that want to prepare some debug log information to be output at a
later point in time. The GStreamer-OpenGL subsystem is making use of these
new functions, which are [gst\_info\_vasprintf()][gst_info_vasprintf],
[gst\_info\_strdup\_vprintf()][gst_info_strdup_vprintf] and
- [netsim](): a new (resurrected) element to simulate network jitter and
packet dropping / duplication.
- New VP9 RTP payloader/depayloader elements: rtpvp9pay/rtpvp9depay
- New [videoframe_audiolevel]() element, a video frame synchronized audio level element
- New spandsp-based tone generator source
- New NVIDIA NVENC-based H.264 encoder for GPU-accelerated video encoding on
suitable NVIDIA hardware
- [rtspclientsink](), a new RTSP RECORD sink element, was added to gst-rtsp-server
- [alsamidisrc](), a new ALSA MIDI sequencer source element
### Noteworthy element features and additions
- *identity*: new ["drop-buffer-flags"](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-identity.html#GstIdentity--drop-buffer-flags)
property to drop buffers based on buffer flags. This can be used to drop all
non-keyframe buffers, for example.
- *multiqueue*: various fixes and improvements, in particular special handling
for sparse streams such as substitle streams, to make sure we don't overread
them any more. For sparse streams it can be normal that there's no buffer for
a long period of time, so having no buffer queued is perfectly normal. Before
we would often unnecessarily try to fill the subtitle stream queue, which
could lead to much more data being queued in multiqueue than necessary.
- *multiqueue*/*queue*: When dealing with time limits, these elements now use the
new ["GST_BUFFER_DTS_OR_PTS"](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstBuffer.html#GST-BUFFER-DTS-OR-PTS:CAPS)
and ["gst_segment_to_running_time_full()"](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/GstSegment.html#gst-segment-to-running-time-full)
API, resulting in more accurate levels, especially when dealing with non-raw
streams (where reordering happens, and we want to use the increasing DTS as
opposed to the non-continuously increasing PTS) and out-of-segment input/output.
Previously all encoded buffers before the segment start, which can happen when
doing ACCURATE seeks, were not taken into account in the queue level calculation.
- *multiqueue*: New ["use-interleave"](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-multiqueue.html#GstMultiQueue--use-interleave)
property which allows the size of the queues to be optimized based on the input
streams interleave. This should only be used with input streams which are properly
timestamped. It will be used in the future decodebin3 element.
- *queue2*: new ["avg-in-rate"](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer-plugins/html/gstreamer-plugins-queue2.html#GstQueue2--avg-in-rate)
property that returns the average input rate in bytes per second
- audiotestsrc now supports all audio formats and is no longer artificially
limited with regard to the number of channels or sample rate
- gst-libav (ffmpeg codec wrapper): map and enable JPEG2000 decoder
- multisocketsink can, on request, send a custom GstNetworkMessage event
upstream whenever data is received from a client on a socket. Similarly,
socketsrc will, on request, pick up GstNetworkMessage events from downstream
and send any data contained within them via the socket. This allows for
simple bidirectional communication.
- matroska muxer and demuxer now support the ProRes video format
- Improved VP8/VP9 decoding performance on multi-core systems by enabling
multi-threaded decoding in the libvpx-based decoders on such systems
- appsink has a new ["wait-on-eos"](https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-appsink.html#GstAppSink--wait-on-eos)
property, so in cases where it is uncertain if an appsink will have a consumer for
its buffers when it receives an EOS this can be set to FALSE to ensure that the
appsink will not hang.
- rtph264pay and rtph265pay have a new "config-interval" mode -1 that will
re-send the setup data (SPS/PPS/VPS) before every keyframe to ensure
optimal coverage and the shortest possibly start-up time for a new client
- mpegtsmux can now mux H.265/HEVC video as well
- The MXF muxer was ported to 1.x and produces more standard conformant files now
that can be handled by more other software; The MXF demuxer got improved
support for seek tables (IndexTableSegments).
### Plugin moves
- The rtph265pay/depay RTP payloader/depayloader elements for H.265/HEVC video
from the rtph265 plugin in -bad have been moved into the existing rtp plugin
in gst-plugins-good.
- The mpg123 plugin containing a libmpg123 based audio decoder element has
been moved from -bad to -ugly.
- The Opus encoder/decoder elements have been moved to gst-plugins-base and
the RTP payloader to gst-plugins-good, both coming from gst-plugins-bad.
### New tracing tools for developers
A new tracing subsystem API has been added to GStreamer, which provides
external tracers with the possibility to strategically hook into GStreamer
internals and collect data that can be evaluated later. These tracers are a
new type of plugin features, and GStreamer core ships with a few example
tracers (latency, stats, rusage, log) to start with. Tracers can be loaded
and configured at start-up via an environment variable (GST\_TRACER\_PLUGINS).
Background: While GStreamer provides plenty of data on what's going on in a
pipeline via its debug log, that data is not necessarily structured enough to
be generally useful, and the overhead to enable logging output for all data
required might be too high in many cases. The new tracing system allows tracers
to just obtain the data needed at the right spot with as little overhead as
possible, which will be particularly useful on embedded systems.
Of course it has always been possible to do performance benchmarks and debug
memory leaks, memory consumption and invalid memory access using standard
operating system tools, but there are some things that are difficult to track
with the standard tools, and the new tracing system helps with that. Examples
are things such as latency handling, buffer flow, ownership transfer of
events and buffers from element to element, caps negotiation, etc.
For some background on the new tracing system, watch Stefan Sauer's
GStreamer Conference talk ["A new tracing subsystem for GStreamer"][tracer-0]
and for a more specific example how it can be useful have a look at
Thiago Santos's lightning talk ["Analyzing caps negotiation using GstTracer"][tracer-1]
and his ["GstTracer experiments"][tracer-2] blog post. There was also a Google
Summer of Code project in 2015 that used tracing system for a graphical