Since we started depending on GLib 2.44, we can be sure this macro is
defined (it will be a no-op on compilers that don't support it). For
plugins we should just start using `G_DECLARE_FINAL_TYPE` which means we
no longer need the macro there, but for most types in base/gst-libs we
don't want to break ABI, which means it's better to just keep it like it
is (and use the `#ifdef` instead).
Removing gstalsadeviceprobe.[ch] as it was a relique from the 0.10
century.
This doesn't implement device monitoring but only probing, monitoring
should be implemented in its own commit.
This means we can use some newer features and get rid of some boilerplate code using the G_DECLARE_* macros.
As discussed on IRC, 2.44 is old enough by now to start depending on it.
The problem is that Gobject Introspections does not understand the const
gfloat matrix[16] as an matrix but as an array of gfloasts but as just
one gfloat.
To fix this i added the annotation to the parameter
descriptions.
This came up in the case where v4l2 sets caps with colorimetry=NULL, and
then tries to parse back the colorimetry, causing a crash in
gst_video_get_colorimetry() because of g_str_equal(). We fix this by
making sure the only caller of the function never calls it with a null
colorimetry string.
SMPTE ST 2084 transfer characteristics (a.k.a ITU-R BT.2100-1 perceptual quantization, PQ)
is used for various HDR standard.
With ST 2084, we can represent BT 2100 (Rec. 2100). BT 2100 defines
various aspect of HDR such as resolution, transfer functions, matrix, primaries
and etc. It uses BT2020 color space (primaries and matrix) with PQ or HLG
transfer functions.
Protocols that are in the stream_uris list should always
be streams, no matter what they respond to the scheduling
query. The flag in the scheduling query is just another
way to declare something that needs buffering without the
whitelist, the absence of the flag shouldn't make us ignore
our known protocol list.
Also set is_stream always to a boolean and not a mask value.
The code for this is mostly lifted from audiobuffersplit, it
allows use cases such as keeping the buffers output by compositor
on one branch and audiomixer on another perfectly aligned, by
requiring the compositor to output a n/d frame rate, and setting
output-buffer-duration to d/n on the audiomixer.
The old output-buffer-duration property now simply maps to its
fractional counterpart, the last set property wins.
If the last WebVTT cue does not have an empty line after it, or if it
does not end with a newline at all, it does not get pushed out and it
won't be displayed.
gst_sub_parse_sink_event() already handles the issue for other subtitle
formats, enable handling it for GST_SUB_PARSE_FORMAT_VTT too.
While at it also add a test for this case.
Packed 10 bits per each R, G and B channel with MSB 2bits alpha channel.
This format is mapped to Windows' DXGI_FORMAT_R10G10B10A2_UNORM format which is
required for 10bits HDR rendering.
Note that this RGB10A2_LE format is R - B channel swapped version of BGR10A2_LE