avwait can now be configured to stop when a given timecode has been
reached. It will start at the timecode indicated with start-timecode and
end at the timecode indicated with end-timecode. If end-timecode is
NULL (default), the previous functionality is preserved: keep going and
not end.
https://bugzilla.gnome.org/show_bug.cgi?id=789403
* Avoid copying the pending data and instead create a buffer directly from
that data with the appropriate offset.
* Locate the jp2k magic to determine the exact location of the (first) frame
data instead of assuming that the header is of an expected size
https://bugzilla.gnome.org/show_bug.cgi?id=786111
The jp2k specification (ITU-T T.800) specifies that the 'brat' box
has two fields and the second one (AUF2) can be set to 0 for progressive
streams.
The problem is that the mpeg-ts specification (ITU-T H.222.0 06/2012)
says that the AUF2 field is only present if the stream is interlaced
In order to cope with both situation, accept those next 32bit if the
stream is marked as progressive and those bits contain 0
https://bugzilla.gnome.org/show_bug.cgi?id=786111
Crossfading is a bit more complex than just having two pads with the
right keyframes as the blending is not exactly the same.
The difference is in the way we compute the alpha channel, in the case
of crossfading, we have to compute an additive operation between
the destination and the source (factored by the alpha property of both
the input pad alpha property and the crossfading ratio) basically so
that the crossfade result of 2 opaque frames is also fully opaque at any
time in the crossfading process, avoid bleeding through the layer
blending.
Some rationnal can be found in https://phabricator.freedesktop.org/T7773.
https://bugzilla.gnome.org/show_bug.cgi?id=784827
These elements allow splitting a pipeline across several processes,
with communication done by the ipcpipelinesink and ipcpipelinesrc
elements. The main use case is to split a playback pipeline into
a process that runs networking, parser & demuxer and another process
that runs the decoder & sink, for security reasons.
https://bugzilla.gnome.org/show_bug.cgi?id=752214
This allows us to know exactly where in the material track we are, and
how to convert from a PTS for a source track to the actual PTS of the
material track (i.e. by adding the component start position).
https://bugzilla.gnome.org/show_bug.cgi?id=785119
While the size in the packet is only 16 bits, we need to handle bigger
sizes without overflowing. For video streams this can happen, 0 is
written to the stream instead.
This fixes muxing of buffers >= 2**16.
In this case, we assume that the format is jpc, and we infer the color
space from the number of components. This allows the parser to process a
jpc disk file coming from a filesrc element.
https://bugzilla.gnome.org/show_bug.cgi?id=783291
It is only relevant in deciding whether or not send SEGMENT_DONE.
In this case, not detecting EOS leads to a busy loop when encountering
the originally recorded end-of-file of a file that is still growing.
While only filler packets should be allowed, for good measure also skip
any other KLV packets in the range where there could be index table
segments.
This fixes parsing of partitions with multiple index table segments,
which are separated by a filler packet, or other packets.
This is needed to know the PTS, without that we only know the DTS and
using that also for the PTS is wrong unless we have an intra-only codec.
If we can't get the temporal reordering from the index table, don't set
any PTS for non-intra-only codecs and let decoders figure out something.
https://bugzilla.gnome.org/show_bug.cgi?id=784027
When retrieving the `mxfdemux.structure` property, it leads to an
assertion as metadatas need to be resolved for the call to
mxf_metadata_base_to_structure to be valid.
The RSIZ capabilities tag stores the JPEG 2000 profile. In the case of
broadcast profiles, it also stores the broadcast main level, which
specifies the bit rate.
https://bugzilla.gnome.org/show_bug.cgi?id=782337
Also swap the linktype after we detected that we need to do
byteswapping. Fixes a problem with reading pcap files generated
on a machine with different endianness.
When caps changes while streaming, the new caps was getting processed
immediately in videoaggregator, but the next buffer in the queue that
corresponds to this new caps was not necessarily being used immediately,
which resulted sometimes in using an old buffer with new caps. Of course
there used to be a separate buffer_vinfo for mapping the buffer with its
own caps, but in compositor the GstVideoConverter was still using wrong
info and resulted in invalid reads and corrupt output.
This approach here is more safe. We delay using the new caps
until we actually select the next buffer in the queue for use.
This way we also eliminate the need for buffer_vinfo, since the
pad->info is always in sync with the format of the selected buffer.
https://bugzilla.gnome.org/show_bug.cgi?id=780682
When there are more than 64 channels, we don't want to exceed the
bounds of the ordering_map buffer, and in these cases we don't want to
remap at all. Here we avoid doing that.
Based on a patch originally for plugins-good/interleave in
https://bugzilla.gnome.org/show_bug.cgi?id=780331
This duplicated property is no longer needed as there is now API to
allow bindings access GST_TYPE_ARRAY (see gst_util_get/set/object_array).
Additionnally, Python has proper overrides which will make this looks
like Python. A 2x2 matrix would be set this way:
element = matrix = Gst.ValueArray(Gst.ValueArray([1.0, -1.0]),
Gst.ValueArray([1.0, -1.0))
Notice that you need to "cast" each arrays to Gst.ValueArray, otherwise
there is an ambiguity between Gst.ValueArray and Gst.ValueList list type.
Fortunatly, Gst.ValueArray implements the Sequence interface, so it can
be indexed like normal python matrix.
Inserts AU delimeter by default if missing au delimeter from upstream.
This should be done only in case of byte-stream format.
Note that:
We have to compensate for the new bytes added for the AU, otherwise
insertion of PPS/SPS will use wrong offsets and overwrite wrong data.
Also mark the AU delimiter blob const, and use frame->out_buffer for
storing the output to keep baseparse assumptions valid.
Original-Patch-By: Michal Lazo <michal.lazo@mdragon.org>
Helped by Sebastian Dröge <sebastian@centricular.com>
https://bugzilla.gnome.org/show_bug.cgi?id=736213
Those are the rules:
In the SPS:
* if frame_mbs_only_flag=1 => all frame progressive
* if frame_mbs_only_flag=0 => field_pic_flag defines if each frame is
progressive or interlaced, thus the mode is 'mixed' in GStreamer
terms.
https://bugzilla.gnome.org/show_bug.cgi?id=779309
Doing lazy conversion of PCR values doesn't work right
when a PCR discont is encountered. Instead, convert PCR
values to the continuous timestamp domain as soon as we
encounter them and store that instead.
This element transforms a given number of input channels into a given number of
output channels according to a given transformation matrix. The matrix
coefficients must be between -1 and 1. In the auto mode, input/output channels
are automatically negotiated and the transformation matrix is a truncated or
zero-padded identity matrix.
https://bugzilla.gnome.org/show_bug.cgi?id=777376
See https://bugzilla.gnome.org/show_bug.cgi?id=773666
This would ideally be solved in baseparse but that requires further
thought at this point, and in the meantime it would be good to have
rawbaseparse not assert on this but handle it gracefully instead.