This was completely broken before and could only work on a very constrained
set of files. After these changes it should work except for situations where
PTS != DTS, which is not handled at all in mxfdemux currently.
https://bugzilla.gnome.org/show_bug.cgi?id=759118
According to S377-1-2009c 9.2 the local tags must not be resolved from the
primer pack, which as a result means that there can't be any other tags than
statically assigned ones.
This is to support byte-stream decoder that does not remember the
PPS/SPS after a flush. This is not needed by all decoders, but is
harmless for those that do remember.
https://bugzilla.gnome.org/show_bug.cgi?id=758405
The order in which program switch must happen is:
1) drain all data on old pads (but don't push EOS)
2) add new pads (but don't push any data on them)
3) Push EOS and remove old pads
4) Start pushing data on new pads
There was one caveat in this implementation, which is that when
we activate a sparse pad (step 2) we would push a GAP event. The problem
is that, while being an event, it is actually *data*.
We therefore need to make sure pushing those GAP event is done at the step
we start pushing data.
https://bugzilla.gnome.org/show_bug.cgi?id=750402
Before we add any streams, make sure we drain all streams. This ensures
there's consistency that only "new" data will be pushed on buffers once
the new pads are added
https://bugzilla.gnome.org/show_bug.cgi?id=750402
When changing programs, the order of events needs to be the following:
* add pads from new program
* send EOS on old pads
* remove old pads
* emit 'no-more-pads'
Previously tsdemux was not doing that, and was first deactivating and
removing old pads before adding new ones.
We fix this by allowing subclasses of mpegtsbase to be able to handle
themselves the deactivation of programs. In this case tsdemux will
properly deactivate it once it has activated the new program.
https://bugzilla.gnome.org/show_bug.cgi?id=750402
The videoframe-audiolevel element acts like a synchronized audio/video "level"
element. For each video frame, it posts a level-style message containing the
RMS value of the corresponding audio frames. This element needs both video and
audio to pass through it. Furthermore, it needs a queue after its video
source.
https://bugzilla.gnome.org/show_bug.cgi?id=748259
0x04 signifies a MPEG elementary stream but according to RP2008, 0x10 should
be used for a h264 byte-stream. This also fixes compatibility of our files
with ffmpeg.
If packet->payload_unit_start_indicator is true and pointer 0, there is no
discontinuity check. Therefore there could be a previous section not complete
that need to be cleared.
https://bugzilla.gnome.org/show_bug.cgi?id=758010
The values of channel_mapping are copied by gst_codec_utils_opus_create_caps ()
but it doesn't free or take ownership of the g_new0 allocated memory. This
needs to be freed before going out of scope.
CID 1338692
buf surely isn't NULL inside the block conditional to a buffer size bigger
than (G_MAXUINT16 - 3). Plus gst_buffer_unref() checks if the buffer is
NULL and does nothing if it is.
CID 1338693
If tsdemux never receives data for a stream, the corresponding pad will never
be added and stream->active will remain FALSE. When the stream is removed, the
pad will not be unreffed and will be leaked.
https://bugzilla.gnome.org/show_bug.cgi?id=757873
The current implementation for detecting the resolution changes
on key frames is based on vp8 bitstream alignment. Avoid this
width and height parsing for vp9 bitstream, which requires proper
frame header parsing inorder to detect the resolution change (Fixme).
https://bugzilla.gnome.org/show_bug.cgi?id=757825
It is up to the element handling the seek to send flush events
downstream, otherwise we end up with a situation where upstream
would get unexpected GST_FLOW_FLUSHING
The Onvif Streaming Specification specifies that the NTP timestamps
in the Onvif extension header indicaes the absolute UTC time associated
with the access unit. But by using running time we can not achieve that,
since a frame's running time depends on the played interval, whether a
non-flushing is done, etc. Instead we have to use the stream time.
https://bugzilla.gnome.org/show_bug.cgi?id=757688
It is now possible to update the currently used ntp-offset with a
custom serialized downstream event. The element will read the ntp-offset
property when doing the state transition from READY to PAUSED and
use that offset until it receives a "GstNtpOffset" event, which also
has a "ntp-offset" attribute in that it's structure. In case the
property is not set and no event has been received, the element will
guess the npt-offset with help of the clock. If no clock can be
retrieved, the element will error out and stop the data flow.
The same event is also used for updating the D/E-bits in the RTP
extension header. The discont flag in a buffer can be set whenver a
live/network source looses a frame, but that is not the type of
discontinuity that the onvif extension header should reflect. The
header is mainly used for playback of a track concept, in which
gaps can be present, and it's those kind of gaps that should be
highlighted with the D- and E-bits.
https://bugzilla.gnome.org/show_bug.cgi?id=757688
If a buffer or a buffer list is cached, no events serialized with the
data stream should get through. The cached buffers and events should
be purged when we stop flushing.
https://bugzilla.gnome.org/show_bug.cgi?id=757688