Seekability, like duration, etc is unlikely to change (frequently), and
the default assumption covers most cases, so let subclass set when needed.
At the same time, allow subclass to indicate if it has seek-metadata (table)
available, and possibly have it provide an average bitrate.
This allows the child class to chain its event handler with
GstBaseParse, so that subclasses don't have to duplicate all the default
event handling logic.
https://bugzilla.gnome.org/show_bug.cgi?id=622276
We wait to parse a minimum number of frames (10, arbitrarily) before
emiting bitrate tags so that our early estimates are not wildly
inaccurate for streams that start with a silence. If the stream ends
before that, we just emit the tags anyway.
While it _would_ be nicer to be specify the threshold to start pushing
the tags in terms of duration, this would introduce more complexity than
this merits.
https://bugzilla.gnome.org/show_bug.cgi?id=614991
This is optional because it's a quite expensive operation and it's very
unlikely that a non-frame is detected as frame after the header CRC check
and checking all bits for valid values. The overall frame checksums are
mainly useful to detect inconsistencies in the encoded payload.
When called from the GST_FLAC_PARSE_STATE_HEADERS case,
gst_flac_parse_hand_headers() does a gst_buffer_set_caps() on a buffer
with refcount > 1. This change handles this case by making the buffer
metadata_Writable.
https://bugzilla.gnome.org/show_bug.cgi?id=614037
This patch adds the get_frame_overhead() vfunc so that baseparse can
accurately calculate the min/avg/max bitrates for aacparse.
Note: The bitrate was being incorrectly calculated for ADTS streams
(it's not in the header as the code suggests).
This makes baseparse keep a running average of the stream bitrate, as
well as the minimum and maximum bitrates. Subclasses can override a
vfunc to make sure that per-frame overhead from the container is not
accounted for in the bitrate calculation.
We take care not to override the bitrate, minimum-bitrate, and
maximum-bitrate tags if they have been posted upstream. We also
rate-limit the emission of bitrate so that it is only triggered by a
change of >10 kbps.
Because config.h defines __MSVCRT_VERSION__, which should be defined
before inclusion of any system header.
Also fixes mpegdemux Makefile.am LIBADD typo.
Fixes#606665
Perform sanity check on type of seek, and only perform one that is
appropriately supported. Adjust downstream newsegment event
to first buffer timestamp that is sent downstream.
In particular, consider DISCONT == !sync, and allow subclass to query
sync state, as it may want to perform additional checks depending
on whether sync was achieved earlier on.
Also arrange for subclass to query whether leftover data is being drained.
In particular, (optionally) provide baseparse with a notion of frames per second
(and therefore also frame duration) and have it track frame and byte counts.
This way, subclass can provide baseparse with fps and have it provide default
buffer time metadata and conversions, though subclass can still install
callbacks to handle such itself.
After all, stream is as-is, and there is little molding to downstream's
taste that can be done. If subclass can and wants to do so, it can
still override as such.
Also handle the case gracefully where the subclass decides to drop
the first buffers and has no caps set yet. It's still required to
have valid caps set when the first buffer should be passed downstream.
In one case we extracted the sample rate index from the codec data
and saved it as sample rate rather than getting the real sample
rate from the table. Fix that, and also make sure we don't access
non-existant table entries by adding a small helper function that
guards against out-of-bounds access in case of invalid input data.
Create output caps from input caps, so we maintain any fields we
might get on the input caps, such as codec_data or rate and channels.
Set channels and rate on the output caps if we don't have input caps
or they don't contain such fields. We do this partly because we can,
but also because some muxers need this information. Tagreadbin will
also be happy about this.
Sending the flush-start event forward before taking the stream lock actually
works, in contrast to deadlocking in downstream preroll_wait (hunk 1).
After that we get the chain function being stuck in a busy loop. This is fixed
by updating the minimum frame size inside the synchronization loop because the
subclass asks for more data in this way (hunk 2).
Finally, this leads to a very probable crash because the subclass can find a
valid frame with a size greater than the currently available data in the
adapter. This makes the subsequent gst_adapter_take_buffer call return NULL,
which is not expected (hunk 3).
The problem is that after a discont, set_min_frame_size(1024) is called when
detect_stream returns FALSE. However, detect_stream calls check_adts_frame
which sets the frame size on its own to something larger than 1024. This is the
same situation as in the beginning, so the base class ends up calling
check_valid_frame in an endless loop.
Baseparse internaly breaks the semantics of a _chain function by calling it with
buffer==NULL. The reson I belived it was okay to remove it was that there is
also an unchecked access to buffer later in _chain. Actually that code is wrong,
as it most probably wants to set discont on the outgoing buffer.