In push mode, we determine duration by doing a seek to the end of the
stream. However, a skeleton stream with an index will cause the duration
to be known already, and we end up never setting the push_time_duration
variable which we use to know duration has been determined.
https://bugzilla.gnome.org/show_bug.cgi?id=662049
The codec setup headers are a lot more likely to have correct information,
especially as it's easy to remux a skeleton in a file where streams don't
have the same parameters (I've even seen a file with two skeletons).
Still, this is useful in the case we have a codec we can't decode, so we
can at least (theoretically) convert granpos to time, so we discard this
information if the codec setup has already provided it.
This fixes playback on (at lesat) the original archive.org encoding of
"The Night of the Living Dead" (now replaced by another encoding).
https://bugzilla.gnome.org/show_bug.cgi?id=612443
This could happen when testing with navseek, and pressing
right and left at roughly the same time. The current chain
is temporarily moved away, and this caused the flush events
not to be sent to the source pads, which would cause the
data queues downstream to reject incoming data after the
seek, and shut down, wedging the pipeline.
Now, I can't really decide whether this is a nasty steaming
hack or a good fix, but it certainly does fix the issue, and
does not seem to break anything else so far.
https://bugzilla.gnome.org/show_bug.cgi?id=621897
This patch implements seeking in push mode (eg, over the net)
in Ogg, using the double bisection method.
As a side effect, it also fixes duration determination of network
streams, by seeking to the end to check the actual duration.
Known issues:
- Getting an EOS while seeking stops the streaming task, I can't
find a way to prevent this (eg, by issuing a seek in the event
handler).
- Seeking twice in a VERY short succession with playbin2 fails
for streams with subtitles, we end up pushing in a dataqueue
which is flushing. Rare in normal use AFAICT.
- Seeking is slow on slow links - byte ranges guesses could be
made better, decreasing the number of required requests
- If no granule position is found in the last 64 KB of a stream,
duration will be left unknown (should be pretty rare)
https://bugzilla.gnome.org/show_bug.cgi?id=621897
The first packet of a sparse stream may arrive after an initial
delay in the stream. If ogg_stream_packetout reports a discontinuity
in a sparse stream, do not propagate it to other streams in the
chain unnecessarily.
https://bugzilla.gnome.org/show_bug.cgi?id=621897
If both quality and bitrate are set, libtheora will try to meet
both constraints, causing it to prefer emitting a smaller number
of good frames, to emitting the full number of frames that would
not meet the requested quality. This causes a slideshow effect
when the bitrate is low and the quality is high. And the default
theoraenc is high (48/63).
So only set quality when it is requested, and leave it unset
otherwise.
https://bugzilla.gnome.org/show_bug.cgi?id=658443
Remove the _ in front of the endianness prefix.
Remove the _3 postfix for the 24 bits formats.
Add a _32 postfix after the formats that occupy extra space beyond their
natural size.
The result is that the GST_AUDIO_NE() macro can simply append the endianness
after all formats and that we only specify a different sample width when it is
different from the natural size of the sample. This makes things more consistent
and follows the pulseaudio conventions instead of the alsa ones.
After all, we do hope to find actual data for these streams.
However, warn if we could not set up a chain when we find a
non BOS page, as that means we don't have a valid Ogg stream.
https://bugzilla.gnome.org/show_bug.cgi?id=657151
While the casual reader might end up bewildered by just why this
change might increase clarity, it just happens than, in the libogg
and associated sources, op is the canonical name for an ogg_packet
whlie og is the canonical name for an ogg_page, and reading this
code confuses me.
https://bugzilla.gnome.org/show_bug.cgi?id=657151
Headers are inherently durationless.
Instead, set duration to 0 to avoid increasing tracked granpos,
and do not warn about it, since it is totally expected.
https://bugzilla.gnome.org/show_bug.cgi?id=657151
Version written is 3.0
Base times are left empty for now.
Content-Type should be the MIME type of the stream. It is set to
the GStreamer media type for now, which is probably the same for
the streams oggmux supports.
https://bugzilla.gnome.org/show_bug.cgi?id=563251
Make enums for the chroma siting for easier use in the videoinfo.
Make enums for the color range, color matrix, transfer function and the
color primaries. Add these values to the video info structure in a Colorimetry
structure. These values define the exact colors and are needed to perform
correct colorspace conversion. Use a couple of predefined colorimetry specs
because in practice only a few combinations are in use.
Add view_id to the video frames to identify the view this frame represents in
multiview video.
Remove old gst_video_parse_caps_framerate, use the videoinfo for this.
Port elements to new colorimetry info.
Remove deprecated colorspace property from videotestsrc.
vorbisenc currently reacts in a rater draconian fashion if input
timestamps are more than 1/2 sample off what it considers ideal. If data
is 'too late' it truncates buffers, if it is 'too soon' it completely
shuts down encode and restarts it. This is causingvorbisenc to produce
corrupt output when encoding data produced by sources with bugs that
produce a smple or two of jitter (eg, flacdec)
If ints are 64 bits, 32 bits should get promoted in varargs anyway,
and we don't care about 16 bit ints.
This makes the code a lot more readable, and still gets us nice
hexadecimal 32 bit serialnos.
https://bugzilla.gnome.org/show_bug.cgi?id=656775