It contains videocrop ! videoscale ! capsfilter and implements digital
zooming.
At this moment, it is a private element of the camerabin plugin.
This will remove some code used in wrappercamerabinsrc to make
code clearer and digitalzoom can potentially be used by other
applications in the future, it has nothing camerabin specific.
Skip the byte alignment bits as per the logic of byte_alignment()
provided in hevc specification. This will fix the calculation of
slice header size.
https://bugzilla.gnome.org/show_bug.cgi?id=747613
If 95% of the code of an example app consists of GObject
code, maybe that's defeating the point a little. So just
remove a lot of that and trim down the example to the
absolute minimum. Also removes the last remaining GPL3
licensed code in -bad.
GstPhotography enables new paths in wrappercamerabinsrc that allows
the source to be notified about the capture caps and provide an
alternative caps if desired bypassing the negotiation (this doesn't
seem like a good idea these days). To make sure it keeps working
until we remove it from the API in favor of standard caps negotiation
features this test was added.
It adds 3 extra tests with a simple test source that will:
1) Test that capturing with ANY caps work
2) Test that capturing with a fixed caps work
3) Test that capturing with a fixed caps and having the source
pick a different resolution from GstPhotography API works
by having wrappercamerabinsrc crop the capture to the final
requested dimensions
wrappercamerabinsrc has a videocrop element to be used for
zooming and for cropping when input caps is different when used
with the GstPhotography interface. The zooming part needs
the following elements:
capsfilter ! videocrop ! videoscale ! capsfilter
The capsfilters should always have the same caps to ensure the
zooming is done and preserves dimensions, unless when it is needed
to do more cropping due to input dimensions those caps
need to be modified accordingly to preserve the output dimensions.
This, however, makes it hard to get caps negotiation to work properly
as we need to have different caps in the capsfilters to account for
the extra cropping needed. It could be simple for fixed caps but it
gets tricky with unfixed ones.
To solve this, this patch splits the zooming and dimension reduction
cropping into 2 separate videocrop elements. The first one does
the dimension cropping, which is only needed when the GstPhotography
API is used and the source provides a caps that is different than
what is requested, while the second is dedicated to zoom crop only.
The first part of the pipeline goes from:
src ! videoconvert ! capsfilter ! videocrop ! videoscale ! capsfilter
to
src ! videocrop ! videoconvert ! capsfilter ! videocrop ! videoscale ! capsfilter
It might add an extra overhead in the image capture as the image might need
to be cropped twice but this can be solved by enabling videocrop to use
crop metas so only the later one does the real cropping.
It also makes the code a bit simpler.
Even for "live" streams we are not live in the GStreamer meaning of the word.
We don't produce buffers that are timestamped based on their "capture time"
and our clock, but just based on whatever timestamps the stream might contain.
Also even if we wanted to claim to be live, that wouldn't work well as we
would have to return GST_STATE_CHANGE_NO_PREROLL when going from READY to
PAUSED, which we can't. We first need data to know if we are "live" or not.
It's better to just select some random variant playlist instead of stopping,
chances are that it's still continuing to work and we might just have to
select a different variant again later.
We should only refresh the currently selected variant playlist (if any,
otherwise the main playlist), not the main playlist. And only try to
refresh the main playlist if updating the variant playlist fails.
Some servers (Wowza) use the request of the main playlist to create a
"session", which is then part of the URI of the variant playlist and
also the fragments. Refreshing the main playlist would generate a new
session, and the server rate limits that usually. And after a few retries
the server just kicks us out.
Also as a side effect we now use the same downloader for all playlists, so
that we only have 2 instead of 3 connections to the server. And also
previously we just ignored the downloaded data from the main playlist that
the base class gave to us.