Don't emit NAL HRD parameters for now in the SPS headers because the
SEI buffering_period() and picture_timing() messages are not handled
yet. Some additional changes are necessary to get it right.
https://bugzilla.gnome.org/show_bug.cgi?id=722734
Fix default CPB buffer size to something more reasonable (1500 ms)
and that still fits the level limits. This is a non configurable
property for now. The initial CPB removal delay is also fixed to
750 ms.
https://bugzilla.gnome.org/show_bug.cgi?id=722087
Round down the calculated, or supplied, bitrate (kbps) into a multiple
of the HRD bitrate scale factor. Use a bitrate scale factor of 64 so
that to have less losses in precision. Likewise, don't round up because
that could be a strict constraint imposed by the user.
Fix the level calculation involving bitrate limits. Since we are
targetting NAL HRD conformance, the check against MaxBR from the
Table A-1 limits shall involve cpbBrNalFactor depending on the
active profile.
Submit sequence parameter buffers only once, or when the bitstream
was reconfigured in a way that requires such. Always submit packed
sequence parameter buffers at I-frame period, if the VA driver needs
those.
https://bugzilla.gnome.org/show_bug.cgi?id=722737
Make sure to submit the packed headers only if the underlying VA driver
requires those. Currently, only handle packed sequence and picture
headers.
https://bugzilla.gnome.org/show_bug.cgi?id=722737
The VAEncSequenceParameterBuffer.ip_period value reprents the distance
between the I-frame and the next P-frame. So, this also accounts for
any additional B-frame in the middle of it.
This fixes rate control heuristics for certain VA drivers.
https://bugzilla.gnome.org/show_bug.cgi?id=722735
Fix level characterisation when the bitrate is automatically computed
from the active coding tools. i.e. ensure the bitrate once the profile
is completely characterized but before the level calculation process.
Document and rename a few functions here and there. Drop code that
caps num_bframes variable in reset_properties() since they shall
have been checked beforehand, during properties initialization.
Clean-up GstBitWriter related utility functions and simplify notations.
While we are at it, also make bitstream writing more robust should an
overflow occur. We could later optimize for writing headers capped to
their maximum possible size by using the _unchecked() helper variants.
Drop private header since it was originally used to expose internals
to the plugin element. The proper interface is now the properties API,
thus rendering private headers totally obsolete.
Various clean-ups to improve consistency and readability: rename some
variables, drop unused macro definitions, drop initialization of vars
that are zero-initialized from the base class, drop un-necessary casts,
allocate GPtrArrays with a destroy function.
Fix frame cropping rectangle calculation to handle horizontal resolutions
that don't match a multiple of 16 pixels, but also the vertical resolution
that was incorrectly computed for progressive sequences too.
https://bugzilla.gnome.org/show_bug.cgi?id=722089
For non "Constant-QP" modes, we could provide more reasonable heuristics
for the target bitrate. In general, 48 bits per macroblock with all the
useful coding tools enable looks safe enough. Then, this rate is raised
by +10% to +15% for each coding tool that is disabled.
https://bugzilla.gnome.org/show_bug.cgi?id=719699
Add support for "high-compression" tuning option. First, determine the
largest supported profile by the hardware. Next, check any target limit
set by the user. Then, enable each individual coding tool based on the
resulting profile_idc value to use.
https://bugzilla.gnome.org/show_bug.cgi?id=719696
Allow user to precise the largest profile to use for encoding due
to target decoder constraints. For instance, if CABAC entropy coding
mode is requested by "constrained-baseline" profile only is desired,
then an error is returned during codec configuration.
Also make sure that the suitable profile we derived actually matches
what the HW can cope with.
https://bugzilla.gnome.org/show_bug.cgi?id=719694
Refine the heuristic to determine the maximum size of a coded buffer
to account for the exact number of slices. set_context_info() is the
last step during codec reconfiguration, no additional change is done
afterwards, so re-using the num_slices field here is fine.
https://bugzilla.gnome.org/show_bug.cgi?id=719953
Automatically derive the minimum profile and level to be used for
encoding, based on the activated coding tools. The encoder will
be trying to generate a bitstream that has the best chances to be
decoded on most platforms by default.
Also change the default profile to "constrained-baseline" so that
to ensure maximum compatibility when the stream is decoded.
https://bugzilla.gnome.org/show_bug.cgi?id=719691
Fix lookup for a suitable HW profile, as to be used by the underlying
hardware, based on heuristics that lead to characterize the SW profile,
i.e. the one used by the SW level encoding logic.
Also fix constraint_set0_flag (A.2.1) and constraint_set1_flag (A.2.2)
as they should respectively match the baseline and main profile.
https://bugzilla.gnome.org/show_bug.cgi?id=719827
The libgstvaapi core encoders are meant to support raw bitstreams only.
Henceforth, we are always producing a stream in "byte-stream" format.
However, the "codec-data" buffer which holds SPS and PPS headers is
always available. The "lengthSizeMinusOne" field is always set to 3
so that in-place "byte-stream" format to "avc" format conversion could
be performed.
Various clean-ups to improve consistency and readability: rename some
variables, drop unused macro definitions, drop initialization of vars
that are zero-initialized from the base class, drop un-necessary casts.
Add encoder "tune" option to override the default behaviour that is to
favor maximum decoder compatibility at the expense of lower compression
ratios.
Expected tuning options to be developed are:
- "high-compression": improve compression, target best-in-class decoders;
- "low-latency": tune for low-latency decoding;
- "low-power": tune for encoding in low power / resources conditions.
Add gst_vaapi_encoder_set_keyframe_period() interface to allow the
user control the maximum distance between two keyframes. This new
property can only be set prior to gst_vaapi_encoder_set_codec_state().
A value of zero for "keyframe-period" gets it re-evaluated to the
actual framerate during encoder reconfiguration.
Improve codec reconfiguration to be performed only through a single
function. That is, remove the _set_context_info() hook as subclass
should not alter the parent GstVaapiContextInfo itself. Besides, the
VA context is constructed only at the final stages of reconfigure().
Add interface to communicate the encoder resolution and related info
like framerate, interlaced vs. progressive, etc. This new interface
supersedes gst_vaapi_encoder_set_format() and doesn't use any GstCaps
but rather use GstVideoCodecState.
Note that gst_vaapi_encoder_set_codec_state() is also a synchronization
point for codec config. This means that the encoder is reconfigured
there to match the latest properties.
Add interface to communicate configurable properties to the encoder.
This covers both the common ones (rate-control, bitrate), and the
codec specific properties.
https://bugzilla.gnome.org/show_bug.cgi?id=719529
Add gst_vaapi_encoder_set_bitrate() interface to allow the user control
the bitrate for encoding. Currently, changing this parameter is only
valid before the first frame is encoded. Should the value be modified
afterwards, then GST_VAAPI_ENCODER_STATUS_ERROR_OPERATION_FAILED is
returned.
https://bugzilla.gnome.org/show_bug.cgi?id=719529
Add gst_vaapi_encoder_set_rate_control() interface to request a new
rate control mode for encoding. Changing the rate control mode is
only valid prior to encoding the very first frame. Afterwards, an
error ("operation-failed") is issued.
https://bugzilla.gnome.org/show_bug.cgi?id=719529
The previous fix was only valid to express the maximum size of the
macroblock layer, i.e. without any headers. Now, also account for
the slice headers and top picture header, but also any other header
we might stuff into the VA coded buffer, e.g. sequence headers.
Fix coded buffer size for each codec. A generic issue was that the
number of macroblocks was incorrectly computed. The second issue was
specific to MPEG-2 were the max number of bits per macroblock, and
as defined by the standard, was incorrectly mapped to the (lower)
H.264 requirement. i.e. 4608 bits vs. 3200 bits limit.
Change get_context_info() into a set_context_info() function that
initializes common defaults into the base class, thus allowing the
subclasses to specialize the context info further on.
The set_context_info() hook is also the location where additional
context specific data could be initialized. At this point, we are
guaranteed to have valid video resolution size and framerate. i.e.
gst_vaapi_encoder_set_format() was called beforehand.
Clean public APIs up so that to better align with the decoder APIs.
Most importantly, gst_vaapi_encoder_get_buffer() is changed to only
return the VA coded buffer proxy. Also provide useful documentation
for the public APIs.
Fix the GstVaapiEncoderClass parent class type. Make sure to validate
subclass hooks as early as possible, i.e. in gst_vaapi_encoder_init(),
thus avoiding useless run-time checks. Also simplify the subclass
initialization process to be less error prone.
Refactor the GstVaapiCodedBuffer APIs so that to more clearly separate
public and private interfaces. Besides, the map/unmap APIs should not
be exposed as is but appropriate accessors should be provided instead.
* GstVaapiCodedBuffer: VA coded buffer abstraction
- gst_vaapi_coded_buffer_get_size(): get coded buffer size.
- gst_vaapi_coded_buffer_copy_into(): copy coded buffer into GstBuffer
* GstVaapiCodedBufferPool: pool of VA coded buffer objects
- gst_vaapi_coded_buffer_pool_new(): create a pool of coded buffers of
the specified max size, and bound to the supplied encoder
* GstVaapiCodedBufferProxy: pool-allocated VA coded buffer object proxy
- gst_vaapi_coded_buffer_proxy_new_from_pool(): create coded buf from pool
- gst_vaapi_coded_buffer_proxy_get_buffer(): get underlying coded buffer
- gst_vaapi_coded_buffer_proxy_get_buffer_size(): get coded buffer size
Rationale: more optimized transfer functions might be provided in the
future, thus rendering the map/unmap mechanism obsolete or sub-optimal.
https://bugzilla.gnome.org/show_bug.cgi?id=719775