Change the hard-coded range of quality-level from {1-8} to {1-7},
since it is the range Intel Open source driver supports.
Also perform the range clamping only if the user provided
quality-level is greater than the max-range suppored by the driver,
because there could be non-intel drivers giving lower value than
the hard-coded max value 7.
https://bugzilla.gnome.org/show_bug.cgi?id=783567
Refactor the set_config() virtual method considering a cleaner
approach to allocator instanciation, if it it not set or if it is
not valid for the pool.
https://bugzilla.gnome.org/show_bug.cgi?id=783599
The vaapi video decoders might have different allocation caps from
the negotiation caps, thus the GstVideoMeta shall use the negotiation
caps, not the allocation caps.
This was done before reusing gst_allocator_get_vaapi_video_info(),
storing there the negotiation caps if they differ from the allocation
ones, but this strategy felt short when the allocator had to be reset
in the vaapi buffer pool, since we need both.
This patch adds gst_allocator_set_vaapi_negotiated_video_info() and
gst_allocator_get_vaapi_negotiated_video_info() to store the
negotiated video info in the allocator, and distinguish it from
the allocation video info.
https://bugzilla.gnome.org/show_bug.cgi?id=783599
Renamed local video info structure names in set_config() vitual
method. The purpose of their renaming is to clarify the origin
of those structures, whether come from passed caps parameter
(new_allocation_vinfo) or from the configured allocator
(allocator_vinfo).
https://bugzilla.gnome.org/show_bug.cgi?id=783599
Renamed private GstVideoInfo structure video_info to allocation_vinfo
and alloc_info to negotiated_vinfo.
The purpose of these renaming is to clarify the origin and purpose of
these private variables:
video_info (now allocation_vinfo) comes from the bufferpool
configuration. It describes the physical video resolution to be
allocated by the allocator, which may be different from the
negotiated one.
alloc_info (now vmeta_vinfo) comes from the negotiated caps in
the pipeline. It represents how the frame is going to be mapped
using the video meta.
In Intel's VA-API backend, the allocation_vinfo resolution is
bigger than the negotiated_info.
https://bugzilla.gnome.org/show_bug.cgi?id=783599
Just set the framerate parameter if the framerate numerator and
denominator are bigger than zero.
Otherwise, in Intel Gen6 driver, a warning is raised disabling the
bitrate control.
Original-patch-by: Hyunjun Ko <zzoon@igalia.com>
https://bugzilla.gnome.org/show_bug.cgi?id=783532
Instead of recalculating the miscellaneous buffer parameters for
every buffer, it is only done once, when the encoder is configured.
And for every buffer, the same structures are just copied.
https://bugzilla.gnome.org/show_bug.cgi?id=783449
This is patch pretends to decouple the assignation of the values
in the parameter structures and the VA buffer's parameters setting.
It may lead to some issues since HRD, framerate or controlrate may
not be handled by the specific encoder, but they are set in
the VA buffer's parameters.
I leave as it because this patch is just a transitional patch.
https://bugzilla.gnome.org/show_bug.cgi?id=783449
According to the VA documentation:
The framerate is specified as a number of frames per second,
as a fraction. The denominator of the fraction is given in
the top half (the high two bytes) of the framerate field, and
the numerator is given in the bottom half (the low two bytes).
For example, if framerate is set to (100 << 16 | 750), this is
750 / 100, hence 7.5fps.
If the denominator is zero (the high two bytes are both zero)
then it takes the value one instead, so the framerate is just
the integer in the low 2 bytes.
This patch fixes the the framerate calculation in vp8 encoder
according to this.
https://bugzilla.gnome.org/show_bug.cgi?id=783449
Move frame-rate parameter from ensure_misc_params() to
ensure_contro_rate_param() since it only has meaning when the
control rate is either VBR or CBR.
https://bugzilla.gnome.org/show_bug.cgi?id=783449
Move the Hypothetical Reference Decoder (HRD) parameter, from
ensure_misc_params() to ensure_control_rate_params(), since it
only shall be defined when the control rate is either VBR or CBR.
https://bugzilla.gnome.org/show_bug.cgi?id=783449
Instead of filling the control rate param in ensure_misc_params(),
this patch refactor it out, as a first step to merge the same code
for all the encoders.
https://bugzilla.gnome.org/show_bug.cgi?id=783449
Instead of using a proxy to story the buffer quality level, the
encoder now uses the native VA structure, which is copied to the
dynamically allocated VAEncMiscParameterBuffer.
This approach is computationally less expensive.
Right now, H264 and HEVC can set as a property the number of slices to
process. But each driver can set a maximum number of slices, depending
on the supported profile & entry point.
This patch verifies the current num_slices to process against the maximum
permitted by the driver and the media size.
https://bugzilla.gnome.org/show_bug.cgi?id=780955
- Use gst_element_send_event() instead of gst_pad_push_event()
- don't zero App structure
- check for pipeline parsing error
- only get vaapisink for property set
When state of vaapisink is changed from PLAYING to NULL, the handle_events
flag is set to FALSE, and never recovered, and then event thread is never
going to run.
So we should allow to set the flag only when users try it.
https://bugzilla.gnome.org/show_bug.cgi?id=782543
Since we started using VPP in VaapiWindowX11, we need to care about
the case that src rect and window's size are different.
So, once VPP has converted to other format, we should honor the
size of the VPP's surface as source rect. Otherwise, it is cropped
according the previous size of the source rect.
https://bugzilla.gnome.org/show_bug.cgi?id=782542
This implements a pipleint to recognize difference between ROI and non-ROI.
See comments in this code in detail.
https://bugzilla.gnome.org/show_bug.cgi?id=768248
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
Handles new custom event GstVaapiEncoderRegionOfInterest
to enable/disable a ROI region.
Writes a way to use new event to document.
https://bugzilla.gnome.org/show_bug.cgi?id=768248
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
$ simple-encoder -r inputfile.y4m
And you'll got an output file in H264 with two regions of interest.
https://bugzilla.gnome.org/show_bug.cgi?id=768248
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
Set ROI params during encoding each frame, which are set via
gst_vaapi_encoder_add_roi ()
https://bugzilla.gnome.org/show_bug.cgi?id=768248
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
Queries if the driver supports "Region of Interest" (ROI) during the config
creation.
This attribute conveys whether the driver supports region-of-interest (ROI)
encoding, based on user provided ROI rectangles. The attribute value is
partitioned into fields as defined in the VAConfigAttribValEncROI union.
If ROI encoding is supported, the ROI information is passed to the driver
using VAEncMiscParameterTypeROI.
https://bugzilla.gnome.org/show_bug.cgi?id=768248
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>
This patch adds the handling of VAEncMiscParameterTypeQualityLevel,
in gstreamer-vaapi encoders:
The encoding quality could be set through this structure, if the
implementation supports multiple quality levels. The quality level set
through this structure is persistent over the entire coded sequence, or
until a new structure is being sent. The quality level range can be queried
through the VAConfigAttribEncQualityRange attribute. A lower value means
higher quality, and a value of 1 represents the highest quality. The quality
level setting is used as a trade-off between quality and speed/power
consumption, with higher quality corresponds to lower speed and higher power
consumption.
The quality level is set by the element's parameter "quality-level" with a
hard-coded range of 1 to 8.
Later, when the encoder is configured in run time, just before start
processing, the quality level is scaled to the codec range. If
VAConfigAttribEncQualityRange is not available in the used VA backend, then
the quality level is set to zero, which means "disabled".
All the available codecs now process this parameter if it is available.
https://bugzilla.gnome.org/show_bug.cgi?id=778733
Signed-off-by: Víctor Manuel Jáquez Leal <vjaquez@igalia.com>