Compare commits

...

54 commits

Author SHA1 Message Date
Yury Shatz 1e9d7c54fb Merge branch 'tsmux-dont-write-start-unit-indicator-twice' into 'main'
tsmux: do not write start_unit_indicator twice

See merge request gstreamer/gstreamer!6289
2024-04-27 19:42:41 +00:00
Nirbheek Chauhan d7eeb62f38 meson: Fix Python library searching on Windows
Neither LIBDIR nor LIBPL are set with the native windows Python
(unlike MSYS2), so we need to use `prefix` which takes us to the
rootdir of the Python installation.

The name is also different: it's python312.dll, not python3.12.dll.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6734>
2024-04-27 01:30:21 +00:00
Nirbheek Chauhan 753aeccde7 meson: Fix Python library name fetching on Windows
`python.get_variable('FOO', [])` becomes `python.get_variable('FOO')`
due to how Meson treats empty arrays in arguments, which breaks the
fallback feature of get_variable().

So we need to actually check whether the variable exists before trying
to fetch it.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6734>
2024-04-27 01:30:21 +00:00
Tim-Philipp Müller 7074849c5c exif: add debug category
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6741>
2024-04-27 00:19:30 +00:00
Xavier Claessens f0ef33d018 unixfd: Close file descriptors on error
After calling g_unix_fd_list_steal_fds() and before calling
gst_fd_allocator_alloc(), we are responsible for closing those fds.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6532>
2024-04-26 18:52:19 +00:00
Xavier Claessens 1f8accbc8d unixfdsink: Take segment into account when converting timestamps
Also rename `calculate_timestamp()` to `to_monotonic()` and
`from_monotonic()` which better describe what it does.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6532>
2024-04-26 18:52:19 +00:00
Xavier Claessens 7f47dba299 unixfd: Allow sending buffers with no memories
There is no reason to not allow it, and it is useful for simple unit
test.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6532>
2024-04-26 18:52:18 +00:00
Víctor Manuel Jáquez Leal 1f080391ed vulkan: replace gst_vulkan_queue_create_decoder() with gst_vulkan_decoder_new_from_queue()
The purpose of this refactor is to hide decoding code from public API.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6723>
2024-04-26 16:24:22 +00:00
Víctor Manuel Jáquez Leal 18c32272bd vulkan: conceal unused decoder symbols
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6723>
2024-04-26 16:24:22 +00:00
Víctor Manuel Jáquez Leal 668b395a38 vulkan: conceal decoder from public API
Since we don't want to expose video decoding API outside of GStreamer, the
header is removed from installation and both source files are renamed as
-private.

The header must remain in gst-libs because is referred by GstVulkanQueue,
which's the decoder factory.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6723>
2024-04-26 16:24:22 +00:00
Víctor Manuel Jáquez Leal 547e2899d1 vaallocator: disable derived all together for Mesa <23.3
First it derived mapping was disabled for P010 formats, but also there's an
issue with interlaced frames.

It would be possible to disable derived mapping only for interlaced (H.264
decoder and vadeinterlace) but it would spread the hacks along the code. It's
simpler and contained to disable derived completely for Mesa <23.3

Fixes: #3450
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6729>
2024-04-26 15:41:39 +00:00
Víctor Manuel Jáquez Leal 7eb08feeee va: videoformat: use video library to get DRM fourcc
Instead of duplicating the GStreamer format to DRM fourcc mapping, this patch
uses the GstVideo library helpers. This duplicates the big O of looking for,
since the two lists are traversed, but it's less error prone.

Partially reverts commit 547f3e8622.

Fixes: #3354
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6731>
2024-04-26 15:10:36 +02:00
Edward Hervey 8737b9ca84 playbin3: Handle combiner update in case of errors
The assertion that was present before is a bit too harsh, since there is now
a (understandable) use-case where this could happen.

In gapless use-case, with two files containing the same type (ex:audio). The
first one *does* expose a collection with an audio stream, but decoding
fails (for whatever reason).

That would cause us to have configured a audio combiner, which was never
used (i.e. not active).

Then the second file plays and we (wrongly) assume it should be activated
... whereas the combiner was indeed present.

Demote the assertion to a warning and properly handle it

Fixes https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3389

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6737>
2024-04-26 11:31:32 +00:00
Tim Blechmann ff7b41ac86 soup: fix thread name
thread names should be below 16char, otherwise they won't be shown on
linux.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6736>
2024-04-26 09:45:49 +08:00
Hou Qi be7ba5ac51 wlwindow: free staged buffer when do gst_wl_window_finalize
If waylandsink received buffer rate is high which causes frame
drop, the cached staged buffer will be replaced when next buffer
needs to be rendered and be freed after redraw. But there is
chance to get memory leak if ended without redraw. So need to
free staged buffer when do gst_wl_window_finalize().

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6670>
2024-04-25 23:55:42 +00:00
Seungha Yang 4ac46ce82b d3d12screencapturesrc: Performance improvement
Process captured frame using d3d11 instead of d3d12, and use shared
fence when copying processed d3d11 texture to d3d12 resource.
In this way, capture CPU thread does not need to wait for fence signal.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6733>
2024-04-25 22:51:01 +00:00
Xavier Claessens 364d0ff45d pad: gst_pad_set_offset is only reliable on source pads
Setting an offset on sink pads won't repush segment event which means
buffer running time won't be adjusted. Better warn about this than being
silently not working.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6464>
2024-04-25 13:49:03 +00:00
Nirbheek Chauhan e7598ed521 rsvg: Disable deprecations instead of porting to new librsvg API
`rsvg_handle_get_dimensions()` and `rsvg_handle_render_cairo()` are
deprecated, and the replacement librsvg functions as specified in the
migration guide are `rsvg_handle_get_intrinsic_size_in_pixels()` and
`rsvg_handle_render_document()`.

However, those are not drop-in replacements, and actually have
breaking semantics for our use-case:

1. `intrinsic_size_in_pixels()` requires SVGs to have width+height or
   the viewBox attribute, but `get_dimensions()` does not. It will
   calculate the geometry based on element extents recursively.
2. `render_cairo()` simply renders the SVG at its intrinsic size on
   the specified surface starting at the top-left, maintaining
   whatever transformations have been applied to the cairo surface,
   including distorted aspect ratio.
   However, `render_document()` does not do that, it is specifically
   for rendering at the specified aspect ratio inside the specified
   viewport, and if you specify a viewPort that does not match the
   aspect ratio of the SVG, librsvg will center it.

Matching the old behaviour with the new APIs is a lot of work for no
benefit. We'd be duplicating code that is already there in librsvg in
one case and undoing work that librsvg is doing in the other case.

The aspect ratio handling in this element is also kinda atrocious.
There is no option to scale the SVG while maintaining the aspect
ratio. Overall, element needs a rewrite.

Let's just disable deprecations. The API is not going anywhere.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6726>
2024-04-25 12:43:07 +00:00
Nirbheek Chauhan 49f9a1e224 Revert "rsvgdec: Fix uses of librsvg functions deprecated since 2.52"
This reverts commit b8db473955.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6726>
2024-04-25 12:43:07 +00:00
Jordan Petridis 83694a1094 ci: Remove pip install version limits for meson/hotdoc
We used to have them pinned to avoid unexpected issues
when we wanted to update the image, however we haven't
needed them lately and we should be good to install the
latest stable version always.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6656>
2024-04-24 15:01:27 +00:00
Jordan Petridis 472d1b52d3 ci: Add a simple build job based on debian
The gstreamer-rs repos use debian based images already,
which we can later base on this one. Additionally it's
good to have another distro target so we avoid weird
fedoraisms when possible.

It will also be simpler to keep it up to date, as we
don't need to run the test suite against this build as
well.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6656>
2024-04-24 15:01:27 +00:00
Víctor Manuel Jáquez Leal d2c8593b2e vkswapper: choose color space according with format
The swapper surfaces contains the color space for each supported format. Instead
of hard coding the color space, it returns the value associated with the
negotiated vulkan format.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6725>
2024-04-24 13:53:18 +00:00
Víctor Manuel Jáquez Leal f62574cb5f tests: vulkan: split decoder test and parameters
Thus they can be reused for the encoder test.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6721>
2024-04-24 12:54:12 +00:00
Jordan Petridis d72d79d85a pre-commit: Avoid prefixing string comparissons with x
This was working around bugs in various shells, but this
problem has been fixed for a decade now.

https://www.shellcheck.net/wiki/SC2268

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6652>
2024-04-24 12:37:31 +00:00
Jordan Petridis 490deafcbe git-hooks/pre-commit-python.hook: Fix typos
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6652>
2024-04-24 12:37:31 +00:00
Jordan Petridis a12881e2e4 git-hooks/pre-commit-python.hook: Specify encoding for the open call
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6652>
2024-04-24 12:37:31 +00:00
Jordan Petridis 7b81d081ad pre-commit: Stop using legacy backticks in shell
We can repalce them with the simpler $(foo) syntax instead

https://www.shellcheck.net/wiki/SC2006

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6652>
2024-04-24 12:37:31 +00:00
Edward Hervey 2aba1c86e9 videodecoder: Use a frame duration for QoS
We prefer to use the frame stop position for checking for QoS since we don't
want to drop a frame which is only partially late.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6689>
2024-04-24 09:19:22 +00:00
Elliot Chen a7d0b07406 gstplay: query seek information again in playing state for live stream
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6681>
2024-04-24 07:01:21 +00:00
Edward Hervey 9e4cb46bd4 validate/flvdemux: Stop spamming audio/video on test
Use the sinks specified by the runner

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6727>
2024-04-24 07:47:42 +02:00
Haihua Hu 560dc511f7 wlwindow: clear configure mutex and cond when finalize
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6722>
2024-04-24 01:02:15 +09:00
Edward Hervey 5d705ed923 ges/tools: Use new GstEncodingProfile function from pbutils
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6712>
2024-04-23 14:27:00 +00:00
Edward Hervey ad8c42ba06 encoding-profile: Make (de)serialization functions public
This is more convenient and cheaper than going through the `g_value_convert()`
hoops

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6712>
2024-04-23 14:27:00 +00:00
Edward Hervey b3c9f598aa bad/utils: Simplify get_file_extension
By using g_strrstr

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6715>
2024-04-23 12:08:19 +00:00
Edward Hervey 131679b9d0 mpegtsbase: Fix Program equality check
There was an issue with this equality check, which was to figure out what to do
with PCR pids (whether they were part of the streams present or not) and whether
we ignore PCR or not.

Turns out ... we already took care of that further up in the function.

The length check can be simplified by just checking whether the length of
the *original* PMT and the new PMT are identical. Since we don't store "magic"
PCR streams in those, we can just use them as-is.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6713>
2024-04-23 11:15:53 +00:00
Seungha Yang cecb0f2148 d3d12decoder: Lock DPB while building command
Since DPB resource can be modified in output thread, protect
it when building command list.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6709>
2024-04-23 10:08:19 +00:00
Seungha Yang 27c02a0b80 d3d12decoder: Hold reference pictures in fence data
Keep reference pictures alive during executing decoding commands

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6709>
2024-04-23 10:08:19 +00:00
Seungha Yang 0f5f170a40 d3d12vp9dec: Disallow resolution change to larger size on non-keyframe
Intel GPU seems to be crashing if the case happens.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6709>
2024-04-23 10:08:18 +00:00
Edward Hervey 376aaa828d decodebin3: Remove custom stream-start field if present
This field is added by urisourcebin so that we can avoid double-parsing. It's no
longer needed after.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6690>
2024-04-23 07:09:21 +00:00
Edward Hervey 049665ccaa urisourcebin2: Adaptive demuxers don't require another parsebin
By setting the same field on the GST_EVENT_STREAM_START decodebin3 will be able
to avoid plugging in an extra parsebin

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6690>
2024-04-23 07:09:21 +00:00
Edward Hervey 4e5a54612e adaptivedemux2: Answer GST_QUERY_CAPS
If we have a generic caps, we can answer the query.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6690>
2024-04-23 07:09:21 +00:00
Edward Hervey 6b43e4e19f adaptivedemux2: Refactor output slot creation
Set as much information as possible on the slot (including the associated
track) *before* the associated source pad is added to the element.

We need this so that incoming event/queries can be replied to if they are
received when adding the pad

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6690>
2024-04-23 07:09:21 +00:00
Johan Sternerup deddcbdc66 basesrc: protect segment_seqnum/pending with object lock
In a few places the object lock was not taken when writing to
segment_pending and segment_seqnum.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6671>
2024-04-22 18:13:46 +00:00
Johan Sternerup a3f8f036fe gstbasesrc: Do not hold LIVE_LOCK while sending events
An application that triggers a state transition from PLAYING to PAUSED
needs to acquire the LIVE_LOCK. Consequently the LIVE_LOCK must not be
taken while pushing anything on the pads because this operation might
get blocked by something that cannot be unblocked without the
application being able to proceed with the state transitions for other
elements in the pipeline. This commit extends the previous behaviour
where the live lock was released before pushing buffers (indirectly
through the unlock before subclass->create) to now also include
unlocking before pushing events.

The issue was discovered in a case for WebRTC where the application
tried to shut down a pipeline but an event originating from a video
source element (based on basesrc) was in the process of being pushed
down the pipeline when it got stuck on the STREAM_LOCK for the pad after
the rtpgccbwe element. This lock in turn was held by the rtcpgccbwe
element as it was in the process of pushing data down the pipeline but
was stuck on the blocking probes installed on dtlssrtpenc to prevent
data from flowing before dtls keys had been negotiated. What should have
happened here is that the blocking probes should be removed, but that
can only happen if the application may continue driving the state
transitions.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6671>
2024-04-22 18:13:46 +00:00
Seungha Yang 700c00eda3 d3d12decoder: Fix potential use after free
A DPB buffer held by codec picture object may not be writable
at the moment, then gst_buffer_make_writable() will unref passed buffer.

Specifically, the use after free or double free can happen if:
* Crop meta of buffer copy is required because of non-zero
  top-left crop position
* zero-copy is possible with crop meta
* A picture was duplicated, interlaced h264 stream for example

Interlaced h264 stream with non-zero top-left crop position
is not very common but it's possible configuration in theory.

Thus gst_buffer_make_writable() should be called with
GstVideoCodecFrame.output_buffer directly.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6706>
2024-04-22 13:28:06 +00:00
Seungha Yang b9e51facdd d3d11decoder: Fix potential use after free
A DPB buffer held by codec picture object may not be writable
at the moment, then gst_buffer_make_writable() will unref passed buffer.

Specifically, the use after free or double free can happen if:
* Crop meta of buffer copy is required because of non-zero
  top-left crop position
* zero-copy is possible with crop meta
* A picture was duplicated, interlaced h264 stream for example

Interlaced h264 stream with non-zero top-left crop position
is not very common but it's possible configuration in theory.

Thus gst_buffer_make_writable() should be called with
GstVideoCodecFrame.output_buffer directly.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6706>
2024-04-22 13:28:06 +00:00
Edward Hervey 6d228c420c tsdemux: Disable smart program update
The goal of this code was, for programs which were updates (i.e. adding/removing
streams but not completely changing) to allow dynamic addition/removal of
streams without completely removing everything.

But this wasn't 100% tested and there are a bunch of issues which make it fail
in plenty of ways.

For now disable that feature and force the legacy "add all pads again and then
remove old ones" behaviour to make it switch.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6651>
2024-04-22 11:56:30 +00:00
Loïc Le Page 9fd0f44492 gst-editing-services: add input channels reorder
- whitelist corresponding properties from audioconvert
- add input channels reorder validation test in gst-integration-testsuites/ges/scenarios

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5923>
2024-04-22 12:06:11 +02:00
Loïc Le Page 8fb96253be audioconvert: add possibility to reorder input channels
When audioconvert has unpositionned audio channels as input
it can now use reordering configurations to automatically
position those channels.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5923>
2024-04-22 12:06:11 +02:00
Seungha Yang 40f7d7f1f7 d3d11device: Add device-removed-reason property
In addition to device removed status monitoring in gst_d3d11_result()
method, if ID3D11Device4 interface is available,
an event handle will be used for device removed status update.
And "device-removed" signal is removed since applications can monitor
the device removed status via gobject notify

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6699>
2024-04-21 12:06:03 +00:00
Seungha Yang b12b04eeef d3d12utils: Fix documentation
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6699>
2024-04-21 12:06:03 +00:00
Seungha Yang 6efeeb8300 d3d12device: Add device-removed-reason property
Adding new property in order to notify users of device removed status.
Once device removed status is detected, application should release
all ID3D12Device objects corresponding to the adapter, including
GstD3D12Device object. Otherwise D3D12CreateDevice() call for the
adapter will fail.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6699>
2024-04-21 12:06:03 +00:00
Seungha Yang b9feb47de5 mediafoundation: Fix infinite loop in device provider
Initialize source state with GST_MF_DEVICE_NOT_FOUND to terminate
loop immediately if no available capture device is available

Fixes: https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/3492
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6704>
2024-04-21 19:21:54 +09:00
Jurijs Satcs 3b4e4cb9ba tsmux: do not write start_unit_indicator twice 2024-03-07 13:17:40 +01:00
80 changed files with 4000 additions and 1718 deletions

View file

@ -33,6 +33,7 @@ variables:
value: ""
FEDORA_AMD64_SUFFIX: 'amd64/fedora'
DEBIAN_AMD64_SUFFIX: 'amd64/debian'
INDENT_AMD64_SUFFIX: 'amd64/gst-indent'
COMMITLINT_AMD64_SUFFIX: 'amd64/commitlint'
WINDOWS_AMD64_SUFFIX: 'amd64/windows'
@ -124,6 +125,7 @@ trigger:
.fedora image:
variables:
CCACHE_DIR: '/cache/gstreamer/gstreamer/ccache/'
FDO_BASE_IMAGE: 'registry.fedoraproject.org/fedora-toolbox:34'
FDO_DISTRIBUTION_VERSION: '34'
FDO_REPO_SUFFIX: "$FEDORA_AMD64_SUFFIX"
@ -140,6 +142,25 @@ fedora amd64 docker:
# (which has faster network connectivity to the registry).
tags: [ 'placeholder-job' ]
.debian image:
variables:
CCACHE_DIR: '/cache/gstreamer/gstreamer/ccache_debian/'
FDO_BASE_IMAGE: "quay.io/toolbx-images/debian-toolbox:12"
FDO_DISTRIBUTION_VERSION: '12'
FDO_REPO_SUFFIX: "$DEBIAN_AMD64_SUFFIX"
FDO_DISTRIBUTION_TAG: "$DEBIAN_TAG-$GST_UPSTREAM_BRANCH"
FDO_DISTRIBUTION_EXEC: 'GIT_BRANCH=$CI_COMMIT_REF_NAME GIT_URL=$CI_REPOSITORY_URL bash ci/docker/debian/prepare.sh'
debian amd64 docker:
extends:
- '.debian image'
- '.fdo.container-build@debian'
stage: 'preparation'
needs: []
# Note: assumption is that placeholder jobs run on a packet runner
# (which has faster network connectivity to the registry).
tags: [ 'placeholder-job' ]
.gst-indent image:
tags: [ 'placeholder-job' ]
variables:
@ -264,7 +285,6 @@ commitlint:
CCACHE_COMPILERCHECK: 'content'
CCACHE_COMPRESS: 'true'
CCACHE_BASEDIR: '/cache/gstreamer/gstreamer'
CCACHE_DIR: '/cache/gstreamer/gstreamer/ccache/'
# shared across everything really
CCACHE_MAXSIZE: '10G'
@ -329,15 +349,10 @@ commitlint:
- changes:
- subprojects/gstreamer-vaapi/**/*
.build fedora x86_64:
.build simple:
extends:
- '.fedora image'
- '.fdo.suffixed-image@fedora'
- '.build'
- '.build_ccache_vars'
needs:
- "trigger"
- "fedora amd64 docker"
variables:
GST_WERROR: "true"
MESON_ARGS: "${SIMPLE_BUILD}"
@ -350,6 +365,15 @@ commitlint:
- meson install --destdir $CI_PROJECT_DIR/destdir -C build
- rm -rf $CI_PROJECT_DIR/destdir
.build fedora x86_64:
extends:
- '.fedora image'
- '.fdo.suffixed-image@fedora'
- '.build simple'
needs:
- "trigger"
- "fedora amd64 docker"
build fedora gcc:
extends: '.build fedora x86_64'
variables:
@ -405,6 +429,18 @@ build fedora clang:
-Dgstreamer-sharp:tests=disabled
--force-fallback-for=glib
build debian x86_64:
extends:
- '.debian image'
- '.fdo.suffixed-image@debian'
- '.build simple'
needs:
- "trigger"
- "debian amd64 docker"
variables:
BUILD_TYPE: "--default-library=shared"
BUILD_GST_DEBUG: "-Dgstreamer:gst_debug=true"
.build windows:
image: $WINDOWS_IMAGE
stage: 'build'

View file

@ -5,7 +5,9 @@ variables:
# If you are hacking on them or need a them to rebuild, its enough
# to change any part of the string of the image you want.
###
FEDORA_TAG: '2024-04-10.0'
FEDORA_TAG: '2024-04-22.4'
DEBIAN_TAG: '2024-04-22.4'
INDENT_TAG: '2023-08-24.3'

283
ci/docker/debian/deps.txt Normal file
View file

@ -0,0 +1,283 @@
apertium-regtest
appstream-util
autopoint
bash-completion
bat
bison
bubblewrap
busybox
ccache
clang
clang-tools
cmake
coinor-libcgl-dev
curl
desktop-file-utils
docutils-common
doxygen
dwz
elfutils
emscripten
ffmpeg
flex
flite1-dev
g++
gcc
gdb
gettext
git
git-lfs
glslc
googletest
graphviz
gtk-doc-tools
guile-cairo-dev
iproute2
iso-codes
itstool
ladspa-sdk
liba52-0.7.4-dev
libaa1-dev
liballeggl4-dev
libaom-dev
libasound2-dev
libass-dev
libatk1.0-dev
libavahi-client-dev
libavahi-common-dev
libavc1394-dev
libavcodec-dev
libavdevice-dev
libavfilter-dev
libavformat-dev
libavif-dev
libavutil-dev
libbluetooth-dev
libboost-system-dev
libbs2b-dev
libcaca-dev
libcairo2-dev
libcamera-dev
libcanberra-dev
libcap-dev
libcdio-dev
libcdparanoia-dev
libchromaprint-dev
libclang-dev
libcoap3-dev
libcurl4-openssl-dev
libdbus-glib2.0-cil-dev
libdca-dev
libde265-dev
libdirectfb-dev
libdrm-dev
libdrumstick-dev
libdv4-dev
libdvdnav-dev
libdvdread-dev
libdw-dev
libebur128-dev
libegl-dev
libespeak-ng-dev
libespeak-ng-libespeak-dev
libevdev-dev
libevent-dev
libexempi-dev
libexif-dev
libfaad-dev
libfftw3-bin
libfftw3-dev
libflac-dev
libfluidsynth-dev
libframe-dev
libfreeaptx-dev
libftgl-dev
libgbm-dev
libgdk-pixbuf-2.0-dev
libgeocode-glib-dev
libgirepository1.0-dev
libgl-dev
libgl1-mesa-dev
libgles-dev
libglib2.0-dev
libglib2.0-doc
libglx-dev
libgme-dev
libgnutls28-dev
libgraphene-1.0-dev
libgridsite-dev
libgsl-dev
libgsm1-dev
libgssdp-1.6-dev
libgtest-dev
libgtk-3-dev
libgtk-4-dev
libgtkmm-3.0-dev
libgudev-1.0-dev
libgupnp-igd-1.0-dev
libiec61883-dev
libinput-dev
libiptcdata0-dev
libjack-jackd2-dev
libjpeg62-turbo-dev
libjson-glib-dev
libjwt-gnutls-dev
libkate-dev
liblc3-dev
liblcms2-dev
libldacbt-abr-dev
libldacbt-enc-dev
liblilv-dev
liblogg4-dev
libltc-dev
liblttng-ust-dev
liblxi-dev
libmfx-dev
libmjpegtools-dev
libmodplug-dev
libmono-cil-dev
libmp3lame-dev
libmpcdec-dev
libmpeg2-4-dev
libmpeg3-dev
libmpg123-dev
libneon27-dev
libngtcp2-crypto-gnutls-dev
libnice-dev
libnx-x11-dev
libogg-dev
libopenal-dev
libopencore-amrnb-dev
libopencore-amrwb-dev
libopencv-dev
libopenexr-dev
libopengl-dev
libopenh264-dev
libopenjp2-7-dev
libopenmpt-dev
libopenni2-dev
libopus-dev
libpango1.0-dev
libpng-dev
libpolkit-gobject-1-dev
libpulse-dev
libpython3-all-dev
libqrencode-dev
libqt5waylandclient5-dev
libqt5x11extras5-dev
libraw1394-dev
librsvg2-dev
librtmp-dev
librust-wayland-protocols-dev
libsbc-dev
libsdl2-dev
libshaderc1
libshout-dev
libsidplay1-dev
libsigc++-2.0-dev
libsndfile1-dev
libsndifsdl2-dev
libsoundtouch-dev
libsoup-3.0-dev
libsoup2.4-dev
libspandsp-dev
libspeex-dev
libsphinxbase-dev
libspice-client-glib-2.0-dev
libsrt-openssl-dev
libsrtp2-dev
libssh2-1-dev
libssl-dev
libsvtav1-dev
libsvtav1dec-dev
libsvtav1enc-dev
libsysprof-4-dev
libtag1-dev
libtaoframework-openal-cil-dev
libtaoframework-opengl-cil-dev
libtheora-dev
libtwolame-dev
libudev-dev
libunwind-dev
liburcu-dev
libusb-1.0-0-dev
libv4l-dev
libva-dev
libvisual-0.4-dev
libvo-aacenc-dev
libvo-amrwbenc-dev
libvorbis-dev
libvpx-dev
libvulkan-dev
libwacom-dev
libwavpack-dev
libwayland-dev
libwebp-dev
libwebrtc-audio-processing-dev
libwildmidi-dev
libwpe-1.0-dev
libwpebackend-fdo-1.0-dev
libwpewebkit-1.1-dev
libx11-dev
libx11-xcb-dev
libx264-dev
libx265-dev
libxcb-dri3-dev
libxcb-glx0-dev
libxcb-xfixes0-dev
libxcb-xv0-dev
libxcb1-dev
libxdamage-dev
libxext-dev
libxfixes-dev
libxi-dev
libxkbcommon-dev
libxkbcommon-x11-dev
libxml2-dev
libxmlsec1-dev
libxrandr-dev
libxslt1-dev
libxtst-dev
libxv-dev
libxvidcore-dev
libyajl-dev
libyaml-dev
libz-mingw-w64-dev
libzbar-dev
libzita-convolver-dev
libzvbi-dev
libzxing-dev
libzxingcore-dev
llvm-dev
lua-zlib-dev
make
modemmanager-dev
mono-complete
mono-devel
nasm
nettle-dev
ninja-build
patch
python-gi-dev
python3-all-dev
python3-cairo-dev
python3-dev
python3-pip
qconf
qt5-qmake
qtbase5-dev
qtbase5-private-dev
qtdeclarative5-dev
qtdeclarative5-dev-tools
qttools5-dev-tools
sudo
svt-av1
valgrind
wayland-protocols
x11-xserver-utils
xdg-utils
xfonts-jmk
xfonts-kaname
xvfb
yasm
zlib1g-dev

View file

@ -0,0 +1,10 @@
#! /bin/bash
set -eux
apt update -y && apt full-upgrade -y
apt install -y $(<./ci/docker/debian/deps.txt)
pip3 install --break-system-packages meson hotdoc python-gitlab tomli junitparser
apt clean all

View file

@ -0,0 +1,13 @@
#! /bin/bash
set -eux
bash ./ci/docker/debian/install-deps.sh
bash ./ci/scripts/install-rust.sh
# Configure git for various usage
git config --global user.email "gstreamer@gstreamer.net"
git config --global user.name "Gstbuild Runner"
bash ./ci/scripts/create-subprojects-cache.sh

View file

@ -32,7 +32,7 @@ dnf builddep -y gstreamer1 \
python3-gstreamer1
dnf remove -y meson -x ninja-build
pip3 install meson==1.2.3 hotdoc==0.16 python-gitlab tomli junitparser
pip3 install meson hotdoc python-gitlab tomli junitparser
# Remove gst-devel packages installed by builddep above
dnf remove -y "gstreamer1*devel"

View file

@ -12,8 +12,8 @@ bash ./ci/docker/fedora/install-gdk-pixbuf.sh
bash ./ci/docker/fedora/install-wayland-protocols.sh
bash ./ci/docker/fedora/install-rust.sh
bash ./ci/scripts/install-rust.sh
bash ./ci/docker/fedora/virtme-fluster-setup.sh
bash ./ci/docker/fedora/create-subprojects-cache.sh
bash ./ci/scripts/create-subprojects-cache.sh

View file

@ -30985,7 +30985,10 @@ of the peer sink pad, if present.</doc>
</parameters>
</method>
<method name="set_offset" c:identifier="gst_pad_set_offset">
<doc xml:space="preserve" filename="../subprojects/gstreamer/gst/gstpad.c">Set the offset that will be applied to the running time of @pad.</doc>
<doc xml:space="preserve" filename="../subprojects/gstreamer/gst/gstpad.c">Set the offset that will be applied to the running time of @pad. Upon next
buffer, every sticky events (notably segment) will be pushed again with
their running time adjusted. For that reason this is only reliable on
source pads.</doc>
<source-position filename="../subprojects/gstreamer/gst/gstpad.h"/>
<return-value transfer-ownership="none">
<type name="none" c:type="void"/>

View file

@ -1733,6 +1733,23 @@ subtitles), are currently ignored.</doc>
</parameter>
</parameters>
</function>
<function name="from_string" c:identifier="gst_encoding_profile_from_string" version="1.26">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">Converts a string in the "encoding profile serialization format" into a
GstEncodingProfile. Refer to the encoding-profile documentation for details
on the format.</doc>
<source-position filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.h"/>
<return-value transfer-ownership="full">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">A newly created GstEncodingProfile or NULL if the
input string is not a valid encoding profile serialization format.</doc>
<type name="EncodingProfile" c:type="GstEncodingProfile*"/>
</return-value>
<parameters>
<parameter name="string" transfer-ownership="none">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">The string to convert into a GstEncodingProfile.</doc>
<type name="utf8" c:type="const gchar*"/>
</parameter>
</parameters>
</function>
<method name="copy" c:identifier="gst_encoding_profile_copy" version="1.12">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">Makes a deep copy of @self</doc>
<source-position filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.h"/>
@ -2175,6 +2192,22 @@ single segment before the encoder, #FALSE otherwise.</doc>
</parameter>
</parameters>
</method>
<method name="to_string" c:identifier="gst_encoding_profile_to_string" version="1.26">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">Converts a GstEncodingProfile to a string in the "Encoding Profile
serialization format".</doc>
<source-position filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.h"/>
<return-value transfer-ownership="full">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">A string representation of the GstEncodingProfile,
or NULL if the input is invalid.</doc>
<type name="utf8" c:type="gchar*"/>
</return-value>
<parameters>
<instance-parameter name="profile" transfer-ownership="none">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">The GstEncodingProfile to convert.</doc>
<type name="EncodingProfile" c:type="GstEncodingProfile*"/>
</instance-parameter>
</parameters>
</method>
<property name="element-properties" version="1.20" writable="1" transfer-ownership="none">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-base/gst-libs/gst/pbutils/encoding-profile.c">A #GstStructure defining the properties to be set to the element
the profile represents.

View file

@ -1314,21 +1314,6 @@ need to use this function.</doc>
<record name="VulkanCommandPoolPrivate" c:type="GstVulkanCommandPoolPrivate" disguised="1">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/vulkan_fwd.h"/>
</record>
<record name="VulkanDecoder" c:type="GstVulkanDecoder" disguised="1">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/vulkan_fwd.h"/>
</record>
<record name="VulkanDecoderClass" c:type="GstVulkanDecoderClass" disguised="1">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/vulkan_fwd.h"/>
</record>
<union name="VulkanDecoderParameters" c:type="GstVulkanDecoderParameters">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/vulkan_fwd.h"/>
</union>
<record name="VulkanDecoderPicture" c:type="GstVulkanDecoderPicture" disguised="1">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/vulkan_fwd.h"/>
</record>
<record name="VulkanDecoderPrivate" c:type="GstVulkanDecoderPrivate" disguised="1">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/vulkan_fwd.h"/>
</record>
<class name="VulkanDescriptorCache" c:symbol-prefix="vulkan_descriptor_cache" c:type="GstVulkanDescriptorCache" version="1.18" parent="VulkanHandlePool" glib:type-name="GstVulkanDescriptorCache" glib:get-type="gst_vulkan_descriptor_cache_get_type" glib:type-struct="VulkanDescriptorCacheClass">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkdescriptorcache.h"/>
<constructor name="new" c:identifier="gst_vulkan_descriptor_cache_new" version="1.18">
@ -5178,24 +5163,6 @@ surrounding elements of @element.</doc>
</instance-parameter>
</parameters>
</method>
<method name="create_decoder" c:identifier="gst_vulkan_queue_create_decoder" version="1.24" introspectable="0">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkqueue.c">Creates a #GstVulkanDecoder object if @codec decoding is supported by @queue</doc>
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkqueue.h"/>
<return-value transfer-ownership="full" nullable="1">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkqueue.c">the #GstVulkanDecoder object</doc>
<type name="VulkanDecoder" c:type="GstVulkanDecoder*"/>
</return-value>
<parameters>
<instance-parameter name="queue" transfer-ownership="none">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkqueue.c">a #GstVulkanQueue</doc>
<type name="VulkanQueue" c:type="GstVulkanQueue*"/>
</instance-parameter>
<parameter name="codec" transfer-ownership="none">
<doc xml:space="preserve" filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkqueue.c">the VkVideoCodecOperationFlagBitsKHR to decode</doc>
<type name="guint" c:type="guint"/>
</parameter>
</parameters>
</method>
<method name="get_device" c:identifier="gst_vulkan_queue_get_device" version="1.18">
<source-position filename="../subprojects/gst-plugins-bad/gst-libs/gst/vulkan/gstvkqueue.h"/>
<return-value transfer-ownership="full" nullable="1">

View file

@ -11,12 +11,12 @@ NOT_PYCODESTYLE_COMPLIANT_MESSAGE_PRE = \
NOT_PYCODESTYLE_COMPLIANT_MESSAGE_POST = \
"Please fix these errors and commit again, you can do so "\
"from the root directory automatically like this, assuming the whole "\
"file is to be commited:"
"file is to be committed:"
NO_PYCODESTYLE_MESSAGE = \
"You should install the pycodestyle style checker to be able"\
" to commit in this repo.\nIt allows us to garantee that "\
"anything that is commited respects the pycodestyle coding style "\
" to commit in this repo.\nIt allows us to guarantee that "\
"anything that is committed respects the pycodestyle coding style "\
"standard.\nYou can install it:\n"\
" * on ubuntu, debian: $sudo apt-get install pycodestyle \n"\
" * on fedora: #yum install python3-pycodestyle \n"\
@ -40,7 +40,7 @@ def copy_files_to_tmp_dir(files):
filepath = os.path.dirname(filename)
if not os.path.exists(filepath):
os.makedirs(filepath)
with open(filename, 'w') as f:
with open(filename, 'w', encoding="utf-8") as f:
system('git', 'show', ':' + name, stdout=f)
return tempdir

View file

@ -7,12 +7,12 @@
# On some *bsd systems the binary seems to be called gnunindent,
# so check for that first.
version=`gnuindent --version 2>/dev/null`
if test "x$version" = "x"; then
version=`gindent --version 2>/dev/null`
if test "x$version" = "x"; then
version=`indent --version 2>/dev/null`
if test "x$version" = "x"; then
version=$(gnuindent --version 2>/dev/null)
if test -z "$version"; then
version=$(gindent --version 2>/dev/null)
if test -z "$version"; then
version=$(indent --version 2>/dev/null)
if test -z "$version"; then
echo "GStreamer git pre-commit hook:"
echo "Did not find GNU indent, please install it before continuing."
exit 1
@ -26,7 +26,7 @@ else
INDENT=gnuindent
fi
case `$INDENT --version` in
case $($INDENT --version) in
GNU*)
;;
default)
@ -52,11 +52,11 @@ INDENT_PARAMETERS="--braces-on-if-line \
--leave-preprocessor-space"
echo "--Checking style--"
for file in `git diff-index --cached --name-only HEAD --diff-filter=ACMR| grep "\.c$"` ; do
for file in $(git diff-index --cached --name-only HEAD --diff-filter=ACMR| grep "\.c$") ; do
# nf is the temporary checkout. This makes sure we check against the
# revision in the index (and not the checked out version).
nf=`git checkout-index --temp ${file} | cut -f 1`
newfile=`mktemp /tmp/${nf}.XXXXXX` || exit 1
nf=$(git checkout-index --temp ${file} | cut -f 1)
newfile=$(mktemp /tmp/${nf}.XXXXXX) || exit 1
$INDENT ${INDENT_PARAMETERS} \
$nf -o $newfile 2>> /dev/null
# FIXME: Call indent twice as it tends to do line-breaks
@ -91,10 +91,10 @@ echo "==========================================================================
exit 1
fi
csharp_files=` git diff-index --cached --name-only HEAD --diff-filter=ACMR| grep "^subprojects/gstreamer-sharp/.*cs$" `
if test "x$csharp_files" != "x"; then
version=`dotnet-format --version 2>/dev/null`
if test "x$version" = "x"; then
csharp_files=$( git diff-index --cached --name-only HEAD --diff-filter=ACMR| grep "^subprojects/gstreamer-sharp/.*cs$" )
if test -n "$csharp_files"; then
version=$(dotnet-format --version 2>/dev/null)
if test -z "$version"; then
echo "GStreamer git pre-commit hook:"
echo "Did not find dotnet-format required to format C# files, please install it before continuing."
exit 1

View file

@ -19449,23 +19449,6 @@
"GstVulkanCommandPool.pool",
"GstVulkanCommandPool.queue",
"GstVulkanCommandPoolClass.parent_class",
"GstVulkanDecoder",
"GstVulkanDecoder.codec",
"GstVulkanDecoder.dedicated_dpb",
"GstVulkanDecoder.input_buffer",
"GstVulkanDecoder.layered_buffer",
"GstVulkanDecoder.layered_dpb",
"GstVulkanDecoder.parent",
"GstVulkanDecoder.profile",
"GstVulkanDecoder.queue",
"GstVulkanDecoderClass.parent",
"GstVulkanDecoderPicture",
"GstVulkanDecoderPicture.dpb",
"GstVulkanDecoderPicture.img_view_out",
"GstVulkanDecoderPicture.img_view_ref",
"GstVulkanDecoderPicture.out",
"GstVulkanDecoderPicture.refs",
"GstVulkanDecoderPicture.slice_offs",
"GstVulkanDescriptorCache",
"GstVulkanDescriptorCache.parent",
"GstVulkanDescriptorCache.pool",
@ -44795,22 +44778,6 @@
"gst_vulkan_command_pool_lock",
"gst_vulkan_command_pool_unlock",
"gst_vulkan_create_shader",
"gst_vulkan_decoder_append_slice",
"gst_vulkan_decoder_caps",
"gst_vulkan_decoder_create_dpb_pool",
"gst_vulkan_decoder_decode",
"gst_vulkan_decoder_flush",
"gst_vulkan_decoder_is_started",
"gst_vulkan_decoder_out_format",
"gst_vulkan_decoder_picture_create_view",
"gst_vulkan_decoder_picture_init",
"gst_vulkan_decoder_picture_release",
"gst_vulkan_decoder_profile_caps",
"gst_vulkan_decoder_start",
"gst_vulkan_decoder_stop",
"gst_vulkan_decoder_update_video_session_parameters",
"gst_vulkan_decoder_update_ycbcr_sampler",
"gst_vulkan_decoder_wait",
"gst_vulkan_descriptor_cache_acquire",
"gst_vulkan_descriptor_cache_new",
"gst_vulkan_descriptor_pool_create",
@ -44965,7 +44932,6 @@
"gst_vulkan_physical_device_type_to_string",
"gst_vulkan_present_mode_to_string",
"gst_vulkan_queue_create_command_pool",
"gst_vulkan_queue_create_decoder",
"gst_vulkan_queue_flags_to_string",
"gst_vulkan_queue_get_device",
"gst_vulkan_queue_handle_context_query",

View file

@ -128,7 +128,10 @@ ges_audio_source_create_element (GESTrackElement * trksrc)
GPtrArray *elements;
GESSourceClass *source_class = GES_SOURCE_GET_CLASS (trksrc);
const gchar *volume_props[] = { "volume", "mute", NULL };
const gchar *audioconvert_props[] = { "mix-matrix", NULL };
const gchar *audioconvert_props[] = {
"mix-matrix", "input-channels-reorder",
"input-channels-reorder-mode", NULL
};
GESAudioSource *self = GES_AUDIO_SOURCE (trksrc);
g_assert (source_class->create_source);

View file

@ -618,10 +618,10 @@ _set_rendering_details (GESLauncher * self)
smart_profile = TRUE;
else {
opts->format = get_file_extension (opts->outputuri);
prof = parse_encoding_profile (opts->format);
prof = gst_encoding_profile_from_string (opts->format);
}
} else {
prof = parse_encoding_profile (opts->format);
prof = gst_encoding_profile_from_string (opts->format);
if (!prof) {
ges_printerr ("Invalid format specified: %s", opts->format);
goto done;
@ -636,7 +636,7 @@ _set_rendering_details (GESLauncher * self)
opts->format =
g_strdup ("application/ogg:video/x-theora:audio/x-vorbis");
prof = parse_encoding_profile (opts->format);
prof = gst_encoding_profile_from_string (opts->format);
}
if (!prof) {
@ -649,7 +649,8 @@ _set_rendering_details (GESLauncher * self)
GstEncodingProfile *new_prof;
GList *tmp;
if (!(new_prof = parse_encoding_profile (opts->container_profile))) {
if (!(new_prof =
gst_encoding_profile_from_string (opts->container_profile))) {
ges_printerr ("Failed to parse container profile %s",
opts->container_profile);
gst_object_unref (prof);

View file

@ -183,26 +183,6 @@ ensure_uri (const gchar * location)
return gst_filename_to_uri (location, NULL);
}
GstEncodingProfile *
parse_encoding_profile (const gchar * format)
{
GstEncodingProfile *profile;
GValue value = G_VALUE_INIT;
g_value_init (&value, GST_TYPE_ENCODING_PROFILE);
if (!gst_value_deserialize (&value, format)) {
g_value_reset (&value);
return NULL;
}
profile = g_value_dup_object (&value);
g_value_reset (&value);
return profile;
}
void
print_enum (GType enum_type)
{

View file

@ -58,7 +58,6 @@ typedef struct
gchar * sanitize_timeline_description (gchar **args, GESLauncherParsedOptions *opts);
gboolean get_flags_from_string (GType type, const gchar * str_flags, guint *val);
gchar * ensure_uri (const gchar * location);
GstEncodingProfile * parse_encoding_profile (const gchar * format);
void print_enum (GType enum_type);
void ges_print (GstDebugColorFlags c, gboolean err, gboolean nline, const gchar * format, va_list var_args);

View file

@ -0,0 +1,58 @@
set-globals, media_file="$(test_dir)/../../medias/defaults/ogg/audio_5.1_separated_frequencies.ogg"
meta,
tool = "ges-launch-$(gst_api_version)",
handles-states=true,
seek=true,
needs_preroll=true,
args = {
--track-types, audio,
--audio-caps, "audio/x-raw, channels=2",
--audiosink, "$(audiosink) name=audiosink",
},
configs = {
"$(validateflow), pad=audiosink:sink, buffers-checksum=true, ignored-fields=\"stream-start={stream-id,group-id,stream}\"",
}
# Add a 5.1 audio clip with forced default order (gst)
add-clip, name=clip, asset-id="file://$(media_file)", layer-priority=0, type=GESUriClip
set-child-properties, element-name=clip, input-channels-reorder-mode=force
checkpoint, text="Checking default GST order."
pause
# Change order to smpte
checkpoint, text="Checking SMPTE order."
set-state, state=null
set-child-properties, element-name=clip, input-channels-reorder=smpte
pause
# Change order to cine
checkpoint, text="Checking CINE order."
set-state, state=null
set-child-properties, element-name=clip, input-channels-reorder=cine
pause
# Change order to ac3
checkpoint, text="Checking AC3 order."
set-state, state=null
set-child-properties, element-name=clip, input-channels-reorder=ac3
pause
# Change order to aac
checkpoint, text="Checking AAC order."
set-state, state=null
set-child-properties, element-name=clip, input-channels-reorder=aac
pause
# Change order to mono
checkpoint, text="Checking MONO order."
set-state, state=null
set-child-properties, element-name=clip, input-channels-reorder=mono
pause
# Change order to alternate
checkpoint, text="Checking ALTERNATE order."
set-state, state=null
set-child-properties, element-name=clip, input-channels-reorder=alternate
pause
stop

View file

@ -0,0 +1,49 @@
CHECKPOINT: Checking default GST order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=a8466458c3dc435159d9df2f7ca37345017013fb, pts=0:00:00.000000000, dur=0:00:00.010000000
CHECKPOINT: Checking SMPTE order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=a8466458c3dc435159d9df2f7ca37345017013fb, pts=0:00:00.000000000, dur=0:00:00.010000000
CHECKPOINT: Checking CINE order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=203fce4efef89b9526b608b0d4f5f5c0097be3a3, pts=0:00:00.000000000, dur=0:00:00.010000000
CHECKPOINT: Checking AC3 order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=18ea51cc5086625305ea18e4d0abbabc66616136, pts=0:00:00.000000000, dur=0:00:00.010000000
CHECKPOINT: Checking AAC order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=70620194e45b76eb1469b7221f8dec05a5505c93, pts=0:00:00.000000000, dur=0:00:00.010000000
CHECKPOINT: Checking MONO order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=3cffd73a8ddcafa0d0cd17151af4d2bd6c26f620, pts=0:00:00.000000000, dur=0:00:00.010000000
CHECKPOINT: Checking ALTERNATE order.
event stream-start: GstEventStreamStart, flags=(GstStreamFlags)GST_STREAM_FLAG_NONE;
event caps: audio/x-raw, channel-mask=(bitmask)0x0000000000000003, channels=(int)2, format=(string)F32LE, layout=(string)interleaved, rate=(int)48000;
event segment: format=TIME, start=0:00:00.000000000, offset=0:00:00.000000000, stop=0:00:30.000000000, flags=0x01, time=0:00:00.000000000, base=0:00:00.000000000, position=0:00:00.000000000
buffer: checksum=e1472a1300859dea1e3ecf4ac6a16298d6fb4e63, pts=0:00:00.000000000, dur=0:00:00.010000000

@ -1 +1 @@
Subproject commit cba2a17a35049f76b5a1501da877b550dd795bef
Subproject commit fa0b7ec4a3e48371ff25d608ff5278e6fac1060c

View file

@ -764,3 +764,4 @@ ges.test.videoscale_effect
ges.test.backward_playback_with_start
ges.test.check_reverse_source
ges.test.timelineelement.set_child_prop_on_all_instances
ges.test.input_audio_channels_reorder

View file

@ -4,6 +4,6 @@ meta,
"change-issue-severity, issue-id=event::eos-has-wrong-seqnum, new-severity=critical",
},
args = {
"playbin uri=file://$(media_dir)/defaults/flv/819290236.flv",
"playbin uri=file://$(media_dir)/defaults/flv/819290236.flv audio-sink=\"$(audiosink)\" video-sink=\"$(videosink)\"",
},
scenario=seek_with_stop

View file

@ -145,12 +145,8 @@ gst_rsvg_decode_image (GstRsvgDec * rsvg, GstBuffer * buffer,
cairo_surface_t *surface;
RsvgHandle *handle;
GError *error = NULL;
#if LIBRSVG_MAJOR_VERSION > (2) || (LIBRSVG_MAJOR_VERSION == (2) && LIBRSVG_MINOR_VERSION > (52))
RsvgRectangle viewport;
#else
RsvgDimensionData dimension;
gdouble scalex, scaley;
#endif
GstRsvgDimension dimension;
GstMapInfo minfo;
GstVideoFrame vframe;
GstVideoCodecState *output_state;
@ -167,12 +163,10 @@ gst_rsvg_decode_image (GstRsvgDec * rsvg, GstBuffer * buffer,
g_error_free (error);
return GST_FLOW_ERROR;
}
#if LIBRSVG_MAJOR_VERSION > (2) || (LIBRSVG_MAJOR_VERSION == (2) && LIBRSVG_MINOR_VERSION > (52))
rsvg_handle_get_intrinsic_size_in_pixels (handle, &dimension.width,
&dimension.height);
#else
G_GNUC_BEGIN_IGNORE_DEPRECATIONS;
rsvg_handle_get_dimensions (handle, &dimension);
#endif
G_GNUC_END_IGNORE_DEPRECATIONS;
output_state = gst_video_decoder_get_output_state (decoder);
if ((output_state == NULL)
@ -188,9 +182,8 @@ gst_rsvg_decode_image (GstRsvgDec * rsvg, GstBuffer * buffer,
gst_pad_peer_query_caps (GST_VIDEO_DECODER_SRC_PAD (rsvg), templ_caps);
GST_DEBUG_OBJECT (rsvg,
"Trying to negotiate for SVG resolution %" G_GUINT64_FORMAT "x %"
G_GUINT64_FORMAT " with downstream caps %" GST_PTR_FORMAT,
(guint64) dimension.width, (guint64) dimension.height, peer_caps);
"Trying to negotiate for SVG resolution %ux%u with downstream caps %"
GST_PTR_FORMAT, dimension.width, dimension.height, peer_caps);
source_caps = gst_caps_make_writable (g_steal_pointer (&templ_caps));
gst_caps_set_simple (source_caps, "width", G_TYPE_INT, dimension.width,
@ -272,21 +265,6 @@ gst_rsvg_decode_image (GstRsvgDec * rsvg, GstBuffer * buffer,
cairo_set_operator (cr, CAIRO_OPERATOR_OVER);
cairo_set_source_rgba (cr, 0.0, 0.0, 0.0, 1.0);
#if LIBRSVG_MAJOR_VERSION > (2) || (LIBRSVG_MAJOR_VERSION == (2) && LIBRSVG_MINOR_VERSION > (52))
viewport.x = 0;
viewport.y = 0;
viewport.width = GST_VIDEO_INFO_WIDTH (&output_state->info);
viewport.height = GST_VIDEO_INFO_HEIGHT (&output_state->info);
if (!rsvg_handle_render_document (handle, cr, &viewport, &error)) {
GST_ERROR_OBJECT (rsvg, "Failed to render SVG image: %s", error->message);
g_error_free (error);
g_object_unref (handle);
cairo_destroy (cr);
cairo_surface_destroy (surface);
gst_video_codec_state_unref (output_state);
return GST_FLOW_ERROR;
}
#else
scalex = scaley = 1.0;
if (GST_VIDEO_INFO_WIDTH (&output_state->info) != dimension.width) {
scalex =
@ -300,8 +278,9 @@ gst_rsvg_decode_image (GstRsvgDec * rsvg, GstBuffer * buffer,
}
cairo_scale (cr, scalex, scaley);
G_GNUC_BEGIN_IGNORE_DEPRECATIONS;
rsvg_handle_render_cairo (handle, cr);
#endif
G_GNUC_END_IGNORE_DEPRECATIONS;
g_object_unref (handle);
cairo_destroy (cr);

View file

@ -44,15 +44,6 @@ G_BEGIN_DECLS
typedef struct _GstRsvgDec GstRsvgDec;
typedef struct _GstRsvgDecClass GstRsvgDecClass;
#if LIBRSVG_MAJOR_VERSION > (2) || (LIBRSVG_MAJOR_VERSION == (2) && LIBRSVG_MINOR_VERSION > (52))
typedef struct
{
gdouble width, height;
} GstRsvgDimension;
#else
typedef RsvgDimensionData GstRsvgDimension;
#endif
struct _GstRsvgDec
{
GstVideoDecoder decoder;
@ -66,7 +57,7 @@ struct _GstRsvgDec
guint64 frame_count;
GstVideoCodecState *input_state;
GstRsvgDimension dimension;
RsvgDimensionData dimension;
GstSegment segment;
gboolean need_newsegment;

View file

@ -163,7 +163,9 @@ gst_rsvg_overlay_set_svg_data (GstRsvgOverlay * overlay, const gchar * data,
} else {
/* Get SVG dimension. */
RsvgDimensionData svg_dimension;
G_GNUC_BEGIN_IGNORE_DEPRECATIONS;
rsvg_handle_get_dimensions (overlay->handle, &svg_dimension);
G_GNUC_END_IGNORE_DEPRECATIONS;
overlay->svg_width = svg_dimension.width;
overlay->svg_height = svg_dimension.height;
gst_base_transform_set_passthrough (btrans, FALSE);
@ -421,7 +423,9 @@ gst_rsvg_overlay_transform_frame_ip (GstVideoFilter * vfilter,
cairo_scale (cr, (double) applied_width / overlay->svg_width,
(double) applied_height / overlay->svg_height);
}
G_GNUC_BEGIN_IGNORE_DEPRECATIONS;
rsvg_handle_render_cairo (overlay->handle, cr);
G_GNUC_END_IGNORE_DEPRECATIONS;
GST_RSVG_UNLOCK (overlay);
cairo_destroy (cr);

View file

@ -26,6 +26,7 @@
#include <gst/video/video.h>
#include <gst/vulkan/vulkan.h>
#include "gst/vulkan/gstvkdecoder-private.h"
#include "gstvulkanelements.h"
typedef struct _GstVulkanH264Decoder GstVulkanH264Decoder;
@ -161,7 +162,7 @@ gst_vulkan_h264_decoder_open (GstVideoDecoder * decoder)
return FALSE;
}
self->decoder = gst_vulkan_queue_create_decoder (self->decode_queue,
self->decoder = gst_vulkan_decoder_new_from_queue (self->decode_queue,
VK_VIDEO_CODEC_OPERATION_DECODE_H264_BIT_KHR);
if (!self->decoder) {
GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND,

View file

@ -25,6 +25,7 @@
#include <gst/video/video.h>
#include <gst/vulkan/vulkan.h>
#include "gst/vulkan/gstvkdecoder-private.h"
#include "gstvulkanelements.h"
@ -219,7 +220,7 @@ gst_vulkan_h265_decoder_open (GstVideoDecoder * decoder)
return FALSE;
}
self->decoder = gst_vulkan_queue_create_decoder (self->decode_queue,
self->decoder = gst_vulkan_decoder_new_from_queue (self->decode_queue,
VK_VIDEO_CODEC_OPERATION_DECODE_H265_BIT_KHR);
if (!self->decoder) {
GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND,

View file

@ -64,8 +64,8 @@ GST_D3D11_API
HRESULT gst_d3d11_device_get_rasterizer_msaa (GstD3D11Device * device,
ID3D11RasterizerState ** rasterizer);
/* Used internally by gstd3d11utils.cpp */
void gst_d3d11_device_mark_removed (GstD3D11Device * device, HRESULT reason);
GST_D3D11_API
void gst_d3d11_device_check_device_removed (GstD3D11Device * device);
G_END_DECLS

View file

@ -96,24 +96,31 @@ enum
PROP_DESCRIPTION,
PROP_CREATE_FLAGS,
PROP_ADAPTER_LUID,
PROP_DEVICE_REMOVED_REASON,
};
static GParamSpec *pspec_removed_reason = nullptr;
#define DEFAULT_ADAPTER 0
#define DEFAULT_CREATE_FLAGS 0
enum
{
/* signals */
SIGNAL_DEVICE_REMOVED,
LAST_SIGNAL
};
static guint gst_d3d11_device_signals[LAST_SIGNAL] = { 0, };
/* *INDENT-OFF* */
struct _GstD3D11DevicePrivate
{
_GstD3D11DevicePrivate ()
{
device_removed_event =
CreateEventEx (nullptr, nullptr, 0, EVENT_ALL_ACCESS);
cancallable =
CreateEventEx (nullptr, nullptr, 0, EVENT_ALL_ACCESS);
}
~_GstD3D11DevicePrivate ()
{
CloseHandle (device_removed_event);
CloseHandle (cancallable);
}
guint adapter = 0;
guint device_id = 0;
guint vendor_id = 0;
@ -123,6 +130,7 @@ struct _GstD3D11DevicePrivate
gint64 adapter_luid = 0;
ID3D11Device *device = nullptr;
ID3D11Device4 *device4 = nullptr;
ID3D11Device5 *device5 = nullptr;
ID3D11DeviceContext *device_context = nullptr;
ID3D11DeviceContext4 *device_context4 = nullptr;
@ -157,7 +165,11 @@ struct _GstD3D11DevicePrivate
IDXGIInfoQueue *dxgi_info_queue = nullptr;
#endif
gboolean device_removed = FALSE;
DWORD device_removed_cookie = 0;
GThread *device_removed_monitor_thread = nullptr;
HANDLE device_removed_event;
HANDLE cancallable;
std::atomic<HRESULT> removed_reason = { S_OK };
};
/* *INDENT-ON* */
@ -430,18 +442,18 @@ gst_d3d11_device_class_init (GstD3D11DeviceClass * klass)
G_MININT64, G_MAXINT64, 0, readable_flags));
/**
* GstD3D11Device::device-removed:
* @device: the #d3d11device
* GstD3D11Device:device-removed-reason:
*
* Emitted when the D3D11Device gets suspended by the DirectX (error
* DXGI_ERROR_DEVICE_REMOVED or DXGI_ERROR_DEVICE_RESET have been returned
* after one of the DirectX operations).
* Device removed reason HRESULT code
*
* Since: 1.26
*/
gst_d3d11_device_signals[SIGNAL_DEVICE_REMOVED] =
g_signal_new ("device-removed", G_TYPE_FROM_CLASS (klass),
G_SIGNAL_RUN_LAST, 0, NULL, NULL, NULL, G_TYPE_NONE, 1, G_TYPE_INT);
pspec_removed_reason =
g_param_spec_int ("device-removed-reason", "Device Removed Reason",
"HRESULT code returned from ID3D11Device::GetDeviceRemovedReason",
G_MININT32, G_MAXINT32, 0, readable_flags);
g_object_class_install_property (gobject_class, PROP_DEVICE_REMOVED_REASON,
pspec_removed_reason);
gst_d3d11_memory_init_once ();
}
@ -700,6 +712,9 @@ gst_d3d11_device_get_property (GObject * object, guint prop_id,
case PROP_ADAPTER_LUID:
g_value_set_int64 (value, priv->adapter_luid);
break;
case PROP_DEVICE_REMOVED_REASON:
g_value_set_int (value, priv->removed_reason);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
break;
@ -741,7 +756,14 @@ gst_d3d11_device_dispose (GObject * object)
GST_LOG_OBJECT (self, "dispose");
if (priv->device4 && priv->device_removed_monitor_thread) {
priv->device4->UnregisterDeviceRemoved (priv->device_removed_cookie);
SetEvent (priv->cancallable);
g_clear_pointer (&priv->device_removed_monitor_thread, g_thread_join);
}
AcquireSRWLockExclusive (&_device_creation_rwlock);
priv->ps_cache.clear ();
priv->vs_cache.clear ();
priv->sampler_cache.clear ();
@ -749,6 +771,7 @@ gst_d3d11_device_dispose (GObject * object)
GST_D3D11_CLEAR_COM (priv->rs);
GST_D3D11_CLEAR_COM (priv->rs_msaa);
GST_D3D11_CLEAR_COM (priv->device5);
GST_D3D11_CLEAR_COM (priv->device4);
GST_D3D11_CLEAR_COM (priv->device_context4);
GST_D3D11_CLEAR_COM (priv->video_device);
GST_D3D11_CLEAR_COM (priv->video_context);
@ -977,12 +1000,47 @@ gst_d3d11_device_setup_debug_layer (GstD3D11Device * self)
#endif
}
void
gst_d3d11_device_check_device_removed (GstD3D11Device * self)
{
auto priv = self->priv;
auto removed_reason = priv->device->GetDeviceRemovedReason ();
if (removed_reason == S_OK)
return;
HRESULT expected = S_OK;
if (std::atomic_compare_exchange_strong (&priv->removed_reason,
&expected, removed_reason)) {
auto error_text = g_win32_error_message ((guint) priv->removed_reason);
GST_ERROR_OBJECT (self, "DeviceRemovedReason: 0x%x, %s",
(guint) priv->removed_reason, GST_STR_NULL (error_text));
g_free (error_text);
g_object_notify_by_pspec (G_OBJECT (self), pspec_removed_reason);
}
}
static gpointer
gst_d3d11_device_removed_monitor_thread (GstD3D11Device * self)
{
auto priv = self->priv;
HANDLE waitables[] = { priv->device_removed_event, priv->cancallable };
auto ret = WaitForMultipleObjects (2, waitables, FALSE, INFINITE);
if (ret == WAIT_OBJECT_0)
gst_d3d11_device_check_device_removed (self);
return nullptr;
}
static GstD3D11Device *
gst_d3d11_device_new_internal (const GstD3D11DeviceConstructData * data)
{
ComPtr < IDXGIAdapter1 > adapter;
ComPtr < IDXGIFactory1 > factory;
ComPtr < ID3D11Device > device;
ComPtr < ID3D11Device4 > device4;
ComPtr < ID3D11Device5 > device5;
ComPtr < ID3D11DeviceContext > device_context;
ComPtr < ID3D11DeviceContext4 > device_context4;
@ -1115,6 +1173,18 @@ gst_d3d11_device_new_internal (const GstD3D11DeviceConstructData * data)
priv = self->priv;
hr = device.As (&device4);
if (SUCCEEDED (hr)) {
hr = device4->RegisterDeviceRemovedEvent (priv->device_removed_event,
&priv->device_removed_cookie);
if (SUCCEEDED (hr)) {
priv->device4 = device4.Detach ();
priv->device_removed_monitor_thread =
g_thread_new ("d3d11-removed-monitor",
(GThreadFunc) gst_d3d11_device_removed_monitor_thread, self);
}
}
hr = device.As (&device5);
if (SUCCEEDED (hr))
hr = device_context.As (&device_context4);
@ -1430,18 +1500,6 @@ gst_d3d11_device_get_format (GstD3D11Device * device, GstVideoFormat format,
return TRUE;
}
void
gst_d3d11_device_mark_removed (GstD3D11Device * device, HRESULT reason)
{
g_return_if_fail (GST_IS_D3D11_DEVICE (device));
if (!device->priv->device_removed) {
g_signal_emit (device, gst_d3d11_device_signals[SIGNAL_DEVICE_REMOVED], 0,
reason);
device->priv->device_removed = TRUE;
}
}
GST_DEFINE_MINI_OBJECT_TYPE (GstD3D11Fence, gst_d3d11_fence);
struct _GstD3D11FencePrivate

View file

@ -551,26 +551,6 @@ gst_d3d11_luid_to_int64 (const LUID * luid)
return val.QuadPart;
}
#ifndef GST_DISABLE_GST_DEBUG
static void
gst_d3d11_log_gpu_remove_reason (HRESULT hr, GstD3D11Device * device,
GstDebugCategory * cat, const gchar * file, const gchar * function,
gint line)
{
gchar *error_text = g_win32_error_message ((guint) hr);
gst_debug_log (cat, GST_LEVEL_ERROR, file, function, line,
NULL, "DeviceRemovedReason: 0x%x, %s", (guint) hr,
GST_STR_NULL (error_text));
if (hr != DXGI_ERROR_DEVICE_REMOVED)
g_critical ("D3D11Device have been removed. Reason (0x%x): %s",
(guint) hr, GST_STR_NULL (error_text));
g_free (error_text);
gst_d3d11_device_log_live_objects (device, file, function, line);
}
#endif
/**
* _gst_d3d11_result:
* @result: HRESULT D3D11 API return code
@ -591,11 +571,14 @@ _gst_d3d11_result (HRESULT hr, GstD3D11Device * device, GstDebugCategory * cat,
const gchar * file, const gchar * function, gint line)
{
#ifndef GST_DISABLE_GST_DEBUG
gboolean ret = TRUE;
#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H)
if (device) {
gst_d3d11_device_d3d11_debug (device, file, function, line);
gst_d3d11_device_dxgi_debug (device, file, function, line);
}
#endif
if (FAILED (hr)) {
gchar *error_text = NULL;
error_text = g_win32_error_message ((guint) hr);
/* g_win32_error_message() doesn't cover all HERESULT return code,
* so it could be empty string, or null if there was an error
@ -604,29 +587,16 @@ _gst_d3d11_result (HRESULT hr, GstD3D11Device * device, GstDebugCategory * cat,
NULL, "D3D11 call failed: 0x%x, %s", (guint) hr,
GST_STR_NULL (error_text));
g_free (error_text);
if (device) {
ID3D11Device *device_handle = gst_d3d11_device_get_device_handle (device);
hr = device_handle->GetDeviceRemovedReason ();
if (hr != S_OK) {
gst_d3d11_log_gpu_remove_reason (hr, device, cat, file, function, line);
gst_d3d11_device_mark_removed (device, hr);
}
}
ret = FALSE;
}
#if (HAVE_D3D11SDKLAYERS_H || HAVE_DXGIDEBUG_H)
if (device) {
gst_d3d11_device_d3d11_debug (device, file, function, line);
gst_d3d11_device_dxgi_debug (device, file, function, line);
}
#endif
return ret;
#else
return SUCCEEDED (hr);
#endif
if (SUCCEEDED (hr))
return TRUE;
if (device)
gst_d3d11_device_check_device_removed (device);
return FALSE;
}
/**

View file

@ -61,5 +61,8 @@ void gst_d3d12_device_11on12_lock (GstD3D12Device * device);
GST_D3D12_API
void gst_d3d12_device_11on12_unlock (GstD3D12Device * device);
GST_D3D12_API
void gst_d3d12_device_check_device_removed (GstD3D12Device * device);
G_END_DECLS

View file

@ -41,6 +41,7 @@
#include <unordered_map>
#include <thread>
#include <gmodule.h>
#include <atomic>
GST_DEBUG_CATEGORY_STATIC (gst_d3d12_sdk_debug);
@ -91,13 +92,21 @@ enum
PROP_VENDOR_ID,
PROP_HARDWARE,
PROP_DESCRIPTION,
PROP_DEVICE_REMOVED_REASON,
};
static GParamSpec *pspec_removed_reason = nullptr;
/* *INDENT-OFF* */
using namespace Microsoft::WRL;
struct DeviceInner
{
DeviceInner ()
{
dev_removed_event = CreateEventEx (nullptr, nullptr, 0, EVENT_ALL_ACCESS);
}
~DeviceInner ()
{
Drain ();
@ -114,7 +123,13 @@ struct DeviceInner
factory = nullptr;
adapter = nullptr;
ReportLiveObjects ();
if (removed_reason == S_OK)
ReportLiveObjects ();
if (dev_removed_monitor_handle)
UnregisterWait (dev_removed_monitor_handle);
CloseHandle (dev_removed_event);
}
void Drain ()
@ -169,6 +184,24 @@ struct DeviceInner
info_queue->ClearStoredMessages ();
}
void AddClient (GstD3D12Device * client)
{
std::lock_guard <std::mutex> lk (lock);
clients.push_back (client);
}
void RemoveClient (GstD3D12Device * client)
{
std::lock_guard <std::mutex> lk (lock);
auto it = clients.begin ();
for (auto it = clients.begin (); it != clients.end(); it++) {
if (*it == client) {
clients.erase (it);
return;
}
}
}
ComPtr<ID3D12Device> device;
ComPtr<IDXGIAdapter1> adapter;
ComPtr<IDXGIFactory2> factory;
@ -196,6 +229,13 @@ struct DeviceInner
guint vendor_id = 0;
std::string description;
gint64 adapter_luid = 0;
HANDLE dev_removed_monitor_handle = nullptr;
HANDLE dev_removed_event;
ComPtr<ID3D12Fence> dev_removed_fence;
std::atomic<HRESULT> removed_reason = { S_OK };
std::vector<GstD3D12Device*> clients;
};
typedef std::shared_ptr<DeviceInner> DeviceInnerPtr;
@ -241,7 +281,7 @@ public:
GstD3D12Device * GetDevice (const GstD3D12DeviceConstructData * data)
{
std::lock_guard <std::mutex> lk (lock_);
std::lock_guard <std::recursive_mutex> lk (lock_);
auto it = std::find_if (list_.begin (), list_.end (),
[&] (const auto & device) {
if (data->type == GST_D3D12_DEVICE_CONSTRUCT_FOR_INDEX)
@ -261,6 +301,8 @@ public:
GST_DEBUG_OBJECT (device, "Reusing created device");
device->priv->inner->AddClient (device);
return device;
}
@ -275,12 +317,14 @@ public:
list_.push_back (device->priv->inner);
device->priv->inner->AddClient (device);
return device;
}
void ReleaseDevice (gint64 luid)
{
std::lock_guard <std::mutex> lk (lock_);
std::lock_guard <std::recursive_mutex> lk (lock_);
for (const auto & it : list_) {
if (it->adapter_luid == luid) {
if (it.use_count () == 1) {
@ -292,6 +336,52 @@ public:
}
}
void OnDeviceRemoved (gint64 luid)
{
std::lock_guard <std::recursive_mutex> lk (lock_);
DeviceInnerPtr ptr;
{
auto it = std::find_if (list_.begin (), list_.end (),
[&] (const auto & device) {
return device->adapter_luid == luid;
});
if (it == list_.end ())
return;
ptr = *it;
list_.erase (it);
}
UnregisterWait (ptr->dev_removed_monitor_handle);
ptr->dev_removed_monitor_handle = nullptr;
ptr->removed_reason = ptr->device->GetDeviceRemovedReason ();
if (SUCCEEDED (ptr->removed_reason))
ptr->removed_reason = DXGI_ERROR_DEVICE_REMOVED;
auto error_text = g_win32_error_message ((guint) ptr->removed_reason);
GST_ERROR ("Adapter LUID: %" G_GINT64_FORMAT
", DeviceRemovedReason: 0x%x, %s", ptr->adapter_luid,
(guint) ptr->removed_reason, GST_STR_NULL (error_text));
g_free (error_text);
std::vector<GstD3D12Device *> clients;
{
std::lock_guard<std::mutex> client_lk (ptr->lock);
for (auto it : ptr->clients) {
gst_object_ref (it);
clients.push_back (it);
}
}
for (auto it : clients) {
g_object_notify_by_pspec (G_OBJECT (it), pspec_removed_reason);
gst_object_unref (it);
}
}
private:
DeviceCacheManager () {}
~DeviceCacheManager () {}
@ -312,12 +402,20 @@ private:
}
private:
std::mutex lock_;
std::recursive_mutex lock_;
std::vector<DeviceInnerPtr> list_;
std::unordered_map<UINT,UINT> name_map_;
};
/* *INDENT-ON* */
static VOID NTAPI
on_device_removed (PVOID context, BOOLEAN unused)
{
DeviceInner *inner = (DeviceInner *) context;
auto manager = DeviceCacheManager::GetInstance ();
manager->OnDeviceRemoved (inner->adapter_luid);
}
static gboolean
gst_d3d12_device_enable_debug (void)
{
@ -373,6 +471,7 @@ gst_d3d12_device_enable_debug (void)
#define gst_d3d12_device_parent_class parent_class
G_DEFINE_TYPE (GstD3D12Device, gst_d3d12_device, GST_TYPE_OBJECT);
static void gst_d3d12_device_dispose (GObject * object);
static void gst_d3d12_device_finalize (GObject * object);
static void gst_d3d12_device_get_property (GObject * object, guint prop_id,
GValue * value, GParamSpec * pspec);
@ -385,6 +484,7 @@ gst_d3d12_device_class_init (GstD3D12DeviceClass * klass)
GParamFlags readable_flags =
(GParamFlags) (G_PARAM_READABLE | G_PARAM_STATIC_STRINGS);
gobject_class->dispose = gst_d3d12_device_dispose;
gobject_class->finalize = gst_d3d12_device_finalize;
gobject_class->get_property = gst_d3d12_device_get_property;
@ -409,6 +509,13 @@ gst_d3d12_device_class_init (GstD3D12DeviceClass * klass)
g_object_class_install_property (gobject_class, PROP_DESCRIPTION,
g_param_spec_string ("description", "Description",
"Human readable device description", nullptr, readable_flags));
pspec_removed_reason =
g_param_spec_int ("device-removed-reason", "Device Removed Reason",
"HRESULT code returned from ID3D12Device::GetDeviceRemovedReason",
G_MININT32, G_MAXINT32, 0, readable_flags);
g_object_class_install_property (gobject_class, PROP_DEVICE_REMOVED_REASON,
pspec_removed_reason);
}
static void
@ -417,6 +524,19 @@ gst_d3d12_device_init (GstD3D12Device * self)
self->priv = new GstD3D12DevicePrivate ();
}
static void
gst_d3d12_device_dispose (GObject * object)
{
auto self = GST_D3D12_DEVICE (object);
GST_DEBUG_OBJECT (self, "Dispose");
if (self->priv->inner)
self->priv->inner->RemoveClient (self);
G_OBJECT_CLASS (parent_class)->dispose (object);
}
static void
gst_d3d12_device_finalize (GObject * object)
{
@ -459,6 +579,9 @@ gst_d3d12_device_get_property (GObject * object, guint prop_id,
case PROP_DESCRIPTION:
g_value_set_string (value, priv->description.c_str ());
break;
case PROP_DEVICE_REMOVED_REASON:
g_value_set_int (value, priv->removed_reason);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
break;
@ -795,6 +918,26 @@ gst_d3d12_device_new_internal (const GstD3D12DeviceConstructData * data)
GST_OBJECT_FLAG_SET (priv->copy_cl_pool, GST_OBJECT_FLAG_MAY_BE_LEAKED);
GST_OBJECT_FLAG_SET (priv->copy_ca_pool, GST_OBJECT_FLAG_MAY_BE_LEAKED);
hr = device->CreateFence (0,
D3D12_FENCE_FLAG_NONE, IID_PPV_ARGS (&priv->dev_removed_fence));
if (FAILED (hr)) {
GST_ERROR_OBJECT (self, "Couldn't create device removed monitor fence");
gst_object_unref (self);
return nullptr;
}
hr = priv->dev_removed_fence->SetEventOnCompletion (G_MAXUINT64,
priv->dev_removed_event);
if (FAILED (hr)) {
GST_ERROR_OBJECT (self, "SetEventOnCompletion failed");
gst_object_unref (self);
return nullptr;
}
RegisterWaitForSingleObject (&priv->dev_removed_monitor_handle,
priv->dev_removed_event, on_device_removed, priv.get (), INFINITE,
WT_EXECUTEONLYONCE);
return self;
error:
@ -1445,3 +1588,16 @@ gst_d3d12_device_11on12_unlock (GstD3D12Device * device)
auto priv = device->priv->inner;
priv->device11on12_lock.unlock ();
}
void
gst_d3d12_device_check_device_removed (GstD3D12Device * device)
{
g_return_if_fail (GST_IS_D3D12_DEVICE (device));
auto priv = device->priv->inner;
auto hr = priv->device->GetDeviceRemovedReason ();
if (FAILED (hr)) {
auto manager = DeviceCacheManager::GetInstance ();
manager->OnDeviceRemoved (priv->adapter_luid);
}
}

View file

@ -649,6 +649,7 @@ gst_d3d12_buffer_copy_into (GstBuffer * dest, GstBuffer * src,
* @file: the file that checking the result code
* @function: the function that checking the result code
* @line: the line that checking the result code
* @level: #GstDebugLevel
*
* Prints debug message if @result code indicates the operation was failed.
*
@ -661,14 +662,11 @@ _gst_d3d12_result (HRESULT hr, GstD3D12Device * device, GstDebugCategory * cat,
const gchar * file, const gchar * function, gint line, GstDebugLevel level)
{
#ifndef GST_DISABLE_GST_DEBUG
gboolean ret = TRUE;
if (device)
gst_d3d12_device_d3d12_debug (device, file, function, line);
if (FAILED (hr)) {
gchar *error_text = nullptr;
error_text = g_win32_error_message ((guint) hr);
/* g_win32_error_message() doesn't cover all HERESULT return code,
* so it could be empty string, or nullptr if there was an error
@ -677,12 +675,14 @@ _gst_d3d12_result (HRESULT hr, GstD3D12Device * device, GstDebugCategory * cat,
nullptr, "D3D12 call failed: 0x%x, %s", (guint) hr,
GST_STR_NULL (error_text));
g_free (error_text);
ret = FALSE;
}
return ret;
#else
return SUCCEEDED (hr);
#endif
if (SUCCEEDED (hr))
return TRUE;
if (device)
gst_d3d12_device_check_device_removed (device);
return FALSE;
}

View file

@ -1452,6 +1452,28 @@ state_changed_cb (G_GNUC_UNUSED GstBus * bus, GstMessage * msg,
on_duration_changed (self, duration);
}
}
/* Try to query seek information again for live stream */
if (self->is_live) {
gboolean seekable = FALSE;
gboolean updated = FALSE;
GstQuery *query = gst_query_new_seeking (GST_FORMAT_TIME);
if (gst_element_query (self->playbin, query))
gst_query_parse_seeking (query, NULL, &seekable, NULL, NULL);
gst_query_unref (query);
g_mutex_lock (&self->lock);
if (self->media_info && seekable != self->media_info->seekable) {
self->media_info->seekable = seekable;
updated = TRUE;
}
g_mutex_unlock (&self->lock);
if (updated) {
on_media_info_updated (self);
}
}
/* api_bus_post_message (self, GST_PLAY_MESSAGE_POSITION_UPDATED, */
/* GST_PLAY_MESSAGE_DATA_POSITION, GST_TYPE_CLOCK_TIME, 0, NULL); */

View file

@ -1327,7 +1327,7 @@ static inline gboolean
_is_old_mesa (GstVaAllocator * va_allocator)
{
return GST_VA_DISPLAY_IS_IMPLEMENTATION (va_allocator->display, MESA_GALLIUM)
&& !gst_va_display_check_version (va_allocator->display, 23, 2);
&& !gst_va_display_check_version (va_allocator->display, 23, 3);
}
#endif /* G_OS_WIN32 */
@ -1376,13 +1376,15 @@ _update_image_info (GstVaAllocator * va_allocator,
}
va_allocator->use_derived = FALSE;
#else
/* XXX: Derived in Mesa <23.3 can't use derived images for P010 format
/* XXX: Derived in radeonsi Mesa <23.3 can't use derived images for several
* cases
* https://gitlab.freedesktop.org/mesa/mesa/-/merge_requests/24381
* https://gitlab.freedesktop.org/mesa/mesa/-/merge_requests/26174
*/
if (va_allocator->img_format == GST_VIDEO_FORMAT_P010_10LE
&& _is_old_mesa (va_allocator)) {
if (_is_old_mesa (va_allocator)) {
if (feat_use_derived != GST_VA_FEATURE_DISABLED) {
GST_INFO_OBJECT (va_allocator, "Disable image derive on old Mesa.");
GST_INFO_OBJECT (va_allocator,
"Disable image derive on old Mesa (< 23.3).");
feat_use_derived = GST_VA_FEATURE_DISABLED;
}
va_allocator->use_derived = FALSE;

View file

@ -23,9 +23,6 @@
#endif
#include "gstvavideoformat.h"
#ifndef G_OS_WIN32
#include <libdrm/drm_fourcc.h>
#endif
#define GST_CAT_DEFAULT gst_va_display_debug
GST_DEBUG_CATEGORY_EXTERN (gst_va_display_debug);
@ -38,73 +35,62 @@ static struct FormatMap
GstVideoFormat format;
guint va_rtformat;
VAImageFormat va_format;
/* The drm fourcc may have different definition from VA */
guint drm_fourcc;
} format_map[] = {
#ifndef G_OS_WIN32
#define F(format, drm, fourcc, rtformat, order, bpp, depth, r, g, b, a) { \
#define F(format, fourcc, rtformat, order, bpp, depth, r, g, b, a) { \
G_PASTE (GST_VIDEO_FORMAT_, format), \
G_PASTE (VA_RT_FORMAT_, rtformat), \
{ VA_FOURCC fourcc, G_PASTE (G_PASTE (VA_, order), _FIRST), \
bpp, depth, r, g, b, a }, G_PASTE (DRM_FORMAT_, drm) }
#else
#define F(format, drm, fourcc, rtformat, order, bpp, depth, r, g, b, a) { \
G_PASTE (GST_VIDEO_FORMAT_, format), \
G_PASTE (VA_RT_FORMAT_, rtformat), \
{ VA_FOURCC fourcc, G_PASTE (G_PASTE (VA_, order), _FIRST), \
bpp, depth, r, g, b, a }, 0 /* DRM_FORMAT_INVALID */ }
#endif
#define G(format, drm, fourcc, rtformat, order, bpp) \
F (format, drm, fourcc, rtformat, order, bpp, 0, 0, 0 ,0, 0)
G (NV12, NV12, ('N', 'V', '1', '2'), YUV420, NSB, 12),
G (NV21, NV21, ('N', 'V', '2', '1'), YUV420, NSB, 21),
G (VUYA, AYUV, ('A', 'Y', 'U', 'V'), YUV444, LSB, 32),
F (RGBA, RGBA8888, ('R', 'G', 'B', 'A'), RGB32, LSB, 32, 32, 0x000000ff,
bpp, depth, r, g, b, a } }
#define G(format, fourcc, rtformat, order, bpp) \
F (format, fourcc, rtformat, order, bpp, 0, 0, 0 ,0, 0)
G (NV12, ('N', 'V', '1', '2'), YUV420, NSB, 12),
G (NV21, ('N', 'V', '2', '1'), YUV420, NSB, 21),
G (VUYA, ('A', 'Y', 'U', 'V'), YUV444, LSB, 32),
F (RGBA, ('R', 'G', 'B', 'A'), RGB32, LSB, 32, 32, 0x000000ff,
0x0000ff00, 0x00ff0000, 0xff000000),
F (RGBx, RGBX8888, ('R', 'G', 'B', 'X'), RGB32, LSB, 32, 24, 0x000000ff,
F (RGBx, ('R', 'G', 'B', 'X'), RGB32, LSB, 32, 24, 0x000000ff,
0x0000ff00, 0x00ff0000, 0x00000000),
F (BGRA, BGRA8888, ('B', 'G', 'R', 'A'), RGB32, LSB, 32, 32, 0x00ff0000,
F (BGRA, ('B', 'G', 'R', 'A'), RGB32, LSB, 32, 32, 0x00ff0000,
0x0000ff00, 0x000000ff, 0xff000000),
F (ARGB, ARGB8888, ('A', 'R', 'G', 'B'), RGB32, LSB, 32, 32, 0x0000ff00,
F (ARGB, ('A', 'R', 'G', 'B'), RGB32, LSB, 32, 32, 0x0000ff00,
0x00ff0000, 0xff000000, 0x000000ff),
F (xRGB, XRGB8888, ('X', 'R', 'G', 'B'), RGB32, LSB, 32, 24, 0x0000ff00,
F (xRGB, ('X', 'R', 'G', 'B'), RGB32, LSB, 32, 24, 0x0000ff00,
0x00ff0000, 0xff000000, 0x00000000),
F (ABGR, ABGR8888, ('A', 'B', 'G', 'R'), RGB32, LSB, 32, 32, 0xff000000,
F (ABGR, ('A', 'B', 'G', 'R'), RGB32, LSB, 32, 32, 0xff000000,
0x00ff0000, 0x0000ff00, 0x000000ff),
F (xBGR, XBGR8888, ('X', 'B', 'G', 'R'), RGB32, LSB, 32, 24, 0xff000000,
F (xBGR, ('X', 'B', 'G', 'R'), RGB32, LSB, 32, 24, 0xff000000,
0x00ff0000, 0x0000ff00, 0x00000000),
F (BGRx, BGRX8888, ('B', 'G', 'R', 'X'), RGB32, LSB, 32, 24, 0x00ff0000,
F (BGRx, ('B', 'G', 'R', 'X'), RGB32, LSB, 32, 24, 0x00ff0000,
0x0000ff00, 0x000000ff, 0x00000000),
G (UYVY, UYVY, ('U', 'Y', 'V', 'Y'), YUV422, NSB, 16),
G (YUY2, YUYV, ('Y', 'U', 'Y', '2'), YUV422, NSB, 16),
G (AYUV, AYUV, ('A', 'Y', 'U', 'V'), YUV444, LSB, 32),
G (UYVY, ('U', 'Y', 'V', 'Y'), YUV422, NSB, 16),
G (YUY2, ('Y', 'U', 'Y', '2'), YUV422, NSB, 16),
G (AYUV, ('A', 'Y', 'U', 'V'), YUV444, LSB, 32),
/* F (????, NV11), */
G (YV12, YVU420, ('Y', 'V', '1', '2'), YUV420, NSB, 12),
G (YV12, ('Y', 'V', '1', '2'), YUV420, NSB, 12),
/* F (????, P208), */
G (I420, YUV420, ('I', '4', '2', '0'), YUV420, NSB, 12),
G (I420, ('I', '4', '2', '0'), YUV420, NSB, 12),
/* F (????, YV24), */
/* F (????, YV32), */
/* F (????, Y800), */
/* F (????, IMC3), */
/* F (????, 411P), */
/* F (????, 411R), */
G (Y42B, YUV422, ('4', '2', '2', 'H'), YUV422, LSB, 16),
G (Y42B, ('4', '2', '2', 'H'), YUV422, LSB, 16),
/* F (????, 422V), */
/* F (????, 444P), */
/* No RGBP support in drm fourcc */
G (RGBP, INVALID, ('R', 'G', 'B', 'P'), RGBP, LSB, 8),
G (RGBP, ('R', 'G', 'B', 'P'), RGBP, LSB, 8),
/* F (????, BGRP), */
/* F (????, RGB565), */
/* F (????, BGR565), */
G (Y210, Y210, ('Y', '2', '1', '0'), YUV422_10, NSB, 32),
G (Y210, ('Y', '2', '1', '0'), YUV422_10, NSB, 32),
/* F (????, Y216), */
G (Y410, Y410, ('Y', '4', '1', '0'), YUV444_10, NSB, 32),
G (Y212_LE, Y212, ('Y', '2', '1', '2'), YUV422_12, NSB, 32),
G (Y412_LE, Y412, ('Y', '4', '1', '2'), YUV444_12, NSB, 32),
G (Y410, ('Y', '4', '1', '0'), YUV444_10, NSB, 32),
G (Y212_LE, ('Y', '2', '1', '2'), YUV422_12, NSB, 32),
G (Y412_LE, ('Y', '4', '1', '2'), YUV444_12, NSB, 32),
/* F (????, Y416), */
/* F (????, YV16), */
G (P010_10LE, P010, ('P', '0', '1', '0'), YUV420_10, NSB, 24),
G (P012_LE, P012, ('P', '0', '1', '2'), YUV420_12, NSB, 24),
G (P010_10LE, ('P', '0', '1', '0'), YUV420_10, NSB, 24),
G (P012_LE, ('P', '0', '1', '2'), YUV420_12, NSB, 24),
/* F (P016_LE, P016, ????), */
/* F (????, I010), */
/* F (????, IYUV), */
@ -112,20 +98,19 @@ static struct FormatMap
/* F (????, A2B10G10R10), */
/* F (????, X2R10G10B10), */
/* F (????, X2B10G10R10), */
/* No GRAY8 support in drm fourcc */
G (GRAY8, INVALID, ('Y', '8', '0', '0'), YUV400, NSB, 8),
G (Y444, YUV444, ('4', '4', '4', 'P'), YUV444, NSB, 24),
G (GRAY8, ('Y', '8', '0', '0'), YUV400, NSB, 8),
G (Y444, ('4', '4', '4', 'P'), YUV444, NSB, 24),
/* F (????, Y16), */
/* G (VYUY, VYUY, YUV422), */
/* G (YVYU, YVYU, YUV422), */
/* F (ARGB64, ARGB64, ????), */
/* F (????, ABGR64), */
F (RGB16, RGB565, ('R', 'G', '1', '6'), RGB16, NSB, 16, 16, 0x0000f800,
F (RGB16, ('R', 'G', '1', '6'), RGB16, NSB, 16, 16, 0x0000f800,
0x000007e0, 0x0000001f, 0x00000000),
F (RGB, RGB888, ('R', 'G', '2', '4'), RGB32, NSB, 32, 24, 0x00ff0000,
F (RGB, ('R', 'G', '2', '4'), RGB32, NSB, 32, 24, 0x00ff0000,
0x0000ff00, 0x000000ff, 0x00000000),
F (BGR10A2_LE, ARGB2101010, ('A', 'R', '3', '0'), RGB32, LSB, 32, 30,
0x3ff00000, 0x000ffc00, 0x000003ff, 0x30000000),
F (BGR10A2_LE, ('A', 'R', '3', '0'), RGB32, LSB, 32, 30, 0x3ff00000,
0x000ffc00, 0x000003ff, 0x30000000),
#undef F
#undef G
};
@ -133,54 +118,47 @@ static struct FormatMap
static const struct RBG32FormatMap
{
GstVideoFormat format;
guint drm_fourcc;
VAImageFormat va_format[2];
} rgb32_format_map[] = {
#define F(fourcc, order, bpp, depth, r, g, b, a) \
{ VA_FOURCC fourcc, G_PASTE (G_PASTE (VA_, order), _FIRST), bpp, depth, r, g, b, a }
#define A(fourcc, order, r, g, b, a) F (fourcc, order, 32, 32, r, g, b, a)
#define X(fourcc, order, r, g, b) F (fourcc, order, 32, 24, r, g, b, 0x0)
#ifndef G_OS_WIN32
#define D(format, drm_fourcc) G_PASTE (GST_VIDEO_FORMAT_, format), G_PASTE (DRM_FORMAT_, drm_fourcc)
#else
#define D(format, drm_fourcc) G_PASTE (GST_VIDEO_FORMAT_, format), 0 /* DRM_FORMAT_INVALID */
#endif
{ D (ARGB, BGRA8888), {
{ GST_VIDEO_FORMAT_ARGB, {
A (('B', 'G', 'R', 'A'), LSB, 0x0000ff00, 0x00ff0000, 0xff000000, 0x000000ff),
A (('A', 'R', 'G', 'B'), MSB, 0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000),
} },
{ D (RGBA, ABGR8888), {
{ GST_VIDEO_FORMAT_RGBA, {
A (('A', 'B', 'G', 'R'), LSB, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000),
A (('R', 'G', 'B', 'A'), MSB, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff),
} },
{ D (ABGR, RGBA8888), {
{ GST_VIDEO_FORMAT_ABGR, {
A (('R', 'G', 'B', 'A'), LSB, 0xff000000, 0x00ff0000, 0x0000ff00, 0x000000ff),
A (('A', 'B', 'G', 'R'), MSB, 0x000000ff, 0x0000ff00, 0x00ff0000, 0xff000000),
} },
{ D (BGRA, ARGB8888), {
{ GST_VIDEO_FORMAT_BGRA, {
A (('A', 'R', 'G', 'B'), LSB, 0x00ff0000, 0x0000ff00, 0x000000ff, 0xff000000),
A (('B', 'G', 'R', 'A'), MSB, 0x0000ff00, 0x00ff0000, 0xff000000, 0x000000ff),
} },
{ D (xRGB, BGRX8888), {
{ GST_VIDEO_FORMAT_xRGB, {
X (('B', 'G', 'R', 'X'), LSB, 0x0000ff00, 0x00ff0000, 0xff000000),
X (('X', 'R', 'G', 'B'), MSB, 0x00ff0000, 0x0000ff00, 0x000000ff),
} },
{ D (RGBx, XBGR8888), {
{ GST_VIDEO_FORMAT_RGBx, {
X (('X', 'B', 'G', 'R'), LSB, 0x000000ff, 0x0000ff00, 0x00ff0000),
X (('R', 'G', 'B', 'X'), MSB, 0xff000000, 0x00ff0000, 0x0000ff00),
} },
{ D (xBGR, RGBX8888), {
{ GST_VIDEO_FORMAT_xBGR, {
X (('R', 'G', 'B', 'X'), LSB, 0xff000000, 0x00ff0000, 0x0000ff00),
X (('X', 'B', 'G', 'R'), MSB, 0x000000ff, 0x0000ff00, 0x00ff0000),
} },
{ D (BGRx, XRGB8888), {
{ GST_VIDEO_FORMAT_BGRx, {
X (('X', 'R', 'G', 'B'), LSB, 0x00ff0000, 0x0000ff00, 0x000000ff),
X (('B', 'G', 'R', 'X'), MSB, 0x0000ff00, 0x00ff0000, 0xff000000),
} },
#undef X
#undef A
#undef F
#undef D
};
/* *INDENT-ON* */
@ -201,9 +179,11 @@ static const struct FormatMap *
get_format_map_from_drm_fourcc (guint drm_fourcc)
{
int i;
GstVideoFormat format;
format = gst_video_dma_drm_fourcc_to_format (drm_fourcc);
for (i = 0; i < G_N_ELEMENTS (format_map); i++) {
if (format_map[i].drm_fourcc == drm_fourcc)
if (format_map[i].format == format)
return &format_map[i];
}
@ -263,6 +243,20 @@ get_format_map_from_va_image_format (const VAImageFormat * va_format)
return NULL;
}
/**
* XXX: there are two potentially different fourcc: the VA and the DRM.
*
* Normally they should be the same, but there are a couple formats where VA's
* fourcc is different from the DRM's fourcc. One example is
* GST_VIDEO_FORMAT_I420, where VA's fourcc is ('I', '4', '2', '0') while DRM's
* is ('Y', 'U', '1', '2').
*
* That's the reason there are two functions:
* gst_va_video_format_from_va_fourcc() and
* gst_va_video_format_from_drm_fourcc() They should be used depending where the
* value is going to be used: for VA concerns the first should be used, for
* DMABuf exportation, the last.
*/
GstVideoFormat
gst_va_video_format_from_va_fourcc (guint fourcc)
{
@ -292,7 +286,7 @@ gst_va_drm_fourcc_from_video_format (GstVideoFormat format)
{
const struct FormatMap *map = get_format_map_from_video_format (format);
return map ? map->drm_fourcc : 0;
return map ? gst_video_dma_drm_fourcc_from_format (format) : 0;
}
guint
@ -406,15 +400,13 @@ gst_va_dma_drm_info_to_video_info (const GstVideoInfoDmaDrm * drm_info,
}
static GstVideoFormat
find_gst_video_format_in_rgb32_map (VAImageFormat * image_format,
guint * drm_fourcc)
find_gst_video_format_in_rgb32_map (VAImageFormat * image_format)
{
guint i, j;
for (i = 0; i < G_N_ELEMENTS (rgb32_format_map); i++) {
for (j = 0; j < G_N_ELEMENTS (rgb32_format_map[i].va_format); j++) {
if (va_format_is_same (&rgb32_format_map[i].va_format[j], image_format)) {
*drm_fourcc = rgb32_format_map[i].drm_fourcc;
return rgb32_format_map[i].format;
}
}
@ -436,14 +428,13 @@ fix_map (gpointer data)
GstVideoFormat format;
VAImageFormat *image_format;
struct FormatMap *map;
guint drm_fourcc = 0;
guint i;
for (i = 0; i < args->len; i++) {
image_format = &args->image_formats[i];
if (!va_format_is_rgb (image_format))
continue;
format = find_gst_video_format_in_rgb32_map (image_format, &drm_fourcc);
format = find_gst_video_format_in_rgb32_map (image_format);
if (format == GST_VIDEO_FORMAT_UNKNOWN)
continue;
map = get_format_map_from_video_format (format);
@ -453,14 +444,11 @@ fix_map (gpointer data)
continue;
map->va_format = *image_format;
map->drm_fourcc = drm_fourcc;
GST_INFO ("GST_VIDEO_FORMAT_%s => { fourcc %" GST_FOURCC_FORMAT ", "
"drm fourcc %" GST_FOURCC_FORMAT ", %s, bpp %d, depth %d, "
"R %#010x, G %#010x, B %#010x, A %#010x }",
gst_video_format_to_string (map->format),
GST_INFO ("GST_VIDEO_FORMAT_%s => { fourcc %"
GST_FOURCC_FORMAT ", %s, bpp %d, depth %d, R %#010x, G %#010x, "
"B %#010x, A %#010x }", gst_video_format_to_string (map->format),
GST_FOURCC_ARGS (map->va_format.fourcc),
GST_FOURCC_ARGS (map->drm_fourcc),
(map->va_format.byte_order == 1) ? "LSB" : "MSB",
map->va_format.bits_per_pixel, map->va_format.depth,
map->va_format.red_mask, map->va_format.green_mask,

View file

@ -22,7 +22,7 @@
#include "config.h"
#endif
#include "gstvkdecoder.h"
#include "gstvkdecoder-private.h"
#include "gstvkoperation.h"
#include "gstvkphysicaldevice-private.h"
@ -39,6 +39,7 @@
* Since: 1.24
*/
typedef struct _GstVulkanDecoderPrivate GstVulkanDecoderPrivate;
struct _GstVulkanDecoderPrivate
{
GstVulkanHandle *empty_params;
@ -1309,3 +1310,72 @@ gst_vulkan_decoder_wait (GstVulkanDecoder * self)
return TRUE;
}
/**
* gst_vulkan_decoder_new_from_queue:
* @queue: a #GstVulkanQueue
* @codec: (type guint): the VkVideoCodecOperationFlagBitsKHR to decode
*
* Creates a #GstVulkanDecoder object if @codec decoding is supported by @queue
*
* Returns: (transfer full) (nullable): the #GstVulkanDecoder object
*/
GstVulkanDecoder *
gst_vulkan_decoder_new_from_queue (GstVulkanQueue * queue, guint codec)
{
GstVulkanPhysicalDevice *device;
GstVulkanDecoder *decoder;
guint flags, expected_flag, supported_video_ops;
const char *extension;
g_return_val_if_fail (GST_IS_VULKAN_QUEUE (queue), NULL);
device = queue->device->physical_device;
expected_flag = VK_QUEUE_VIDEO_DECODE_BIT_KHR;
flags = device->queue_family_props[queue->family].queueFlags;
supported_video_ops = device->queue_family_ops[queue->family].video;
if (device->properties.apiVersion < VK_MAKE_VERSION (1, 3, 238)) {
GST_WARNING_OBJECT (queue,
"Driver API version [%d.%d.%d] doesn't support Video extensions",
VK_VERSION_MAJOR (device->properties.apiVersion),
VK_VERSION_MINOR (device->properties.apiVersion),
VK_VERSION_PATCH (device->properties.apiVersion));
return NULL;
}
switch (codec) {
case VK_VIDEO_CODEC_OPERATION_DECODE_H264_BIT_KHR:
extension = VK_KHR_VIDEO_DECODE_H264_EXTENSION_NAME;
break;
case VK_VIDEO_CODEC_OPERATION_DECODE_H265_BIT_KHR:
extension = VK_KHR_VIDEO_DECODE_H265_EXTENSION_NAME;
break;
default:
GST_WARNING_OBJECT (queue, "Unsupported codec %u", codec);
return NULL;
}
if ((flags & expected_flag) != expected_flag) {
GST_WARNING_OBJECT (queue, "Queue doesn't support decoding");
return NULL;
}
if ((supported_video_ops & codec) != codec) {
GST_WARNING_OBJECT (queue, "Queue doesn't support codec %u decoding",
codec);
return NULL;
}
if (!(gst_vulkan_device_is_extension_enabled (queue->device,
VK_KHR_VIDEO_QUEUE_EXTENSION_NAME)
&& gst_vulkan_device_is_extension_enabled (queue->device,
VK_KHR_VIDEO_DECODE_QUEUE_EXTENSION_NAME)
&& gst_vulkan_device_is_extension_enabled (queue->device, extension)))
return NULL;
decoder = g_object_new (GST_TYPE_VULKAN_DECODER, NULL);
gst_object_ref_sink (decoder);
decoder->queue = gst_object_ref (queue);
decoder->codec = codec;
return decoder;
}

View file

@ -21,7 +21,6 @@
#pragma once
#include <gst/vulkan/gstvkqueue.h>
#include <gst/vulkan/gstvkvideoutils.h>
G_BEGIN_DECLS
@ -34,6 +33,11 @@ G_BEGIN_DECLS
GST_VULKAN_API
GType gst_vulkan_decoder_get_type (void);
typedef struct _GstVulkanDecoder GstVulkanDecoder;
typedef struct _GstVulkanDecoderClass GstVulkanDecoderClass;
typedef struct _GstVulkanDecoderPicture GstVulkanDecoderPicture;
typedef union _GstVulkanDecoderParameters GstVulkanDecoderParameters;
/**
* GstVulkanDecoderPicture:
* @out: output buffer
@ -132,6 +136,9 @@ union _GstVulkanDecoderParameters
G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanDecoder, gst_object_unref)
GST_VULKAN_API
GstVulkanDecoder * gst_vulkan_decoder_new_from_queue (GstVulkanQueue * queue,
guint codec);
GST_VULKAN_API
gboolean gst_vulkan_decoder_start (GstVulkanDecoder * self,
GstVulkanVideoProfile * profile,

View file

@ -23,6 +23,9 @@
#endif
#include "gstvkqueue.h"
#if GST_VULKAN_HAVE_VIDEO_EXTENSIONS
#include "gstvkdecoder-private.h"
#endif
/**
* SECTION:vkqueue
@ -154,81 +157,6 @@ error:
return NULL;
}
/**
* gst_vulkan_queue_create_decoder:
* @queue: a #GstVulkanQueue
* @codec: (type guint): the VkVideoCodecOperationFlagBitsKHR to decode
*
* Creates a #GstVulkanDecoder object if @codec decoding is supported by @queue
*
* Returns: (transfer full) (nullable): the #GstVulkanDecoder object
*
* Since: 1.24
*/
GstVulkanDecoder *
gst_vulkan_queue_create_decoder (GstVulkanQueue * queue, guint codec)
{
#if GST_VULKAN_HAVE_VIDEO_EXTENSIONS
GstVulkanPhysicalDevice *device;
GstVulkanDecoder *decoder;
guint flags, expected_flag, supported_video_ops;
const char *extension;
g_return_val_if_fail (GST_IS_VULKAN_QUEUE (queue), NULL);
device = queue->device->physical_device;
expected_flag = VK_QUEUE_VIDEO_DECODE_BIT_KHR;
flags = device->queue_family_props[queue->family].queueFlags;
supported_video_ops = device->queue_family_ops[queue->family].video;
if (device->properties.apiVersion < VK_MAKE_VERSION (1, 3, 238)) {
GST_WARNING_OBJECT (queue,
"Driver API version [%d.%d.%d] doesn't support Video extensions",
VK_VERSION_MAJOR (device->properties.apiVersion),
VK_VERSION_MINOR (device->properties.apiVersion),
VK_VERSION_PATCH (device->properties.apiVersion));
return NULL;
}
switch (codec) {
case VK_VIDEO_CODEC_OPERATION_DECODE_H264_BIT_KHR:
extension = VK_KHR_VIDEO_DECODE_H264_EXTENSION_NAME;
break;
case VK_VIDEO_CODEC_OPERATION_DECODE_H265_BIT_KHR:
extension = VK_KHR_VIDEO_DECODE_H265_EXTENSION_NAME;
break;
default:
GST_WARNING_OBJECT (queue, "Unsupported codec %u", codec);
return NULL;
}
if ((flags & expected_flag) != expected_flag) {
GST_WARNING_OBJECT (queue, "Queue doesn't support decoding");
return NULL;
}
if ((supported_video_ops & codec) != codec) {
GST_WARNING_OBJECT (queue, "Queue doesn't support codec %u decoding",
codec);
return NULL;
}
if (!(gst_vulkan_device_is_extension_enabled (queue->device,
VK_KHR_VIDEO_QUEUE_EXTENSION_NAME)
&& gst_vulkan_device_is_extension_enabled (queue->device,
VK_KHR_VIDEO_DECODE_QUEUE_EXTENSION_NAME)
&& gst_vulkan_device_is_extension_enabled (queue->device, extension)))
return NULL;
decoder = g_object_new (GST_TYPE_VULKAN_DECODER, NULL);
gst_object_ref_sink (decoder);
decoder->queue = gst_object_ref (queue);
decoder->codec = codec;
return decoder;
#else
return NULL;
#endif
}
/**
* gst_context_set_vulkan_queue:
* @context: a #GstContext

View file

@ -23,9 +23,6 @@
#include <gst/vulkan/gstvkdevice.h>
#include <gst/vulkan/gstvkcommandpool.h>
#if GST_VULKAN_HAVE_VIDEO_EXTENSIONS
#include <gst/vulkan/gstvkdecoder.h>
#endif
#define GST_TYPE_VULKAN_QUEUE (gst_vulkan_queue_get_type())
#define GST_VULKAN_QUEUE(o) (G_TYPE_CHECK_INSTANCE_CAST((o), GST_TYPE_VULKAN_QUEUE, GstVulkanQueue))
@ -83,15 +80,12 @@ struct _GstVulkanQueueClass
G_DEFINE_AUTOPTR_CLEANUP_FUNC (GstVulkanQueue, gst_object_unref)
GST_VULKAN_API
GST_VULKAN_API
GstVulkanDevice * gst_vulkan_queue_get_device (GstVulkanQueue * queue);
GST_VULKAN_API
GstVulkanCommandPool * gst_vulkan_queue_create_command_pool (GstVulkanQueue * queue,
GError ** error);
GST_VULKAN_API
GstVulkanDecoder * gst_vulkan_queue_create_decoder (GstVulkanQueue * queue,
guint codec);
GST_VULKAN_API
void gst_vulkan_queue_submit_lock (GstVulkanQueue * queue);

View file

@ -178,8 +178,18 @@ _get_function_table (GstVulkanSwapper * swapper)
}
static VkColorSpaceKHR
_vk_color_space_from_video_info (GstVideoInfo * v_info)
_vk_color_space_from_vk_format (GstVulkanSwapper * swapper, VkFormat format)
{
GstVulkanSwapperPrivate *priv = GET_PRIV (swapper);
int i;
for (i = 0; i < priv->n_surf_formats; i++) {
if (format != priv->surf_formats[i].format)
continue;
return priv->surf_formats[i].colorSpace;
}
GST_WARNING_OBJECT (swapper, "unsupported format for swapper's colospaces");
return VK_COLORSPACE_SRGB_NONLINEAR_KHR;
}
@ -790,7 +800,7 @@ _allocate_swapchain (GstVulkanSwapper * swapper, GstCaps * caps,
}
format = gst_vulkan_format_from_video_info (&priv->v_info, 0);
color_space = _vk_color_space_from_video_info (&priv->v_info);
color_space = _vk_color_space_from_vk_format (swapper, format);
if ((priv->surf_props.supportedCompositeAlpha &
VK_COMPOSITE_ALPHA_OPAQUE_BIT_KHR) != 0) {

View file

@ -333,12 +333,9 @@ static StdVideoH265PictureParameterSet h265_pps;
endif
if have_vk_video
vulkan_conf.set('GST_VULKAN_HAVE_VIDEO_EXTENSIONS', 1)
vulkan_priv_sources += files('gstvkvideo-private.c')
vulkan_sources += files(
'gstvkdecoder.c',
)
vulkan_headers += files(
'gstvkdecoder.h'
vulkan_priv_sources += files(
'gstvkvideo-private.c',
'gstvkdecoder-private.c',
)
elif get_option('vulkan-video').enabled()
error('Vulkan Video extensions headers not found')
@ -501,4 +498,3 @@ if enabled_vulkan_winsys.contains('wayland')
sources : vulkan_wayland_gir)
meson.override_dependency('gstreamer-vulkan-wayland-1.0', gstvulkanwayland_dep)
endif

View file

@ -119,12 +119,6 @@ typedef struct _GstVulkanOperation GstVulkanOperation;
typedef struct _GstVulkanOperationClass GstVulkanOperationClass;
typedef struct _GstVulkanOperationPrivate GstVulkanOperationPrivate;
typedef struct _GstVulkanDecoder GstVulkanDecoder;
typedef struct _GstVulkanDecoderClass GstVulkanDecoderClass;
typedef struct _GstVulkanDecoderPrivate GstVulkanDecoderPrivate;
typedef union _GstVulkanDecoderParameters GstVulkanDecoderParameters;
typedef struct _GstVulkanDecoderPicture GstVulkanDecoderPicture;
G_END_DECLS
#endif /* __GST_VULKAN_FWD_H__ */

View file

@ -196,6 +196,13 @@ gst_wl_window_finalize (GObject * gobject)
gst_wl_display_callback_destroy (priv->display, &priv->frame_callback);
gst_wl_display_callback_destroy (priv->display, &priv->commit_callback);
if (priv->staged_buffer)
gst_wl_buffer_unref_buffer (priv->staged_buffer);
g_cond_clear (&priv->configure_cond);
g_mutex_clear (&priv->configure_mutex);
g_mutex_clear (&priv->window_lock);
if (priv->xdg_toplevel)
xdg_toplevel_destroy (priv->xdg_toplevel);
if (priv->xdg_surface)

View file

@ -684,6 +684,7 @@ mpegts_base_program_remove_stream (MpegTSBase * base,
program->streams[pid] = NULL;
}
#if 0 /* Smart-update disabled */
/* Check if pmtstream is already present in the program */
static inline gboolean
_stream_in_pmt (const GstMpegtsPMT * pmt, MpegTSBaseStream * stream)
@ -768,7 +769,7 @@ mpegts_base_update_program (MpegTSBase * base, MpegTSBaseProgram * program,
g_list_free (toremove);
return TRUE;
}
#endif
static gboolean
_stream_is_private_section (const GstMpegtsPMT * pmt,
@ -814,7 +815,6 @@ mpegts_base_is_same_program (MpegTSBase * base, MpegTSBaseProgram * oldprogram,
{
guint i, nbstreams;
MpegTSBaseStream *oldstream;
gboolean sawpcrpid = FALSE;
if (oldprogram->pmt_pid != new_pmt_pid) {
GST_DEBUG ("Different pmt_pid (new:0x%04x, old:0x%04x)", new_pmt_pid,
@ -828,7 +828,6 @@ mpegts_base_is_same_program (MpegTSBase * base, MpegTSBaseProgram * oldprogram,
return FALSE;
}
/* Check the streams */
nbstreams = new_pmt->streams->len;
for (i = 0; i < nbstreams; ++i) {
GstMpegtsPMTStream *stream = g_ptr_array_index (new_pmt->streams, i);
@ -844,17 +843,14 @@ mpegts_base_is_same_program (MpegTSBase * base, MpegTSBaseProgram * oldprogram,
stream->pid, stream->stream_type, oldstream->stream_type);
return FALSE;
}
if (stream->pid == oldprogram->pcr_pid)
sawpcrpid = TRUE;
}
/* If we have a PCR PID and the pcr is not shared with an existing stream, we'll have one extra stream */
if (!sawpcrpid && !base->ignore_pcr)
nbstreams += 1;
/* We can now just check the number of streams from each PMT. The check for
* PCR was already done previously */
if (nbstreams != g_list_length (oldprogram->stream_list)) {
GST_DEBUG ("Different number of streams (new:%d, old:%d)",
nbstreams, g_list_length (oldprogram->stream_list));
if (nbstreams != oldprogram->pmt->streams->len) {
GST_DEBUG ("Different number of streams (new:%d, old:%u)",
nbstreams, oldprogram->pmt->streams->len);
return FALSE;
}
@ -862,6 +858,7 @@ mpegts_base_is_same_program (MpegTSBase * base, MpegTSBaseProgram * oldprogram,
return TRUE;
}
#if 0 /* Smart-update disabled */
/* Return TRUE if program is an update
*
* A program is equal if:
@ -923,6 +920,7 @@ mpegts_base_is_program_update (MpegTSBase * base,
GST_DEBUG ("Program is not an update of the previous one");
return FALSE;
}
#endif
static void
mpegts_base_deactivate_program (MpegTSBase * base, MpegTSBaseProgram * program)
@ -1208,6 +1206,8 @@ mpegts_base_apply_pmt (MpegTSBase * base, GstMpegtsSection * section)
pmt)))
goto same_program;
#if 0
/* parsebin doesn't support program update properly. Disable this feature for now */
if (base->streams_aware
&& mpegts_base_is_program_update (base, old_program, section->pid, pmt)) {
GST_FIXME ("We are streams_aware and new program is an update");
@ -1215,6 +1215,7 @@ mpegts_base_apply_pmt (MpegTSBase * base, GstMpegtsSection * section)
mpegts_base_update_program (base, old_program, section, pmt);
goto beach;
}
#endif
/* If the current program is active, this means we have a new program */
if (old_program->active) {
@ -1247,7 +1248,9 @@ mpegts_base_apply_pmt (MpegTSBase * base, GstMpegtsSection * section)
mpegts_base_activate_program (base, program, section->pid, section, pmt,
initial_program);
#if 0 /* Smart-update disabled */
beach:
#endif
GST_DEBUG ("Done activating program");
return TRUE;

View file

@ -1681,6 +1681,12 @@ tsmux_write_stream_packet (TsMux * mux, TsMuxStream * stream)
gst_buffer_unmap (buf, &map);
GST_DEBUG ("Writing PES of size %d", (int) gst_buffer_get_size (buf));
/* In some cases tsmux_packet_out calls tsmux_write_ts_header
which may cause two sequential buffers with packet_start_unit_indicator
which breaks some decoders.
*/
pi->packet_start_unit_indicator = FALSE;
res = tsmux_packet_out (mux, buf, new_pcr);
/* Reset all dynamic flags */

View file

@ -428,12 +428,22 @@ send_command_to_all (GstUnixFdSink * self, CommandType type, GUnixFDList * fds,
}
static GstClockTime
calculate_timestamp (GstClockTime timestamp, GstClockTime base_time,
GstClockTime latency, GstClockTimeDiff clock_diff)
to_monotonic (GstClockTime timestamp, const GstSegment * segment,
GstClockTime base_time, GstClockTime latency, GstClockTimeDiff clock_diff)
{
if (GST_CLOCK_TIME_IS_VALID (timestamp)) {
/* Convert running time to pipeline clock time */
timestamp += base_time;
gint res =
gst_segment_to_running_time_full (segment, GST_FORMAT_TIME, timestamp,
&timestamp);
if (res == 0)
return GST_CLOCK_TIME_NONE;
else if (res > 0)
timestamp += base_time;
else if (base_time > timestamp)
timestamp = base_time - timestamp;
else
timestamp = 0;
if (GST_CLOCK_TIME_IS_VALID (latency))
timestamp += latency;
/* Convert to system monotonic clock time */
@ -485,11 +495,11 @@ gst_unix_fd_sink_render (GstBaseSink * bsink, GstBuffer * buffer)
* id so we know which buffer to unref. */
new_buffer->id = (guint64) buffer;
new_buffer->pts =
calculate_timestamp (GST_BUFFER_PTS (buffer), base_time, latency,
clock_diff);
to_monotonic (GST_BUFFER_PTS (buffer),
&GST_BASE_SINK_CAST (self)->segment, base_time, latency, clock_diff);
new_buffer->dts =
calculate_timestamp (GST_BUFFER_DTS (buffer), base_time, latency,
clock_diff);
to_monotonic (GST_BUFFER_DTS (buffer),
&GST_BASE_SINK_CAST (self)->segment, base_time, latency, clock_diff);
new_buffer->duration = GST_BUFFER_DURATION (buffer);
new_buffer->offset = GST_BUFFER_OFFSET (buffer);
new_buffer->offset_end = GST_BUFFER_OFFSET_END (buffer);
@ -498,6 +508,15 @@ gst_unix_fd_sink_render (GstBaseSink * bsink, GstBuffer * buffer)
new_buffer->n_memory = n_memory;
new_buffer->n_meta = n_meta;
if ((GST_BUFFER_PTS_IS_VALID (buffer)
&& !GST_CLOCK_TIME_IS_VALID (new_buffer->pts))
|| (GST_BUFFER_DTS_IS_VALID (buffer)
&& !GST_CLOCK_TIME_IS_VALID (new_buffer->dts))) {
GST_ERROR_OBJECT (self,
"Could not convert buffer timestamp to running time");
return GST_FLOW_ERROR;
}
gboolean dmabuf_count = 0;
GUnixFDList *fds = g_unix_fd_list_new ();
for (int i = 0; i < n_memory; i++) {

View file

@ -89,31 +89,34 @@ typedef struct
guint n_memory;
} BufferContext;
static void
release_buffer (GstUnixFdSrc * self, guint64 id)
{
/* Notify that we are not using this buffer anymore */
ReleaseBufferPayload payload = { id };
GError *error = NULL;
if (!gst_unix_fd_send_command (self->socket, COMMAND_TYPE_RELEASE_BUFFER,
NULL, (guint8 *) & payload, sizeof (payload), &error)) {
GST_WARNING_OBJECT (self, "Failed to send release-buffer command: %s",
error->message);
g_clear_error (&error);
}
}
static void
memory_weak_ref_cb (GstUnixFdSrc * self, GstMemory * mem)
{
GST_OBJECT_LOCK (self);
BufferContext *ctx = g_hash_table_lookup (self->memories, mem);
if (ctx == NULL)
goto out;
if (--ctx->n_memory == 0) {
/* Notify that we are not using this buffer anymore */
ReleaseBufferPayload payload = { ctx->id };
GError *error = NULL;
if (!gst_unix_fd_send_command (self->socket, COMMAND_TYPE_RELEASE_BUFFER,
NULL, (guint8 *) & payload, sizeof (payload), &error)) {
GST_WARNING_OBJECT (self, "Failed to send release-buffer command: %s",
error->message);
g_clear_error (&error);
if (ctx != NULL) {
if (--ctx->n_memory == 0) {
release_buffer (self, ctx->id);
g_free (ctx);
}
g_free (ctx);
g_hash_table_remove (self->memories, mem);
}
g_hash_table_remove (self->memories, mem);
out:
GST_OBJECT_UNLOCK (self);
}
@ -279,7 +282,7 @@ gst_unix_fd_src_unlock_stop (GstBaseSrc * bsrc)
}
static GstClockTime
calculate_timestamp (GstClockTime timestamp, GstClockTime base_time,
from_monotonic (GstClockTime timestamp, GstClockTime base_time,
GstClockTimeDiff clock_diff)
{
if (GST_CLOCK_TIME_IS_VALID (timestamp)) {
@ -295,6 +298,14 @@ calculate_timestamp (GstClockTime timestamp, GstClockTime base_time,
return timestamp;
}
static void
close_and_free_fds (gint * fds, gint fds_len)
{
for (int i = 0; i < fds_len; i++)
g_close (fds[i], NULL);
g_free (fds);
}
static GstFlowReturn
gst_unix_fd_src_create (GstPushSrc * psrc, GstBuffer ** outbuf)
{
@ -335,33 +346,26 @@ again:
goto on_error;
}
if (fds == NULL) {
GST_ERROR_OBJECT (self,
"Received new buffer command without file descriptors");
return GST_FLOW_ERROR;
}
if (g_unix_fd_list_get_length (fds) != new_buffer->n_memory) {
gint fds_arr_len = 0;
gint *fds_arr =
(fds != NULL) ? g_unix_fd_list_steal_fds (fds, &fds_arr_len) : NULL;
if (fds_arr_len != new_buffer->n_memory) {
GST_ERROR_OBJECT (self,
"Received new buffer command with %d file descriptors instead of "
"%d", g_unix_fd_list_get_length (fds), new_buffer->n_memory);
"%d", fds_arr_len, new_buffer->n_memory);
ret = GST_FLOW_ERROR;
close_and_free_fds (fds_arr, fds_arr_len);
goto on_error;
}
if (new_buffer->type >= MEMORY_TYPE_LAST) {
GST_ERROR_OBJECT (self, "Unknown buffer type %d", new_buffer->type);
ret = GST_FLOW_ERROR;
close_and_free_fds (fds_arr, fds_arr_len);
goto on_error;
}
GstAllocator *allocator = self->allocators[new_buffer->type];
gint *fds_arr = g_unix_fd_list_steal_fds (fds, NULL);
BufferContext *ctx = g_new0 (BufferContext, 1);
ctx->id = new_buffer->id;
ctx->n_memory = new_buffer->n_memory;
*outbuf = gst_buffer_new ();
GstClockTime base_time =
@ -373,9 +377,9 @@ again:
}
GST_BUFFER_PTS (*outbuf) =
calculate_timestamp (new_buffer->pts, base_time, clock_diff);
from_monotonic (new_buffer->pts, base_time, clock_diff);
GST_BUFFER_DTS (*outbuf) =
calculate_timestamp (new_buffer->dts, base_time, clock_diff);
from_monotonic (new_buffer->dts, base_time, clock_diff);
GST_BUFFER_DURATION (*outbuf) = new_buffer->duration;
GST_BUFFER_OFFSET (*outbuf) = new_buffer->offset;
GST_BUFFER_OFFSET_END (*outbuf) = new_buffer->offset_end;
@ -388,24 +392,35 @@ again:
if (consumed == 0) {
GST_ERROR_OBJECT (self, "Malformed meta serialization");
ret = GST_FLOW_ERROR;
close_and_free_fds (fds_arr, fds_arr_len);
gst_clear_buffer (outbuf);
goto on_error;
}
payload_off += consumed;
}
GST_OBJECT_LOCK (self);
for (int i = 0; i < new_buffer->n_memory; i++) {
GstMemory *mem = gst_fd_allocator_alloc (allocator, fds_arr[i],
new_buffer->memories[i].size, GST_FD_MEMORY_FLAG_NONE);
gst_memory_resize (mem, new_buffer->memories[i].offset,
new_buffer->memories[i].size);
GST_MINI_OBJECT_FLAG_SET (mem, GST_MEMORY_FLAG_READONLY);
if (new_buffer->n_memory > 0) {
BufferContext *ctx = g_new0 (BufferContext, 1);
ctx->id = new_buffer->id;
ctx->n_memory = new_buffer->n_memory;
for (int i = 0; i < new_buffer->n_memory; i++) {
GstMemory *mem = gst_fd_allocator_alloc (allocator, fds_arr[i],
new_buffer->memories[i].size, GST_FD_MEMORY_FLAG_NONE);
gst_memory_resize (mem, new_buffer->memories[i].offset,
new_buffer->memories[i].size);
GST_MINI_OBJECT_FLAG_SET (mem, GST_MEMORY_FLAG_READONLY);
g_hash_table_insert (self->memories, mem, ctx);
gst_mini_object_weak_ref (GST_MINI_OBJECT_CAST (mem),
(GstMiniObjectNotify) memory_weak_ref_cb, self);
g_hash_table_insert (self->memories, mem, ctx);
gst_mini_object_weak_ref (GST_MINI_OBJECT_CAST (mem),
(GstMiniObjectNotify) memory_weak_ref_cb, self);
gst_buffer_append_memory (*outbuf, mem);
gst_buffer_append_memory (*outbuf, mem);
}
} else {
/* This buffer has no memories, we can release it immediately otherwise
* it gets leaked. */
release_buffer (self, new_buffer->id);
}
GST_OBJECT_UNLOCK (self);

View file

@ -1551,6 +1551,7 @@ gst_d3d11_decoder_output_picture (GstD3D11Decoder * decoder,
{
GstFlowReturn ret = GST_FLOW_OK;
GstBuffer *view_buffer;
bool attach_crop_meta = false;
if (picture->discont_state) {
g_clear_pointer (&decoder->input_state, gst_video_codec_state_unref);
@ -1593,21 +1594,8 @@ gst_d3d11_decoder_output_picture (GstD3D11Decoder * decoder,
mem = gst_buffer_peek_memory (view_buffer, 0);
GST_MINI_OBJECT_FLAG_SET (mem, GST_D3D11_MEMORY_TRANSFER_NEED_DOWNLOAD);
if (decoder->need_crop) {
GstVideoCropMeta *crop_meta;
view_buffer = gst_buffer_make_writable (view_buffer);
crop_meta = gst_buffer_get_video_crop_meta (view_buffer);
if (!crop_meta)
crop_meta = gst_buffer_add_video_crop_meta (view_buffer);
crop_meta->x = decoder->offset_x;
crop_meta->y = decoder->offset_y;
crop_meta->width = decoder->info.width;
crop_meta->height = decoder->info.height;
GST_TRACE_OBJECT (decoder, "Attatching crop meta");
}
if (decoder->need_crop)
attach_crop_meta = true;
frame->output_buffer = gst_buffer_ref (view_buffer);
} else {
@ -1628,6 +1616,18 @@ gst_d3d11_decoder_output_picture (GstD3D11Decoder * decoder,
GST_BUFFER_FLAG_SET (frame->output_buffer, buffer_flags);
gst_codec_picture_unref (picture);
if (attach_crop_meta) {
frame->output_buffer = gst_buffer_make_writable (frame->output_buffer);
auto crop_meta = gst_buffer_add_video_crop_meta (frame->output_buffer);
crop_meta->x = decoder->offset_x;
crop_meta->y = decoder->offset_y;
crop_meta->width = decoder->info.width;
crop_meta->height = decoder->info.height;
GST_TRACE_OBJECT (decoder, "Attatching crop meta");
}
return gst_video_decoder_finish_frame (videodec, frame);
error:

View file

@ -138,7 +138,6 @@ public:
subresources_[id] = 0;
heaps_[id] = nullptr;
cond_.notify_one ();
}
@ -150,24 +149,14 @@ public:
frames.ppHeaps = heaps_.data ();
}
guint8 GetSize ()
void Lock ()
{
return size_;
lock_.lock ();
}
ID3D12Resource ** GetTextures ()
void Unlock ()
{
return &textures_[0];
}
UINT * GetSubresources ()
{
return &subresources_[0];
}
ID3D12VideoDecoderHeap ** GetHeaps ()
{
return &heaps_[0];
lock_.unlock ();
}
private:
@ -915,6 +904,34 @@ gst_d3d12_decoder_new_picture (GstD3D12Decoder * decoder,
return GST_FLOW_OK;
}
GstFlowReturn
gst_d3d12_decoder_new_picture_with_size (GstD3D12Decoder * decoder,
GstVideoDecoder * videodec, GstCodecPicture * picture, guint width,
guint height)
{
g_return_val_if_fail (GST_IS_D3D12_DECODER (decoder), GST_FLOW_ERROR);
g_return_val_if_fail (GST_IS_VIDEO_DECODER (videodec), GST_FLOW_ERROR);
g_return_val_if_fail (picture != nullptr, GST_FLOW_ERROR);
auto priv = decoder->priv;
if (!priv->session) {
GST_ERROR_OBJECT (decoder, "No session configured");
return GST_FLOW_ERROR;
}
if (priv->session->coded_width >= width &&
priv->session->coded_height >= height) {
return gst_d3d12_decoder_new_picture (decoder, videodec, picture);
}
/* FIXME: D3D12_VIDEO_DECODE_CONFIGURATION_FLAG_ALLOW_RESOLUTION_CHANGE_ON_NON_KEY_FRAME
* supported GPU can decode stream with mixed decoder heap */
GST_ERROR_OBJECT (decoder,
"Non-keyframe resolution change with larger size is not supported");
return GST_FLOW_ERROR;
}
static inline GstD3D12DecoderPicture *
get_decoder_picture (GstCodecPicture * picture)
{
@ -1253,6 +1270,7 @@ gst_d3d12_decoder_end_picture (GstD3D12Decoder * decoder,
in_args.CompressedBitstream.Size = args->bitstream_size;
in_args.pHeap = decoder_pic->heap.Get ();
priv->session->dpb->Lock ();
priv->session->dpb->GetReferenceFrames (in_args.ReferenceFrames);
priv->cmd->cl->DecodeFrame (priv->session->decoder.Get (),
@ -1262,6 +1280,8 @@ gst_d3d12_decoder_end_picture (GstD3D12Decoder * decoder,
priv->cmd->cl->ResourceBarrier (post_barriers.size (), &post_barriers[0]);
hr = priv->cmd->cl->Close ();
priv->session->dpb->Unlock ();
if (!gst_d3d12_result (hr, decoder->device)) {
GST_ERROR_OBJECT (decoder, "Couldn't record decoding command");
gst_d3d12_command_allocator_unref (gst_ca);
@ -1284,6 +1304,11 @@ gst_d3d12_decoder_end_picture (GstD3D12Decoder * decoder,
gst_d3d12_fence_data_pool_acquire (priv->fence_data_pool, &fence_data);
gst_d3d12_fence_data_add_notify_mini_object (fence_data,
gst_mini_object_ref (decoder_pic));
for (guint i = 0; i < ref_pics->len; i++) {
auto ref_pic = (GstCodecPicture *) g_ptr_array_index (ref_pics, i);
gst_d3d12_fence_data_add_notify_mini_object (fence_data,
gst_codec_picture_ref (ref_pic));
}
gst_d3d12_fence_data_add_notify_mini_object (fence_data, gst_ca);
gst_d3d12_command_queue_set_notify (priv->cmd->queue, priv->cmd->fence_val,
@ -1360,6 +1385,7 @@ gst_d3d12_decoder_process_output (GstD3D12Decoder * self,
ID3D12Resource *resource;
UINT subresource[2];
HRESULT hr;
bool attach_crop_meta = false;
auto priv = self->priv;
@ -1405,21 +1431,8 @@ gst_d3d12_decoder_process_output (GstD3D12Decoder * self,
GST_MINI_OBJECT_FLAG_SET (dmem, GST_D3D12_MEMORY_TRANSFER_NEED_DOWNLOAD);
GST_MINI_OBJECT_FLAG_UNSET (dmem, GST_D3D12_MEMORY_TRANSFER_NEED_UPLOAD);
if (priv->session->need_crop) {
GstVideoCropMeta *crop_meta;
buffer = gst_buffer_make_writable (buffer);
crop_meta = gst_buffer_get_video_crop_meta (buffer);
if (!crop_meta)
crop_meta = gst_buffer_add_video_crop_meta (buffer);
crop_meta->x = priv->session->crop_x;
crop_meta->y = priv->session->crop_y;
crop_meta->width = priv->session->info.width;
crop_meta->height = priv->session->info.height;
GST_TRACE_OBJECT (self, "Attatching crop meta");
}
if (priv->session->need_crop)
attach_crop_meta = true;
frame->output_buffer = gst_buffer_ref (buffer);
} else {
@ -1550,6 +1563,18 @@ gst_d3d12_decoder_process_output (GstD3D12Decoder * self,
GST_BUFFER_FLAG_SET (frame->output_buffer, buffer_flags);
gst_codec_picture_unref (picture);
if (attach_crop_meta) {
frame->output_buffer = gst_buffer_make_writable (frame->output_buffer);
auto crop_meta = gst_buffer_add_video_crop_meta (frame->output_buffer);
crop_meta->x = priv->session->crop_x;
crop_meta->y = priv->session->crop_y;
crop_meta->width = priv->session->info.width;
crop_meta->height = priv->session->info.height;
GST_TRACE_OBJECT (self, "Attatching crop meta");
}
return gst_video_decoder_finish_frame (videodec, frame);
error:

View file

@ -130,6 +130,12 @@ GstFlowReturn gst_d3d12_decoder_new_picture (GstD3D12Decoder * decoder,
GstVideoDecoder * videodec,
GstCodecPicture * picture);
GstFlowReturn gst_d3d12_decoder_new_picture_with_size (GstD3D12Decoder * decoder,
GstVideoDecoder * videodec,
GstCodecPicture * picture,
guint width,
guint height);
GstFlowReturn gst_d3d12_decoder_duplicate_picture (GstD3D12Decoder * decoder,
GstCodecPicture * src,
GstCodecPicture * dst);

File diff suppressed because it is too large Load diff

View file

@ -777,11 +777,13 @@ gst_d3d12_screen_capture_src_decide_allocation (GstBaseSrc * bsrc,
if (!params) {
params = gst_d3d12_allocation_params_new (self->device, &vinfo,
GST_D3D12_ALLOCATION_FLAG_DEFAULT, resource_flags,
D3D12_HEAP_FLAG_NONE);
D3D12_HEAP_FLAG_SHARED);
} else {
gst_d3d12_allocation_params_set_resource_flags (params, resource_flags);
gst_d3d12_allocation_params_unset_resource_flags (params,
D3D12_RESOURCE_FLAG_DENY_SHADER_RESOURCE);
gst_d3d12_allocation_params_set_heap_flags (params,
D3D12_HEAP_FLAG_SHARED);
}
gst_buffer_pool_config_set_d3d12_allocation_params (config, params);
@ -815,7 +817,7 @@ gst_d3d12_screen_capture_src_decide_allocation (GstBaseSrc * bsrc,
auto params = gst_d3d12_allocation_params_new (self->device, &vinfo,
GST_D3D12_ALLOCATION_FLAG_DEFAULT, resource_flags,
D3D12_HEAP_FLAG_NONE);
D3D12_HEAP_FLAG_SHARED);
gst_buffer_pool_config_set_d3d12_allocation_params (config, params);
gst_d3d12_allocation_params_free (params);

View file

@ -266,9 +266,11 @@ gst_d3d12_vp9_dec_new_picture (GstDxvaVp9Decoder * decoder,
GstCodecPicture * picture)
{
auto self = GST_D3D12_VP9_DEC (decoder);
auto vp9pic = GST_VP9_PICTURE (picture);
return gst_d3d12_decoder_new_picture (self->decoder,
GST_VIDEO_DECODER (decoder), picture);
return gst_d3d12_decoder_new_picture_with_size (self->decoder,
GST_VIDEO_DECODER (decoder), picture, vp9pic->frame_hdr.width,
vp9pic->frame_hdr.height);
}
static GstFlowReturn

View file

@ -109,6 +109,7 @@ gst_mf_source_object_init (GstMFSourceObject * self)
{
self->device_index = DEFAULT_DEVICE_INDEX;
self->source_type = DEFAULT_SOURCE_TYPE;
self->source_state = GST_MF_DEVICE_NOT_FOUND;
g_weak_ref_init (&self->client, nullptr);
}

View file

@ -25,6 +25,7 @@
#include <gst/gst.h>
#include <gst/check/gstcheck.h>
#include <gst/app/app.h>
#include <glib/gstdio.h>
static void
@ -134,6 +135,92 @@ GST_START_TEST (test_unixfd_videotestsrc)
GST_END_TEST;
GST_START_TEST (test_unixfd_segment)
{
GError *error = NULL;
/* Ensure we don't have socket from previous failed test */
gchar *socket_path =
g_strdup_printf ("%s/unixfd-test-socket", g_get_user_runtime_dir ());
if (g_file_test (socket_path, G_FILE_TEST_EXISTS)) {
g_unlink (socket_path);
}
GstCaps *caps = gst_caps_new_empty_simple ("video/x-raw");
/* Setup service */
gchar *pipeline_str =
g_strdup_printf
("appsrc name=src format=time handle-segment-change=true ! unixfdsink socket-path=%s sync=false async=false",
socket_path);
GstElement *pipeline_service = gst_parse_launch (pipeline_str, &error);
g_assert_no_error (error);
fail_unless (gst_element_set_state (pipeline_service,
GST_STATE_PLAYING) == GST_STATE_CHANGE_SUCCESS);
GstElement *appsrc = gst_bin_get_by_name (GST_BIN (pipeline_service), "src");
gst_object_unref (appsrc);
g_free (pipeline_str);
/* Setup client */
pipeline_str =
g_strdup_printf
("unixfdsrc socket-path=%s ! appsink name=sink sync=false async=false",
socket_path);
GstElement *pipeline_client = gst_parse_launch (pipeline_str, &error);
g_assert_no_error (error);
fail_unless (gst_element_set_state (pipeline_client,
GST_STATE_PLAYING) == GST_STATE_CHANGE_SUCCESS);
GstElement *appsink = gst_bin_get_by_name (GST_BIN (pipeline_client), "sink");
gst_object_unref (appsink);
g_free (pipeline_str);
/* Send a buffer with PTS=30s */
GstSegment segment;
gst_segment_init (&segment, GST_FORMAT_TIME);
GstBuffer *buf = gst_buffer_new ();
GST_BUFFER_PTS (buf) = 30 * GST_SECOND;
GstSample *sample = gst_sample_new (buf, caps, &segment, NULL);
gst_app_src_push_sample (GST_APP_SRC (appsrc), sample);
gst_sample_unref (sample);
gst_buffer_unref (buf);
/* Wait for it */
sample = gst_app_sink_pull_sample (GST_APP_SINK (appsink));
buf = gst_sample_get_buffer (sample);
GstClockTime first_pts = GST_BUFFER_PTS (buf);
gst_sample_unref (sample);
/* Send a buffer with PTS=1s but with 30s offset in the segment */
segment.base = 30 * GST_SECOND;
buf = gst_buffer_new ();
GST_BUFFER_PTS (buf) = 1 * GST_SECOND;
sample = gst_sample_new (buf, caps, &segment, NULL);
gst_app_src_push_sample (GST_APP_SRC (appsrc), sample);
gst_sample_unref (sample);
gst_buffer_unref (buf);
/* Wait for it */
sample = gst_app_sink_pull_sample (GST_APP_SINK (appsink));
buf = gst_sample_get_buffer (sample);
GstClockTime second_pts = GST_BUFFER_PTS (buf);
gst_sample_unref (sample);
/* They should be 1s appart */
fail_unless_equals_uint64 (second_pts - first_pts, GST_SECOND);
/* Teardown */
fail_unless (gst_element_set_state (pipeline_client,
GST_STATE_NULL) == GST_STATE_CHANGE_SUCCESS);
fail_unless (gst_element_set_state (pipeline_service,
GST_STATE_NULL) == GST_STATE_CHANGE_SUCCESS);
gst_object_unref (pipeline_service);
gst_object_unref (pipeline_client);
g_free (socket_path);
gst_caps_unref (caps);
}
GST_END_TEST;
static Suite *
unixfd_suite (void)
{
@ -142,6 +229,7 @@ unixfd_suite (void)
suite_add_tcase (s, tc);
tcase_add_test (tc, test_unixfd_videotestsrc);
tcase_add_test (tc, test_unixfd_segment);
return s;
}

View file

@ -0,0 +1,185 @@
/* GStreamer
* Copyright (C) 2024 Seungha Yang <seungha@centricular.com>
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 51 Franklin St, Fifth Floor,
* Boston, MA 02110-1301, USA.
*/
#ifdef HAVE_CONFIG_H
#include "config.h"
#endif
#include <gst/gst.h>
#include <gst/check/gstcheck.h>
#include <gst/d3d12/gstd3d12.h>
#include <wrl.h>
#include <mutex>
#include <condition_variable>
/* *INDENT-OFF* */
using namespace Microsoft::WRL;
/* *INDENT-ON* */
GST_START_TEST (test_device_equal)
{
auto device = gst_d3d12_device_new (0);
fail_unless (GST_IS_D3D12_DEVICE (device));
auto other_device = gst_d3d12_device_new (0);
fail_unless (GST_IS_D3D12_DEVICE (other_device));
fail_unless (gst_d3d12_device_is_equal (device, other_device));
auto handle = gst_d3d12_device_get_device_handle (device);
auto other_handle = gst_d3d12_device_get_device_handle (other_device);
fail_unless_equals_pointer (handle, other_handle);
gst_object_unref (device);
gst_object_unref (other_device);
}
GST_END_TEST;
struct DeviceRemovedData
{
std::mutex lock;
std::condition_variable cond;
guint removed_count = 0;
};
static void
on_device_removed (GstD3D12Device * device, GParamSpec * pspec,
DeviceRemovedData * data)
{
HRESULT hr = S_OK;
g_object_get (device, "device-removed-reason", &hr, nullptr);
fail_unless (FAILED (hr));
std::lock_guard <std::mutex> lk (data->lock);
data->removed_count++;
data->cond.notify_all ();
}
GST_START_TEST (test_device_removed)
{
auto device = gst_d3d12_device_new (0);
fail_unless (GST_IS_D3D12_DEVICE (device));
ComPtr<ID3D12Device5> device5;
auto handle = gst_d3d12_device_get_device_handle (device);
fail_unless (handle != nullptr);
handle->QueryInterface (IID_PPV_ARGS (&device5));
if (!device5) {
gst_object_unref (device);
return;
}
auto other_device = gst_d3d12_device_new (0);
DeviceRemovedData data;
g_signal_connect (device, "notify::device-removed-reason",
G_CALLBACK (on_device_removed), &data);
g_signal_connect (other_device, "notify::device-removed-reason",
G_CALLBACK (on_device_removed), &data);
/* Emulate device removed case */
device5->RemoveDevice ();
device5 = nullptr;
/* Callback will be called from other thread */
{
std::unique_lock <std::mutex> lk (data.lock);
while (data.removed_count != 2)
data.cond.wait (lk);
}
/* This will fail since we are holding removed device */
auto null_device = gst_d3d12_device_new (0);
fail_if (null_device);
gst_object_unref (device);
gst_object_unref (other_device);
/* After releasing all devices, create device should be successful */
device = gst_d3d12_device_new (0);
fail_unless (GST_IS_D3D12_DEVICE (device));
gst_object_unref (device);
}
GST_END_TEST;
static gboolean
check_d3d12_available (void)
{
auto device = gst_d3d12_device_new (0);
if (!device)
return FALSE;
gst_object_unref (device);
return TRUE;
}
/* ID3D12Device5::RemoveDevice requires Windows10 build 20348 or newer */
static gboolean
check_remove_device_supported (void)
{
OSVERSIONINFOEXW osverinfo = { };
typedef NTSTATUS (WINAPI fRtlGetVersion) (PRTL_OSVERSIONINFOEXW);
gboolean ret = FALSE;
memset (&osverinfo, 0, sizeof (OSVERSIONINFOEXW));
osverinfo.dwOSVersionInfoSize = sizeof (OSVERSIONINFOEXW);
auto hmodule = LoadLibraryW (L"ntdll.dll");
if (!hmodule)
return FALSE;
auto RtlGetVersion = (fRtlGetVersion *) GetProcAddress (hmodule, "RtlGetVersion");
if (RtlGetVersion) {
RtlGetVersion (&osverinfo);
if (osverinfo.dwMajorVersion > 10 ||
(osverinfo.dwMajorVersion == 10 && osverinfo.dwBuildNumber >= 20348)) {
ret = TRUE;
}
}
FreeLibrary (hmodule);
return ret;
}
static Suite *
d3d12device_suite (void)
{
Suite *s = suite_create ("d3d12device");
TCase *tc_basic = tcase_create ("general");
suite_add_tcase (s, tc_basic);
if (!check_d3d12_available ())
return s;
tcase_add_test (tc_basic, test_device_equal);
if (check_remove_device_supported ())
tcase_add_test (tc_basic, test_device_removed);
return s;
}
GST_CHECK_MAIN (d3d12device);

View file

@ -0,0 +1,230 @@
/* GStreamer
*
* Copyright (C) 2024 Igalia, S.L.
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Library General Public
* License as published by the Free Software Foundation; either
* version 2 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Library General Public License for more details.
*
* You should have received a copy of the GNU Library General Public
* License along with this library; if not, write to the
* Free Software Foundation, Inc., 59 Temple Place - Suite 330,
* Boston, MA 02111-1307, USA.
*/
/* H264 DECODER: 1 frame 320x240 blue box */
static StdVideoH264HrdParameters h264_std_hrd = {
.cpb_cnt_minus1 = 0,
.bit_rate_scale = 4,
.cpb_size_scale = 0,
.reserved1 = 0,
.bit_rate_value_minus1 = {2928,},
.cpb_size_value_minus1 = {74999,},
.cbr_flag = {0,},
.initial_cpb_removal_delay_length_minus1 = 23,
.cpb_removal_delay_length_minus1 = 23,
.dpb_output_delay_length_minus1 = 23,
.time_offset_length = 24,
};
static StdVideoH264SequenceParameterSetVui h264_std_vui = {
.flags = {
.aspect_ratio_info_present_flag = 1,
.overscan_info_present_flag = 0,
.overscan_appropriate_flag = 0,
.video_signal_type_present_flag = 0,
.video_full_range_flag = 1,
.color_description_present_flag = 0,
.chroma_loc_info_present_flag = 1,
.timing_info_present_flag = 1,
.fixed_frame_rate_flag = 1,
.bitstream_restriction_flag = 1,
.nal_hrd_parameters_present_flag = 1,
.vcl_hrd_parameters_present_flag = 0,
},
.aspect_ratio_idc = STD_VIDEO_H264_ASPECT_RATIO_IDC_SQUARE,
.sar_width = 1,
.sar_height = 1,
.video_format = 0,
.colour_primaries = 0,
.transfer_characteristics = 0,
.matrix_coefficients = 2,
.num_units_in_tick = 1,
.time_scale = 60,
.max_num_reorder_frames = 0,
.max_dec_frame_buffering = 2,
.chroma_sample_loc_type_top_field = 0,
.chroma_sample_loc_type_bottom_field = 0,
.pHrdParameters = &h264_std_hrd,
};
static StdVideoH264SequenceParameterSet h264_std_sps = {
.flags = {
.constraint_set0_flag = 0,
.constraint_set1_flag = 0,
.constraint_set2_flag = 0,
.constraint_set3_flag = 0,
.constraint_set4_flag = 0,
.constraint_set5_flag = 0,
.direct_8x8_inference_flag = 1,
.mb_adaptive_frame_field_flag = 0,
.frame_mbs_only_flag = 1,
.delta_pic_order_always_zero_flag = 0,
.separate_colour_plane_flag = 0,
.gaps_in_frame_num_value_allowed_flag = 0,
.qpprime_y_zero_transform_bypass_flag = 0,
.frame_cropping_flag = 0,
.seq_scaling_matrix_present_flag = 0,
.vui_parameters_present_flag = 1,
},
.profile_idc = STD_VIDEO_H264_PROFILE_IDC_MAIN,
.level_idc = STD_VIDEO_H264_LEVEL_IDC_2_1,
.chroma_format_idc = STD_VIDEO_H264_CHROMA_FORMAT_IDC_420,
.seq_parameter_set_id = 0,
.bit_depth_luma_minus8 = 0,
.bit_depth_chroma_minus8 = 0,
.log2_max_frame_num_minus4 = 4,
.pic_order_cnt_type = STD_VIDEO_H264_POC_TYPE_2,
.offset_for_non_ref_pic = 0,
.offset_for_top_to_bottom_field = 0,
.log2_max_pic_order_cnt_lsb_minus4 = 0,
.num_ref_frames_in_pic_order_cnt_cycle = 0,
.max_num_ref_frames = 2,
.pic_width_in_mbs_minus1 = 19,
.pic_height_in_map_units_minus1 = 14,
.frame_crop_left_offset = 0,
.frame_crop_right_offset = 0,
.frame_crop_top_offset = 0,
.frame_crop_bottom_offset = 0,
.pOffsetForRefFrame = NULL,
.pScalingLists = NULL,
.pSequenceParameterSetVui = &h264_std_vui,
};
static StdVideoH264PictureParameterSet h264_std_pps = {
.flags = {
.transform_8x8_mode_flag = 0,
.redundant_pic_cnt_present_flag = 0,
.constrained_intra_pred_flag = 0,
.deblocking_filter_control_present_flag = 1,
.weighted_pred_flag = 0,
.bottom_field_pic_order_in_frame_present_flag = 0,
.entropy_coding_mode_flag = 1,
.pic_scaling_matrix_present_flag = 0,
},
.seq_parameter_set_id = 0,
.pic_parameter_set_id = 0,
.num_ref_idx_l0_default_active_minus1 = 0,
.num_ref_idx_l1_default_active_minus1 = 0,
.weighted_bipred_idc = STD_VIDEO_H264_WEIGHTED_BIPRED_IDC_DEFAULT,
.pic_init_qp_minus26 = 0,
.pic_init_qs_minus26 = 0,
.chroma_qp_index_offset = 0,
.second_chroma_qp_index_offset = 0,
.pScalingLists = NULL,
};
static const uint8_t h264_slice[] = {
0x25, 0x88, 0x80, 0x4f, 0xb8, 0x15, 0x59, 0xd0, 0x00, 0x3d, 0xe7, 0xfe, 0x6e,
0x22, 0xeb, 0xb9, 0x72, 0x7b, 0x52, 0x8d, 0xd8, 0xf7, 0x14, 0x97, 0xc7, 0xa3,
0x62, 0xb7, 0x6a, 0x61, 0x8e, 0xd2, 0xec, 0x64, 0xbc, 0xa4, 0x00, 0x00, 0x05,
0x93, 0xa2, 0x39, 0xa9, 0x99, 0x1e, 0xc5, 0x01, 0x4a, 0x00, 0x0c, 0x03, 0x0d,
0x75, 0x45, 0x2a, 0xe3, 0x3d, 0x7f, 0x10, 0x03, 0x82
};
/* H265 DECODER: 1 frame 320x240 blue box */
static StdVideoH265HrdParameters h265_std_hrd = { 0, };
static StdVideoH265ProfileTierLevel h265_std_ptl = {
.flags = {
.general_progressive_source_flag = 1,
.general_frame_only_constraint_flag = 1,
},
.general_profile_idc = STD_VIDEO_H265_PROFILE_IDC_MAIN,
.general_level_idc = STD_VIDEO_H265_LEVEL_IDC_6_0,
};
static StdVideoH265DecPicBufMgr h265_std_pbm = {
.max_latency_increase_plus1 = {5, 0,},
.max_dec_pic_buffering_minus1 = {4, 0,},
.max_num_reorder_pics = {2, 0,},
};
static StdVideoH265VideoParameterSet h265_std_vps = {
.flags = {
.vps_temporal_id_nesting_flag = 1,
.vps_sub_layer_ordering_info_present_flag = 1,
},
.vps_video_parameter_set_id = 0,
.pDecPicBufMgr = &h265_std_pbm,
.pHrdParameters = &h265_std_hrd,
.pProfileTierLevel = &h265_std_ptl,
};
static StdVideoH265SequenceParameterSetVui h265_std_sps_vui = {
.flags = {
.video_signal_type_present_flag = 1,
.vui_timing_info_present_flag = 1,
},
.aspect_ratio_idc = STD_VIDEO_H265_ASPECT_RATIO_IDC_UNSPECIFIED,
.video_format = 5,
.colour_primaries = 2,
.transfer_characteristics = 2,
.matrix_coeffs = 2,
.vui_num_units_in_tick = 1,
.vui_time_scale = 25,
.pHrdParameters = &h265_std_hrd,
};
static StdVideoH265SequenceParameterSet h265_std_sps = {
.flags = {
.sps_temporal_id_nesting_flag = 1,
.sps_sub_layer_ordering_info_present_flag = 1,
.sample_adaptive_offset_enabled_flag = 1,
.sps_temporal_mvp_enabled_flag = 1,
.strong_intra_smoothing_enabled_flag = 1,
.vui_parameters_present_flag = 1,
.sps_extension_present_flag = 1,
},
.chroma_format_idc = STD_VIDEO_H265_CHROMA_FORMAT_IDC_420,
.pic_width_in_luma_samples = 320,
.pic_height_in_luma_samples = 240,
.sps_video_parameter_set_id = 0,
.sps_seq_parameter_set_id = 0,
.log2_max_pic_order_cnt_lsb_minus4 = 4,
.log2_diff_max_min_luma_coding_block_size = 3,
.log2_diff_max_min_luma_transform_block_size = 3,
.pProfileTierLevel = &h265_std_ptl,
.pDecPicBufMgr = &h265_std_pbm,
.pSequenceParameterSetVui = &h265_std_sps_vui,
};
static StdVideoH265PictureParameterSet h265_std_pps = {
.flags = {
.sign_data_hiding_enabled_flag = 1,
.cu_qp_delta_enabled_flag = 1,
.weighted_pred_flag = 1,
.entropy_coding_sync_enabled_flag = 1,
.uniform_spacing_flag = 1,
.loop_filter_across_tiles_enabled_flag = 1,
.pps_loop_filter_across_slices_enabled_flag = 1,
},
.pps_pic_parameter_set_id = 0,
.pps_seq_parameter_set_id = 0,
.sps_video_parameter_set_id = 0,
.diff_cu_qp_delta_depth = 1,
};
static const uint8_t h265_slice[] = {
0x28, 0x01, 0xaf, 0x1d, 0x21, 0x6a, 0x83, 0x40, 0xf7, 0xcf, 0x80, 0xff, 0xf8,
0x90, 0xfa, 0x3b, 0x77, 0x87, 0x96, 0x96, 0xba, 0xfa, 0xcd, 0x61, 0xb5, 0xe3,
0xc1, 0x02, 0x2d, 0xe0, 0xa8, 0x17, 0x96, 0x03, 0x4c, 0x4e, 0x1a, 0x9e, 0xd0,
0x93, 0x0b, 0x93, 0x40, 0x00, 0x05, 0xec, 0x87, 0x00, 0x00, 0x03, 0x00, 0x00,
0x03, 0x00, 0x56, 0x40
};

View file

@ -26,6 +26,8 @@
#include <gst/check/gstcheck.h>
#include <gst/vulkan/vulkan.h>
#include "gst/vulkan/gstvkdecoder-private.h"
static GstVulkanInstance *instance;
static GstVulkanDevice *device;
static GstVulkanQueue *video_queue = NULL;
@ -273,120 +275,9 @@ download_and_check_output_buffer (GstVulkanDecoder * dec, VkFormat vk_format,
gst_object_unref (out_pool);
}
#include "vkcodecparams.c"
/* DECODER: 1 frame 320x240 blue box */
StdVideoH264HrdParameters h264_std_hrd = {
.cpb_cnt_minus1 = 0,
.bit_rate_scale = 4,
.cpb_size_scale = 0,
.reserved1 = 0,
.bit_rate_value_minus1 = {2928,},
.cpb_size_value_minus1 = {74999,},
.cbr_flag = {0,},
.initial_cpb_removal_delay_length_minus1 = 23,
.cpb_removal_delay_length_minus1 = 23,
.dpb_output_delay_length_minus1 = 23,
.time_offset_length = 24,
};
StdVideoH264SequenceParameterSetVui h264_std_vui = {
.flags = {
.aspect_ratio_info_present_flag = 1,
.overscan_info_present_flag = 0,
.overscan_appropriate_flag = 0,
.video_signal_type_present_flag = 0,
.video_full_range_flag = 1,
.color_description_present_flag = 0,
.chroma_loc_info_present_flag = 1,
.timing_info_present_flag = 1,
.fixed_frame_rate_flag = 1,
.bitstream_restriction_flag = 1,
.nal_hrd_parameters_present_flag = 1,
.vcl_hrd_parameters_present_flag = 0,
},
.aspect_ratio_idc = STD_VIDEO_H264_ASPECT_RATIO_IDC_SQUARE,
.sar_width = 1,
.sar_height = 1,
.video_format = 0,
.colour_primaries = 0,
.transfer_characteristics = 0,
.matrix_coefficients = 2,
.num_units_in_tick = 1,
.time_scale = 60,
.max_num_reorder_frames = 0,
.max_dec_frame_buffering = 2,
.chroma_sample_loc_type_top_field = 0,
.chroma_sample_loc_type_bottom_field = 0,
.pHrdParameters = &h264_std_hrd,
};
StdVideoH264SequenceParameterSet h264_std_sps = {
.flags = {
.constraint_set0_flag = 0,
.constraint_set1_flag = 0,
.constraint_set2_flag = 0,
.constraint_set3_flag = 0,
.constraint_set4_flag = 0,
.constraint_set5_flag = 0,
.direct_8x8_inference_flag = 1,
.mb_adaptive_frame_field_flag = 0,
.frame_mbs_only_flag = 1,
.delta_pic_order_always_zero_flag = 0,
.separate_colour_plane_flag = 0,
.gaps_in_frame_num_value_allowed_flag = 0,
.qpprime_y_zero_transform_bypass_flag = 0,
.frame_cropping_flag = 0,
.seq_scaling_matrix_present_flag = 0,
.vui_parameters_present_flag = 1,
},
.profile_idc = STD_VIDEO_H264_PROFILE_IDC_MAIN,
.level_idc = STD_VIDEO_H264_LEVEL_IDC_2_1,
.chroma_format_idc = STD_VIDEO_H264_CHROMA_FORMAT_IDC_420,
.seq_parameter_set_id = 0,
.bit_depth_luma_minus8 = 0,
.bit_depth_chroma_minus8 = 0,
.log2_max_frame_num_minus4 = 4,
.pic_order_cnt_type = STD_VIDEO_H264_POC_TYPE_2,
.offset_for_non_ref_pic = 0,
.offset_for_top_to_bottom_field = 0,
.log2_max_pic_order_cnt_lsb_minus4 = 0,
.num_ref_frames_in_pic_order_cnt_cycle = 0,
.max_num_ref_frames = 2,
.pic_width_in_mbs_minus1 = 19,
.pic_height_in_map_units_minus1 = 14,
.frame_crop_left_offset = 0,
.frame_crop_right_offset = 0,
.frame_crop_top_offset = 0,
.frame_crop_bottom_offset = 0,
.pOffsetForRefFrame = NULL,
.pScalingLists = NULL,
.pSequenceParameterSetVui = &h264_std_vui,
};
StdVideoH264PictureParameterSet h264_std_pps = {
.flags = {
.transform_8x8_mode_flag = 0,
.redundant_pic_cnt_present_flag = 0,
.constrained_intra_pred_flag = 0,
.deblocking_filter_control_present_flag = 1,
.weighted_pred_flag = 0,
.bottom_field_pic_order_in_frame_present_flag = 0,
.entropy_coding_mode_flag = 1,
.pic_scaling_matrix_present_flag = 0,
},
.seq_parameter_set_id = 0,
.pic_parameter_set_id = 0,
.num_ref_idx_l0_default_active_minus1 = 0,
.num_ref_idx_l1_default_active_minus1 = 0,
.weighted_bipred_idc = STD_VIDEO_H264_WEIGHTED_BIPRED_IDC_DEFAULT,
.pic_init_qp_minus26 = 0,
.pic_init_qs_minus26 = 0,
.chroma_qp_index_offset = 0,
.second_chroma_qp_index_offset = 0,
.pScalingLists = NULL,
};
VkVideoDecodeH264SessionParametersAddInfoKHR h264_params = {
static VkVideoDecodeH264SessionParametersAddInfoKHR h264_params = {
.sType = VK_STRUCTURE_TYPE_VIDEO_DECODE_H264_SESSION_PARAMETERS_ADD_INFO_KHR,
.stdSPSCount = 1,
.pStdSPSs = &h264_std_sps,
@ -438,7 +329,7 @@ GST_START_TEST (test_h264_decoder)
return;
}
dec = gst_vulkan_queue_create_decoder (video_queue,
dec = gst_vulkan_decoder_new_from_queue (video_queue,
VK_VIDEO_CODEC_OPERATION_DECODE_H264_BIT_KHR);
if (!dec) {
GST_WARNING ("Unable to create a vulkan decoder");
@ -460,21 +351,8 @@ GST_START_TEST (test_h264_decoder)
get_output_buffer (dec, format_prop.format, &pic);
/* get input buffer */
{
static const guint8 slice[] = {
0x25, 0x88, 0x80, 0x4f, 0xb8, 0x15, 0x59, 0xd0, 0x00, 0x3d, 0xe7, 0xfe,
0x6e, 0x22, 0xeb, 0xb9,
0x72, 0x7b, 0x52, 0x8d, 0xd8, 0xf7, 0x14, 0x97, 0xc7, 0xa3, 0x62, 0xb7,
0x6a, 0x61, 0x8e, 0xd2,
0xec, 0x64, 0xbc, 0xa4, 0x00, 0x00, 0x05, 0x93, 0xa2, 0x39, 0xa9, 0x99,
0x1e, 0xc5, 0x01, 0x4a,
0x00, 0x0c, 0x03, 0x0d, 0x75, 0x45, 0x2a, 0xe3, 0x3d, 0x7f, 0x10, 0x03,
0x82
};
fail_unless (gst_vulkan_decoder_append_slice (dec, &pic, slice,
sizeof (slice), TRUE));
}
fail_unless (gst_vulkan_decoder_append_slice (dec, &pic, h264_slice,
sizeof (h264_slice), TRUE));
/* decode */
{
@ -563,89 +441,7 @@ GST_START_TEST (test_h264_decoder)
GST_END_TEST;
StdVideoH265HrdParameters h265_std_hrd = { 0, };
StdVideoH265ProfileTierLevel h265_std_ptl = {
.flags = {
.general_progressive_source_flag = 1,
.general_frame_only_constraint_flag = 1,
},
.general_profile_idc = STD_VIDEO_H265_PROFILE_IDC_MAIN,
.general_level_idc = STD_VIDEO_H265_LEVEL_IDC_6_0,
};
StdVideoH265DecPicBufMgr h265_std_pbm = {
.max_latency_increase_plus1 = {5, 0,},
.max_dec_pic_buffering_minus1 = {4, 0,},
.max_num_reorder_pics = {2, 0,},
};
StdVideoH265VideoParameterSet h265_std_vps = {
.flags = {
.vps_temporal_id_nesting_flag = 1,
.vps_sub_layer_ordering_info_present_flag = 1,
},
.vps_video_parameter_set_id = 0,
.pDecPicBufMgr = &h265_std_pbm,
.pHrdParameters = &h265_std_hrd,
.pProfileTierLevel = &h265_std_ptl,
};
StdVideoH265SequenceParameterSetVui h265_std_sps_vui = {
.flags = {
.video_signal_type_present_flag = 1,
.vui_timing_info_present_flag = 1,
},
.aspect_ratio_idc = STD_VIDEO_H265_ASPECT_RATIO_IDC_UNSPECIFIED,
.video_format = 5,
.colour_primaries = 2,
.transfer_characteristics = 2,
.matrix_coeffs = 2,
.vui_num_units_in_tick = 1,
.vui_time_scale = 25,
.pHrdParameters = &h265_std_hrd,
};
StdVideoH265SequenceParameterSet h265_std_sps = {
.flags = {
.sps_temporal_id_nesting_flag = 1,
.sps_sub_layer_ordering_info_present_flag = 1,
.sample_adaptive_offset_enabled_flag = 1,
.sps_temporal_mvp_enabled_flag = 1,
.strong_intra_smoothing_enabled_flag = 1,
.vui_parameters_present_flag = 1,
.sps_extension_present_flag = 1,
},
.chroma_format_idc = STD_VIDEO_H265_CHROMA_FORMAT_IDC_420,
.pic_width_in_luma_samples = 320,
.pic_height_in_luma_samples = 240,
.sps_video_parameter_set_id = 0,
.sps_seq_parameter_set_id = 0,
.log2_max_pic_order_cnt_lsb_minus4 = 4,
.log2_diff_max_min_luma_coding_block_size = 3,
.log2_diff_max_min_luma_transform_block_size = 3,
.pProfileTierLevel = &h265_std_ptl,
.pDecPicBufMgr = &h265_std_pbm,
.pSequenceParameterSetVui = &h265_std_sps_vui,
};
StdVideoH265PictureParameterSet h265_std_pps = {
.flags = {
.sign_data_hiding_enabled_flag = 1,
.cu_qp_delta_enabled_flag = 1,
.weighted_pred_flag = 1,
.entropy_coding_sync_enabled_flag = 1,
.uniform_spacing_flag = 1,
.loop_filter_across_tiles_enabled_flag = 1,
.pps_loop_filter_across_slices_enabled_flag = 1,
},
.pps_pic_parameter_set_id = 0,
.pps_seq_parameter_set_id = 0,
.sps_video_parameter_set_id = 0,
.diff_cu_qp_delta_depth = 1,
};
VkVideoDecodeH265SessionParametersAddInfoKHR h265_params = {
static VkVideoDecodeH265SessionParametersAddInfoKHR h265_params = {
.sType = VK_STRUCTURE_TYPE_VIDEO_DECODE_H265_SESSION_PARAMETERS_ADD_INFO_KHR,
.stdVPSCount = 1,
.pStdVPSs = &h265_std_vps,
@ -701,7 +497,7 @@ GST_START_TEST (test_h265_decoder)
return;
}
dec = gst_vulkan_queue_create_decoder (video_queue,
dec = gst_vulkan_decoder_new_from_queue (video_queue,
VK_VIDEO_CODEC_OPERATION_DECODE_H265_BIT_KHR);
if (!dec) {
GST_WARNING ("Unable to create a vulkan decoder");
@ -723,18 +519,8 @@ GST_START_TEST (test_h265_decoder)
get_output_buffer (dec, format_prop.format, &pic);
/* get input buffer */
{
static const guint8 slice[] = {
0x28, 0x01, 0xaf, 0x1d, 0x21, 0x6a, 0x83, 0x40, 0xf7, 0xcf, 0x80, 0xff,
0xf8, 0x90, 0xfa, 0x3b, 0x77, 0x87, 0x96, 0x96, 0xba, 0xfa, 0xcd, 0x61,
0xb5, 0xe3, 0xc1, 0x02, 0x2d, 0xe0, 0xa8, 0x17, 0x96, 0x03, 0x4c, 0x4e,
0x1a, 0x9e, 0xd0, 0x93, 0x0b, 0x93, 0x40, 0x00, 0x05, 0xec, 0x87, 0x00,
0x00, 0x03, 0x00, 0x00, 0x03, 0x00, 0x56, 0x40
};
fail_unless (gst_vulkan_decoder_append_slice (dec, &pic, slice,
sizeof (slice), TRUE));
}
fail_unless (gst_vulkan_decoder_append_slice (dec, &pic, h265_slice,
sizeof (h265_slice), TRUE));
/* decode */
{

View file

@ -112,6 +112,7 @@ base_tests = [
[['libs/d3d11device.cpp'], not gstd3d11_dep.found(), [gstd3d11_dep, gstvideo_dep]],
[['libs/d3d11memory.c'], not gstd3d11_dep.found(), [gstd3d11_dep]],
[['libs/cudamemory.c'], not gstcuda_dep.found(), [gstcuda_dep, gstcuda_stub_dep]],
[['libs/d3d12device.cpp'], not gstd3d12_dep.found(), [gstd3d12_dep]],
]
# Make sure our headers are C++ clean

View file

@ -85,22 +85,12 @@ ensure_uri (const gchar * location)
gchar *
get_file_extension (gchar * uri)
{
size_t len;
gint find;
gchar *dot = g_strrstr (uri, ".");
len = strlen (uri);
find = len - 1;
while (find >= 0) {
if (uri[find] == '.')
break;
find--;
}
if (find < 0)
if (!dot)
return NULL;
return &uri[find + 1];
return dot + 1;
}
GList *

View file

@ -865,6 +865,30 @@
"type": "guint",
"writable": true
},
"input-channels-reorder": {
"blurb": "The positions configuration to use to reorder the input channels consecutively according to their index.",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "gst (0)",
"mutable": "null",
"readable": true,
"type": "GstAudioConvertInputChannelsReorder",
"writable": true
},
"input-channels-reorder-mode": {
"blurb": "The input channels reordering mode used to apply the selected positions configuration.",
"conditionally-available": false,
"construct": false,
"construct-only": false,
"controllable": false,
"default": "none (0)",
"mutable": "null",
"readable": true,
"type": "GstAudioConvertInputChannelsReorderMode",
"writable": true
},
"mix-matrix": {
"blurb": "Transformation matrix for input/output channels.",
"conditionally-available": false,
@ -894,7 +918,68 @@
},
"filename": "gstaudioconvert",
"license": "LGPL",
"other-types": {},
"other-types": {
"GstAudioConvertInputChannelsReorder": {
"kind": "enum",
"values": [
{
"desc": "Reorder the input channels using the default GStreamer order",
"name": "gst",
"value": "0"
},
{
"desc": "Reorder the input channels using the SMPTE order",
"name": "smpte",
"value": "1"
},
{
"desc": "Reorder the input channels using the CINE order",
"name": "cine",
"value": "2"
},
{
"desc": "Reorder the input channels using the AC3 order",
"name": "ac3",
"value": "3"
},
{
"desc": "Reorder the input channels using the AAC order",
"name": "aac",
"value": "4"
},
{
"desc": "Reorder and mix all input channels to a single mono channel",
"name": "mono",
"value": "5"
},
{
"desc": "Reorder and mix all input channels to a single left and a single right stereo channels alternately",
"name": "alternate",
"value": "6"
}
]
},
"GstAudioConvertInputChannelsReorderMode": {
"kind": "enum",
"values": [
{
"desc": "Never reorder the input channels",
"name": "none",
"value": "0"
},
{
"desc": "Reorder the input channels only if they are unpositioned",
"name": "unpositioned",
"value": "1"
},
{
"desc": "Always reorder the input channels according to the selected configuration",
"name": "force",
"value": "2"
}
]
}
},
"package": "GStreamer Base Plug-ins",
"source": "gst-plugins-base",
"tracers": {},

View file

@ -631,6 +631,75 @@ gst_audio_channel_mixer_fill_special (gfloat ** matrix, gint in_channels,
* Automagically generate conversion matrix.
*/
typedef enum
{
GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_NONE = 0,
GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_MONO,
GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_STEREO
} GstAudioChannelMixerVirtualInput;
/* Detects specific input channels configurations introduced in the
* audioconvert element (since version 1.26) with the
* `GstAudioConvertInputChannelsReorder` configurations.
*
* If all input channels are positioned to GST_AUDIO_CHANNEL_POSITION_MONO,
* the automatic mixing matrix should be configured like if there was only one
* virtual input mono channel. This virtual mono channel is the mix of all the
* real mono channels.
*
* If all input channels with an even index are positioned to
* GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT and all input channels with an odd
* index are positioned to GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, then the
* automatic mixing matrix should be configured like if there were only one
* virtual input left channel and one virtual input right channel. This virtual
* left or right channel is the mix of all the real left or right channels.
*/
static gboolean
gst_audio_channel_mixer_detect_virtual_input_channels (gint channels,
GstAudioChannelPosition * position,
GstAudioChannelMixerVirtualInput * virtual_input)
{
g_return_val_if_fail (position != NULL, FALSE);
g_return_val_if_fail (virtual_input != NULL, FALSE);
*virtual_input = GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_NONE;
if (channels < 2)
return FALSE;
static const GstAudioChannelPosition alternate_positions[2] =
{ GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT,
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT
};
gboolean is_mono = TRUE;
gboolean is_alternate = TRUE;
for (gint i = 0; i < channels; ++i) {
if (position[i] != GST_AUDIO_CHANNEL_POSITION_MONO)
is_mono = FALSE;
if (position[i] != alternate_positions[i % 2])
is_alternate = FALSE;
if (!is_mono && !is_alternate)
return FALSE;
}
if (is_mono) {
g_assert (!is_alternate);
*virtual_input = GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_MONO;
return TRUE;
}
if (is_alternate && (channels > 2)) {
g_assert (!is_mono);
*virtual_input = GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_STEREO;
return TRUE;
}
return FALSE;
}
static void
gst_audio_channel_mixer_fill_matrix (gfloat ** matrix,
GstAudioChannelMixerFlags flags, gint in_channels,
@ -641,15 +710,71 @@ gst_audio_channel_mixer_fill_matrix (gfloat ** matrix,
out_channels, out_position))
return;
gst_audio_channel_mixer_fill_identical (matrix, in_channels, in_position,
/* If all input channels are positioned to mono, the mix matrix should be
* configured like if there was only one virtual input mono channel. This
* virtual mono channel is the mix of all the real input mono channels.
*
* If all input channels are positioned to left and right alternately, the mix
* matrix should be configured like if there were only two virtual input
* channels: one left and one right. This virtual left or right channel is the
* mix of all the real input left or right channels.
*/
gint in_size = in_channels;
GstAudioChannelMixerVirtualInput virtual_input =
GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_NONE;
if (gst_audio_channel_mixer_detect_virtual_input_channels (in_size,
in_position, &virtual_input)) {
switch (virtual_input) {
case GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_MONO:
in_size = 1;
break;
case GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_STEREO:
in_size = 2;
break;
default:
break;
}
}
gst_audio_channel_mixer_fill_identical (matrix, in_size, in_position,
out_channels, out_position, flags);
if (!(flags & GST_AUDIO_CHANNEL_MIXER_FLAGS_UNPOSITIONED_IN)) {
gst_audio_channel_mixer_fill_compatible (matrix, in_channels, in_position,
gst_audio_channel_mixer_fill_compatible (matrix, in_size, in_position,
out_channels, out_position);
gst_audio_channel_mixer_fill_others (matrix, in_channels, in_position,
gst_audio_channel_mixer_fill_others (matrix, in_size, in_position,
out_channels, out_position);
gst_audio_channel_mixer_fill_normalize (matrix, in_channels, out_channels);
gst_audio_channel_mixer_fill_normalize (matrix, in_size, out_channels);
}
switch (virtual_input) {
case GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_MONO:{
for (gint out = 0; out < out_channels; ++out)
matrix[0][out] /= in_channels;
for (gint in = 1; in < in_channels; ++in)
memcpy (matrix[in], matrix[0], out_channels * sizeof (gfloat));
break;
}
case GST_AUDIO_CHANNEL_MIXER_VIRTUAL_INPUT_STEREO:{
gint right_channels = in_channels >> 1;
gint left_channels = right_channels + (in_channels % 2);
for (gint out = 0; out < out_channels; ++out) {
matrix[0][out] /= left_channels;
matrix[1][out] /= right_channels;
}
for (gint in = 2; in < in_channels; ++in)
memcpy (matrix[in], matrix[in % 2], out_channels * sizeof (gfloat));
break;
}
default:
break;
}
}

View file

@ -2068,8 +2068,21 @@ error:
goto done;
}
static GstEncodingProfile *
profile_from_string (const gchar * string)
/**
* gst_encoding_profile_from_string:
* @string: The string to convert into a GstEncodingProfile.
*
* Converts a string in the "encoding profile serialization format" into a
* GstEncodingProfile. Refer to the encoding-profile documentation for details
* on the format.
*
* Since: 1.26
*
* Returns: (transfer full): A newly created GstEncodingProfile or NULL if the
* input string is not a valid encoding profile serialization format.
*/
GstEncodingProfile *
gst_encoding_profile_from_string (const gchar * string)
{
GstEncodingProfile *profile;
gchar *filename_end;
@ -2117,7 +2130,7 @@ string_to_profile_transform (const GValue * src_value, GValue * dest_value)
profilename = g_value_get_string (src_value);
profile = profile_from_string (profilename);
profile = gst_encoding_profile_from_string (profilename);
if (profile)
g_value_take_object (dest_value, (GObject *) profile);
@ -2156,23 +2169,45 @@ serialize_profile (GString * res, GstEncodingProfile * profile)
}
}
static gchar *
gst_encoding_profile_serialize_valfunc (GValue * value)
/**
* gst_encoding_profile_to_string:
* @profile: (transfer none): The GstEncodingProfile to convert.
*
* Converts a GstEncodingProfile to a string in the "Encoding Profile
* serialization format".
*
* Since: 1.26
*
* Returns: (transfer full): A string representation of the GstEncodingProfile,
* or NULL if the input is invalid.
*/
gchar *
gst_encoding_profile_to_string (GstEncodingProfile * profile)
{
GString *res = g_string_new (NULL);
GstEncodingProfile *profile = g_value_get_object (value);
GString *res;
g_return_val_if_fail (profile != NULL, NULL);
res = g_string_new (NULL);
serialize_profile (res, profile);
return g_string_free (res, FALSE);
}
static gchar *
gst_encoding_profile_serialize_valfunc (GValue * value)
{
GstEncodingProfile *profile = g_value_get_object (value);
return gst_encoding_profile_to_string (profile);
}
static gboolean
gst_encoding_profile_deserialize_valfunc (GValue * value, const gchar * s)
{
GstEncodingProfile *profile;
profile = profile_from_string (s);
profile = gst_encoding_profile_from_string (s);
if (profile) {
g_value_take_object (value, (GObject *) profile);

View file

@ -266,6 +266,12 @@ GstEncodingProfile * gst_encoding_profile_from_discoverer (GstDiscovererInfo *in
GST_PBUTILS_API
GstEncodingProfile * gst_encoding_profile_copy (GstEncodingProfile *self);
GST_PBUTILS_API
GstEncodingProfile * gst_encoding_profile_from_string (const gchar *string);
GST_PBUTILS_API
gchar * gst_encoding_profile_to_string (GstEncodingProfile *profile);
GST_PBUTILS_API
void gst_encoding_profile_set_element_properties (GstEncodingProfile *self,
GstStructure *element_properties);

View file

@ -65,6 +65,25 @@ typedef struct _GstExifWriter GstExifWriter;
typedef struct _GstExifReader GstExifReader;
typedef struct _GstExifTagData GstExifTagData;
#define GST_CAT_DEFAULT gst_exif_tag_ensure_debug_category()
static GstDebugCategory *
gst_exif_tag_ensure_debug_category (void)
{
static gsize cat_gonce = 0;
if (g_once_init_enter (&cat_gonce)) {
GstDebugCategory *cat = NULL;
GST_DEBUG_CATEGORY_INIT (cat, "exif-tags", 0, "EXIF tag parsing");
g_once_init_leave (&cat_gonce, (gsize) cat);
}
return (GstDebugCategory *) cat_gonce;
}
typedef void (*GstExifSerializationFunc) (GstExifWriter * writer,
const GstTagList * taglist, const GstExifTagMatch * exiftag);

View file

@ -3652,11 +3652,15 @@ gst_video_decoder_clip_and_push_buf (GstVideoDecoder * decoder, GstBuffer * buf)
goto done;
}
/* Is buffer too late (QoS) ? */
if (priv->do_qos && GST_CLOCK_TIME_IS_VALID (priv->earliest_time)
&& GST_CLOCK_TIME_IS_VALID (cstart)) {
GstClockTime deadline =
gst_segment_to_running_time (segment, GST_FORMAT_TIME, cstart);
/* Check if the buffer is too late (QoS). */
if (priv->do_qos && GST_CLOCK_TIME_IS_VALID (priv->earliest_time)) {
GstClockTime deadline = GST_CLOCK_TIME_NONE;
/* We prefer to use the frame stop position for checking for QoS since we
* don't want to drop a frame which is partially late */
if (GST_CLOCK_TIME_IS_VALID (cstop))
deadline = gst_segment_to_running_time (segment, GST_FORMAT_TIME, cstop);
else if (GST_CLOCK_TIME_IS_VALID (cstart))
deadline = gst_segment_to_running_time (segment, GST_FORMAT_TIME, cstart);
if (GST_CLOCK_TIME_IS_VALID (deadline) && deadline < priv->earliest_time) {
GST_WARNING_OBJECT (decoder,
"Dropping frame due to QoS. start:%" GST_TIME_FORMAT " deadline:%"

View file

@ -90,6 +90,31 @@
* |[
* gst-launch-1.0 -v audiotestsrc ! audio/x-raw,channels=8 ! audioconvert mix-matrix="<>" ! audio/x-raw,channels=16,channel-mask=\(bitmask\)0x0000000000000000 ! fakesink
* ]|
*
* If input channels are unpositioned but follow a standard layout, they can be
* automatically positioned according to their index using one of the reorder
* configurations.
*
* ## Example with unpositioned input channels reordering
* |[
* gst-launch-1.0 -v audiotestsrc ! audio/x-raw,channels=6,channel-mask=\(bitmask\)0x0000000000000000 ! audioconvert input-channels-reorder-mode=unpositioned input-channels-reorder=smpte ! fakesink
* ]|
* In this case the input channels will be automatically positioned to the
* SMPTE order (left, right, center, lfe, rear-left and rear-right).
*
* The input channels reorder configurations can also be used to force the
* repositioning of the input channels when needed, for example when channels'
* positions are not correctly identified in an encoded file.
*
* ## Example with the forced reordering of input channels wrongly positioned
* |[
* gst-launch-1.0 -v audiotestsrc ! audio/x-raw,channels=3,channel-mask=\(bitmask\)0x0000000000000034 ! audioconvert input-channels-reorder-mode=force input-channels-reorder=aac ! fakesink
* ]|
* In this case the input channels are positioned upstream as center,
* rear-left and rear-right in this order. Using the "force" reorder mode and
* the "aac" order, the input channels are going to be repositioned to left,
* right and lfe, ignoring the actual value of the `channel-mask` in the input
* caps.
*/
/*
@ -158,7 +183,9 @@ enum
PROP_DITHERING,
PROP_NOISE_SHAPING,
PROP_MIX_MATRIX,
PROP_DITHERING_THRESHOLD
PROP_DITHERING_THRESHOLD,
PROP_INPUT_CHANNELS_REORDER,
PROP_INPUT_CHANNELS_REORDER_MODE
};
#define DEBUG_INIT \
@ -192,6 +219,80 @@ GST_STATIC_PAD_TEMPLATE ("sink",
static GQuark meta_tag_audio_quark;
/*** TYPE FUNCTIONS ***********************************************************/
#define GST_TYPE_AUDIO_CONVERT_INPUT_CHANNELS_REORDER (gst_audio_convert_input_channels_reorder_get_type ())
static GType
gst_audio_convert_input_channels_reorder_get_type (void)
{
static GType reorder_type = 0;
if (g_once_init_enter (&reorder_type)) {
static GEnumValue reorder_types[] = {
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST,
"Reorder the input channels using the default GStreamer order",
"gst"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE,
"Reorder the input channels using the SMPTE order",
"smpte"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE,
"Reorder the input channels using the CINE order",
"cine"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3,
"Reorder the input channels using the AC3 order",
"ac3"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC,
"Reorder the input channels using the AAC order",
"aac"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO,
"Reorder and mix all input channels to a single mono channel",
"mono"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE,
"Reorder and mix all input channels to a single left and a single right stereo channels alternately",
"alternate"},
{0, NULL, NULL},
};
GType type = g_enum_register_static ("GstAudioConvertInputChannelsReorder",
reorder_types);
g_once_init_leave (&reorder_type, type);
}
return reorder_type;
}
#define GST_TYPE_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE (gst_audio_convert_input_channels_reorder_mode_get_type ())
static GType
gst_audio_convert_input_channels_reorder_mode_get_type (void)
{
static GType reorder_mode_type = 0;
if (g_once_init_enter (&reorder_mode_type)) {
static GEnumValue reorder_mode_types[] = {
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_NONE,
"Never reorder the input channels",
"none"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_UNPOSITIONED,
"Reorder the input channels only if they are unpositioned",
"unpositioned"},
{GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_FORCE,
"Always reorder the input channels according to the selected configuration",
"force"},
{0, NULL, NULL},
};
GType type =
g_enum_register_static ("GstAudioConvertInputChannelsReorderMode",
reorder_mode_types);
g_once_init_leave (&reorder_mode_type, type);
}
return reorder_mode_type;
}
static void
gst_audio_convert_class_init (GstAudioConvertClass * klass)
{
@ -246,6 +347,53 @@ gst_audio_convert_class_init (GstAudioConvertClass * klass)
"Threshold for the output bit depth at/below which to apply dithering.",
0, 32, 20, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS));
/**
* GstAudioConvert:input-channels-reorder:
*
* The positions configuration to use to reorder the input channels
* consecutively according to their index. If a `mix-matrix` is specified,
* this configuration is ignored.
*
* When the input channels reordering is activated (because the
* `input-channels-reorder-mode` property is
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_FORCE or the input channels
* are unpositioned and the reorder mode is
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_UNPOSITIONED), input
* channels will be reordered consecutively according to their index
* independently of the `channel-mask` value in the sink pad audio caps.
*
* Since: 1.26
*/
g_object_class_install_property (gobject_class,
PROP_INPUT_CHANNELS_REORDER,
g_param_spec_enum ("input-channels-reorder",
"Input Channels Reorder",
"The positions configuration to use to reorder the input channels consecutively according to their index.",
GST_TYPE_AUDIO_CONVERT_INPUT_CHANNELS_REORDER,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST,
G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS));
gst_type_mark_as_plugin_api (GST_TYPE_AUDIO_CONVERT_INPUT_CHANNELS_REORDER,
0);
/**
* GstAudioConvert:input-channels-reorder-mode:
*
* The input channels reordering mode used to apply the selected positions
* configuration.
*
* Since: 1.26
*/
g_object_class_install_property (gobject_class,
PROP_INPUT_CHANNELS_REORDER_MODE,
g_param_spec_enum ("input-channels-reorder-mode",
"Input Channels Reorder Mode",
"The input channels reordering mode used to apply the selected positions configuration.",
GST_TYPE_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_NONE,
G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS));
gst_type_mark_as_plugin_api
(GST_TYPE_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE, 0);
gst_element_class_add_static_pad_template (element_class,
&gst_audio_convert_src_template);
gst_element_class_add_static_pad_template (element_class,
@ -304,6 +452,574 @@ gst_audio_convert_dispose (GObject * obj)
G_OBJECT_CLASS (parent_class)->dispose (obj);
}
/*** INPUT CHANNELS REORDER FUNCTIONS *****************************************/
typedef struct
{
gboolean has_stereo;
gboolean lfe_as_last_channel;
} GstAudioConvertInputChannelsReorderConfig;
static const GstAudioConvertInputChannelsReorderConfig
input_channels_reorder_config[] = {
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST
{TRUE, FALSE},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE
{TRUE, FALSE},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE
{TRUE, TRUE},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3
{TRUE, TRUE},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC
{TRUE, TRUE},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO
{FALSE, FALSE},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE
{TRUE, FALSE}
};
#define GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_NB G_N_ELEMENTS (input_channels_reorder_config)
static const GstAudioChannelPosition
channel_position_per_reorder_config
[GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_NB][64] = {
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST
{
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT,
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT,
GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER,
GST_AUDIO_CHANNEL_POSITION_LFE1,
GST_AUDIO_CHANNEL_POSITION_REAR_LEFT,
GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT,
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER,
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER,
GST_AUDIO_CHANNEL_POSITION_REAR_CENTER,
GST_AUDIO_CHANNEL_POSITION_LFE2,
GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT,
GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT,
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_LEFT,
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_RIGHT,
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_CENTER,
GST_AUDIO_CHANNEL_POSITION_TOP_CENTER,
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_LEFT,
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT,
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_LEFT,
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_RIGHT,
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER,
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_CENTER,
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_LEFT,
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_RIGHT,
GST_AUDIO_CHANNEL_POSITION_WIDE_LEFT,
GST_AUDIO_CHANNEL_POSITION_WIDE_RIGHT,
GST_AUDIO_CHANNEL_POSITION_SURROUND_LEFT,
GST_AUDIO_CHANNEL_POSITION_SURROUND_RIGHT,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE (see: https://www.sis.se/api/document/preview/919377/)
{
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // Left front (L)
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // Right front (R)
GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER, // Center front (C)
GST_AUDIO_CHANNEL_POSITION_LFE1, // Low frequency enhancement (LFE)
GST_AUDIO_CHANNEL_POSITION_REAR_LEFT, // Left surround (Ls)
GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT, // Right surround (Rs)
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER, // Left front center (Lc)
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER, // Right front center (Rc)
GST_AUDIO_CHANNEL_POSITION_SURROUND_LEFT, // Rear surround left (Lsr)
GST_AUDIO_CHANNEL_POSITION_SURROUND_RIGHT, // Rear surround right (Rsr)
GST_AUDIO_CHANNEL_POSITION_REAR_CENTER, // Rear center (Cs)
GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT, // Left side surround (Lss)
GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT, // Right side surround (Rss)
GST_AUDIO_CHANNEL_POSITION_WIDE_LEFT, // Left wide front (Lw)
GST_AUDIO_CHANNEL_POSITION_WIDE_RIGHT, // Right wide front (Rw)
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_LEFT, // Left front vertical height (Lv)
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_RIGHT, // Right front vertical height (Rv)
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_CENTER, // Center front vertical height (Cv)
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_LEFT, // Left surround vertical height rear (Lvr)
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT, // Right surround vertical height rear (Rvr)
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER, // Center vertical height rear (Cvr)
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_LEFT, // Left vertical height side surround (Lvss)
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_RIGHT, // Right vertical height side surround (Rvss)
GST_AUDIO_CHANNEL_POSITION_TOP_CENTER, // Top center surround (Ts)
GST_AUDIO_CHANNEL_POSITION_LFE2, // Low frequency enhancement 2 (LFE2)
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_LEFT, // Left front vertical bottom (Lb)
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_RIGHT, // Right front vertical bottom (Rb)
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_CENTER, // Center front vertical bottom (Cb)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Left vertical height surround (Lvs)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Right vertical height surround (Rvs)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Reserved
GST_AUDIO_CHANNEL_POSITION_INVALID, // Reserved
GST_AUDIO_CHANNEL_POSITION_INVALID, // Reserved
GST_AUDIO_CHANNEL_POSITION_INVALID, // Reserved
GST_AUDIO_CHANNEL_POSITION_INVALID, // Low frequency enhancement 3 (LFE3)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Left edge of screen (Leos)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Right edge of screen (Reos)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Half-way between center of screen and left edge of screen (Hwbcal)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Half-way between center of screen and right edge of screen (Hwbcar)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Left back surround (Lbs)
GST_AUDIO_CHANNEL_POSITION_INVALID, // Right back surround (Rbs)
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE
{
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER, // C
GST_AUDIO_CHANNEL_POSITION_REAR_LEFT, // Ls
GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT, // Rs
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER, // Lc
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER, // Rc
GST_AUDIO_CHANNEL_POSITION_SURROUND_LEFT, // Lsr
GST_AUDIO_CHANNEL_POSITION_SURROUND_RIGHT, // Rsr
GST_AUDIO_CHANNEL_POSITION_REAR_CENTER, // Cs
GST_AUDIO_CHANNEL_POSITION_TOP_CENTER, // Ts
GST_AUDIO_CHANNEL_POSITION_WIDE_LEFT, // Lw
GST_AUDIO_CHANNEL_POSITION_WIDE_RIGHT, // Rw
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_LEFT, // Lv
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_RIGHT, // Rv
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_CENTER, // Cv
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_LEFT, // Lvr
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT, // Rvr
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER, // Cvr
GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT, // Lss
GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT, // Rss
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_LEFT, // Lvss
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_RIGHT, // Rvss
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_LEFT, // Lb
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_RIGHT, // Rb
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_CENTER, // Cb
GST_AUDIO_CHANNEL_POSITION_LFE2, // LFE2
GST_AUDIO_CHANNEL_POSITION_LFE1, // LFE1
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3
{
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER, // C
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_REAR_LEFT, // Ls
GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT, // Rs
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER, // Lc
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER, // Rc
GST_AUDIO_CHANNEL_POSITION_SURROUND_LEFT, // Lsr
GST_AUDIO_CHANNEL_POSITION_SURROUND_RIGHT, // Rsr
GST_AUDIO_CHANNEL_POSITION_REAR_CENTER, // Cs
GST_AUDIO_CHANNEL_POSITION_TOP_CENTER, // Ts
GST_AUDIO_CHANNEL_POSITION_WIDE_LEFT, // Lw
GST_AUDIO_CHANNEL_POSITION_WIDE_RIGHT, // Rw
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_LEFT, // Lv
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_RIGHT, // Rv
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_CENTER, // Cv
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_LEFT, // Lvr
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT, // Rvr
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER, // Cvr
GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT, // Lss
GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT, // Rss
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_LEFT, // Lvss
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_RIGHT, // Rvss
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_LEFT, // Lb
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_RIGHT, // Rb
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_CENTER, // Cb
GST_AUDIO_CHANNEL_POSITION_LFE2, // LFE2
GST_AUDIO_CHANNEL_POSITION_LFE1, // LFE1
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC
{
GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER, // C
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_REAR_LEFT, // Ls
GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT, // Rs
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER, // Lc
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER, // Rc
GST_AUDIO_CHANNEL_POSITION_SURROUND_LEFT, // Lsr
GST_AUDIO_CHANNEL_POSITION_SURROUND_RIGHT, // Rsr
GST_AUDIO_CHANNEL_POSITION_REAR_CENTER, // Cs
GST_AUDIO_CHANNEL_POSITION_TOP_CENTER, // Ts
GST_AUDIO_CHANNEL_POSITION_WIDE_LEFT, // Lw
GST_AUDIO_CHANNEL_POSITION_WIDE_RIGHT, // Rw
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_LEFT, // Lv
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_RIGHT, // Rv
GST_AUDIO_CHANNEL_POSITION_TOP_FRONT_CENTER, // Cv
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_LEFT, // Lvr
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_RIGHT, // Rvr
GST_AUDIO_CHANNEL_POSITION_TOP_REAR_CENTER, // Cvr
GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT, // Lss
GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT, // Rss
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_LEFT, // Lvss
GST_AUDIO_CHANNEL_POSITION_TOP_SIDE_RIGHT, // Rvss
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_LEFT, // Lb
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_RIGHT, // Rb
GST_AUDIO_CHANNEL_POSITION_BOTTOM_FRONT_CENTER, // Cb
GST_AUDIO_CHANNEL_POSITION_LFE2, // LFE2
GST_AUDIO_CHANNEL_POSITION_LFE1, // LFE1
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
GST_AUDIO_CHANNEL_POSITION_INVALID,
},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO
{
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
GST_AUDIO_CHANNEL_POSITION_MONO,
},
// GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE
{
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT, // L
GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT, // R
}
};
static const gchar *gst_audio_convert_input_channels_reorder_to_string
(GstAudioConvertInputChannelsReorder reorder)
{
switch (reorder) {
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST:
return "GST";
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE:
return "SMPTE";
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE:
return "CINE";
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3:
return "AC3";
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC:
return "AAC";
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO:
return "MONO";
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE:
return "ALTERNATE";
default:
return "UNKNOWN";
}
}
static gboolean
gst_audio_convert_position_channels_from_reorder_configuration (gint channels,
GstAudioConvertInputChannelsReorder reorder,
GstAudioChannelPosition * position)
{
g_return_val_if_fail (channels > 0, FALSE);
g_return_val_if_fail (reorder >= 0
&& reorder < GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_NB, FALSE);
g_return_val_if_fail (position != NULL, FALSE);
GST_DEBUG ("ordering %d audio channel(s) according to the %s configuration",
channels, gst_audio_convert_input_channels_reorder_to_string (reorder));
if (channels == 1) {
position[0] = GST_AUDIO_CHANNEL_POSITION_MONO;
return TRUE;
}
if (channels == 2 && input_channels_reorder_config[reorder].has_stereo) {
position[0] = GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT;
position[1] = GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT;
return TRUE;
}
for (gint i = 0; i < channels; ++i) {
if (i < G_N_ELEMENTS (channel_position_per_reorder_config[reorder]))
position[i] = channel_position_per_reorder_config[reorder][i];
else
position[i] = GST_AUDIO_CHANNEL_POSITION_INVALID;
}
if (channels > 2
&& input_channels_reorder_config[reorder].lfe_as_last_channel) {
position[channels - 1] = GST_AUDIO_CHANNEL_POSITION_LFE1;
if (channels == 3 && input_channels_reorder_config[reorder].has_stereo) {
position[0] = GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT;
position[1] = GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT;
}
}
return TRUE;
}
/*** GSTREAMER FUNCTIONS ******************************************************/
/* BaseTransform vmethods */
@ -352,12 +1068,14 @@ remove_channels_from_structure (GstCapsFeatures * features, GstStructure * s,
{
guint64 mask;
gint channels;
GstAudioConvert *this = GST_AUDIO_CONVERT (user_data);
gboolean force_removing = *(gboolean *) user_data;
/* Only remove the channels and channel-mask if a (empty) mix matrix was manually specified,
* if no channel-mask is specified, for non-NONE channel layouts or for a single channel layout
/* Only remove the channels and channel-mask if a mix matrix was manually
* specified or an input channels reordering is applied, or if no
* channel-mask is specified, for non-NONE channel layouts or for a single
* channel layout.
*/
if (this->mix_matrix_is_set ||
if (force_removing ||
!gst_structure_get (s, "channel-mask", GST_TYPE_BITMASK, &mask, NULL) ||
(mask != 0 || (gst_structure_get_int (s, "channels", &channels)
&& channels == 1))) {
@ -393,7 +1111,12 @@ gst_audio_convert_transform_caps (GstBaseTransform * btrans,
gst_caps_map_in_place (tmp, remove_format_from_structure, NULL);
gst_caps_map_in_place (tmp, remove_layout_from_structure, NULL);
gst_caps_map_in_place (tmp, remove_channels_from_structure, btrans);
gboolean force_removing = this->mix_matrix_is_set
|| (direction == GST_PAD_SINK
&& this->input_channels_reorder_mode !=
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_NONE);
gst_caps_map_in_place (tmp, remove_channels_from_structure, &force_removing);
/* We can infer the required input / output channels based on the
* matrix dimensions */
@ -796,11 +1519,46 @@ gst_audio_convert_set_caps (GstBaseTransform * base, GstCaps * incaps,
GST_AUDIO_CONVERTER_OPT_NOISE_SHAPING_METHOD,
GST_TYPE_AUDIO_NOISE_SHAPING_METHOD, this->ns, NULL);
if (this->mix_matrix_is_set)
if (this->mix_matrix_is_set) {
gst_structure_set_value (config, GST_AUDIO_CONVERTER_OPT_MIX_MATRIX,
&this->mix_matrix);
this->convert = gst_audio_converter_new (0, &in_info, &out_info, config);
this->convert = gst_audio_converter_new (0, &in_info, &out_info, config);
} else if (this->input_channels_reorder_mode !=
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_NONE) {
GstAudioFlags in_flags;
GstAudioChannelPosition in_position[64];
gboolean restore_in = FALSE;
if (this->input_channels_reorder_mode ==
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_FORCE
|| GST_AUDIO_INFO_IS_UNPOSITIONED (&in_info)) {
in_flags = GST_AUDIO_INFO_FLAGS (&in_info);
memcpy (in_position, in_info.position,
GST_AUDIO_INFO_CHANNELS (&in_info) *
sizeof (GstAudioChannelPosition));
if (gst_audio_convert_position_channels_from_reorder_configuration
(GST_AUDIO_INFO_CHANNELS (&in_info), this->input_channels_reorder,
in_info.position)) {
GST_AUDIO_INFO_FLAGS (&in_info) &= ~GST_AUDIO_FLAG_UNPOSITIONED;
restore_in = TRUE;
}
}
this->convert = gst_audio_converter_new (0, &in_info, &out_info, config);
if (restore_in) {
GST_AUDIO_INFO_FLAGS (&in_info) = in_flags;
memcpy (in_info.position, in_position,
GST_AUDIO_INFO_CHANNELS (&in_info) *
sizeof (GstAudioChannelPosition));
}
} else {
this->convert = gst_audio_converter_new (0, &in_info, &out_info, config);
}
if (this->convert == NULL)
goto no_converter;
@ -1033,6 +1791,12 @@ gst_audio_convert_set_property (GObject * object, guint prop_id,
}
}
break;
case PROP_INPUT_CHANNELS_REORDER:
this->input_channels_reorder = g_value_get_enum (value);
break;
case PROP_INPUT_CHANNELS_REORDER_MODE:
this->input_channels_reorder_mode = g_value_get_enum (value);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
break;
@ -1059,6 +1823,12 @@ gst_audio_convert_get_property (GObject * object, guint prop_id,
if (this->mix_matrix_is_set)
g_value_copy (&this->mix_matrix, value);
break;
case PROP_INPUT_CHANNELS_REORDER:
g_value_set_enum (value, this->input_channels_reorder);
break;
case PROP_INPUT_CHANNELS_REORDER_MODE:
g_value_set_enum (value, this->input_channels_reorder_mode);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
break;

View file

@ -26,6 +26,150 @@
#include <gst/base/gstbasetransform.h>
#include <gst/audio/audio.h>
/**
* GstAudioConvertInputChannelsReorder:
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST: reorder input channels
* according to the default ordering in GStreamer: FRONT_LEFT, FRONT_RIGHT,
* FRONT_CENTER, LFE1 and then the other channels. If there is only one
* input channel available, it will be positioned to MONO.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE: reorder input channels
* according to the SMPTE standard: FRONT_LEFT, FRONT_RIGHT, FRONT_CENTER,
* LFE1 and then the other channels (the ordering is slightly different from
* the default GStreamer order). This audio channels ordering is the only
* one that is officially standardized and used by default in many audio
* softwares (see: https://www.sis.se/api/document/preview/919377/). If
* there is only one input channel available, it will be positioned to MONO.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE: reorder input channels as it
* is commonly used in the cinema industry: FRONT_LEFT, FRONT_RIGHT,
* FRONT_CENTER, the other channels and then LFE1. This configuration is not
* standardized but usually appears in the literature related to the cinema
* industry and as an alternate ordering in different audio softwares. On
* some web sites, this configuration and the
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3 ordering are switched. If
* there is only one input channel available, it will be positioned to
* MONO. If the number of available input channels is > 2, the last channel
* will always be positioned to LFE1.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3: reorder input channels in the
* same order as the default order of the AC3 format: FRONT_LEFT,
* FRONT_CENTER, FRONT_RIGHT, the other channels (same order as in the
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE policy) and then LFE1.
* This configuration is also commonly used in the cinema industry and in
* professional audio softwares (like ProTools under the name "FILM"
* ordering). The only difference with the
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE configuration is the
* order of the first 3 channels. If there is only one input channel
* available, it will be positioned to MONO. If the number of available
* input channels is > 2, the last channel will always be positioned to
* LFE1. If the number of available input channels is 2 or 3, the first two
* channels will be positioned to FRONT_LEFT and FRONT_RIGHT.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC: reorder input channels in the
* same order as the default order of the AAC format: FRONT_CENTER,
* FRONT_LEFT, FRONT_RIGHT, the other channels (same order as in the
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE configuration) and then
* LFE1. The only difference with the
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE configuration is the
* order of the first 3 channels. If there is only one input channel
* available, it will be positioned to MONO. If the number of available
* input channels is > 2, the last channel will always be positioned to
* LFE1. If the number of available input channels is 2 or 3, the first two
* channels will be positioned to FRONT_LEFT and FRONT_RIGHT.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO: reorder all input channels
* to MONO. All input channels are mixed together at the same level to a
* virtual single mono channel. For `n` input channels, the virtual output
* sample value is computed as:
* `output_sample[MONO] = (1/n) x input_sample_for_channel(i)` with
* `0 <= i < n`. A concrete usage for this configuration is, for example,
* when importing audio from an array of multiple mono microphones and you
* want to use them as a unique mono channel.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE: reorder all input
* channels to FRONT_LEFT and FRONT_RIGHT channels alternately (or MONO if
* there is only one input channel available). All left input channels are
* mixed together, at the same level, to a single FRONT_LEFT virtual
* channel and all right input channels are mixed together to a single
* FRONT_RIGHT virtual channel. For `2n` input channels the FRONT_LEFT and
* FRONT_RIGHT virtual output samples are computed as:
* `output_sample[FRONT_LEFT] = (1/n) x input_sample_for_channel(2i)` and
* `output_sample[FRONT_RIGHT] = (1/n) x input_sample_for_channel(2i+1)`
* with `0 <= i < n` (in case of an odd number of input channels the
* principle is the same but with an extra input left channel). A concrete
* usage for this configuration is, for example, when importing audio from
* an array of multiple stereo microphones and you want to use them as a
* simple pair of stereo channels.
*
* Input audio channels reordering configurations.
*
* It defines different ways of reordering input audio channels when they are
* not positioned by GStreamer. As a general matter, channels are always ordered
* in the @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST order and the
* `channel-mask` field in the audio caps allows specifying which channels are
* active.
*
* Depending on the selected mode (see:
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_UNPOSITIONED), input channels
* can be automatically positioned when the `channel-mask` is not specified or
* equals 0. In this case, all input channels will be positioned according to
* the selected reordering configuration and the index of each input channel.
* This can be useful when importing audio from an array of independent
* microphones for example.
*
* The reordering configuration can also be forced (see:
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_FORCE) to reposition all
* input channels according to each channel index. In this case the
* `channel-mask` will be totally ignored and input channels will be reordered
* just like if they were unpositioned. This can be useful when importing
* multi-channels audio with errors in the channels positioning.
*
* For any of the former configurations, when the reordering is applied
* (input channels are unpositioned or the "force" mode is active):
* - When there is only one input channel available, it is positioned to MONO
* always, independently of the selected configuration.
* - When there are 2 input channels available, they are positioned to
* FRONT_LEFT and FRONT_RIGHT (except for the
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO configuration where all
* input channels are positioned to MONO).
*
* Since: 1.26
*/
typedef enum {
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST = 0,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE
} GstAudioConvertInputChannelsReorder;
/**
* GstAudioConvertInputChannelsReorderMode:
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_NONE: never reorder the input
* channels. If input channels are unpositioned and there are, at least, 3
* input channels, an error will be generated.
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_UNPOSITIONED: automatically
* reorder the input channels according to the selected
* #GstAudioConvertInputChannelsReorder configuration when, and only when,
* they are unpositioned (the `channel-mask` equals 0 or is not specified
* in the input caps).
* @GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_FORCE: always reorder the
* input channels according to the selected
* #GstAudioConvertInputChannelsReorder configuration. The `channel-mask`
* value in the input caps is completely ignored. Input channels are always
* reordered as if they were unpositioned independently of the input caps.
*
* The different usage modes of the input channels reordering configuration.
*
* Independently of the selected mode, the explicit definition of a mix matrix
* takes precedence over the reorder configuration. In this case, the provided
* mix matrix will override the reorder configuration.
*
* Since: 1.26
*/
typedef enum {
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_NONE = 0,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_UNPOSITIONED,
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_FORCE
} GstAudioConvertInputChannelsReorderMode;
#define GST_TYPE_AUDIO_CONVERT (gst_audio_convert_get_type())
G_DECLARE_FINAL_TYPE (GstAudioConvert, gst_audio_convert,
GST, AUDIO_CONVERT, GstBaseTransform);
@ -45,6 +189,8 @@ struct _GstAudioConvert
GstAudioNoiseShapingMethod ns;
GValue mix_matrix;
gboolean mix_matrix_is_set;
GstAudioConvertInputChannelsReorder input_channels_reorder;
GstAudioConvertInputChannelsReorderMode input_channels_reorder_mode;
GstAudioInfo in_info;
GstAudioInfo out_info;

View file

@ -1404,6 +1404,14 @@ sink_event_function (GstPad * sinkpad, GstDecodebin3 * dbin, GstEvent * event)
input->input_is_parsed = s
&& gst_structure_has_field (s, "urisourcebin-parsed-data");
if (input->input_is_parsed) {
/* We remove the custom field from stream-start so as not to pollute
* downstream */
event = gst_event_make_writable (event);
s = gst_event_get_structure (event);
gst_structure_remove_field ((GstStructure *) s,
"urisourcebin-parsed-data");
}
/* Make sure group ids will be recalculated */
input->group_id = GST_GROUP_ID_INVALID;

View file

@ -2614,25 +2614,38 @@ reconfigure_output (GstPlayBin3 * playbin)
GST_DEBUG_OBJECT (playbin, "Stream type '%s' is now requested",
gst_stream_type_get_name (combine->stream_type));
g_assert (combine->sinkpad == NULL);
if (combine->sinkpad) {
/* This was previously an assert but is now just a WARNING
*
* This *theoretically* should never happen, but there is the
* possibility where there was a failure within (uri)decodebin3 where
* the collection was posted (by demuxer for ex) but the decoding failed
* (no decoder, bad stream, etc...).
*
* in that case, we could have a combiner already prepared for that type
* but never got activated.
**/
GST_WARNING_OBJECT (playbin, "Combiner already configured");
} else {
/* Request playsink sink pad */
combine->sinkpad =
gst_play_sink_request_pad (playbin->playsink,
gst_play_sink_type_from_stream_type (combine->stream_type));
gst_object_ref (combine->sinkpad);
/* Create combiner if needed and link it */
create_combiner (playbin, combine);
if (combine->combiner) {
res = gst_pad_link (combine->srcpad, combine->sinkpad);
GST_DEBUG_OBJECT (playbin, "linked type %s, result: %d",
gst_stream_type_get_name (combine->stream_type), res);
if (res != GST_PAD_LINK_OK) {
GST_ELEMENT_ERROR (playbin, CORE, PAD,
("Internal playbin error."),
("Failed to link combiner to sink. Error %d", res));
}
/* Request playsink sink pad */
combine->sinkpad =
gst_play_sink_request_pad (playbin->playsink,
gst_play_sink_type_from_stream_type (combine->stream_type));
gst_object_ref (combine->sinkpad);
/* Create combiner if needed and link it */
create_combiner (playbin, combine);
if (combine->combiner) {
res = gst_pad_link (combine->srcpad, combine->sinkpad);
GST_DEBUG_OBJECT (playbin, "linked type %s, result: %d",
gst_stream_type_get_name (combine->stream_type), res);
if (res != GST_PAD_LINK_OK) {
GST_ELEMENT_ERROR (playbin, CORE, PAD,
("Internal playbin error."),
("Failed to link combiner to sink. Error %d", res));
}
}
}
}

View file

@ -951,7 +951,8 @@ demux_pad_events (GstPad * pad, GstPadProbeInfo * info, OutputSlotInfo * slot)
{
/* This is a temporary hack to notify downstream decodebin3 to *not*
* plug in an extra parsebin */
if (slot->linked_info && slot->linked_info->demuxer_is_parsebin) {
if (urisrc->is_adaptive || (slot->linked_info
&& slot->linked_info->demuxer_is_parsebin)) {
GstStructure *s;
GST_PAD_PROBE_INFO_DATA (info) = ev = gst_event_make_writable (ev);
s = (GstStructure *) gst_event_get_structure (ev);

View file

@ -26,6 +26,7 @@
#include <gst/check/gstcheck.h>
#include <gst/audio/audio.h>
#include <gst/audioconvert/gstaudioconvert.h>
/* For ease of programming we use globals to keep refs for our floating
* src and sink pads we create; otherwise we always have to do get_pad,
@ -82,6 +83,46 @@ setup_audioconvert (GstCaps * outcaps, gboolean use_mix_matrix,
return audioconvert;
}
static GstElement *
setup_audioconvert_with_input_channels_reorder (GstCaps * outcaps,
GstAudioConvertInputChannelsReorder reorder)
{
GstPadTemplate *sinktemplate;
static GstStaticPadTemplate srctemplate = GST_STATIC_PAD_TEMPLATE ("src",
GST_PAD_SRC,
GST_PAD_ALWAYS,
GST_STATIC_CAPS (CONVERT_CAPS_TEMPLATE_STRING)
);
GstElement *audioconvert;
ASSERT_CAPS_REFCOUNT (outcaps, "outcaps", 1);
sinktemplate =
gst_pad_template_new ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, outcaps);
GST_DEBUG ("setup_audioconvert with caps %" GST_PTR_FORMAT, outcaps);
audioconvert = gst_check_setup_element ("audioconvert");
g_object_set (G_OBJECT (audioconvert), "dithering", 0, NULL);
g_object_set (G_OBJECT (audioconvert), "noise-shaping", 0, NULL);
g_object_set (G_OBJECT (audioconvert), "input-channels-reorder",
reorder, "input-channels-reorder-mode",
GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MODE_UNPOSITIONED, NULL);
mysrcpad = gst_check_setup_src_pad (audioconvert, &srctemplate);
mysinkpad =
gst_check_setup_sink_pad_from_template (audioconvert, sinktemplate);
/* this installs a getcaps func that will always return the caps we set
* later */
gst_pad_use_fixed_caps (mysinkpad);
gst_pad_set_active (mysrcpad, TRUE);
gst_pad_set_active (mysinkpad, TRUE);
gst_object_unref (sinktemplate);
ASSERT_CAPS_REFCOUNT (outcaps, "outcaps", 2);
return audioconvert;
}
static void
cleanup_audioconvert (GstElement * audioconvert)
{
@ -94,6 +135,41 @@ cleanup_audioconvert (GstElement * audioconvert)
gst_check_teardown_element (audioconvert);
}
static gchar *
format_input_channels_reorder_test_name (const gchar * test_name,
GstAudioConvertInputChannelsReorder reorder)
{
const gchar *reorder_name = "unknown";
switch (reorder) {
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST:
reorder_name = "gst";
break;
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_SMPTE:
reorder_name = "smpte";
break;
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_CINE:
reorder_name = "cine";
break;
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AC3:
reorder_name = "ac3";
break;
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_AAC:
reorder_name = "aac";
break;
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_MONO:
reorder_name = "mono";
break;
case GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE:
reorder_name = "alternate";
break;
default:
break;
}
return g_strdup_printf ("%s with input channels %s reorder", test_name,
reorder_name);
}
/* returns a newly allocated caps */
static GstCaps *
get_int_caps (guint channels, gint endianness, guint width,
@ -159,6 +235,29 @@ get_float_caps (guint channels, gint endianness, guint width,
return caps;
}
static GstCaps *
get_unpositioned_input_caps (guint channels)
{
GstCaps *caps;
GstAudioInfo info;
gst_audio_info_init (&info);
gst_audio_info_set_format (&info,
gst_audio_format_build_integer (TRUE, G_BYTE_ORDER, 16, 16),
GST_AUDIO_DEF_RATE, channels, NULL);
info.layout = GST_AUDIO_LAYOUT_INTERLEAVED;
info.flags = GST_AUDIO_FLAG_UNPOSITIONED;
for (guint i = 0; i < MIN (64, channels); ++i)
info.position[i] = GST_AUDIO_CHANNEL_POSITION_NONE;
caps = gst_audio_info_to_caps (&info);
fail_unless (caps != NULL);
GST_DEBUG ("returning caps %" GST_PTR_FORMAT, caps);
return caps;
}
/* Copied from vorbis; the particular values used don't matter */
static GstAudioChannelPosition channelpositions[][6] = {
{ /* Mono */
@ -413,7 +512,8 @@ static void
verify_convert (const gchar * which, void *in, int inlength,
GstCaps * incaps, void *out, int outlength, GstCaps * outcaps,
GstFlowReturn expected_flow, gboolean in_place_allowed,
gboolean use_mix_matrix, GValue * mix_matrix)
gboolean use_mix_matrix, GValue * mix_matrix,
GstElement * custom_audioconvert)
{
GstBuffer *inbuffer, *outbuffer;
GstElement *audioconvert;
@ -423,8 +523,13 @@ verify_convert (const gchar * which, void *in, int inlength,
GST_DEBUG ("incaps: %" GST_PTR_FORMAT, incaps);
GST_DEBUG ("outcaps: %" GST_PTR_FORMAT, outcaps);
ASSERT_CAPS_REFCOUNT (incaps, "incaps", 1);
ASSERT_CAPS_REFCOUNT (outcaps, "outcaps", 1);
audioconvert = setup_audioconvert (outcaps, use_mix_matrix, mix_matrix);
if (custom_audioconvert) {
audioconvert = custom_audioconvert;
} else {
ASSERT_CAPS_REFCOUNT (outcaps, "outcaps", 1);
audioconvert = setup_audioconvert (outcaps, use_mix_matrix, mix_matrix);
}
ASSERT_CAPS_REFCOUNT (outcaps, "outcaps", 2);
fail_unless (gst_element_set_state (audioconvert,
@ -510,22 +615,36 @@ done:
#define RUN_CONVERSION(which, inarray, in_get_caps, outarray, out_get_caps) \
verify_convert (which, inarray, sizeof (inarray), \
in_get_caps, outarray, sizeof (outarray), out_get_caps, GST_FLOW_OK, \
TRUE, FALSE, &(GValue) G_VALUE_INIT);
TRUE, FALSE, &(GValue) G_VALUE_INIT, NULL);
#define RUN_CONVERSION_WITH_MATRIX(which, inarray, in_get_caps, outarray, out_get_caps, mix_matrix) \
#define RUN_CONVERSION_WITH_MATRIX(which, inarray, in_get_caps, outarray, out_get_caps, mix_matrix) \
verify_convert (which, inarray, sizeof (inarray), \
in_get_caps, outarray, sizeof (outarray), out_get_caps, GST_FLOW_OK, \
TRUE, TRUE, mix_matrix);
TRUE, TRUE, mix_matrix, NULL);
#define RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER(which, inarray, \
in_channels, reorder, outarray, out_channels) \
{ \
GstCaps *in_get_caps = get_unpositioned_input_caps (in_channels); \
GstCaps *out_get_caps = get_int_mc_caps (out_channels, G_BYTE_ORDER, 16, \
16, TRUE, GST_AUDIO_LAYOUT_INTERLEAVED, NULL); \
verify_convert ( \
which, inarray, sizeof (inarray), in_get_caps, outarray, \
sizeof (outarray), out_get_caps, GST_FLOW_OK, TRUE, FALSE, \
&(GValue) G_VALUE_INIT, \
setup_audioconvert_with_input_channels_reorder (out_get_caps, \
reorder)); \
}
#define RUN_CONVERSION_TO_FAIL(which, inarray, in_caps, outarray, out_caps) \
verify_convert (which, inarray, sizeof (inarray), \
in_caps, outarray, sizeof (outarray), out_caps, \
GST_FLOW_NOT_NEGOTIATED, TRUE, FALSE, &(GValue) G_VALUE_INIT);
GST_FLOW_NOT_NEGOTIATED, TRUE, FALSE, &(GValue) G_VALUE_INIT, NULL);
#define RUN_CONVERSION_NOT_INPLACE(which, inarray, in_get_caps, outarray, out_get_caps) \
#define RUN_CONVERSION_NOT_INPLACE(which, inarray, in_get_caps, outarray, out_get_caps) \
verify_convert (which, inarray, sizeof (inarray), \
in_get_caps, outarray, sizeof (outarray), out_get_caps, GST_FLOW_OK, \
FALSE, FALSE, &(GValue) G_VALUE_INIT);
FALSE, FALSE, &(GValue) G_VALUE_INIT, NULL);
#define INTERLEAVED GST_AUDIO_LAYOUT_INTERLEAVED
#define PLANAR GST_AUDIO_LAYOUT_NON_INTERLEAVED
@ -1094,6 +1213,245 @@ GST_START_TEST (test_multichannel_conversion)
GST_END_TEST;
GST_START_TEST (test_multichannel_crossmixing_with_input_channels_reorder)
{
{
gint16 in[] = { 12400, -120 };
gint16 out[] = { 12400, -120 };
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("1 channel to 1", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 1, reorder,
out, 1);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120 };
gint16 out[] = { 12400, 12400, -120, -120 };
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("1 channel to 2", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 1, reorder,
out, 2);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120 };
gint16 out[] = { 12400, 12400, 8767, -120, -120, -85 };
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("1 channel to 3", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 1, reorder,
out, 3);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120, -10844, 5842 };
gint16 out[] = { 6140, -2501 };
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("2 channels to 1", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 2, reorder,
out, 1);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120, -10844, 5842 };
gint16 out[][4] = {
{12400, -120, -10844, 5842},
{12400, -120, -10844, 5842},
{12400, -120, -10844, 5842},
{12400, -120, -10844, 5842},
{12400, -120, -10844, 5842},
{6140, 6140, -2501, -2501},
{12400, -120, -10844, 5842}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("2 channels to 2", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 2, reorder,
out[reorder], 2);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120, -10844, 5842 };
gint16 out[][6] = {
{8767, -85, 6140, -7667, 4130, -2501},
{8767, -85, 6140, -7667, 4130, -2501},
{8767, -85, 6140, -7667, 4130, -2501},
{8767, -85, 6140, -7667, 4130, -2501},
{8767, -85, 6140, -7667, 4130, -2501},
{6140, 6140, 4341, -2501, -2501, -1768},
{8767, -85, 6140, -7667, 4130, -2501}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("2 channels to 3", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 2, reorder,
out[reorder], 3);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120, 1120, -10844, 5842, -48 };
gint16 out[][2] = {
{4825, -1859},
{4825, -1859},
{4462, -1682},
{4462, -1682},
{4462, -1682},
{4462, -1682},
{3320, 198}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("3 channels to 1", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 3, reorder,
out[reorder], 1);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120, 1120, -10844, 5842, -48 };
gint16 out[][4] = {
{7717, 394, -6363, 3397},
{7717, 394, -6363, 3397},
{6760, 500, -5446, 2897},
{6760, 500, -5446, 2897},
{6760, 500, -5446, 2897},
{4462, 4462, -1682, -1682},
{6760, -120, -5446, 5842}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("3 channels to 2", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 3, reorder,
out[reorder], 2);
g_free (test_name);
}
}
{
gint16 in[] = { 12400, -120, 1120, -10844, 5842, -48 };
gint16 out[][6] = {
{12400, -120, 1120, -10844, 5842, -48},
{12400, -120, 1120, -10844, 5842, -48},
{6364, 471, 4462, -5127, 2727, -1682},
{6364, 471, 4462, -5127, 2727, -1682},
{6364, 471, 4462, -5127, 2727, -1682},
{4462, 4462, 3154, -1682, -1682, -1189},
{4780, -85, 3320, -3850, 4130, 198}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("3 channels to 3", reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 3, reorder,
out[reorder], 3);
g_free (test_name);
}
}
}
GST_END_TEST;
GST_START_TEST
(test_multichannel_downmixing_to_stereo_with_input_channels_reorder) {
{
gint16 in[] =
{ 12400, -120, 1248, 10140, 368, -32124, 8145, 7411, -212, -5489, 18523,
10003
};
gint16 out[][4] = {
{7353, -1592, 3657, 2105},
{7353, -1592, 3657, 2105},
{-4296, -9713, 4755, 8254},
{-4596, -9588, 6430, 7555},
{-5746, -6837, 6362, 7716},
{-1343, -1343, 6372, 6372},
{4667, -7361, 8810, 3971}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("5.1 channels to stereo",
reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 6, reorder,
out[reorder], 2);
g_free (test_name);
}
}
{
gint16 in[] =
{ 12400, -120, 1248, 10140, 368, -32124, 1247, -458, 8145, 7411, -212,
-5489, 18523, 10003, 789, -5557
};
gint16 out[][4] = {
{6739, -1645, 3467, 813},
{6739, -1645, 3467, 813},
{-1482, 260, 1921, 3242},
{-1617, 508, 2673, 1857},
{-3891, 1743, 2539, 1930},
{-912, -912, 4202, 4202},
{3816, -5640, 6811, 1592}
};
for (gint reorder = GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_GST;
reorder <= GST_AUDIO_CONVERT_INPUT_CHANNELS_REORDER_ALTERNATE;
++reorder) {
gchar *test_name =
format_input_channels_reorder_test_name ("7.1 channels to stereo",
reorder);
RUN_CONVERSION_WITH_INPUT_CHANNELS_REORDER (test_name, in, 8, reorder,
out[reorder], 2);
g_free (test_name);
}
}
}
GST_END_TEST;
GST_START_TEST (test_passthrough)
{
/* int 8 bit */
@ -1894,6 +2252,10 @@ audioconvert_suite (void)
tcase_add_test (tc_chain, test_float_conversion);
tcase_add_test (tc_chain, test_int_float_conversion);
tcase_add_test (tc_chain, test_multichannel_conversion);
tcase_add_test (tc_chain,
test_multichannel_crossmixing_with_input_channels_reorder);
tcase_add_test (tc_chain,
test_multichannel_downmixing_to_stereo_with_input_channels_reorder);
tcase_add_test (tc_chain, test_passthrough);
tcase_add_test (tc_chain, test_caps_negotiation);
tcase_add_test (tc_chain, test_convert_undefined_multichannel);

View file

@ -740,13 +740,13 @@ gst_adaptive_demux_output_slot_free (GstAdaptiveDemux * demux,
static OutputSlot *
gst_adaptive_demux_output_slot_new (GstAdaptiveDemux * demux,
GstStreamType streamtype)
GstAdaptiveDemuxTrack * track)
{
OutputSlot *slot;
GstPadTemplate *tmpl;
gchar *name;
switch (streamtype) {
switch (track->type) {
case GST_STREAM_TYPE_AUDIO:
name = g_strdup_printf ("audio_%02u", demux->priv->n_audio_streams++);
tmpl =
@ -770,7 +770,8 @@ gst_adaptive_demux_output_slot_new (GstAdaptiveDemux * demux,
}
slot = g_new0 (OutputSlot, 1);
slot->type = streamtype;
slot->type = track->type;
slot->track = gst_adaptive_demux_track_ref (track);
slot->pushed_timed_data = FALSE;
/* Create and activate new pads */
@ -778,10 +779,6 @@ gst_adaptive_demux_output_slot_new (GstAdaptiveDemux * demux,
g_free (name);
gst_object_unref (tmpl);
gst_element_add_pad (GST_ELEMENT_CAST (demux), slot->pad);
gst_flow_combiner_add_pad (demux->priv->flowcombiner, slot->pad);
gst_pad_set_active (slot->pad, TRUE);
gst_pad_set_query_function (slot->pad,
GST_DEBUG_FUNCPTR (gst_adaptive_demux_src_query));
gst_pad_set_event_function (slot->pad,
@ -789,6 +786,10 @@ gst_adaptive_demux_output_slot_new (GstAdaptiveDemux * demux,
gst_pad_set_element_private (slot->pad, slot);
gst_element_add_pad (GST_ELEMENT_CAST (demux), slot->pad);
gst_flow_combiner_add_pad (demux->priv->flowcombiner, slot->pad);
gst_pad_set_active (slot->pad, TRUE);
GST_INFO_OBJECT (demux, "Created output slot %s:%s",
GST_DEBUG_PAD_NAME (slot->pad));
return slot;
@ -2649,6 +2650,17 @@ gst_adaptive_demux_src_query (GstPad * pad, GstObject * parent,
GST_TIME_FORMAT, ret ? "TRUE" : "FALSE", GST_TIME_ARGS (duration));
break;
}
case GST_QUERY_CAPS:
{
OutputSlot *slot = gst_pad_get_element_private (pad);
if (slot->track && slot->track->generic_caps) {
GST_DEBUG_OBJECT (demux, "Answering caps query %" GST_PTR_FORMAT,
slot->track->generic_caps);
gst_query_set_caps_result (query, slot->track->generic_caps);
ret = TRUE;
}
break;
}
case GST_QUERY_LATENCY:{
gst_query_set_latency (query, FALSE, 0, -1);
ret = TRUE;
@ -3171,13 +3183,12 @@ check_and_handle_selection_update_locked (GstAdaptiveDemux * demux)
}
} else {
/* 2. There is no compatible replacement slot, create a new one */
slot = gst_adaptive_demux_output_slot_new (demux, track->type);
slot = gst_adaptive_demux_output_slot_new (demux, track);
GST_DEBUG_OBJECT (demux, "Created slot for track '%s'", track->id);
demux->priv->outputs = g_list_append (demux->priv->outputs, slot);
track->update_next_segment = TRUE;
slot->track = gst_adaptive_demux_track_ref (track);
track->active = TRUE;
gst_adaptive_demux_send_initial_events (demux, slot);
}

View file

@ -1202,7 +1202,7 @@ gst_soup_http_src_session_open (GstSoupHTTPSrc * src)
/* now owned by the loop */
g_main_context_unref (ctx);
src->session->thread = g_thread_try_new ("souphttpsrc-thread",
src->session->thread = g_thread_try_new ("souphttpsrc",
thread_func, src, NULL);
if (!src->session->thread) {

View file

@ -45,35 +45,53 @@ pylib_loc = get_option('libpython-dir')
fsmod = import('fs')
pylib_prefix = 'lib'
pylib_suffix = 'so'
pylib_ver = python_dep.version()
pylib_locs = []
if host_system == 'windows'
if cc.get_argument_syntax() == 'msvc'
pylib_prefix = ''
endif
pylib_suffix = 'dll'
pylib_ver = pylib_ver.replace('.', '')
elif host_system == 'darwin'
pylib_suffix = 'dylib'
endif
pylib_fnames = []
# Library name with soversion, non-devel package
pylib_fnames += python.get_variable('INSTSONAME', [])
if python.has_variable('INSTSONAME')
# For example, libpython3.12.so.1.0 (Linux), libpython3.11.dll.a (MSYS2), etc.
pylib_fnames += python.get_variable('INSTSONAME')
endif
# Library name without soversion, devel package, framework, etc.
pylib_fnames += python.get_variable('LDLIBRARY', [])
if python.has_variable('LDLIBRARY')
# For example, libpython3.12.so (Linux), libpython3.11.dll.a (MSYS2), etc.
pylib_fnames += python.get_variable('LDLIBRARY')
endif
# Manually construct name as a fallback
pylib_fnames += [
pylib_prefix + 'python' + python_dep.version() + python_abi_flags + '.' + pylib_suffix
pylib_prefix + 'python' + pylib_ver + python_abi_flags + '.' + pylib_suffix
]
if pylib_loc != ''
pylib_locs = [pylib_loc]
else
pylib_locs = [
python.get_variable('LIBDIR', ''),
python.get_variable('LIBPL', ''),
]
if python.has_variable('LIBDIR')
pylib_locs += python.get_variable('LIBDIR')
endif
if python.has_variable('LIBPL')
pylib_locs += python.get_variable('LIBPL')
endif
# On Windows, python312.dll is in the rootdir where Python is installed,
# which is configured as the "prefix" in sysconfig.
if host_system == 'windows'
pylib_locs += python.get_variable('prefix')
endif
endif
pylib_fname = ''
foreach loc: pylib_locs
foreach fname: pylib_fnames
if fsmod.exists(loc / fname)
fpath = loc / fname
debug(f'Looking for Python library at: @fpath@')
if fsmod.exists(fpath)
pylib_fname = fname
message(f'PY_LIB_FNAME = @fname@ (@loc@)')
break
@ -81,12 +99,7 @@ foreach loc: pylib_locs
endforeach
endforeach
if pylib_fname == ''
error_msg = 'Could not find python library to load'
if python_opt.enabled()
error(error_msg)
else
message(error_msg)
endif
message('Could not find python library to load, will try loading at runtime')
endif
pygi_override_dir = get_option('pygi-overrides-dir')

View file

@ -4014,13 +4014,24 @@ mark_event_not_received (GstPad * pad, PadEvent * ev, gpointer user_data)
* @pad: a #GstPad
* @offset: the offset
*
* Set the offset that will be applied to the running time of @pad.
* Set the offset that will be applied to the running time of @pad. Upon next
* buffer, every sticky events (notably segment) will be pushed again with
* their running time adjusted. For that reason this is only reliable on
* source pads.
*/
void
gst_pad_set_offset (GstPad * pad, gint64 offset)
{
g_return_if_fail (GST_IS_PAD (pad));
/* Setting pad offset on a sink pad does not work reliably:
* https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6464 */
if (GST_PAD_IS_SINK (pad)) {
/* Make it non fatal warning for backward compatibility. */
GST_WARNING_OBJECT (pad,
"Setting pad offset only works reliably on source pads");
}
GST_OBJECT_LOCK (pad);
/* if nothing changed, do nothing */
if (pad->offset == offset)

View file

@ -1844,8 +1844,10 @@ gst_base_src_perform_seek (GstBaseSrc * src, GstEvent * event, gboolean unlock)
gst_element_post_message (GST_ELEMENT (src), message);
}
GST_OBJECT_LOCK (src);
src->priv->segment_pending = TRUE;
src->priv->segment_seqnum = seqnum;
GST_OBJECT_UNLOCK (src);
}
src->priv->discont = TRUE;
@ -1905,9 +1907,9 @@ gst_base_src_send_event (GstElement * element, GstEvent * event)
/* For external flush, restart the task .. */
GST_LIVE_LOCK (src);
src->priv->segment_pending = TRUE;
GST_OBJECT_LOCK (src->srcpad);
src->priv->segment_pending = TRUE;
start = (GST_PAD_MODE (src->srcpad) == GST_PAD_MODE_PUSH);
GST_OBJECT_UNLOCK (src->srcpad);
@ -2885,6 +2887,7 @@ gst_base_src_loop (GstPad * pad)
gboolean eos;
guint blocksize;
GList *pending_events = NULL, *tmp;
GstEvent *seg_event = NULL;
eos = FALSE;
@ -2973,14 +2976,17 @@ gst_base_src_loop (GstPad * pad)
/* push events to close/start our segment before we push the buffer. */
if (G_UNLIKELY (src->priv->segment_pending)) {
GstEvent *seg_event = gst_event_new_segment (&src->segment);
/* generate the event but do not send until outside of live_lock */
seg_event = gst_event_new_segment (&src->segment);
GST_OBJECT_LOCK (src);
gst_event_set_seqnum (seg_event, src->priv->segment_seqnum);
src->priv->segment_seqnum = gst_util_seqnum_next ();
gst_pad_push_event (pad, seg_event);
src->priv->segment_pending = FALSE;
GST_OBJECT_UNLOCK (src);
}
/* collect any pending events */
if (g_atomic_int_get (&src->priv->have_events)) {
GST_OBJECT_LOCK (src);
/* take the events */
@ -2989,8 +2995,13 @@ gst_base_src_loop (GstPad * pad)
g_atomic_int_set (&src->priv->have_events, FALSE);
GST_OBJECT_UNLOCK (src);
}
GST_LIVE_UNLOCK (src);
/* Push out pending events if any */
/* now outside the live_lock we can push the segment event */
if (G_UNLIKELY (seg_event))
gst_pad_push_event (pad, seg_event);
/* and the pending events if any */
if (G_UNLIKELY (pending_events != NULL)) {
for (tmp = pending_events; tmp; tmp = g_list_next (tmp)) {
GstEvent *ev = (GstEvent *) tmp->data;
@ -3070,7 +3081,6 @@ gst_base_src_loop (GstPad * pad)
GST_BUFFER_FLAG_SET (buf, GST_BUFFER_FLAG_DISCONT);
src->priv->discont = FALSE;
}
GST_LIVE_UNLOCK (src);
/* push buffer or buffer list */
if (src->priv->pending_bufferlist != NULL) {