Hide most of symbols of GstD3D11Memory object.
GstD3D11Memory is one of primary resource for imcoming d3d11 library
and it's expected to be a extensible feature.
Hiding implementation detail would be helpful for later use case.
Summary of this commit:
* Now all native Direct3D11 resources are private of GstD3D11Memory.
To access native resources, getter methods need to be used
or generic map (e.g., gst_memory_map) API should be called
apart from some exceptional case such as d3d11decoder case.
* Various helper methods are added for GstBuffer related operations
and in order to remove duplicated code.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1892>
... instead of READY state. READY state is too early for setting
overlay window handle especially playbin/playsink scenario
since playsink will set given overlay handle on videosink once
READY state change of videosink is ensured.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1893>
Unlike software MFT (Media Foundation Transform) which is synchronous
in terms of processing input and output data, hardware MFT works
in asynchronous mode. output data might not be available right after
we pushed one input data into MFT.
Note that async MFT will fire two events, one is "METransformNeedInput"
which happens when MFT can accept more input data,
and the other is "METransformHaveOutput", that's for signaling
there's pending data which can be outputted immediately.
To listen the events, we can wait synchronously via
IMFMediaEventGenerator::GetEvent() or make use of IMFAsyncCallback
object which is asynchronous way and the event will be notified
from Media Foundation's internal worker queue thread.
To handle such asynchronous operation, previous working flow was
as follows (IMFMediaEventGenerator::GetEvent() was used for now)
- Check if there is pending output data and push the data toward downstream.
- Pulling events (from streaming thread) until there's at least
one pending "METransformNeedInput" event
- Then, push one data into MFT from streaming thread
- Check if there is pending "METransformHaveOutput" again.
If there is, push new output data to downstream
(unlikely there is pending output data at this moment)
Above flow was processed from upstream streaming thread. That means
even if there's available output data, it could be outputted later
when the next buffer is pushed from upstream streaming thread.
It would introduce at least one frame latency in case of live stream.
To reduce such latency, this commit modifies the flow to be fully
asynchronous like hardware MFT was designed and to be able to
output encoded data whenever it's available. More specifically,
IMFAsyncCallback object will be used for handling
"METransformNeedInput" and "METransformHaveOutput" events from
Media Foundation's internal thread, and new output data will be
also outputted from the Media Foundation's thread.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1520>
When carrying over existing GstStream to a new GstStreamCollection we need to
check whether they *actually* were being used in the previous collection.
This avoids adding unknown streams (metadata, PSI, etc...) to the collection on
updates.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1880>
Instead of implementing exactly the same thing ourself but making
`GstBus` not know that it is the case.
Since we are *sure* that the bus can't have been access at the point
where we add the watch we are guaranteed that the current thread
maincontext is going to be used.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1870>
Otherwise, when rtpm2src cancels an inflight operation that has a queued
message stored, then the rtmp connection operation is not stopped.
If the cancellation occurs during rtmp connection start up, then
rtpm2src does not have any way of accessing the connection object as it
has not been returned yet. As a result, rtpm2src will cancel, the
connection will still be processing things and the
GMainContext/GMainLoop associated with the outstanding operation will be
destroyed. All outstanding operations and the rtmpconnection object will
therefore be leaked in this case.
Fixes: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/issues/1425
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1862>
This function takes the sock lock. This can result in a deadlock when
another thread holding the sock lock is trying to take the object lock.
Thread A (Holds object lock, wants sock lock):
#2 gst_srt_object_get_stats at gst-plugins-bad/ext/srt/gstsrtobject.c:1753
#3 gst_srt_object_get_property_helper at gst-plugins-bad/ext/srt/gstsrtobject.c:409
#4 gst_srt_sink_get_property at gst-plugins-bad/ext/srt/gstsrtsink.c:95
#5 g_object_get_property from libgobject-2.0.so.0
Thread B (Holds sock lock, wants object lock):
#2 gst_element_post_message_default at gstreamer/gst/gstelement.c:2069
#3 gst_element_post_message at gstreamer/gst/gstelement.c:2123
#4 gst_element_message_full_with_details at gstreamer/gst/gstelement.c:2259
#5 gst_element_message_full at gstreamer/gst/gstelement.c:2298
#6 gst_srt_object_send_headers at gst-plugins-bad/ext/srt/gstsrtobject.c:1407
#7 gst_srt_object_send_headers at gst-plugins-bad/ext/srt/gstsrtobject.c:1444
#8 gst_srt_object_write_to_callers at gst-plugins-bad/ext/srt/gstsrtobject.c:1444
#9 gst_srt_object_write at gst-plugins-bad/ext/srt/gstsrtobject.c:1598
#10 gst_srt_sink_render at gst-plugins-bad/ext/srt/gstsrtsink.c:179
Fixes d2d00e07ac.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1861>
Various software, including ffmpeg's Decklink support, fails parsing CDP
packets that contain anything but CC data in the CDP packets.
Based on this property, timecodes are not written into the CDP packets
even if they're present.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1833>
Add a new property "render-stats" to allow rendering statistics
data on window for debugging and/or development purpose.
Text rendering will be accelerated by GPU since this implementation
uses Direct2D/DirectWrite API and Direct3D inter-op for minimal overhead.
Specifically, text data will be rendered on swapchain backbuffer
directly without any copy/allocation of extra texture.
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1830>
Context creation and retrieval is required, the symbols are exported
with the header missing. Users most likely define GST_USE_UNSTABLE_API
so they're aware of the implications of using a header that might change
between releases.
Signed-off-by: Marius Vlad <marius.vlad@collabora.com>
Part-of: <https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/-/merge_requests/1688>
Instead of waiting so that we can simply use a clocksync element as
filter, otherwise we won't know the pipeline is live as it won't
return NO_PREROLL as one would expect in that case.
Adding it right away shouldn't create any issue, both ways are fine.