Commit graph

14 commits

Author SHA1 Message Date
Olivier Crête
4295386804 tensors: Use full GstTensorDataType type name in type members
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6000>
2024-11-08 14:58:49 +00:00
Daniel Morin
7c925eae61 analytics: Move batch to GstTensor
- batch_size is required to interpret the tensor depending on the tensor format
the batch are not necessarily memory plane therefore it's preferable to keep it
inside GstTensor.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6000>
2024-11-08 14:58:49 +00:00
Daniel Morin
43c7e524ce analytics: Decouple GstTensor from GstTensorMeta
- To support transporting tensor as GstMeta, Analytics-Meta and Media we need to
  decouple GstTensor from GstTensorMeta.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6000>
2024-11-08 14:58:49 +00:00
Olivier Crête
13de5160be onnx: Add more tensor data types
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/6001>
2024-02-02 18:43:21 -05:00
Olivier Crête
b1ac114ca5 onnxinference: Return caps based on model preference when possible
This should enable zero-copy when the model has the right type

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5885>
2024-01-13 22:29:41 +00:00
Olivier Crête
83c2d30438 onnx: Use the element pointer for debug message
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5885>
2024-01-13 22:29:41 +00:00
Olivier Crête
54b361c554 onnx: Extract data type from the model itself
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5885>
2024-01-13 22:29:41 +00:00
Olivier Crête
e19428a802 onnx: Update build instructions to use onnx-runtime 0.16.3
This synchronizes it with the meson.build file

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5861>
2023-12-22 14:43:23 -05:00
Aaron Boxer
e2ee207367 onnx: add README outlining install and test instructions
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5816>
2023-12-20 14:48:41 -05:00
Daniel Morin
d3ebaa316d onnx: avoid leak on failure and cleanup
- Unmap buffer on Ort exception
- Avoid retrieving unused videometa

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5761>
2023-12-05 16:54:45 +00:00
Daniel Morin
15e5866e51 onnx: add offset and scale properties
- Offset each datapoint by the value set on offset property.
- Scale each datapoint by the value set on scale property.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5761>
2023-12-05 16:54:45 +00:00
Daniel Morin
48e3836482 onnx: Add support for float datatype
This is a bit of a hack solution has I think the correct solution is to
expose model caps on sinkpad (eventually sinkpads). Till then I think
this is reasonable.

- Add a property to onnxinference to set datatype.
- Fix internal buffer allocation size based on datatype.
- Extract method to remove alphe channel and convert to planar image
  when requested. Also template the method to support writing to buffers
  of different datatype.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/5761>
2023-12-05 16:54:45 +00:00
Olivier Crête
d0b587eb15 onnx: Remove enums file
Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4916>
2023-10-20 00:33:29 +00:00
Aaron Boxer
1ff585233a onnx: add gstonnxinference element
This element refactors functionality from gstonnxinference element,
namely separating out the ONNX inference from the subsequent analysis.

The new element runs an ONNX model on each video frame, and then
attaches a TensorMeta meta with the output tensor data. This tensor data
will then be consumed by downstream elements such as gstobjectdetector.

At the moment, a provisional TensorMeta is used just in the ONNX
plugin, but in future this will upgraded to a GStreamer API for other
plugins to consume.

Part-of: <https://gitlab.freedesktop.org/gstreamer/gstreamer/-/merge_requests/4916>
2023-10-20 00:33:29 +00:00