Belle II Software development
ONNXTensorView Class Reference

View a Dataset's m_input as ONNX Tensor and also set up output buffers/Tensors. More...

#include <ONNX.h>

Public Member Functions

 ONNXTensorView (Dataset &dataset, int nOutputs)
 Construct a new ONNXTensorView.
 
Ort::Value * inputTensor ()
 Get a pointer to the inputTensor.
 
Ort::Value * outputTensor ()
 Get a pointer to the outputTensor.
 
std::vector< float > outputData ()
 Get a vector of output values (that may have been filled)
 

Private Attributes

std::vector< int64_t > m_inputShape
 Shape of the input Tensor.
 
std::vector< float > m_outputData
 Output Tensor buffer.
 
std::vector< int64_t > m_outputShape
 Shape of the output Tensor.
 
Ort::MemoryInfo m_memoryInfo
 MemoryInfo object to be used when constructing the ONNX Tensors - used to specify device (CPU)
 
Ort::Value m_inputTensor
 The input Tensor.
 
Ort::Value m_outputTensor
 The output Tensor.
 

Detailed Description

View a Dataset's m_input as ONNX Tensor and also set up output buffers/Tensors.

These views will be passed to Ort::Session::Run

Definition at line 83 of file ONNX.h.

Constructor & Destructor Documentation

◆ ONNXTensorView()

ONNXTensorView ( Dataset & dataset,
int nOutputs )
inline

Construct a new ONNXTensorView.

Parameters
datasetDataset whose input vector is supposed to be viewed as ONNX tensor
nOutputsNumber of output values

Definition at line 90 of file ONNX.h.

91 : m_inputShape{1, dataset.getNumberOfFeatures()}, m_outputData(nOutputs),
92 m_outputShape{1, nOutputs}, m_memoryInfo(Ort::MemoryInfo::CreateCpu(
93 OrtDeviceAllocator, OrtMemTypeCPU)),
94 m_inputTensor(Ort::Value::CreateTensor<float>(
95 m_memoryInfo, dataset.m_input.data(), dataset.m_input.size(),
96 m_inputShape.data(), m_inputShape.size())),
97 m_outputTensor(Ort::Value::CreateTensor<float>(
98 m_memoryInfo, m_outputData.data(), m_outputData.size(),
99 m_outputShape.data(), m_outputShape.size())) {}

Member Function Documentation

◆ inputTensor()

Ort::Value * inputTensor ( )
inline

Get a pointer to the inputTensor.

Definition at line 103 of file ONNX.h.

103{ return &m_inputTensor; }

◆ outputData()

std::vector< float > outputData ( )
inline

Get a vector of output values (that may have been filled)

Definition at line 113 of file ONNX.h.

113{ return m_outputData; }

◆ outputTensor()

Ort::Value * outputTensor ( )
inline

Get a pointer to the outputTensor.

Definition at line 108 of file ONNX.h.

108{ return &m_outputTensor; }

Member Data Documentation

◆ m_inputShape

std::vector<int64_t> m_inputShape
private

Shape of the input Tensor.

Definition at line 118 of file ONNX.h.

◆ m_inputTensor

Ort::Value m_inputTensor
private

The input Tensor.

Definition at line 138 of file ONNX.h.

◆ m_memoryInfo

Ort::MemoryInfo m_memoryInfo
private

MemoryInfo object to be used when constructing the ONNX Tensors - used to specify device (CPU)

Definition at line 133 of file ONNX.h.

◆ m_outputData

std::vector<float> m_outputData
private

Output Tensor buffer.

Definition at line 123 of file ONNX.h.

◆ m_outputShape

std::vector<int64_t> m_outputShape
private

Shape of the output Tensor.

Definition at line 128 of file ONNX.h.

◆ m_outputTensor

Ort::Value m_outputTensor
private

The output Tensor.

Definition at line 143 of file ONNX.h.


The documentation for this class was generated from the following file: