Belle II Software prerelease-10-00-00a
PriorModel Class Reference
Inheritance diagram for PriorModel:
Collaboration diagram for PriorModel:

Public Member Functions

 __init__ (self, int n_output)
 
torch.tensor forward (self, torch.tensor x)
 

Public Attributes

 hidden1 = Linear(9, 128)
 Linear layer with 9 inputs and 128 outputs.
 
 act1 = ReLU()
 ReLU activation layer.
 
 hidden2 = BatchNorm1d(128)
 Batchnormalization layer.
 
 hidden3 = Linear(128, 64)
 Linear layer with 128 inputs and 64 outputs.
 
 act2 = ReLU()
 ReLU activation layer.
 
 hidden4 = BatchNorm1d(64)
 Batchnormalization layer.
 
 hidden5 = Linear(64, 32)
 Linear layer with 64 inputs and 32 outputs.
 
 act3 = ReLU()
 ReLU activation layer.
 
 hidden6 = BatchNorm1d(32)
 Batchnormalization layer.
 
 hidden7 = Linear(32, n_output)
 Linear layer with 32 inputs and outputs for each particle in the particlelist.
 
 act4 = Softmax(dim=1)
 Softmax activation layer.
 

Detailed Description

Pytorch model for PID prior probability calculation.

Attributes:
    hidden1: Linear layer with 9 inputs and 128 outputs.
    act1: An RELU activation layer.
    hidden2: A batch normalization layer.
    hidden3: Linear layer with 128 inputs and 64 outputs.
    act2: An RELU activation layer.
    hidden4: A batch normalization layer.
    hidden5: Linear layer with 64 inputs and 32 outputs.
    act3: An RELU activation layer.
    hidden6: A batch normalization layer.
    hidden7: Linear layer with 9 inputs and 128 outputs.
    act4: A softmax activation layer.

Definition at line 105 of file priorDataLoaderAndModel.py.

Constructor & Destructor Documentation

◆ __init__()

__init__ ( self,
int n_output )
Initialize the PID prior probability model.

Parameter:
    n_output (int): Number of output nodes.

Definition at line 124 of file priorDataLoaderAndModel.py.

124 def __init__(self, n_output: int):
125 """
126 Initialize the PID prior probability model.
127
128 Parameter:
129 n_output (int): Number of output nodes.
130
131 """
132 super().__init__()
133
134 self.hidden1 = Linear(9, 128)
135 kaiming_uniform_(self.hidden1.weight, nonlinearity="relu")
136
137 self.act1 = ReLU()
138
139 self.hidden2 = BatchNorm1d(128)
140
141 self.hidden3 = Linear(128, 64)
142 kaiming_uniform_(self.hidden3.weight, nonlinearity="relu")
143
144 self.act2 = ReLU()
145
146 self.hidden4 = BatchNorm1d(64)
147
148 self.hidden5 = Linear(64, 32)
149 kaiming_uniform_(self.hidden5.weight, nonlinearity="relu")
150
151 self.act3 = ReLU()
152
153 self.hidden6 = BatchNorm1d(32)
154
155 self.hidden7 = Linear(32, n_output)
156 xavier_uniform_(self.hidden7.weight)
157
158 self.act4 = Softmax(dim=1)
159

Member Function Documentation

◆ forward()

torch.tensor forward ( self,
torch.tensor x )
Gives PID prior probabilities for the input features.

Parameter:
    x (torch.tensor): A 2D tensor containing features for a particle as a row.

Returns:
    A torch tensor containing PID prior probabilities for the provided features.

Definition at line 160 of file priorDataLoaderAndModel.py.

160 def forward(self, x: torch.tensor) -> torch.tensor:
161 """
162 Gives PID prior probabilities for the input features.
163
164 Parameter:
165 x (torch.tensor): A 2D tensor containing features for a particle as a row.
166
167 Returns:
168 A torch tensor containing PID prior probabilities for the provided features.
169 """
170 x = self.hidden1(x)
171 x = self.act1(x)
172 x = self.hidden2(x)
173 x = self.hidden3(x)
174 x = self.act2(x)
175 x = self.hidden4(x)
176 x = self.hidden5(x)
177 x = self.act3(x)
178 x = self.hidden6(x)
179 x = self.hidden7(x)
180 x = self.act4(x)
181 return x

Member Data Documentation

◆ act1

act1 = ReLU()

ReLU activation layer.

Definition at line 137 of file priorDataLoaderAndModel.py.

◆ act2

act2 = ReLU()

ReLU activation layer.

Definition at line 144 of file priorDataLoaderAndModel.py.

◆ act3

act3 = ReLU()

ReLU activation layer.

Definition at line 151 of file priorDataLoaderAndModel.py.

◆ act4

act4 = Softmax(dim=1)

Softmax activation layer.

Definition at line 158 of file priorDataLoaderAndModel.py.

◆ hidden1

hidden1 = Linear(9, 128)

Linear layer with 9 inputs and 128 outputs.

Definition at line 134 of file priorDataLoaderAndModel.py.

◆ hidden2

hidden2 = BatchNorm1d(128)

Batchnormalization layer.

Definition at line 139 of file priorDataLoaderAndModel.py.

◆ hidden3

hidden3 = Linear(128, 64)

Linear layer with 128 inputs and 64 outputs.

Definition at line 141 of file priorDataLoaderAndModel.py.

◆ hidden4

hidden4 = BatchNorm1d(64)

Batchnormalization layer.

Definition at line 146 of file priorDataLoaderAndModel.py.

◆ hidden5

hidden5 = Linear(64, 32)

Linear layer with 64 inputs and 32 outputs.

Definition at line 148 of file priorDataLoaderAndModel.py.

◆ hidden6

hidden6 = BatchNorm1d(32)

Batchnormalization layer.

Definition at line 153 of file priorDataLoaderAndModel.py.

◆ hidden7

hidden7 = Linear(32, n_output)

Linear layer with 32 inputs and outputs for each particle in the particlelist.

Definition at line 155 of file priorDataLoaderAndModel.py.


The documentation for this class was generated from the following file: