![]() |
Belle II Software development
|
Public Member Functions | |
__init__ (self, int n_output) | |
torch.tensor | forward (self, torch.tensor x) |
Public Attributes | |
hidden1 = Linear(9, 128) | |
Linear layer with 9 inputs and 128 outputs. | |
act1 = ReLU() | |
ReLU activation layer. | |
hidden2 = BatchNorm1d(128) | |
Batchnormalization layer. | |
hidden3 = Linear(128, 64) | |
Linear layer with 128 inputs and 64 outputs. | |
act2 = ReLU() | |
ReLU activation layer. | |
hidden4 = BatchNorm1d(64) | |
Batchnormalization layer. | |
hidden5 = Linear(64, 32) | |
Linear layer with 64 inputs and 32 outputs. | |
act3 = ReLU() | |
ReLU activation layer. | |
hidden6 = BatchNorm1d(32) | |
Batchnormalization layer. | |
hidden7 = Linear(32, n_output) | |
Linear layer with 32 inputs and outputs for each particle in the particlelist. | |
act4 = Softmax(dim=1) | |
Softmax activation layer. | |
Pytorch model for PID prior probability calculation. Attributes: hidden1: Linear layer with 9 inputs and 128 outputs. act1: An RELU activation layer. hidden2: A batch normalization layer. hidden3: Linear layer with 128 inputs and 64 outputs. act2: An RELU activation layer. hidden4: A batch normalization layer. hidden5: Linear layer with 64 inputs and 32 outputs. act3: An RELU activation layer. hidden6: A batch normalization layer. hidden7: Linear layer with 9 inputs and 128 outputs. act4: A softmax activation layer.
Definition at line 105 of file priorDataLoaderAndModel.py.
__init__ | ( | self, | |
int | n_output ) |
Initialize the PID prior probability model. Parameter: n_output (int): Number of output nodes.
Definition at line 124 of file priorDataLoaderAndModel.py.
torch.tensor forward | ( | self, | |
torch.tensor | x ) |
Gives PID prior probabilities for the input features. Parameter: x (torch.tensor): A 2D tensor containing features for a particle as a row. Returns: A torch tensor containing PID prior probabilities for the provided features.
Definition at line 160 of file priorDataLoaderAndModel.py.
act1 = ReLU() |
ReLU activation layer.
Definition at line 137 of file priorDataLoaderAndModel.py.
act2 = ReLU() |
ReLU activation layer.
Definition at line 144 of file priorDataLoaderAndModel.py.
act3 = ReLU() |
ReLU activation layer.
Definition at line 151 of file priorDataLoaderAndModel.py.
act4 = Softmax(dim=1) |
Softmax activation layer.
Definition at line 158 of file priorDataLoaderAndModel.py.
hidden1 = Linear(9, 128) |
Linear layer with 9 inputs and 128 outputs.
Definition at line 134 of file priorDataLoaderAndModel.py.
hidden2 = BatchNorm1d(128) |
Batchnormalization layer.
Definition at line 139 of file priorDataLoaderAndModel.py.
hidden3 = Linear(128, 64) |
Linear layer with 128 inputs and 64 outputs.
Definition at line 141 of file priorDataLoaderAndModel.py.
hidden4 = BatchNorm1d(64) |
Batchnormalization layer.
Definition at line 146 of file priorDataLoaderAndModel.py.
hidden5 = Linear(64, 32) |
Linear layer with 64 inputs and 32 outputs.
Definition at line 148 of file priorDataLoaderAndModel.py.
hidden6 = BatchNorm1d(32) |
Batchnormalization layer.
Definition at line 153 of file priorDataLoaderAndModel.py.
hidden7 = Linear(32, n_output) |
Linear layer with 32 inputs and outputs for each particle in the particlelist.
Definition at line 155 of file priorDataLoaderAndModel.py.