Belle II Software development
GRLMLP Class Reference

Class to keep all parameters of an expert MLP for the neuro trigger. More...

#include <GRLMLP.h>

Inheritance diagram for GRLMLP:

Public Member Functions

 GRLMLP ()
 default constructor.
 
 GRLMLP (std::vector< unsigned short > &nodes, unsigned short targets, const std::vector< float > &outputscale)
 constructor to set all parameters (not weights and relevantID ranges).
 
 ~GRLMLP ()
 destructor, empty because we don't allocate memory anywhere.
 
bool isTrained () const
 check if weights are default values or set by some trainer
 
unsigned getNumberOfLayers () const
 get number of layers
 
unsigned getNumberOfNodesLayer (unsigned iLayer) const
 get number of nodes in a layer
 
unsigned getNumberOfWeights () const
 get number of weights from length of weights vector
 
unsigned nWeightsCal () const
 calculate number of weights from number of nodes
 
unsigned nBiasCal () const
 calculate number of weights from number of nodes
 
std::vector< float > getWeights () const
 get weights vector
 
std::vector< float > getBias () const
 get bias vector
 
void setWeights (std::vector< float > &weights)
 set weights vector
 
void setBias (std::vector< float > &bias)
 set bias vector
 
void Trained (bool trained)
 check if weights are default values or set by some trainer
 

Private Member Functions

 ClassDef (GRLMLP, 2)
 Needed to make the ROOT object storable.
 

Private Attributes

std::vector< unsigned short > m_nNodes
 Number of nodes in each layer, not including bias nodes.
 
std::vector< float > m_weights
 Weights of the network.
 
std::vector< float > m_bias
 bias of the network.
 
bool m_trained
 Indicator whether the weights are just default values or have been set by some trainer (set to true when setWeights() is first called).
 
unsigned short m_targetVars
 output variables: 1: z, 2: theta, 3: (z, theta)
 
std::vector< float > m_outputScale
 Output[i] of the MLP is scaled from [-1, 1] to [outputScale[2i], outputScale[2i+1]].
 

Friends

class GRLNeuroTrainerModule
 

Detailed Description

Class to keep all parameters of an expert MLP for the neuro trigger.

Definition at line 21 of file GRLMLP.h.

Constructor & Destructor Documentation

◆ GRLMLP() [1/2]

GRLMLP ( )

default constructor.

Definition at line 14 of file GRLMLP.cc.

14 :
15 m_nNodes{19, 20, 20, 1}, m_trained(false), m_targetVars(1), m_outputScale{ 0., 1.}
16{
17 m_weights.assign(nWeightsCal(), 0.);
18 m_bias.assign(nBiasCal(), 0.);
19}
std::vector< float > m_weights
Weights of the network.
Definition: GRLMLP.h:64
unsigned nWeightsCal() const
calculate number of weights from number of nodes
Definition: GRLMLP.cc:32
std::vector< float > m_outputScale
Output[i] of the MLP is scaled from [-1, 1] to [outputScale[2i], outputScale[2i+1]].
Definition: GRLMLP.h:75
std::vector< unsigned short > m_nNodes
Number of nodes in each layer, not including bias nodes.
Definition: GRLMLP.h:62
std::vector< float > m_bias
bias of the network.
Definition: GRLMLP.h:66
unsigned nBiasCal() const
calculate number of weights from number of nodes
Definition: GRLMLP.cc:45
unsigned short m_targetVars
output variables: 1: z, 2: theta, 3: (z, theta)
Definition: GRLMLP.h:72
bool m_trained
Indicator whether the weights are just default values or have been set by some trainer (set to true w...
Definition: GRLMLP.h:69

◆ GRLMLP() [2/2]

GRLMLP ( std::vector< unsigned short > &  nodes,
unsigned short  targets,
const std::vector< float > &  outputscale 
)

constructor to set all parameters (not weights and relevantID ranges).

Definition at line 21 of file GRLMLP.cc.

24 :
25 m_nNodes(nodes), m_trained(false), m_targetVars(targets), m_outputScale(outputscale)
26{
27 m_weights.assign(nWeightsCal(), 0.);
28 m_bias.assign(nBiasCal(), 0.);
29}

◆ ~GRLMLP()

~GRLMLP ( )
inline

destructor, empty because we don't allocate memory anywhere.

Definition at line 34 of file GRLMLP.h.

34{ }

Member Function Documentation

◆ getBias()

std::vector< float > getBias ( ) const
inline

get bias vector

Definition at line 51 of file GRLMLP.h.

51{ return m_bias; }

◆ getNumberOfLayers()

unsigned getNumberOfLayers ( ) const
inline

get number of layers

Definition at line 39 of file GRLMLP.h.

39{ return m_nNodes.size(); }

◆ getNumberOfNodesLayer()

unsigned getNumberOfNodesLayer ( unsigned  iLayer) const
inline

get number of nodes in a layer

Definition at line 41 of file GRLMLP.h.

41{ return m_nNodes[iLayer]; }

◆ getNumberOfWeights()

unsigned getNumberOfWeights ( ) const
inline

get number of weights from length of weights vector

Definition at line 43 of file GRLMLP.h.

43{ return m_weights.size(); }

◆ getWeights()

std::vector< float > getWeights ( ) const
inline

get weights vector

Definition at line 49 of file GRLMLP.h.

49{ return m_weights; }

◆ isTrained()

bool isTrained ( ) const
inline

check if weights are default values or set by some trainer

Definition at line 37 of file GRLMLP.h.

37{ return m_trained; }

◆ nBiasCal()

unsigned nBiasCal ( ) const

calculate number of weights from number of nodes

Definition at line 45 of file GRLMLP.cc.

46{
47 unsigned nbias = 0;
48 if (getNumberOfLayers() > 1) {
49 for (unsigned il = 1; il < getNumberOfLayers(); ++il) {
50 nbias += m_nNodes[il] ;
51 }
52 }
53 return nbias;
54}
unsigned getNumberOfLayers() const
get number of layers
Definition: GRLMLP.h:39

◆ nWeightsCal()

unsigned nWeightsCal ( ) const

calculate number of weights from number of nodes

Definition at line 32 of file GRLMLP.cc.

33{
34 unsigned nWeights = 0;
35 if (getNumberOfLayers() > 1) {
36 nWeights = m_nNodes[0] * m_nNodes[1];
37 for (unsigned il = 1; il < getNumberOfLayers() - 1; ++il) {
38 nWeights += m_nNodes[il] * m_nNodes[il + 1];
39 }
40 }
41 return nWeights;
42}

◆ setBias()

void setBias ( std::vector< float > &  bias)
inline

set bias vector

Definition at line 55 of file GRLMLP.h.

55{ m_bias = bias; }

◆ setWeights()

void setWeights ( std::vector< float > &  weights)
inline

set weights vector

Definition at line 53 of file GRLMLP.h.

53{ m_weights = weights; }

◆ Trained()

void Trained ( bool  trained)
inline

check if weights are default values or set by some trainer

Definition at line 58 of file GRLMLP.h.

58{ m_trained = trained; }

Friends And Related Function Documentation

◆ GRLNeuroTrainerModule

friend class GRLNeuroTrainerModule
friend

Definition at line 24 of file GRLMLP.h.

Member Data Documentation

◆ m_bias

std::vector<float> m_bias
private

bias of the network.

Definition at line 66 of file GRLMLP.h.

◆ m_nNodes

std::vector<unsigned short> m_nNodes
private

Number of nodes in each layer, not including bias nodes.

Definition at line 62 of file GRLMLP.h.

◆ m_outputScale

std::vector<float> m_outputScale
private

Output[i] of the MLP is scaled from [-1, 1] to [outputScale[2i], outputScale[2i+1]].

Definition at line 75 of file GRLMLP.h.

◆ m_targetVars

unsigned short m_targetVars
private

output variables: 1: z, 2: theta, 3: (z, theta)

Definition at line 72 of file GRLMLP.h.

◆ m_trained

bool m_trained
private

Indicator whether the weights are just default values or have been set by some trainer (set to true when setWeights() is first called).

Definition at line 69 of file GRLMLP.h.

◆ m_weights

std::vector<float> m_weights
private

Weights of the network.

Definition at line 64 of file GRLMLP.h.


The documentation for this class was generated from the following files: