Belle II Software development
GRLNeuroTrainerModule Class Reference

The trainer module for the neural networks of the CDC trigger. More...

#include <GRLNeuroTrainerModule.h>

Inheritance diagram for GRLNeuroTrainerModule:
Module PathElement

Public Types

enum  EModulePropFlags {
  c_Input = 1 ,
  c_Output = 2 ,
  c_ParallelProcessingCertified = 4 ,
  c_HistogramManager = 8 ,
  c_InternalSerializer = 16 ,
  c_TerminateInAllProcesses = 32 ,
  c_DontCollectStatistics = 64
}
 Each module can be tagged with property flags, which indicate certain features of the module. More...
 
typedef ModuleCondition::EAfterConditionPath EAfterConditionPath
 Forward the EAfterConditionPath definition from the ModuleCondition.
 

Public Member Functions

 GRLNeuroTrainerModule ()
 Constructor, for setting module description and parameters.
 
virtual ~GRLNeuroTrainerModule ()
 Destructor.
 
virtual void initialize () override
 Initialize the module.
 
virtual void event () override
 Called once for each event.
 
virtual void terminate () override
 Do the training for all sectors.
 
void updateRelevantID (unsigned isector)
 calculate and set the relevant id range for given sector based on hit counters of the track segments.
 
void train (unsigned isector)
 Train a single MLP.
 
void saveTraindata (const std::string &filename, const std::string &arrayname="trainSets")
 Save all training samples.
 
bool loadTraindata (const std::string &filename, const std::string &arrayname="trainSets")
 Load saved training samples.
 
virtual std::vector< std::string > getFileNames (bool outputFiles)
 Return a list of output filenames for this modules.
 
virtual void beginRun ()
 Called when entering a new run.
 
virtual void endRun ()
 This method is called if the current run ends.
 
const std::string & getName () const
 Returns the name of the module.
 
const std::string & getType () const
 Returns the type of the module (i.e.
 
const std::string & getPackage () const
 Returns the package this module is in.
 
const std::string & getDescription () const
 Returns the description of the module.
 
void setName (const std::string &name)
 Set the name of the module.
 
void setPropertyFlags (unsigned int propertyFlags)
 Sets the flags for the module properties.
 
LogConfiggetLogConfig ()
 Returns the log system configuration.
 
void setLogConfig (const LogConfig &logConfig)
 Set the log system configuration.
 
void setLogLevel (int logLevel)
 Configure the log level.
 
void setDebugLevel (int debugLevel)
 Configure the debug messaging level.
 
void setAbortLevel (int abortLevel)
 Configure the abort log level.
 
void setLogInfo (int logLevel, unsigned int logInfo)
 Configure the printed log information for the given level.
 
void if_value (const std::string &expression, const std::shared_ptr< Path > &path, EAfterConditionPath afterConditionPath=EAfterConditionPath::c_End)
 Add a condition to the module.
 
void if_false (const std::shared_ptr< Path > &path, EAfterConditionPath afterConditionPath=EAfterConditionPath::c_End)
 A simplified version to add a condition to the module.
 
void if_true (const std::shared_ptr< Path > &path, EAfterConditionPath afterConditionPath=EAfterConditionPath::c_End)
 A simplified version to set the condition of the module.
 
bool hasCondition () const
 Returns true if at least one condition was set for the module.
 
const ModuleConditiongetCondition () const
 Return a pointer to the first condition (or nullptr, if none was set)
 
const std::vector< ModuleCondition > & getAllConditions () const
 Return all set conditions for this module.
 
bool evalCondition () const
 If at least one condition was set, it is evaluated and true returned if at least one condition returns true.
 
std::shared_ptr< PathgetConditionPath () const
 Returns the path of the last true condition (if there is at least one, else reaturn a null pointer).
 
Module::EAfterConditionPath getAfterConditionPath () const
 What to do after the conditional path is finished.
 
std::vector< std::shared_ptr< Path > > getAllConditionPaths () const
 Return all condition paths currently set (no matter if the condition is true or not).
 
bool hasProperties (unsigned int propertyFlags) const
 Returns true if all specified property flags are available in this module.
 
bool hasUnsetForcedParams () const
 Returns true and prints error message if the module has unset parameters which the user has to set in the steering file.
 
const ModuleParamListgetParamList () const
 Return module param list.
 
template<typename T >
ModuleParam< T > & getParam (const std::string &name) const
 Returns a reference to a parameter.
 
bool hasReturnValue () const
 Return true if this module has a valid return value set.
 
int getReturnValue () const
 Return the return value set by this module.
 
std::shared_ptr< PathElementclone () const override
 Create an independent copy of this module.
 
std::shared_ptr< boost::python::list > getParamInfoListPython () const
 Returns a python list of all parameters.
 

Static Public Member Functions

static void exposePythonAPI ()
 Exposes methods of the Module class to Python.
 

Protected Member Functions

virtual void def_initialize ()
 Wrappers to make the methods without "def_" prefix callable from Python.
 
virtual void def_beginRun ()
 Wrapper method for the virtual function beginRun() that has the implementation to be used in a call from Python.
 
virtual void def_event ()
 Wrapper method for the virtual function event() that has the implementation to be used in a call from Python.
 
virtual void def_endRun ()
 This method can receive that the current run ends as a call from the Python side.
 
virtual void def_terminate ()
 Wrapper method for the virtual function terminate() that has the implementation to be used in a call from Python.
 
void setDescription (const std::string &description)
 Sets the description of the module.
 
void setType (const std::string &type)
 Set the module type.
 
template<typename T >
void addParam (const std::string &name, T &paramVariable, const std::string &description, const T &defaultValue)
 Adds a new parameter to the module.
 
template<typename T >
void addParam (const std::string &name, T &paramVariable, const std::string &description)
 Adds a new enforced parameter to the module.
 
void setReturnValue (int value)
 Sets the return value for this module as integer.
 
void setReturnValue (bool value)
 Sets the return value for this module as bool.
 
void setParamList (const ModuleParamList &params)
 Replace existing parameter list.
 

Protected Attributes

std::string m_TrgECLClusterName
 Name of the StoreArray containing the ECL clusters.
 
std::string m_2DfinderCollectionName
 Name of the StoreArray containing the input 2D tracks.
 
std::string m_GRLCollectionName
 Name of the StoreObj containing the input GRL.
 
std::string m_filename
 Name of file where network weights etc.
 
std::string m_trainFilename
 Name of file where training samples are stored.
 
std::string m_logFilename
 Name of file where training log is stored.
 
std::string m_arrayname
 Name of the TObjArray holding the networks.
 
std::string m_trainArrayname
 Name of the TObjArray holding the training samples.
 
bool m_saveDebug
 If true, save training curve and parameter distribution of training data.
 
bool m_load
 Switch to load saved parameters from a previous run.
 
GRLNeuro::Parameters m_parameters
 Parameters for the NeuroTrigger.
 
double m_nTrainMin
 Minimal number of training samples.
 
double m_nTrainMax
 Maximal number of training samples.
 
bool m_multiplyNTrain
 Switch to multiply number of samples with number of weights.
 
int m_nValid
 Number of validation samples.
 
int m_nTest
 Number of test samples.
 
double m_wMax
 Limit for weights.
 
int m_nThreads
 Number of threads for training.
 
int m_checkInterval
 Training is stopped if validation error is higher than checkInterval epochs ago, i.e.
 
int m_maxEpochs
 Maximal number of training epochs.
 
int m_repeatTrain
 Number of training runs with different random start weights.
 
GRLNeuro m_GRLNeuro
 Instance of the NeuroTrigger.
 
std::vector< GRLMLPDatam_trainSets
 Sets of training data for all sectors.
 
std::vector< int > TCThetaID
 
std::vector< float > TCPhiLab
 
std::vector< float > TCcotThetaLab
 
std::vector< float > TCPhiCOM
 
std::vector< float > TCThetaCOM
 
std::vector< float > TC1GeV
 
double radtodeg = 0
 convert radian to degree
 
int n_cdc_sector = 0
 Number of CDC sectors.
 
int n_ecl_sector = 0
 Number of ECL sectors.
 
int n_sector = 0
 Number of Total sectors.
 
std::vector< TH1D * > h_cdc2d_phi_sig
 Histograms for monitoring.
 
std::vector< TH1D * > h_cdc2d_pt_sig
 
std::vector< TH1D * > h_selE_sig
 
std::vector< TH1D * > h_selPhi_sig
 
std::vector< TH1D * > h_selTheta_sig
 
std::vector< TH1D * > h_result_sig
 
std::vector< TH1D * > h_cdc2d_phi_bg
 
std::vector< TH1D * > h_cdc2d_pt_bg
 
std::vector< TH1D * > h_selE_bg
 
std::vector< TH1D * > h_selPhi_bg
 
std::vector< TH1D * > h_selTheta_bg
 
std::vector< TH1D * > h_result_bg
 
std::vector< TH1D * > h_ncdc_sig
 
std::vector< TH1D * > h_ncdcf_sig
 
std::vector< TH1D * > h_ncdcs_sig
 
std::vector< TH1D * > h_ncdci_sig
 
std::vector< TH1D * > h_necl_sig
 
std::vector< TH1D * > h_ncdc_bg
 
std::vector< TH1D * > h_ncdcf_bg
 
std::vector< TH1D * > h_ncdcs_bg
 
std::vector< TH1D * > h_ncdci_bg
 
std::vector< TH1D * > h_necl_bg
 
std::vector< int > scale_bg
 BG scale factor for training.
 

Private Member Functions

std::list< ModulePtrgetModules () const override
 no submodules, return empty list
 
std::string getPathString () const override
 return the module name.
 
void setParamPython (const std::string &name, const boost::python::object &pyObj)
 Implements a method for setting boost::python objects.
 
void setParamPythonDict (const boost::python::dict &dictionary)
 Implements a method for reading the parameter values from a boost::python dictionary.
 

Private Attributes

std::string m_name
 The name of the module, saved as a string (user-modifiable)
 
std::string m_type
 The type of the module, saved as a string.
 
std::string m_package
 Package this module is found in (may be empty).
 
std::string m_description
 The description of the module.
 
unsigned int m_propertyFlags
 The properties of the module as bitwise or (with |) of EModulePropFlags.
 
LogConfig m_logConfig
 The log system configuration of the module.
 
ModuleParamList m_moduleParamList
 List storing and managing all parameter of the module.
 
bool m_hasReturnValue
 True, if the return value is set.
 
int m_returnValue
 The return value.
 
std::vector< ModuleConditionm_conditions
 Module condition, only non-null if set.
 

Detailed Description

The trainer module for the neural networks of the CDC trigger.

Prepare training data for several neural networks and train them using the Fast Artificial Neural Network library (FANN). For documentation of FANN see http://leenissen.dk/fann/wp/

Definition at line 29 of file GRLNeuroTrainerModule.h.

Member Typedef Documentation

◆ EAfterConditionPath

Forward the EAfterConditionPath definition from the ModuleCondition.

Definition at line 88 of file Module.h.

Member Enumeration Documentation

◆ EModulePropFlags

enum EModulePropFlags
inherited

Each module can be tagged with property flags, which indicate certain features of the module.

Enumerator
c_Input 

This module is an input module (reads data).

c_Output 

This module is an output module (writes data).

c_ParallelProcessingCertified 

This module can be run in parallel processing mode safely (All I/O must be done through the data store, in particular, the module must not write any files.)

c_HistogramManager 

This module is used to manage histograms accumulated by other modules.

c_InternalSerializer 

This module is an internal serializer/deserializer for parallel processing.

c_TerminateInAllProcesses 

When using parallel processing, call this module's terminate() function in all processes().

This will also ensure that there is exactly one process (single-core if no parallel modules found) or at least one input, one main and one output process.

c_DontCollectStatistics 

No statistics is collected for this module.

Definition at line 77 of file Module.h.

77 {
78 c_Input = 1,
79 c_Output = 2,
85 };
@ c_HistogramManager
This module is used to manage histograms accumulated by other modules.
Definition: Module.h:81
@ c_Input
This module is an input module (reads data).
Definition: Module.h:78
@ c_DontCollectStatistics
No statistics is collected for this module.
Definition: Module.h:84
@ c_ParallelProcessingCertified
This module can be run in parallel processing mode safely (All I/O must be done through the data stor...
Definition: Module.h:80
@ c_InternalSerializer
This module is an internal serializer/deserializer for parallel processing.
Definition: Module.h:82
@ c_Output
This module is an output module (writes data).
Definition: Module.h:79
@ c_TerminateInAllProcesses
When using parallel processing, call this module's terminate() function in all processes().
Definition: Module.h:83

Constructor & Destructor Documentation

◆ GRLNeuroTrainerModule()

Constructor, for setting module description and parameters.

Definition at line 46 of file GRLNeuroTrainerModule.cc.

46 : Module()
47{
49 "The NeuroTriggerTrainer module of the GRL.\n"
50 "Takes CDC track and ECL cluster to prepare input data\n"
51 "for the training of a neural network.\n"
52 "Networks are trained after the event loop and saved."
53 );
54 // parameters for saving / loading
55 addParam("TRGECLClusters", m_TrgECLClusterName,
56 "Name of the StoreArray holding the information of trigger ecl clusters ",
57 string("TRGECLClusters"));
58 addParam("2DfinderCollection", m_2DfinderCollectionName,
59 "Name of the StoreArray holding the tracks made by the 2D finder to be used as input.",
60 string("TRGCDC2DFinderTracks"));
61 addParam("GRLCollection", m_GRLCollectionName,
62 "Name of the StoreArray holding the tracks made by the GRL to be used as input.",
63 string("TRGGRLUnpackerStore"));
64 addParam("filename", m_filename,
65 "Name of the root file where the NeuroTrigger parameters will be saved.",
66 string("GRLNeuroTrigger.root"));
67 addParam("trainFilename", m_trainFilename,
68 "Name of the root file where the generated training samples will be saved.",
69 string("GRLNeuroTrigger.root"));
70 addParam("arrayname", m_arrayname,
71 "Name of the TObjArray to hold the NeuroTrigger parameters.",
72 string("MLPs"));
73 addParam("trainArrayname", m_trainArrayname,
74 "Name of the TObjArray to hold the training samples.",
75 string("trainSets"));
76 addParam("saveDebug", m_saveDebug,
77 "If true, save parameter distribution of training data "
78 "in train file and training curve in log file.", true);
79 addParam("load", m_load,
80 "Switch to load saved parameters if existing. "
81 "Take care not to duplicate training sets!", false);
82 // NeuroTrigger parameters
84 "Number of expert MLPs.", m_parameters.nMLP);
85 addParam("n_cdc_sector", m_parameters.n_cdc_sector,
86 "Number of expert CDC MLPs.", m_parameters.n_cdc_sector);
87 addParam("n_ecl_sector", m_parameters.n_ecl_sector,
88 "Number of expert ECL MLPs.", m_parameters.n_ecl_sector);
89 addParam("i_cdc_sector", m_parameters.i_cdc_sector,
90 "#cdc track of expert MLPs.", m_parameters.i_cdc_sector);
91 addParam("i_ecl_sector", m_parameters.i_ecl_sector,
92 "#ecl cluster of expert MLPs.", m_parameters.i_ecl_sector);
93 addParam("nHidden", m_parameters.nHidden,
94 "Number of nodes in each hidden layer for all networks "
95 "or factor to multiply with number of inputs (1 list or nMLP lists). "
96 "The number of layers is derived from the shape.", m_parameters.nHidden);
97 addParam("multiplyHidden", m_parameters.multiplyHidden,
98 "If true, multiply nHidden with number of input nodes.",
100 addParam("outputScale", m_parameters.outputScale,
101 "Output scale for all networks (1 value list or nMLP value lists). "
102 "Output[i] of the MLP is scaled from [-1, 1] "
103 "to [outputScale[2*i], outputScale[2*i+1]]. "
104 "(units: z[cm] / theta[degree])", m_parameters.outputScale);
105 addParam("nTrainMin", m_nTrainMin,
106 "Minimal number of training samples "
107 "or factor to multiply with number of weights. "
108 "If the minimal number of samples is not reached, "
109 "all samples are saved but no training is started.", 10.);
110 addParam("nTrainMax", m_nTrainMax,
111 "Maximal number of training samples "
112 "or factor to multiply with number of weights. "
113 "When the maximal number of samples is reached, "
114 "no further samples are added.", 10.);
115 addParam("multiplyNTrain", m_multiplyNTrain,
116 "If true, multiply nTrainMin and nTrainMax with number of weights.",
117 true);
118 addParam("nValid", m_nValid,
119 "Number of validation samples for training.", 1000);
120 addParam("nTest", m_nTest,
121 "Number of test samples to get resolution after training.", 5000);
122 addParam("wMax", m_wMax,
123 "Weights are limited to [-wMax, wMax] after each training epoch "
124 "(for convenience of the FPGA implementation).",
125 63.);
126 addParam("nThreads", m_nThreads,
127 "Number of threads for parallel training.", 1);
128 addParam("checkInterval", m_checkInterval,
129 "Training is stopped if validation error is higher than "
130 "checkInterval epochs ago, i.e. either the validation error is increasing "
131 "or the gain is less than the fluctuations.", 500);
132 addParam("maxEpochs", m_maxEpochs,
133 "Maximum number of training epochs.", 10000);
134 addParam("repeatTrain", m_repeatTrain,
135 "If >1, training is repeated several times with different start weights. "
136 "The weights which give the best resolution on the test samples are kept.", 1);
137}
int m_maxEpochs
Maximal number of training epochs.
int m_nValid
Number of validation samples.
bool m_load
Switch to load saved parameters from a previous run.
std::string m_TrgECLClusterName
Name of the StoreArray containing the ECL clusters.
int m_checkInterval
Training is stopped if validation error is higher than checkInterval epochs ago, i....
GRLNeuro::Parameters m_parameters
Parameters for the NeuroTrigger.
double m_wMax
Limit for weights.
std::string m_arrayname
Name of the TObjArray holding the networks.
bool m_multiplyNTrain
Switch to multiply number of samples with number of weights.
int m_nThreads
Number of threads for training.
int m_nTest
Number of test samples.
std::string m_GRLCollectionName
Name of the StoreObj containing the input GRL.
std::string m_trainFilename
Name of file where training samples are stored.
std::string m_2DfinderCollectionName
Name of the StoreArray containing the input 2D tracks.
double m_nTrainMin
Minimal number of training samples.
std::string m_filename
Name of file where network weights etc.
double m_nTrainMax
Maximal number of training samples.
bool m_saveDebug
If true, save training curve and parameter distribution of training data.
std::string m_trainArrayname
Name of the TObjArray holding the training samples.
int m_repeatTrain
Number of training runs with different random start weights.
void setDescription(const std::string &description)
Sets the description of the module.
Definition: Module.cc:214
Module()
Constructor.
Definition: Module.cc:30
void addParam(const std::string &name, T &paramVariable, const std::string &description, const T &defaultValue)
Adds a new parameter to the module.
Definition: Module.h:560
std::vector< std::vector< float > > outputScale
Output scale for all networks.
Definition: GRLNeuro.h:58
bool multiplyHidden
If true, multiply nHidden with number of input nodes.
Definition: GRLNeuro.h:56
unsigned nMLP
Number of networks.
Definition: GRLNeuro.h:47
unsigned n_ecl_sector
Number of ECL sectors.
Definition: GRLNeuro.h:64
std::vector< std::vector< float > > nHidden
Number of nodes in each hidden layer for all networks or factor to multiply with number of inputs.
Definition: GRLNeuro.h:52
unsigned n_cdc_sector
Number of CDC sectors.
Definition: GRLNeuro.h:61

◆ ~GRLNeuroTrainerModule()

virtual ~GRLNeuroTrainerModule ( )
inlinevirtual

Destructor.

Definition at line 35 of file GRLNeuroTrainerModule.h.

35{}

Member Function Documentation

◆ beginRun()

virtual void beginRun ( void  )
inlinevirtualinherited

Called when entering a new run.

Called at the beginning of each run, the method gives you the chance to change run dependent constants like alignment parameters, etc.

This method can be implemented by subclasses.

Reimplemented in ARICHBackgroundModule, BeamabortModule, BgoModule, CaveModule, ClawModule, CLAWSModule, DosiModule, FANGSModule, He3tubeModule, MicrotpcModule, Ph1bpipeModule, Ph1sustrModule, PindiodeModule, PlumeModule, QcsmonitorModule, SrsensorModule, GetEventFromSocketModule, CalibrationCollectorModule, EventsOfDoomBusterModule, CosmicsAlignmentValidationModule, EnergyBiasCorrectionModule, ChargedPidMVAModule, ChargedPidMVAMulticlassModule, CurlTaggerModule, LowEnergyPi0IdentificationExpertModule, LowEnergyPi0VetoExpertModule, ParticleVertexFitterModule, PhotonEfficiencySystematicsModule, TagVertexModule, TreeFitterModule, arichBtestModule, ARICHDigitizerModule, ARICHDQMModule, ARICHRateCalModule, ARICHReconstructorModule, B2BIIMCParticlesMonitorModule, B2BIIConvertBeamParamsModule, B2BIIConvertMdstModule, B2BIIFixMdstModule, B2BIIMdstInputModule, BelleMCOutputModule, BeamBkgGeneratorModule, BeamBkgHitRateMonitorModule, BeamBkgMixerModule, BeamBkgTagSetterModule, BGOverlayInputModule, AnalysisPhase1StudyModule, NtuplePhase1_v6Module, ReprocessorModule, BeamabortStudyModule, BeamDigitizerModule, BgoDigitizerModule, BgoStudyModule, ClawDigitizerModule, ClawStudyModule, ClawsDigitizerModule, ClawsStudyModule, CsiDigitizer_v2Module, CsIDigitizerModule, CsiModule, CsiStudy_v2Module, CsIStudyModule, DosiDigitizerModule, DosiStudyModule, FANGSDigitizerModule, FANGSStudyModule, He3DigitizerModule, He3tubeStudyModule, MicrotpcStudyModule, TpcDigitizerModule, PinDigitizerModule, PindiodeStudyModule, PlumeDigitizerModule, QcsmonitorDigitizerModule, QcsmonitorStudyModule, CDCCosmicAnalysisModule, CDCCRTestModule, cdcDQM7Module, CDCDQMModule, CDCPackerModule, CDCRecoTrackFilterModule, CDCUnpackerModule, DAQPerfModule, RxSocketModule, TxSocketModule, DqmHistoManagerModule, MonitorDataModule, TrackAnaModule, Ds2SampleModule, ReceiveEventModule, HLTDQM2ZMQModule, ElapsedTimeModule, DeSerializerPXDModule, GenRawSendModule, SerializerModule, CertifyParallelModule, Ds2RawModule, Ds2RbufModule, EvReductionModule, FastRbuf2DsModule, Raw2DsModule, RawInputModule, Rbuf2DsModule, Rbuf2RbufModule, Ds2RawFileModule, PartialSeqRootReaderModule, SeqRootMergerModule, StorageDeserializerModule, StorageSerializerModule, IPDQMModule, PhysicsObjectsDQMModule, PhysicsObjectsMiraBelleBhabhaModule, PhysicsObjectsMiraBelleDst2Module, PhysicsObjectsMiraBelleDstModule, PhysicsObjectsMiraBelleHadronModule, PhysicsObjectsMiraBelleModule, ECLBackgroundModule, ECLChargedPIDModule, ECLChargedPIDDataAnalysisModule, ECLChargedPIDDataAnalysisValidationModule, ECLChargedPIDMVAModule, ECLClusterPSDModule, ECLCovarianceMatrixModule, ECLCRFinderModule, ECLDataAnalysisModule, ECLDigitCalibratorModule, ECLDigitizerModule, ECLDigitizerPureCsIModule, EclDisplayModule, ECLDQMModule, ECLDQMConnectedRegionsModule, ECLDQMEXTENDEDModule, ECLDQMOutOfTimeDigitsModule, ECLFinalizerModule, ECLHitDebugModule, ECLLocalMaximumFinderModule, ECLLocalRunCalibratorModule, ECLLOMModule, ECLPackerModule, ECLShowerCorrectorModule, ECLShowerShapeModule, ECLSplitterN1Module, ECLSplitterN2Module, ECLUnpackerModule, ECLWaveformFitModule, HistoModule, SubEventModule, SwitchDataStoreModule, EventInfoPrinterModule, EventLimiterModule, IoVDependentConditionModule, ProgressModule, RandomBarrierModule, GearboxModule, HistoManagerModule, StatisticsSummaryModule, SeqRootInputModule, SeqRootOutputModule, RxModule, TxModule, EvtGenDecayModule, EvtGenInputModule, OverrideGenerationFlagsModule, KKGenInputModule, CreateFieldMapModule, ExportGeometryModule, SoftwareTriggerModule, SoftwareTriggerHLTDQMModule, StatisticsTimingHLTDQMModule, BKLMAnaModule, BKLMDigitAnalyzerModule, BKLMSimHistogrammerModule, BKLMTrackingModule, EKLMDataCheckerModule, KLMClusterAnaModule, KLMClusterEfficiencyModule, KLMClustersReconstructorModule, KLMDigitizerModule, KLMDigitTimeShifterModule, KLMDQMModule, KLMDQM2Module, KLMPackerModule, KLMReconstructorModule, KLMScintillatorSimulatorModule, KLMUnpackerModule, MVAExpertModule, MVAMultipleExpertsModule, MVAPrototypeModule, AWESOMEBasicModule, PXDBackgroundModule, PXDRawDQMChipsModule, PXDClustersFromTracksModule, PXDPerformanceModule, PXDClusterizerModule, Convert2RawDetModule, CDCDedxDQMModule, CDCDedxValidationModule, EventT0DQMModule, EventT0ValidationModule, DataWriterModule, ECLExpertModule, KLMExpertModule, KlongValidationModule, KLMMuonIDDNNExpertModule, FullSimModule, MaterialScanModule, SVDBackgroundModule, SVDClusterCalibrationsMonitorModule, SVDHotStripFinderModule, SVDLatencyCalibrationModule, SVDLocalCalibrationsCheckModule, SVDLocalCalibrationsMonitorModule, SVDPositionErrorScaleFactorImporterModule, SVDTimeCalibrationsMonitorModule, SVDDQMHitTimeModule, svdDumpModule, SVDPackerModule, SVDB4CommissioningPlotsModule, SVDClusterEvaluationModule, SVDClusterEvaluationTrueInfoModule, SVDClusterFilterModule, SVDMaxStripTTreeModule, SVDOccupancyAnalysisModule, SVDPerformanceModule, SVDPerformanceTTreeModule, SVDShaperDigitsFromTracksModule, SVDClusterizerModule, SVDCoGTimeEstimatorModule, SVDDataFormatCheckModule, SVDMissingAPVsClusterCreatorModule, SVDRecoDigitCreatorModule, SVD3SamplesEmulatorModule, SVDDigitizerModule, SVDEventInfoSetterModule, SVDTriggerQualityGeneratorModule, SVDSpacePointCreatorModule, SVDTimeGroupingModule, SVDUnpackerModule, TOPBackgroundModule, TOPBunchFinderModule, TOPChannelMaskerModule, TOPChannelT0MCModule, TOPDigitizerModule, TOPTriggerDigitizerModule, TOPDoublePulseGeneratorModule, TOPDQMModule, TOPGainEfficiencyCalculatorModule, TOPLaserHitSelectorModule, TOPInterimFENtupleModule, TOPLaserCalibratorModule, TOPMCTrackMakerModule, TOPModuleT0CalibratorModule, TOPNtupleModule, TOPPackerModule, TOPRawDigitConverterModule, TOPTBCComparatorModule, TOPTimeBaseCalibratorModule, TOPTimeRecalibratorModule, TOPUnpackerModule, TOPWaveformFeatureExtractorModule, TOPXTalkChargeShareSetterModule, DQMHistoModuleBase, SVDEventT0EstimatorModule, ExtModule, FlipQualityModule, BeamSpotMonitorModule, KinkFinderModule, MCV0MatcherModule, MCTrackCandClassifierModule, MuidModule, PXDROIFinderModule, SVDROIFinderAnalysisModule, SVDROIFinderModule, SPTCmomentumSeedRetrieverModule, SPTCvirtualIPRemoverModule, TrackCreatorModule, TrackFinderMCTruthRecoTracksModule, EffPlotsModule, HitXPModule, TrackingPerformanceEvaluationModule, V0findingPerformanceEvaluationModule, TrackQETrainingDataCollectorModule, TrackQualityEstimatorMVAModule, SecMapTrainerBaseModule, SecMapTrainerVXDTFModule, TrackFinderVXDAnalizerModule, VXDSimpleClusterizerModule, QualityEstimatorVXDModule, VXDQETrainingDataCollectorModule, VXDQualityEstimatorMVAModule, SectorMapBootstrapModule, SegmentNetworkProducerModule, TrackFinderVXDBasicPathFinderModule, TrackFinderVXDCellOMatModule, VXDTFTrainingDataCollectorModule, FindletModule< AFindlet >, FindletModule< HitBasedT0Extractor >, FindletModule< CKFToSVDSeedFindlet >, FindletModule< CKFToSVDFindlet >, FindletModule< CosmicsTrackMergerFindlet >, FindletModule< DATCONFPGAFindlet >, FindletModule< MCVXDCDCTrackMergerFindlet >, FindletModule< vxdHoughTracking::SVDHoughTracking >, FindletModule< CKFToCDCFindlet >, FindletModule< CKFToCDCFromEclFindlet >, FindletModule< CKFToPXDFindlet >, FindletModule< AsicBackgroundLibraryCreator >, FindletModule< CDCTrackingEventLevelMdstInfoFillerFromHitsFindlet >, FindletModule< CDCTrackingEventLevelMdstInfoFillerFromSegmentsFindlet >, FindletModule< AxialSegmentPairCreator >, FindletModule< AxialStraightTrackFinder >, FindletModule< AxialTrackCreatorMCTruth >, FindletModule< AxialTrackCreatorSegmentHough >, FindletModule< AxialTrackFinderHough >, FindletModule< AxialTrackFinderLegendre >, FindletModule< ClusterBackgroundDetector >, FindletModule< ClusterPreparer >, FindletModule< ClusterRefiner< BridgingWireHitRelationFilter > >, FindletModule< FacetCreator >, FindletModule< HitReclaimer >, FindletModule< MonopoleAxialTrackFinderLegendre >, FindletModule< MonopoleStereoHitFinder >, FindletModule< MonopoleStereoHitFinderQuadratic >, FindletModule< SegmentCreatorFacetAutomaton >, FindletModule< SegmentCreatorMCTruth >, FindletModule< SegmentFinderFacetAutomaton >, FindletModule< SegmentFitter >, FindletModule< SegmentLinker >, FindletModule< SegmentOrienter >, FindletModule< SegmentPairCreator >, FindletModule< SegmentRejecter >, FindletModule< SegmentTrackCombiner >, FindletModule< SegmentTripleCreator >, FindletModule< StereoHitFinder >, FindletModule< SuperClusterCreator >, FindletModule< TrackCombiner >, FindletModule< TrackCreatorSegmentPairAutomaton >, FindletModule< TrackCreatorSegmentTripleAutomaton >, FindletModule< TrackCreatorSingleSegments >, FindletModule< TrackExporter >, FindletModule< TrackFinderAutomaton >, FindletModule< TrackFinderCosmics >, FindletModule< TrackFinder >, FindletModule< TrackFinderSegmentPairAutomaton >, FindletModule< TrackFinderSegmentTripleAutomaton >, FindletModule< TrackFlightTimeAdjuster >, FindletModule< TrackLinker >, FindletModule< TrackOrienter >, FindletModule< TrackQualityAsserter >, FindletModule< TrackQualityEstimator >, FindletModule< TrackRejecter >, FindletModule< WireHitBackgroundDetector >, FindletModule< WireHitCreator >, FindletModule< WireHitPreparer >, CDCTriggerNeuroDQMModule, CDCTriggerNeuroDQMOnlineModule, CDCTriggerNDFinderModule, CDCTriggerTSFModule, TRGCDCModule, TRGCDCETFUnpackerModule, TRGCDCT2DDQMModule, TRGCDCT3DConverterModule, TRGCDCT3DDQMModule, TRGCDCT3DUnpackerModule, TRGCDCTSFDQMModule, TRGCDCTSFUnpackerModule, TRGCDCTSStreamModule, CDCTriggerUnpackerModule, MCMatcherTRGECLModule, TRGECLFAMModule, TRGECLModule, TRGECLBGTCHitModule, TRGECLDQMModule, TRGECLEventTimingDQMModule, TRGECLQAMModule, TRGECLRawdataAnalysisModule, TRGECLTimingCalModule, TRGECLUnpackerModule, TRGGDLModule, TRGEFFDQMModule, TRGGDLDQMModule, TRGGDLDSTModule, TRGGDLSummaryModule, TRGGDLUnpackerModule, TRGGRLMatchModule, TRGGRLModule, TRGGRLProjectsModule, TRGGRLDQMModule, TRGGRLUnpackerModule, KLMTriggerModule, TRGTOPDQMModule, TRGTOPTRD2TTSConverterModule, TRGTOPUnpackerModule, TRGTOPUnpackerWaveformModule, TRGTOPWaveformPlotterModule, TRGRAWDATAModule, VXDMisalignmentModule, DQMHistAnalysisARICHModule, DQMHistAnalysisCDCDedxModule, DQMHistAnalysisCDCEpicsModule, DQMHistAnalysisCDCMonObjModule, DQMHistAnalysisDAQMonObjModule, DQMHistAnalysisECLModule, DQMHistAnalysisECLConnectedRegionsModule, DQMHistAnalysisECLShapersModule, DQMHistAnalysisECLSummaryModule, DQMHistAnalysisEpicsExampleModule, DQMHistAnalysisEventT0EfficiencyModule, DQMHistAnalysisEventT0TriggerJitterModule, DQMHistAnalysisExampleModule, DQMHistAnalysisExampleFlagsModule, DQMHistAnalysisHLTModule, DQMHistAnalysisInput2Module, DQMHistAnalysisInputPVSrvModule, DQMHistAnalysisInputRootFileModule, DQMHistAnalysisInputTestModule, DQMHistAnalysisKLMModule, DQMHistAnalysisKLM2Module, DQMHistAnalysisMiraBelleModule, DQMHistAnalysisOutputMonObjModule, DQMHistAnalysisOutputRelayMsgModule, DQMHistAnalysisPeakModule, DQMHistAnalysisPXDERModule, DQMHistAnalysisPXDFitsModule, DQMHistAnalysisSVDClustersOnTrackModule, DQMHistAnalysisSVDDoseModule, DQMHistAnalysisSVDEfficiencyModule, DQMHistAnalysisSVDGeneralModule, DQMHistAnalysisSVDOccupancyModule, DQMHistAnalysisSVDOnMiraBelleModule, DQMHistAnalysisSVDUnpackerModule, DQMHistAnalysisTOPModule, DQMHistAnalysisTrackingAbortModule, DQMHistAnalysisTrackingHLTModule, DQMHistAnalysisTRGECLModule, DQMHistAutoCanvasModule, DQMHistComparitorModule, DQMHistDeltaHistoModule, DQMHistReferenceModule, DQMHistSnapshotsModule, DAQMonitorModule, DelayDQMModule, V0ObjectsDQMModule, ECLDQMInjectionModule, PyModule, PXDBgTupleProducerModule, PXDMCBgTupleProducerModule, PXDDAQDQMModule, PXDDQMClustersModule, PXDDQMCorrModule, PXDDQMEfficiencyModule, PXDDQMEfficiencySelftrackModule, PXDDQMExpressRecoModule, PXDGatedDHCDQMModule, PXDGatedModeDQMModule, PXDInjectionDQMModule, PXDRawDQMCorrModule, PXDRawDQMModule, PXDROIDQMModule, PXDTrackClusterDQMModule, PXDDigitizerModule, PXDPackerModule, PXDUnpackerModule, TTDDQMModule, DetectorOccupanciesDQMModule, SVDDQMClustersOnTrackModule, SVDDQMDoseModule, SVDDQMExpressRecoModule, SVDDQMInjectionModule, SVDUnpackerDQMModule, PXDclusterFilterModule, PXDdigiFilterModule, PXDROIFinderAnalysisModule, TrackingAbortDQMModule, VXDDQMExpressRecoModule, vxdDigitMaskingModule, DQMHistAnalysisDeltaEpicsMonObjExampleModule, DQMHistAnalysisDeltaTestModule, DQMHistAnalysisEpicsOutputModule, DQMHistAnalysisPhysicsModule, DQMHistAnalysisPXDChargeModule, DQMHistAnalysisPXDCMModule, DQMHistAnalysisPXDDAQModule, DQMHistAnalysisPXDEffModule, DQMHistAnalysisPXDInjectionModule, DQMHistAnalysisPXDReductionModule, DQMHistAnalysisPXDTrackChargeModule, DQMHistAnalysisRooFitExampleModule, DQMHistAnalysisRunNrModule, DQMHistAnalysisTRGModule, DQMHistInjectionModule, and DQMHistOutputToEPICSModule.

Definition at line 147 of file Module.h.

147{};

◆ clone()

std::shared_ptr< PathElement > clone ( ) const
overridevirtualinherited

Create an independent copy of this module.

Note that parameters are shared, so changing them on a cloned module will also affect the original module.

Implements PathElement.

Definition at line 179 of file Module.cc.

180{
182 newModule->m_moduleParamList.setParameters(getParamList());
183 newModule->setName(getName());
184 newModule->m_package = m_package;
185 newModule->m_propertyFlags = m_propertyFlags;
186 newModule->m_logConfig = m_logConfig;
187 newModule->m_conditions = m_conditions;
188
189 return newModule;
190}
std::shared_ptr< Module > registerModule(const std::string &moduleName, std::string sharedLibPath="") noexcept(false)
Creates an instance of a module and registers it to the ModuleManager.
static ModuleManager & Instance()
Exception is thrown if the requested module could not be created by the ModuleManager.
const ModuleParamList & getParamList() const
Return module param list.
Definition: Module.h:363
const std::string & getName() const
Returns the name of the module.
Definition: Module.h:187
const std::string & getType() const
Returns the type of the module (i.e.
Definition: Module.cc:41
unsigned int m_propertyFlags
The properties of the module as bitwise or (with |) of EModulePropFlags.
Definition: Module.h:512
LogConfig m_logConfig
The log system configuration of the module.
Definition: Module.h:514
std::vector< ModuleCondition > m_conditions
Module condition, only non-null if set.
Definition: Module.h:521
std::string m_package
Package this module is found in (may be empty).
Definition: Module.h:510
std::shared_ptr< Module > ModulePtr
Defines a pointer to a module object as a boost shared pointer.
Definition: Module.h:43

◆ def_beginRun()

virtual void def_beginRun ( )
inlineprotectedvirtualinherited

Wrapper method for the virtual function beginRun() that has the implementation to be used in a call from Python.

Reimplemented in PyModule.

Definition at line 426 of file Module.h.

426{ beginRun(); }
virtual void beginRun()
Called when entering a new run.
Definition: Module.h:147

◆ def_endRun()

virtual void def_endRun ( )
inlineprotectedvirtualinherited

This method can receive that the current run ends as a call from the Python side.

For regular C++-Modules that forwards the call to the regular endRun() method.

Reimplemented in PyModule.

Definition at line 439 of file Module.h.

439{ endRun(); }
virtual void endRun()
This method is called if the current run ends.
Definition: Module.h:166

◆ def_event()

virtual void def_event ( )
inlineprotectedvirtualinherited

Wrapper method for the virtual function event() that has the implementation to be used in a call from Python.

Reimplemented in PyModule.

Definition at line 432 of file Module.h.

432{ event(); }
virtual void event()
This method is the core of the module.
Definition: Module.h:157

◆ def_initialize()

virtual void def_initialize ( )
inlineprotectedvirtualinherited

Wrappers to make the methods without "def_" prefix callable from Python.

Overridden in PyModule. Wrapper method for the virtual function initialize() that has the implementation to be used in a call from Python.

Reimplemented in PyModule.

Definition at line 420 of file Module.h.

420{ initialize(); }
virtual void initialize()
Initialize the Module.
Definition: Module.h:109

◆ def_terminate()

virtual void def_terminate ( )
inlineprotectedvirtualinherited

Wrapper method for the virtual function terminate() that has the implementation to be used in a call from Python.

Reimplemented in PyModule.

Definition at line 445 of file Module.h.

445{ terminate(); }
virtual void terminate()
This method is called at the end of the event processing.
Definition: Module.h:176

◆ endRun()

virtual void endRun ( void  )
inlinevirtualinherited

This method is called if the current run ends.

Use this method to store information, which should be aggregated over one run.

This method can be implemented by subclasses.

Reimplemented in BeamabortModule, BgoModule, CaveModule, ClawModule, CLAWSModule, DosiModule, FANGSModule, He3tubeModule, MicrotpcModule, Ph1bpipeModule, Ph1sustrModule, PindiodeModule, PlumeModule, QcsmonitorModule, SrsensorModule, GetEventFromSocketModule, CalibrationCollectorModule, AlignDQMModule, CosmicsAlignmentValidationModule, CurlTaggerModule, LowEnergyPi0IdentificationExpertModule, LowEnergyPi0VetoExpertModule, arichBtestModule, ARICHDQMModule, B2BIIMCParticlesMonitorModule, B2BIIConvertMdstModule, B2BIIMdstInputModule, BelleMCOutputModule, BeamBkgGeneratorModule, BeamBkgHitRateMonitorModule, BeamBkgMixerModule, BeamBkgTagSetterModule, BGOverlayInputModule, AnalysisPhase1StudyModule, NtuplePhase1_v6Module, ReprocessorModule, BeamabortStudyModule, BeamDigitizerModule, BgoDigitizerModule, BgoStudyModule, ClawDigitizerModule, ClawStudyModule, ClawsDigitizerModule, ClawsStudyModule, CsiDigitizer_v2Module, CsIDigitizerModule, CsiModule, CsiStudy_v2Module, CsIStudyModule, DosiDigitizerModule, DosiStudyModule, FANGSDigitizerModule, FANGSStudyModule, He3DigitizerModule, He3tubeStudyModule, MicrotpcStudyModule, TpcDigitizerModule, TPCStudyModule, PinDigitizerModule, PindiodeStudyModule, PlumeDigitizerModule, QcsmonitorDigitizerModule, QcsmonitorStudyModule, CDCCosmicAnalysisModule, CDCCRTestModule, cdcDQM7Module, CDCDQMModule, CDCPackerModule, CDCRecoTrackFilterModule, CDCUnpackerModule, DAQPerfModule, RxSocketModule, TxSocketModule, DqmHistoManagerModule, MonitorDataModule, TrackAnaModule, Ds2SampleModule, ReceiveEventModule, HLTDQM2ZMQModule, HLTDs2ZMQModule, ElapsedTimeModule, DeSerializerPXDModule, GenRawSendModule, Root2RawModule, SerializerModule, CertifyParallelModule, Ds2RawModule, Ds2RbufModule, EvReductionModule, FastRbuf2DsModule, Raw2DsModule, RawInputModule, Rbuf2DsModule, Rbuf2RbufModule, Ds2RawFileModule, PartialSeqRootReaderModule, SeqRootMergerModule, StorageDeserializerModule, StorageRootOutputModule, StorageSerializerModule, PhysicsObjectsDQMModule, PhysicsObjectsMiraBelleBhabhaModule, PhysicsObjectsMiraBelleDst2Module, PhysicsObjectsMiraBelleDstModule, PhysicsObjectsMiraBelleHadronModule, PhysicsObjectsMiraBelleModule, ECLBackgroundModule, ECLChargedPIDModule, ECLChargedPIDDataAnalysisModule, ECLChargedPIDDataAnalysisValidationModule, ECLClusterPSDModule, ECLCovarianceMatrixModule, ECLCRFinderModule, ECLDataAnalysisModule, ECLDigitCalibratorModule, ECLDigitizerModule, ECLDigitizerPureCsIModule, EclDisplayModule, ECLDQMModule, ECLDQMEXTENDEDModule, ECLFinalizerModule, ECLHitDebugModule, ECLLocalMaximumFinderModule, ECLLocalRunCalibratorModule, ECLLOMModule, ECLPackerModule, ECLShowerCorrectorModule, ECLShowerShapeModule, ECLSplitterN1Module, ECLSplitterN2Module, ECLUnpackerModule, ECLWaveformFitModule, HistoModule, SubEventModule, SwitchDataStoreModule, EventInfoPrinterModule, RandomBarrierModule, HistoManagerModule, StatisticsSummaryModule, SeqRootInputModule, SeqRootOutputModule, RxModule, TxModule, ZMQTxInputModule, ZMQTxWorkerModule, EvtGenDecayModule, OverrideGenerationFlagsModule, BKLMAnaModule, BKLMDigitAnalyzerModule, BKLMSimHistogrammerModule, BKLMTrackingModule, EKLMDataCheckerModule, KLMClusterEfficiencyModule, KLMClustersReconstructorModule, KLMDigitizerModule, KLMDQMModule, KLMDQM2Module, KLMPackerModule, KLMReconstructorModule, KLMScintillatorSimulatorModule, KLMUnpackerModule, AWESOMEBasicModule, PXDBackgroundModule, PXDClustersFromTracksModule, PXDPerformanceModule, Convert2RawDetModule, PrintDataModule, PrintEventRateModule, Root2BinaryModule, CDCDedxDQMModule, CDCDedxValidationModule, EventT0ValidationModule, DataWriterModule, KlongValidationModule, KLMMuonIDDNNExpertModule, FullSimModule, SVDBackgroundModule, SVDClusterCalibrationsMonitorModule, SVDHotStripFinderModule, SVDLatencyCalibrationModule, SVDLocalCalibrationsMonitorModule, SVDPositionErrorScaleFactorImporterModule, SVDTimeCalibrationsMonitorModule, svdDumpModule, SVDPackerModule, SVDB4CommissioningPlotsModule, SVDClusterEvaluationModule, SVDClusterEvaluationTrueInfoModule, SVDClusterFilterModule, SVDOccupancyAnalysisModule, SVDPerformanceModule, SVDShaperDigitsFromTracksModule, SVDClusterizerModule, SVDCoGTimeEstimatorModule, SVDDataFormatCheckModule, SVDRecoDigitCreatorModule, SVD3SamplesEmulatorModule, SVDTriggerQualityGeneratorModule, SVDUnpackerModule, TOPBackgroundModule, TOPChannelT0MCModule, TOPTriggerDigitizerModule, TOPDoublePulseGeneratorModule, TOPGainEfficiencyCalculatorModule, TOPLaserHitSelectorModule, TOPInterimFENtupleModule, TOPLaserCalibratorModule, TOPMCTrackMakerModule, TOPNtupleModule, TOPPackerModule, TOPRawDigitConverterModule, TOPTBCComparatorModule, TOPTimeBaseCalibratorModule, TOPUnpackerModule, TOPWaveformFeatureExtractorModule, TOPWaveformQualityPlotterModule, TOPXTalkChargeShareSetterModule, ExtModule, GenfitVisModule, MCV0MatcherModule, MCTrackCandClassifierModule, MuidModule, MCSlowPionPXDROICreatorModule, PXDROIFinderModule, SVDROIDQMModule, SVDROIFinderAnalysisModule, SVDROIFinderModule, RT2SPTCConverterModule, SPTCmomentumSeedRetrieverModule, SPTCvirtualIPRemoverModule, TrackFinderMCTruthRecoTracksModule, EffPlotsModule, HitXPModule, TrackingPerformanceEvaluationModule, V0findingPerformanceEvaluationModule, SecMapTrainerBaseModule, SecMapTrainerVXDTFModule, TrackFinderVXDAnalizerModule, VXDSimpleClusterizerModule, NoKickCutsEvalModule, SectorMapBootstrapModule, VXDTFTrainingDataCollectorModule, FindletModule< AFindlet >, FindletModule< HitBasedT0Extractor >, FindletModule< CKFToSVDSeedFindlet >, FindletModule< CKFToSVDFindlet >, FindletModule< CosmicsTrackMergerFindlet >, FindletModule< DATCONFPGAFindlet >, FindletModule< MCVXDCDCTrackMergerFindlet >, FindletModule< vxdHoughTracking::SVDHoughTracking >, FindletModule< CKFToCDCFindlet >, FindletModule< CKFToCDCFromEclFindlet >, FindletModule< CKFToPXDFindlet >, FindletModule< AsicBackgroundLibraryCreator >, FindletModule< CDCTrackingEventLevelMdstInfoFillerFromHitsFindlet >, FindletModule< CDCTrackingEventLevelMdstInfoFillerFromSegmentsFindlet >, FindletModule< AxialSegmentPairCreator >, FindletModule< AxialStraightTrackFinder >, FindletModule< AxialTrackCreatorMCTruth >, FindletModule< AxialTrackCreatorSegmentHough >, FindletModule< AxialTrackFinderHough >, FindletModule< AxialTrackFinderLegendre >, FindletModule< ClusterBackgroundDetector >, FindletModule< ClusterPreparer >, FindletModule< ClusterRefiner< BridgingWireHitRelationFilter > >, FindletModule< FacetCreator >, FindletModule< HitReclaimer >, FindletModule< MonopoleAxialTrackFinderLegendre >, FindletModule< MonopoleStereoHitFinder >, FindletModule< MonopoleStereoHitFinderQuadratic >, FindletModule< SegmentCreatorFacetAutomaton >, FindletModule< SegmentCreatorMCTruth >, FindletModule< SegmentFinderFacetAutomaton >, FindletModule< SegmentFitter >, FindletModule< SegmentLinker >, FindletModule< SegmentOrienter >, FindletModule< SegmentPairCreator >, FindletModule< SegmentRejecter >, FindletModule< SegmentTrackCombiner >, FindletModule< SegmentTripleCreator >, FindletModule< StereoHitFinder >, FindletModule< SuperClusterCreator >, FindletModule< TrackCombiner >, FindletModule< TrackCreatorSegmentPairAutomaton >, FindletModule< TrackCreatorSegmentTripleAutomaton >, FindletModule< TrackCreatorSingleSegments >, FindletModule< TrackExporter >, FindletModule< TrackFinderAutomaton >, FindletModule< TrackFinderCosmics >, FindletModule< TrackFinder >, FindletModule< TrackFinderSegmentPairAutomaton >, FindletModule< TrackFinderSegmentTripleAutomaton >, FindletModule< TrackFlightTimeAdjuster >, FindletModule< TrackLinker >, FindletModule< TrackOrienter >, FindletModule< TrackQualityAsserter >, FindletModule< TrackQualityEstimator >, FindletModule< TrackRejecter >, FindletModule< WireHitBackgroundDetector >, FindletModule< WireHitCreator >, FindletModule< WireHitPreparer >, CDCTriggerNeuroDQMModule, CDCTriggerNeuroDQMOnlineModule, CDCTriggerNDFinderModule, TRGCDCModule, TRGCDCETFUnpackerModule, TRGCDCT2DDQMModule, TRGCDCT3DConverterModule, TRGCDCT3DDQMModule, TRGCDCT3DUnpackerModule, TRGCDCTSFDQMModule, TRGCDCTSFUnpackerModule, TRGCDCTSStreamModule, MCMatcherTRGECLModule, TRGECLFAMModule, TRGECLModule, TRGECLBGTCHitModule, TRGECLDQMModule, TRGECLQAMModule, TRGECLRawdataAnalysisModule, TRGECLTimingCalModule, TRGECLUnpackerModule, TRGGDLModule, TRGEFFDQMModule, TRGGDLDQMModule, TRGGDLDSTModule, TRGGDLSummaryModule, TRGGDLUnpackerModule, TRGGRLMatchModule, TRGGRLModule, TRGGRLProjectsModule, TRGGRLDQMModule, TRGGRLUnpackerModule, KLMTriggerModule, TRGTOPDQMModule, TRGTOPTRD2TTSConverterModule, TRGTOPUnpackerModule, TRGTOPUnpackerWaveformModule, TRGTOPWaveformPlotterModule, TRGRAWDATAModule, DQMHistAnalysisARICHModule, DQMHistAnalysisARICHMonObjModule, DQMHistAnalysisCDCDedxModule, DQMHistAnalysisCDCEpicsModule, DQMHistAnalysisCDCMonObjModule, DQMHistAnalysisDAQMonObjModule, DQMHistAnalysisECLModule, DQMHistAnalysisECLConnectedRegionsModule, DQMHistAnalysisECLOutOfTimeDigitsModule, DQMHistAnalysisECLShapersModule, DQMHistAnalysisECLSummaryModule, DQMHistAnalysisEpicsExampleModule, DQMHistAnalysisExampleModule, DQMHistAnalysisExampleFlagsModule, DQMHistAnalysisHLTMonObjModule, DQMHistAnalysisInput2Module, DQMHistAnalysisInputPVSrvModule, DQMHistAnalysisInputTestModule, DQMHistAnalysisKLMModule, DQMHistAnalysisKLM2Module, DQMHistAnalysisMiraBelleModule, DQMHistAnalysisMonObjModule, DQMHistAnalysisOutputFileModule, DQMHistAnalysisOutputMonObjModule, DQMHistAnalysisOutputRelayMsgModule, DQMHistAnalysisPXDFitsModule, DQMHistAnalysisSVDClustersOnTrackModule, DQMHistAnalysisSVDDoseModule, DQMHistAnalysisSVDEfficiencyModule, DQMHistAnalysisSVDGeneralModule, DQMHistAnalysisSVDOccupancyModule, DQMHistAnalysisSVDOnMiraBelleModule, DQMHistAnalysisSVDUnpackerModule, DQMHistAnalysisTOPModule, DQMHistAnalysisTRGECLModule, DQMHistAnalysisTRGEFFModule, DQMHistAnalysisTRGGDLModule, DQMHistComparitorModule, DQMHistDeltaHistoModule, DQMHistReferenceModule, DQMHistSnapshotsModule, PyModule, SVDUnpackerDQMModule, TrackSetEvaluatorHopfieldNNDEVModule, vxdDigitMaskingModule, DQMHistAnalysisDeltaEpicsMonObjExampleModule, DQMHistAnalysisDeltaTestModule, DQMHistAnalysisEpicsOutputModule, DQMHistAnalysisPhysicsModule, DQMHistAnalysisPXDChargeModule, DQMHistAnalysisPXDTrackChargeModule, DQMHistAnalysisRooFitExampleModule, DQMHistAnalysisTRGModule, and DQMHistOutputToEPICSModule.

Definition at line 166 of file Module.h.

166{};

◆ evalCondition()

bool evalCondition ( ) const
inherited

If at least one condition was set, it is evaluated and true returned if at least one condition returns true.

If no condition or result value was defined, the method returns false. Otherwise, the condition is evaluated and true returned, if at least one condition returns true. To speed up the evaluation, the condition strings were already parsed in the method if_value().

Returns
True if at least one condition and return value exists and at least one condition expression was evaluated to true.

Definition at line 96 of file Module.cc.

97{
98 if (m_conditions.empty()) return false;
99
100 //okay, a condition was set for this Module...
101 if (!m_hasReturnValue) {
102 B2FATAL("A condition was set for '" << getName() << "', but the module did not set a return value!");
103 }
104
105 for (const auto& condition : m_conditions) {
106 if (condition.evaluate(m_returnValue)) {
107 return true;
108 }
109 }
110 return false;
111}
int m_returnValue
The return value.
Definition: Module.h:519
bool m_hasReturnValue
True, if the return value is set.
Definition: Module.h:518

◆ event()

void event ( void  )
overridevirtual

Called once for each event.

Prepare input and target for each track and store it.

Reimplemented from Module.

Definition at line 233 of file GRLNeuroTrainerModule.cc.

234{
235 //inputs and outputs
236 std::vector<float> input;
237 std::vector<float> output;
238
240 std::vector<float> cdc2d_phi;
241 std::vector<float> cdc2d_pt;
242
243 //GRL input
245 int n_cdcf = 0;
246 int n_cdcs = 0;
247 int n_cdci = 0;
248 int n_cdc = 0;
249 int map_cdcf[36];
250 int map_cdcs[36];
251 int map_cdci[36];
252 for (int i = 0; i < 36; i++) {
253 map_cdcf[i] = 0;
254 map_cdcs[i] = 0;
255 map_cdci[i] = 0;
256 }
257
258 //full track
259 for (int i = 0; i < 36; i++) {
260 if (GRLStore->get_phi_CDC(i)) {
261 map_cdcf[i] = 1;
262 }
263 }
264
265 //short track
266 for (int i = 0; i < 64; i++) {
267 if (GRLStore->get_map_ST2(i)) {
268 int j = i * (36. / 64.);
269 map_cdcs[j] = 1;
270 }
271 }
272
273 //inner track
274 for (int i = 0; i < 64; i++) {
275 if (GRLStore->get_map_TSF0(i)) {
276 int j = i * (36. / 64.);
277 int j1 = i - 4;
278 if (j1 < 0) j1 = j1 + 64;
279 int j2 = i - 3;
280 if (j2 < 0) j2 = j2 + 64;
281 int j3 = i - 2;
282 if (j3 < 0) j3 = j3 + 64;
283 int j4 = i - 1;
284 if (j4 < 0) j4 = j4 + 64;
285 int j5 = i;
286 int j6 = i + 1;
287 if (j6 > 63) j6 = j6 - 64;
288 int j7 = i + 2;
289 if (j7 > 63) j7 = j7 - 64;
290 if (
291 (GRLStore->get_map_TSF1(j1) || GRLStore->get_map_TSF1(j2) || GRLStore->get_map_TSF1(j3) || GRLStore->get_map_TSF1(j4)
292 || GRLStore->get_map_TSF1(j5))
293 &&
294 (GRLStore->get_map_TSF2(j3) || GRLStore->get_map_TSF2(j4) || GRLStore->get_map_TSF2(j5) || GRLStore->get_map_TSF2(j6)
295 || GRLStore->get_map_TSF2(j7))
296 )
297 map_cdci[j] = 1;
298 }
299 }
300
301 //avoid overlap
302 for (int i = 0; i < 36; i++) {
303 if (map_cdcf[i] == 1) {
304 int i1 = i - 2;
305 if (i1 < 0) i1 = i1 + 36;
306 int i2 = i - 1;
307 if (i2 < 0) i2 = i2 + 36;
308 int i3 = i;
309 int i4 = i + 1;
310 // cppcheck-suppress knownConditionTrueFalse
311 if (i4 > 36) i4 = i4 - 36;
312 int i5 = i + 2;
313 if (i5 > 36) i5 = i5 - 36;
314 //map_cdcs[i1]=0;
315 map_cdcs[i2] = 0;
316 map_cdcs[i3] = 0;
317 map_cdcs[i4] = 0;
318 //map_cdcs[i5]=0;
319 //map_cdci[i1]=0;
320 map_cdci[i2] = 0;
321 map_cdci[i3] = 0;
322 map_cdci[i4] = 0;
323 //map_cdci[i5]=0;
324 }
325 }
326 for (int i = 0; i < 36; i++) {
327 if (map_cdcs[i] == 1) {
328 int i1 = i - 2;
329 if (i1 < 0) i1 = i1 + 36;
330 int i2 = i - 1;
331 if (i2 < 0) i2 = i2 + 36;
332 int i3 = i;
333 int i4 = i + 1;
334 // cppcheck-suppress knownConditionTrueFalse
335 if (i4 > 36) i4 = i4 - 36;
336 int i5 = i + 2;
337 if (i5 > 36) i5 = i5 - 36;
338 //map_cdci[i1]=0;
339 map_cdci[i2] = 0;
340 map_cdci[i3] = 0;
341 map_cdci[i4] = 0;
342 //map_cdci[i5]=0;
343 }
344 }
345
347 //for (int i = 0; i < 36; i++) {
348 // std::cout << map_cdcf[i] << " " ;
349 //}
350 //std::cout << std::endl;
351 //for (int i = 0; i < 36; i++) {
352 // std::cout << map_cdcs[i] << " " ;
353 //}
354 //std::cout << std::endl;
355 //for (int i = 0; i < 36; i++) {
356 // std::cout << map_cdci[i] << " " ;
357 //}
358 //std::cout << std::endl;
359
360 //count
361 for (int i = 0; i < 36; i++) {
362 if (map_cdcf[i] == 1) {n_cdcf++; n_cdc++;}
363 if (map_cdcs[i] == 1) {n_cdcs++; n_cdc++;}
364 if (map_cdci[i] == 1) {n_cdci++; n_cdc++;}
365 }
366
367 //input
368 for (int i = 0; i < 36; i++) {
369 input.push_back((map_cdcf[i] - 0.5) * 2);
370 }
371 for (int i = 0; i < 36; i++) {
372 input.push_back((map_cdcs[i] - 0.5) * 2);
373 }
374 for (int i = 0; i < 36; i++) {
375 input.push_back((map_cdci[i] - 0.5) * 2);
376 }
377
378 //ECL input
379 //..Use only clusters within 100 ns of event timing (from ECL).
380 StoreArray<TRGECLTrg> trgArray;
382 int ntrgArray = trgArray.getEntries();
383 double EventTiming = -9999.;
384 if (ntrgArray > 0) {EventTiming = trgArray[0]->getEventTiming();}
385 std::vector<int> selTC;
386 std::vector<float> selTheta;
387 std::vector<float> selPhi;
388 std::vector<float> selE;
389 for (int ic = 0; ic < eclTrgClusterArray.getEntries(); ic++) {
390 double tcT = abs(eclTrgClusterArray[ic]->getTimeAve() - EventTiming);
391 //if (tcT < 100.) {
392 int TC = eclTrgClusterArray[ic]->getMaxTCId();
393 selTC.push_back(TC);
394 selTheta.push_back(TCcotThetaLab[TC - 1]);
395 selPhi.push_back(TCPhiLab[TC - 1]);
396 selE.push_back(eclTrgClusterArray[ic]->getEnergyDep() * 0.001);
397 input.push_back(TCcotThetaLab[TC - 1] / TMath::Pi());
398 input.push_back(TCPhiLab[TC - 1] / TMath::Pi());
399 input.push_back((eclTrgClusterArray[ic]->getEnergyDep() * 0.001 - 3.5) / 3.5);
400 //}
401 B2DEBUG(50, "InputECL " << ic << " " << tcT << " " << TC << " " << TCcotThetaLab[TC - 1] << " " << TCPhiLab[TC - 1] << " " <<
402 eclTrgClusterArray[ic]->getEnergyDep() << " " << EventTiming);
403 }
404
405 //output
406 bool accepted_signal = false;
407 bool accepted_bg = false;
408 bool accepted_hadron = false;
409 bool accepted_filter = false;
410 bool accepted_bhabha = false;
412 if (result_soft.isValid()) {
413 const std::map<std::string, int>& skim_map = result_soft->getResults();
414 if (skim_map.find("software_trigger_cut&skim&accept_hadronb2") != skim_map.end()) {
415 accepted_hadron = (result_soft->getResult("software_trigger_cut&skim&accept_hadronb2") == SoftwareTriggerCutResult::c_accept);
416 }
417 if (skim_map.find("software_trigger_cut&filter&total_result") != skim_map.end()) {
418 accepted_filter = (result_soft->getResult("software_trigger_cut&filter&total_result") == SoftwareTriggerCutResult::c_accept);
419 }
420 if (skim_map.find("software_trigger_cut&skim&accept_bhabha") != skim_map.end()) {
421 accepted_bhabha = (result_soft->getResult("software_trigger_cut&skim&accept_bhabha") == SoftwareTriggerCutResult::c_accept);
422 }
423 }
424
425 accepted_signal = accepted_hadron && accepted_filter;
426 accepted_bg = !accepted_filter;
427
428 //input and output for NN training
429 int cdc_sector = cdc2d_phi.size();
430 int ecl_sector = selTC.size();
431 int isector = cdc_sector * n_ecl_sector + ecl_sector;
432 B2DEBUG(50, "Input " << cdc_sector << " " << ecl_sector << " " << accepted_signal << " " << accepted_bg);
433 if (accepted_signal
434 && !accepted_filter)B2DEBUG(50, "Input " << cdc_sector << " " << ecl_sector << " " << accepted_signal << " " << accepted_filter <<
435 " " << accepted_bhabha);
436
437 if (accepted_signal) {
438 output.push_back(1);
439 } else if (accepted_bg) {
440 scale_bg[isector]++;
441 if (isector == 3) {
442 if (scale_bg[isector] == 100) {
443 output.push_back(-1);
444 scale_bg[isector] = 1;
445 } else return;
446 }
447 if (isector == 4) {
448 if (scale_bg[isector] == 5) {
449 output.push_back(-1);
450 scale_bg[isector] = 1;
451 } else return;
452 } else {
453 output.push_back(-1);
454 }
455 } else {
456 return;
457 }
458
459
460
461 if (cdc_sector < n_cdc_sector && ecl_sector < n_ecl_sector) {
462 m_trainSets[isector].addSample(input, output);
463 if (m_saveDebug) {
464 if (accepted_signal) {
465 for (int i = 0; i < cdc_sector; i++) h_cdc2d_phi_sig[isector]->Fill(cdc2d_phi[i]);
466 for (int i = 0; i < cdc_sector; i++) h_cdc2d_pt_sig[isector]->Fill(cdc2d_pt[i]);
467 for (int i = 0; i < ecl_sector; i++) h_selTheta_sig[isector]->Fill(selTheta[i]);
468 for (int i = 0; i < ecl_sector; i++) h_selPhi_sig[isector]->Fill(selPhi[i]);
469 for (int i = 0; i < ecl_sector; i++) h_selE_sig[isector]->Fill(selE[i]);
470 h_ncdcf_sig[0]->Fill(n_cdcf);
471 h_ncdcs_sig[0]->Fill(n_cdcs);
472 h_ncdci_sig[0]->Fill(n_cdci);
473 h_ncdc_sig[0]->Fill(n_cdc);
474 h_necl_sig[0]->Fill(ecl_sector);
475 } else if (accepted_bg) {
476 for (int i = 0; i < cdc_sector; i++) h_cdc2d_phi_bg[isector]->Fill(cdc2d_phi[i]);
477 for (int i = 0; i < cdc_sector; i++) h_cdc2d_pt_bg[isector]->Fill(cdc2d_pt[i]);
478 for (int i = 0; i < ecl_sector; i++) h_selTheta_bg[isector]->Fill(selTheta[i]);
479 for (int i = 0; i < ecl_sector; i++) h_selPhi_bg[isector]->Fill(selPhi[i]);
480 for (int i = 0; i < ecl_sector; i++) h_selE_bg[isector]->Fill(selE[i]);
481 h_ncdcf_bg[0]->Fill(n_cdcf);
482 h_ncdcs_bg[0]->Fill(n_cdcs);
483 h_ncdci_bg[0]->Fill(n_cdci);
484 h_ncdc_bg[0]->Fill(n_cdc);
485 h_necl_bg[0]->Fill(ecl_sector);
486 }
487 }
488 }
489}
int n_cdc_sector
Number of CDC sectors.
int n_ecl_sector
Number of ECL sectors.
std::vector< TH1D * > h_cdc2d_phi_sig
Histograms for monitoring.
std::vector< GRLMLPData > m_trainSets
Sets of training data for all sectors.
std::vector< int > scale_bg
BG scale factor for training.
Accessor to arrays stored in the data store.
Definition: StoreArray.h:113
int getEntries() const
Get the number of objects in the array.
Definition: StoreArray.h:216
Type-safe access to single objects in the data store.
Definition: StoreObjPtr.h:96
bool isValid() const
Check whether the object was created.
Definition: StoreObjPtr.h:111
@ c_accept
Accept this event.

◆ exposePythonAPI()

void exposePythonAPI ( )
staticinherited

Exposes methods of the Module class to Python.

Definition at line 325 of file Module.cc.

326{
327 // to avoid confusion between std::arg and boost::python::arg we want a shorthand namespace as well
328 namespace bp = boost::python;
329
330 docstring_options options(true, true, false); //userdef, py sigs, c++ sigs
331
332 void (Module::*setReturnValueInt)(int) = &Module::setReturnValue;
333
334 enum_<Module::EAfterConditionPath>("AfterConditionPath",
335 R"(Determines execution behaviour after a conditional path has been executed:
336
337.. attribute:: END
338
339 End processing of this path after the conditional path. (this is the default for if_value() etc.)
340
341.. attribute:: CONTINUE
342
343 After the conditional path, resume execution after this module.)")
344 .value("END", Module::EAfterConditionPath::c_End)
345 .value("CONTINUE", Module::EAfterConditionPath::c_Continue)
346 ;
347
348 /* Do not change the names of >, <, ... we use them to serialize conditional pathes */
349 enum_<Belle2::ModuleCondition::EConditionOperators>("ConditionOperator")
356 ;
357
358 enum_<Module::EModulePropFlags>("ModulePropFlags",
359 R"(Flags to indicate certain low-level features of modules, see :func:`Module.set_property_flags()`, :func:`Module.has_properties()`. Most useful flags are:
360
361.. attribute:: PARALLELPROCESSINGCERTIFIED
362
363 This module can be run in parallel processing mode safely (All I/O must be done through the data store, in particular, the module must not write any files.)
364
365.. attribute:: HISTOGRAMMANAGER
366
367 This module is used to manage histograms accumulated by other modules
368
369.. attribute:: TERMINATEINALLPROCESSES
370
371 When using parallel processing, call this module's terminate() function in all processes. This will also ensure that there is exactly one process (single-core if no parallel modules found) or at least one input, one main and one output process.
372)")
373 .value("INPUT", Module::EModulePropFlags::c_Input)
374 .value("OUTPUT", Module::EModulePropFlags::c_Output)
375 .value("PARALLELPROCESSINGCERTIFIED", Module::EModulePropFlags::c_ParallelProcessingCertified)
376 .value("HISTOGRAMMANAGER", Module::EModulePropFlags::c_HistogramManager)
377 .value("INTERNALSERIALIZER", Module::EModulePropFlags::c_InternalSerializer)
378 .value("TERMINATEINALLPROCESSES", Module::EModulePropFlags::c_TerminateInAllProcesses)
379 ;
380
381 //Python class definition
382 class_<Module, PyModule> module("Module", R"(
383Base class for Modules.
384
385A module is the smallest building block of the framework.
386A typical event processing chain consists of a Path containing
387modules. By inheriting from this base class, various types of
388modules can be created. To use a module, please refer to
389:func:`Path.add_module()`. A list of modules is available by running
390``basf2 -m`` or ``basf2 -m package``, detailed information on parameters is
391given by e.g. ``basf2 -m RootInput``.
392
393The 'Module Development' section in the manual provides detailed information
394on how to create modules, setting parameters, or using return values/conditions:
395https://xwiki.desy.de/xwiki/rest/p/f4fa4/#HModuleDevelopment
396
397)");
398 module
399 .def("__str__", &Module::getPathString)
400 .def("name", &Module::getName, return_value_policy<copy_const_reference>(),
401 "Returns the name of the module. Can be changed via :func:`set_name() <Module.set_name()>`, use :func:`type() <Module.type()>` for identifying a particular module class.")
402 .def("type", &Module::getType, return_value_policy<copy_const_reference>(),
403 "Returns the type of the module (i.e. class name minus 'Module')")
404 .def("set_name", &Module::setName, args("name"), R"(
405Set custom name, e.g. to distinguish multiple modules of the same type.
406
407>>> path.add_module('EventInfoSetter')
408>>> ro = path.add_module('RootOutput', branchNames=['EventMetaData'])
409>>> ro.set_name('RootOutput_metadata_only')
410>>> print(path)
411[EventInfoSetter -> RootOutput_metadata_only]
412
413)")
414 .def("description", &Module::getDescription, return_value_policy<copy_const_reference>(),
415 "Returns the description of this module.")
416 .def("package", &Module::getPackage, return_value_policy<copy_const_reference>(),
417 "Returns the package this module belongs to.")
418 .def("available_params", &_getParamInfoListPython,
419 "Return list of all module parameters as `ModuleParamInfo` instances")
420 .def("has_properties", &Module::hasProperties, (bp::arg("properties")),
421 R"DOCSTRING(Allows to check if the module has the given properties out of `ModulePropFlags` set.
422
423>>> if module.has_properties(ModulePropFlags.PARALLELPROCESSINGCERTIFIED):
424>>> ...
425
426Parameters:
427 properties (int): bitmask of `ModulePropFlags` to check for.
428)DOCSTRING")
429 .def("set_property_flags", &Module::setPropertyFlags, args("property_mask"),
430 "Set module properties in the form of an OR combination of `ModulePropFlags`.");
431 {
432 // python signature is too crowded, make ourselves
433 docstring_options subOptions(true, false, false); //userdef, py sigs, c++ sigs
434 module
435 .def("if_value", &Module::if_value,
436 (bp::arg("expression"), bp::arg("condition_path"), bp::arg("after_condition_path")= Module::EAfterConditionPath::c_End),
437 R"DOCSTRING(if_value(expression, condition_path, after_condition_path=AfterConditionPath.END)
438
439Sets a conditional sub path which will be executed after this
440module if the return value set in the module passes the given ``expression``.
441
442Modules can define a return value (int or bool) using ``setReturnValue()``,
443which can be used in the steering file to split the Path based on this value, for example
444
445>>> module_with_condition.if_value("<1", another_path)
446
447In case the return value of the ``module_with_condition`` for a given event is
448less than 1, the execution will be diverted into ``another_path`` for this event.
449
450You could for example set a special return value if an error occurs, and divert
451the execution into a path containing :b2:mod:`RootOutput` if it is found;
452saving only the data producing/produced by the error.
453
454After a conditional path has executed, basf2 will by default stop processing
455the path for this event. This behaviour can be changed by setting the
456``after_condition_path`` argument.
457
458Parameters:
459 expression (str): Expression to determine if the conditional path should be executed.
460 This should be one of the comparison operators ``<``, ``>``, ``<=``,
461 ``>=``, ``==``, or ``!=`` followed by a numerical value for the return value
462 condition_path (Path): path to execute in case the expression is fulfilled
463 after_condition_path (AfterConditionPath): What to do once the ``condition_path`` has been executed.
464)DOCSTRING")
465 .def("if_false", &Module::if_false,
466 (bp::arg("condition_path"), bp::arg("after_condition_path")= Module::EAfterConditionPath::c_End),
467 R"DOC(if_false(condition_path, after_condition_path=AfterConditionPath.END)
468
469Sets a conditional sub path which will be executed after this module if
470the return value of the module evaluates to False. This is equivalent to
471calling `if_value` with ``expression=\"<1\"``)DOC")
472 .def("if_true", &Module::if_true,
473 (bp::arg("condition_path"), bp::arg("after_condition_path")= Module::EAfterConditionPath::c_End),
474 R"DOC(if_true(condition_path, after_condition_path=AfterConditionPath.END)
475
476Sets a conditional sub path which will be executed after this module if
477the return value of the module evaluates to True. It is equivalent to
478calling `if_value` with ``expression=\">=1\"``)DOC");
479 }
480 module
481 .def("has_condition", &Module::hasCondition,
482 "Return true if a conditional path has been set for this module "
483 "using `if_value`, `if_true` or `if_false`")
484 .def("get_all_condition_paths", &_getAllConditionPathsPython,
485 "Return a list of all conditional paths set for this module using "
486 "`if_value`, `if_true` or `if_false`")
487 .def("get_all_conditions", &_getAllConditionsPython,
488 "Return a list of all conditional path expressions set for this module using "
489 "`if_value`, `if_true` or `if_false`")
490 .add_property("logging", make_function(&Module::getLogConfig, return_value_policy<reference_existing_object>()),
@ c_GE
Greater or equal than: ">=".
@ c_SE
Smaller or equal than: "<=".
@ c_GT
Greater than: ">"
@ c_NE
Not equal: "!=".
@ c_EQ
Equal: "=" or "=="
@ c_ST
Smaller than: "<"
Base class for Modules.
Definition: Module.h:72
LogConfig & getLogConfig()
Returns the log system configuration.
Definition: Module.h:225
void if_value(const std::string &expression, const std::shared_ptr< Path > &path, EAfterConditionPath afterConditionPath=EAfterConditionPath::c_End)
Add a condition to the module.
Definition: Module.cc:79
void setPropertyFlags(unsigned int propertyFlags)
Sets the flags for the module properties.
Definition: Module.cc:208
void if_true(const std::shared_ptr< Path > &path, EAfterConditionPath afterConditionPath=EAfterConditionPath::c_End)
A simplified version to set the condition of the module.
Definition: Module.cc:90
void setReturnValue(int value)
Sets the return value for this module as integer.
Definition: Module.cc:220
void setLogConfig(const LogConfig &logConfig)
Set the log system configuration.
Definition: Module.h:230
const std::string & getDescription() const
Returns the description of the module.
Definition: Module.h:202
void if_false(const std::shared_ptr< Path > &path, EAfterConditionPath afterConditionPath=EAfterConditionPath::c_End)
A simplified version to add a condition to the module.
Definition: Module.cc:85
bool hasCondition() const
Returns true if at least one condition was set for the module.
Definition: Module.h:311
const std::string & getPackage() const
Returns the package this module is in.
Definition: Module.h:197
void setName(const std::string &name)
Set the name of the module.
Definition: Module.h:214
bool hasProperties(unsigned int propertyFlags) const
Returns true if all specified property flags are available in this module.
Definition: Module.cc:160
std::string getPathString() const override
return the module name.
Definition: Module.cc:192

◆ getAfterConditionPath()

Module::EAfterConditionPath getAfterConditionPath ( ) const
inherited

What to do after the conditional path is finished.

(defaults to c_End if no condition is set)

Definition at line 133 of file Module.cc.

134{
135 if (m_conditions.empty()) return EAfterConditionPath::c_End;
136
137 //okay, a condition was set for this Module...
138 if (!m_hasReturnValue) {
139 B2FATAL("A condition was set for '" << getName() << "', but the module did not set a return value!");
140 }
141
142 for (const auto& condition : m_conditions) {
143 if (condition.evaluate(m_returnValue)) {
144 return condition.getAfterConditionPath();
145 }
146 }
147
148 return EAfterConditionPath::c_End;
149}

◆ getAllConditionPaths()

std::vector< std::shared_ptr< Path > > getAllConditionPaths ( ) const
inherited

Return all condition paths currently set (no matter if the condition is true or not).

Definition at line 150 of file Module.cc.

151{
152 std::vector<std::shared_ptr<Path>> allConditionPaths;
153 for (const auto& condition : m_conditions) {
154 allConditionPaths.push_back(condition.getPath());
155 }
156
157 return allConditionPaths;
158}

◆ getAllConditions()

const std::vector< ModuleCondition > & getAllConditions ( ) const
inlineinherited

Return all set conditions for this module.

Definition at line 324 of file Module.h.

325 {
326 return m_conditions;
327 }

◆ getCondition()

const ModuleCondition * getCondition ( ) const
inlineinherited

Return a pointer to the first condition (or nullptr, if none was set)

Definition at line 314 of file Module.h.

315 {
316 if (m_conditions.empty()) {
317 return nullptr;
318 } else {
319 return &m_conditions.front();
320 }
321 }

◆ getConditionPath()

std::shared_ptr< Path > getConditionPath ( ) const
inherited

Returns the path of the last true condition (if there is at least one, else reaturn a null pointer).


Definition at line 113 of file Module.cc.

114{
115 PathPtr p;
116 if (m_conditions.empty()) return p;
117
118 //okay, a condition was set for this Module...
119 if (!m_hasReturnValue) {
120 B2FATAL("A condition was set for '" << getName() << "', but the module did not set a return value!");
121 }
122
123 for (const auto& condition : m_conditions) {
124 if (condition.evaluate(m_returnValue)) {
125 return condition.getPath();
126 }
127 }
128
129 // if none of the conditions were true, return a null pointer.
130 return p;
131}
std::shared_ptr< Path > PathPtr
Defines a pointer to a path object as a boost shared pointer.
Definition: Path.h:35

◆ getDescription()

const std::string & getDescription ( ) const
inlineinherited

Returns the description of the module.

Definition at line 202 of file Module.h.

202{return m_description;}
std::string m_description
The description of the module.
Definition: Module.h:511

◆ getFileNames()

virtual std::vector< std::string > getFileNames ( bool  outputFiles)
inlinevirtualinherited

Return a list of output filenames for this modules.

This will be called when basf2 is run with "--dry-run" if the module has set either the c_Input or c_Output properties.

If the parameter outputFiles is false (for modules with c_Input) the list of input filenames should be returned (if any). If outputFiles is true (for modules with c_Output) the list of output files should be returned (if any).

If a module has sat both properties this member is called twice, once for each property.

The module should return the actual list of requested input or produced output filenames (including handling of input/output overrides) so that the grid system can handle input/output files correctly.

This function should return the same value when called multiple times. This is especially important when taking the input/output overrides from Environment as they get consumed when obtained so the finalized list of output files should be stored for subsequent calls.

Reimplemented in RootInputModule, StorageRootOutputModule, and RootOutputModule.

Definition at line 134 of file Module.h.

135 {
136 return std::vector<std::string>();
137 }

◆ getLogConfig()

LogConfig & getLogConfig ( )
inlineinherited

Returns the log system configuration.

Definition at line 225 of file Module.h.

225{return m_logConfig;}

◆ getModules()

std::list< ModulePtr > getModules ( ) const
inlineoverrideprivatevirtualinherited

no submodules, return empty list

Implements PathElement.

Definition at line 506 of file Module.h.

506{ return std::list<ModulePtr>(); }

◆ getName()

const std::string & getName ( ) const
inlineinherited

Returns the name of the module.

This can be changed via e.g. set_name() in the steering file to give more useful names if there is more than one module of the same type.

For identifying the type of a module, using getType() (or type() in Python) is recommended.

Definition at line 187 of file Module.h.

187{return m_name;}
std::string m_name
The name of the module, saved as a string (user-modifiable)
Definition: Module.h:508

◆ getPackage()

const std::string & getPackage ( ) const
inlineinherited

Returns the package this module is in.

Definition at line 197 of file Module.h.

197{return m_package;}

◆ getParamInfoListPython()

std::shared_ptr< boost::python::list > getParamInfoListPython ( ) const
inherited

Returns a python list of all parameters.

Each item in the list consists of the name of the parameter, a string describing its type, a python list of all default values and the description of the parameter.

Returns
A python list containing the parameters of this parameter list.

Definition at line 279 of file Module.cc.

280{
282}
std::shared_ptr< boost::python::list > getParamInfoListPython() const
Returns a python list of all parameters.
ModuleParamList m_moduleParamList
List storing and managing all parameter of the module.
Definition: Module.h:516

◆ getParamList()

const ModuleParamList & getParamList ( ) const
inlineinherited

Return module param list.

Definition at line 363 of file Module.h.

363{ return m_moduleParamList; }

◆ getPathString()

std::string getPathString ( ) const
overrideprivatevirtualinherited

return the module name.

Implements PathElement.

Definition at line 192 of file Module.cc.

193{
194
195 std::string output = getName();
196
197 for (const auto& condition : m_conditions) {
198 output += condition.getString();
199 }
200
201 return output;
202}

◆ getReturnValue()

int getReturnValue ( ) const
inlineinherited

Return the return value set by this module.

This value is only meaningful if hasReturnValue() is true

Definition at line 381 of file Module.h.

381{ return m_returnValue; }

◆ getType()

const std::string & getType ( ) const
inherited

Returns the type of the module (i.e.

class name minus 'Module')

Definition at line 41 of file Module.cc.

42{
43 if (m_type.empty())
44 B2FATAL("Module type not set for " << getName());
45 return m_type;
46}
std::string m_type
The type of the module, saved as a string.
Definition: Module.h:509

◆ hasCondition()

bool hasCondition ( ) const
inlineinherited

Returns true if at least one condition was set for the module.

Definition at line 311 of file Module.h.

311{ return not m_conditions.empty(); };

◆ hasProperties()

bool hasProperties ( unsigned int  propertyFlags) const
inherited

Returns true if all specified property flags are available in this module.

Parameters
propertyFlagsOred EModulePropFlags which should be compared with the module flags.

Definition at line 160 of file Module.cc.

161{
162 return (propertyFlags & m_propertyFlags) == propertyFlags;
163}

◆ hasReturnValue()

bool hasReturnValue ( ) const
inlineinherited

Return true if this module has a valid return value set.

Definition at line 378 of file Module.h.

378{ return m_hasReturnValue; }

◆ hasUnsetForcedParams()

bool hasUnsetForcedParams ( ) const
inherited

Returns true and prints error message if the module has unset parameters which the user has to set in the steering file.

Definition at line 166 of file Module.cc.

167{
169 std::string allMissing = "";
170 for (const auto& s : missing)
171 allMissing += s + " ";
172 if (!missing.empty())
173 B2ERROR("The following required parameters of Module '" << getName() << "' were not specified: " << allMissing <<
174 "\nPlease add them to your steering file.");
175 return !missing.empty();
176}
std::vector< std::string > getUnsetForcedParams() const
Returns list of unset parameters (if they are required to have a value.

◆ if_false()

void if_false ( const std::shared_ptr< Path > &  path,
EAfterConditionPath  afterConditionPath = EAfterConditionPath::c_End 
)
inherited

A simplified version to add a condition to the module.

Please note that successive calls of this function will add more than one condition to the module. If more than one condition results in true, only the last of them will be used.

Please be careful: Avoid creating cyclic paths, e.g. by linking a condition to a path which is processed before the path where this module is located in.

It is equivalent to the if_value() method, using the expression "<1". This method is meant to be used together with the setReturnValue(bool value) method.

Parameters
pathShared pointer to the Path which will be executed if the return value is false.
afterConditionPathWhat to do after executing 'path'.

Definition at line 85 of file Module.cc.

86{
87 if_value("<1", path, afterConditionPath);
88}

◆ if_true()

void if_true ( const std::shared_ptr< Path > &  path,
EAfterConditionPath  afterConditionPath = EAfterConditionPath::c_End 
)
inherited

A simplified version to set the condition of the module.

Please note that successive calls of this function will add more than one condition to the module. If more than one condition results in true, only the last of them will be used.

Please be careful: Avoid creating cyclic paths, e.g. by linking a condition to a path which is processed before the path where this module is located in.

It is equivalent to the if_value() method, using the expression ">=1". This method is meant to be used together with the setReturnValue(bool value) method.

Parameters
pathShared pointer to the Path which will be executed if the return value is true.
afterConditionPathWhat to do after executing 'path'.

Definition at line 90 of file Module.cc.

91{
92 if_value(">=1", path, afterConditionPath);
93}

◆ if_value()

void if_value ( const std::string &  expression,
const std::shared_ptr< Path > &  path,
EAfterConditionPath  afterConditionPath = EAfterConditionPath::c_End 
)
inherited

Add a condition to the module.

Please note that successive calls of this function will add more than one condition to the module. If more than one condition results in true, only the last of them will be used.

See https://xwiki.desy.de/xwiki/rest/p/a94f2 or ModuleCondition for a description of the syntax.

Please be careful: Avoid creating cyclic paths, e.g. by linking a condition to a path which is processed before the path where this module is located in.

Parameters
expressionThe expression of the condition.
pathShared pointer to the Path which will be executed if the condition is evaluated to true.
afterConditionPathWhat to do after executing 'path'.

Definition at line 79 of file Module.cc.

80{
81 m_conditions.emplace_back(expression, path, afterConditionPath);
82}

◆ initialize()

void initialize ( void  )
overridevirtual

Initialize the module.

Initialize the networks and register datastore objects.

Reimplemented from Module.

Definition at line 141 of file GRLNeuroTrainerModule.cc.

142{
143 //initialize with input parameter
148 m_trainSets.clear();
149 for (unsigned iMLP = 0; iMLP < (unsigned)n_sector; ++iMLP) {
150 m_trainSets.push_back(GRLMLPData());
151 scale_bg.push_back(0);
152 }
153 if (m_nTrainMin > m_nTrainMax) {
155 B2WARNING("nTrainMin set to " << m_nTrainMin << " (was larger than nTrainMax)");
156 }
157
158 //initializa histograms
159 for (int isector = 0; isector < n_sector; isector++) {
160 h_cdc2d_phi_sig .push_back(new TH1D(("h_cdc2d_phi_sig_" + to_string(isector)).c_str(),
161 ("h_cdc2d_phi_sig_" + to_string(isector)).c_str(), 64, -3.2, 3.2));
162 h_cdc2d_pt_sig .push_back(new TH1D(("h_cdc2d_pt_sig_" + to_string(isector)).c_str(),
163 ("h_cdc2d_pt_sig_" + to_string(isector)).c_str(), 100, -5, 5));
164 h_selTheta_sig.push_back(new TH1D(("h_selTheta_sig_" + to_string(isector)).c_str(),
165 ("h_selTheta_sig_" + to_string(isector)).c_str(), 64, -3.2, 3.2));
166 h_selPhi_sig .push_back(new TH1D(("h_selPhi_sig_" + to_string(isector)).c_str(), ("h_selPhi_sig_" + to_string(isector)).c_str(),
167 64, -3.2, 3.2));
168 h_selE_sig .push_back(new TH1D(("h_selE_sig_" + to_string(isector)).c_str(), ("h_selE_sig_" + to_string(isector)).c_str(),
169 100, 0, 10));
170 h_result_sig .push_back(new TH1D(("h_result_sig_" + to_string(isector)).c_str(), ("h_result_sig_" + to_string(isector)).c_str(),
171 100, -1, 1));
172 h_cdc2d_phi_bg .push_back(new TH1D(("h_cdc2d_phi_bg_" + to_string(isector)).c_str(),
173 ("h_cdc2d_phi_bg_" + to_string(isector)).c_str(), 64, -3.2, 3.2));
174 h_cdc2d_pt_bg .push_back(new TH1D(("h_cdc2d_pt_bg_" + to_string(isector)).c_str(),
175 ("h_cdc2d_pt_bg_" + to_string(isector)).c_str(), 100, -5, 5));
176 h_selTheta_bg .push_back(new TH1D(("h_selTheta_bg_" + to_string(isector)).c_str(), ("h_selTheta_bg_" + to_string(isector)).c_str(),
177 64, -3.2, 3.2));
178 h_selPhi_bg .push_back(new TH1D(("h_selPhi_bg_" + to_string(isector)).c_str(), ("h_selPhi_bg_" + to_string(isector)).c_str(),
179 64, -3.2, 3.2));
180 h_selE_bg .push_back(new TH1D(("h_selE_bg_" + to_string(isector)).c_str(), ("h_selE_bg_" + to_string(isector)).c_str(),
181 100, 0, 10));
182 h_result_bg .push_back(new TH1D(("h_result_bg_" + to_string(isector)).c_str(), ("h_result_bg_" + to_string(isector)).c_str(),
183 100, -1, 1));
184 }
185 h_ncdcf_sig.push_back(new TH1D("h_ncdcf_sig", "h_ncdcf_sig", 10, 0, 10));
186 h_ncdcs_sig.push_back(new TH1D("h_ncdcs_sig", "h_ncdcs_sig", 10, 0, 10));
187 h_ncdci_sig.push_back(new TH1D("h_ncdci_sig", "h_ncdci_sig", 10, 0, 10));
188 h_ncdc_sig.push_back(new TH1D("h_ncdc_sig", "h_ncdc_sig", 10, 0, 10));
189 h_necl_sig.push_back(new TH1D("h_necl_sig", "h_necl_sig", 10, 0, 10));
190 h_ncdcf_bg.push_back(new TH1D("h_ncdcf_bg", "h_ncdcf_bg", 10, 0, 10));
191 h_ncdcs_bg.push_back(new TH1D("h_ncdcs_bg", "h_ncdcs_bg", 10, 0, 10));
192 h_ncdci_bg.push_back(new TH1D("h_ncdci_bg", "h_ncdci_bg", 10, 0, 10));
193 h_ncdc_bg.push_back(new TH1D("h_ncdc_bg", "h_ncdc_bg", 10, 0, 10));
194 h_necl_bg.push_back(new TH1D("h_necl_bg", "h_necl_bg", 10, 0, 10));
195
196 //..Trigger ThetaID for each trigger cell. Could be replaced by getMaxThetaId() for newer MC
197 TrgEclMapping* trgecl_obj = new TrgEclMapping();
198 for (int tc = 1; tc <= 576; tc++) {
199 TCThetaID.push_back(trgecl_obj->getTCThetaIdFromTCId(tc));
200 }
201
202 //-----------------------------------------------------------------------------------------
203 //..ECL look up tables
204 PCmsLabTransform boostrotate;
205 for (int tc = 1; tc <= 576; tc++) {
206
207 //..Four vector of a 1 GeV lab photon at this TC
208 ROOT::Math::XYZVector CellPosition = trgecl_obj->getTCPosition(tc);
209 ROOT::Math::PxPyPzEVector CellLab;
210 CellLab.SetPx(CellPosition.Unit().X());
211 CellLab.SetPy(CellPosition.Unit().Y());
212 CellLab.SetPz(CellPosition.Unit().Z());
213 CellLab.SetE(1.);
214
215 //..cotan Theta and phi in lab
216 TCPhiLab.push_back(CellPosition.Phi());
217 double tantheta = tan(CellPosition.Theta());
218 TCcotThetaLab.push_back(1. / tantheta);
219
220 //..Corresponding 4 vector in the COM frame
221 ROOT::Math::PxPyPzEVector CellCOM = boostrotate.rotateLabToCms() * CellLab;
222 TCThetaCOM.push_back(CellCOM.Theta()*TMath::RadToDeg());
223 TCPhiCOM.push_back(CellCOM.Phi()*TMath::RadToDeg());
224
225 //..Scale to give 1 GeV in the COM frame
226 TC1GeV.push_back(1. / CellCOM.E());
227 }
228
229 delete trgecl_obj;
230}
Struct for training data of a single MLP for the neuro trigger.
Definition: GRLMLPData.h:20
GRLNeuro m_GRLNeuro
Instance of the NeuroTrigger.
int n_sector
Number of Total sectors.
void initialize(const Parameters &p)
Set parameters and get some network independent parameters.
Definition: GRLNeuro.cc:29
Class to hold Lorentz transformations from/to CMS and boost vector.
const ROOT::Math::LorentzRotation rotateLabToCms() const
Returns Lorentz transformation from Lab to CMS.
A class of TC Mapping.
Definition: TrgEclMapping.h:26
int getTCThetaIdFromTCId(int)
get [TC Theta ID] from [TC ID]
ROOT::Math::XYZVector getTCPosition(int)
TC position (cm)

◆ loadTraindata()

bool loadTraindata ( const std::string &  filename,
const std::string &  arrayname = "trainSets" 
)

Load saved training samples.

Parameters
filenamename of the TFile to read from
arraynamename of the TObjArray holding the training samples in the file
Returns
true if the training sets were loaded correctly

◆ saveTraindata()

void saveTraindata ( const std::string &  filename,
const std::string &  arrayname = "trainSets" 
)

Save all training samples.

Parameters
filenamename of the TFile to write to
arraynamename of the TObjArray holding the training samples in the file

Definition at line 653 of file GRLNeuroTrainerModule.cc.

654{
655 B2INFO("Saving traindata to file " << filename << ", array " << arrayname);
656 TFile datafile(filename.c_str(), "RECREATE");
657 //TObjArray* trainSets = new TObjArray(m_trainSets.size());
658 for (int isector = 0; isector < n_sector; ++isector) {
659 //trainSets->Add(&m_trainSets[isector]);
660 if (m_saveDebug) {
661 h_cdc2d_phi_sig[isector]->Write();
662 h_cdc2d_pt_sig[isector]->Write();
663 h_selTheta_sig[isector]->Write();
664 h_selPhi_sig[isector]->Write();
665 h_selE_sig[isector]->Write();
666 h_result_sig[isector]->Write();
667 h_cdc2d_phi_bg[isector]->Write();
668 h_cdc2d_pt_bg[isector]->Write();
669 h_selTheta_bg[isector]->Write();
670 h_selPhi_bg[isector]->Write();
671 h_selE_bg[isector]->Write();
672 h_result_bg[isector]->Write();
673 }
674 h_ncdcf_sig[0]->Write();
675 h_ncdcs_sig[0]->Write();
676 h_ncdci_sig[0]->Write();
677 h_ncdc_sig[0]->Write();
678 h_necl_sig[0]->Write();
679 h_ncdcf_bg[0]->Write();
680 h_ncdcs_bg[0]->Write();
681 h_ncdci_bg[0]->Write();
682 h_ncdc_bg[0]->Write();
683 h_necl_bg[0]->Write();
684 }
685 //trainSets->Write(arrayname.c_str(), TObject::kSingleKey | TObject::kOverwrite);
686 //datafile.Close();
687 //trainSets->Clear();
688 //delete trainSets;
689 for (int isector = 0; isector < n_sector; ++ isector) {
690 delete h_cdc2d_phi_sig[isector];
691 delete h_cdc2d_pt_sig[isector];
692 delete h_selTheta_sig[isector];
693 delete h_selPhi_sig[isector];
694 delete h_selE_sig[isector];
695 delete h_result_sig[isector];
696 delete h_cdc2d_phi_bg[isector];
697 delete h_cdc2d_pt_bg[isector];
698 delete h_selTheta_bg[isector];
699 delete h_selPhi_bg[isector];
700 delete h_selE_bg[isector];
701 delete h_result_bg[isector];
702 }
703 delete h_ncdcf_sig[0];
704 delete h_ncdcs_sig[0];
705 delete h_ncdci_sig[0];
706 delete h_ncdc_sig[0];
707 delete h_necl_sig[0];
708 delete h_ncdcf_bg[0];
709 delete h_ncdcs_bg[0];
710 delete h_ncdci_bg[0];
711 delete h_ncdc_bg[0];
712 delete h_necl_bg[0];
713 h_cdc2d_phi_sig.clear();
714 h_cdc2d_pt_sig.clear();
715 h_selTheta_sig.clear();
716 h_selPhi_sig.clear();
717 h_selE_sig.clear();
718 h_result_sig.clear();
719 h_cdc2d_phi_bg.clear();
720 h_cdc2d_pt_bg.clear();
721 h_selTheta_bg.clear();
722 h_selPhi_bg.clear();
723 h_selE_bg.clear();
724 h_result_bg.clear();
725 h_ncdcf_sig.clear();
726 h_ncdcs_sig.clear();
727 h_ncdci_sig.clear();
728 h_ncdc_sig.clear();
729 h_necl_sig.clear();
730 h_ncdcf_bg.clear();
731 h_ncdcs_bg.clear();
732 h_ncdci_bg.clear();
733 h_necl_bg.clear();
734}

◆ setAbortLevel()

void setAbortLevel ( int  abortLevel)
inherited

Configure the abort log level.

Definition at line 67 of file Module.cc.

68{
69 m_logConfig.setAbortLevel(static_cast<LogConfig::ELogLevel>(abortLevel));
70}
ELogLevel
Definition of the supported log levels.
Definition: LogConfig.h:26
void setAbortLevel(ELogLevel abortLevel)
Configure the abort level.
Definition: LogConfig.h:112

◆ setDebugLevel()

void setDebugLevel ( int  debugLevel)
inherited

Configure the debug messaging level.

Definition at line 61 of file Module.cc.

62{
63 m_logConfig.setDebugLevel(debugLevel);
64}
void setDebugLevel(int debugLevel)
Configure the debug messaging level.
Definition: LogConfig.h:98

◆ setDescription()

void setDescription ( const std::string &  description)
protectedinherited

Sets the description of the module.

Parameters
descriptionA description of the module.

Definition at line 214 of file Module.cc.

215{
216 m_description = description;
217}

◆ setLogConfig()

void setLogConfig ( const LogConfig logConfig)
inlineinherited

Set the log system configuration.

Definition at line 230 of file Module.h.

230{m_logConfig = logConfig;}

◆ setLogInfo()

void setLogInfo ( int  logLevel,
unsigned int  logInfo 
)
inherited

Configure the printed log information for the given level.

Parameters
logLevelThe log level (one of LogConfig::ELogLevel)
logInfoWhat kind of info should be printed? ORed combination of LogConfig::ELogInfo flags.

Definition at line 73 of file Module.cc.

74{
75 m_logConfig.setLogInfo(static_cast<LogConfig::ELogLevel>(logLevel), logInfo);
76}
void setLogInfo(ELogLevel logLevel, unsigned int logInfo)
Configure the printed log information for the given level.
Definition: LogConfig.h:127

◆ setLogLevel()

void setLogLevel ( int  logLevel)
inherited

Configure the log level.

Definition at line 55 of file Module.cc.

56{
57 m_logConfig.setLogLevel(static_cast<LogConfig::ELogLevel>(logLevel));
58}
void setLogLevel(ELogLevel logLevel)
Configure the log level.
Definition: LogConfig.cc:25

◆ setName()

void setName ( const std::string &  name)
inlineinherited

Set the name of the module.

Note
The module name is set when using the REG_MODULE macro, but the module can be renamed before calling process() using the set_name() function in your steering file.
Parameters
nameThe name of the module

Definition at line 214 of file Module.h.

214{ m_name = name; };

◆ setParamList()

void setParamList ( const ModuleParamList params)
inlineprotectedinherited

Replace existing parameter list.

Definition at line 501 of file Module.h.

501{ m_moduleParamList = params; }

◆ setParamPython()

void setParamPython ( const std::string &  name,
const boost::python::object &  pyObj 
)
privateinherited

Implements a method for setting boost::python objects.

The method supports the following types: list, dict, int, double, string, bool The conversion of the python object to the C++ type and the final storage of the parameter value is done in the ModuleParam class.

Parameters
nameThe unique name of the parameter.
pyObjThe object which should be converted and stored as the parameter value.

Definition at line 234 of file Module.cc.

235{
236 LogSystem& logSystem = LogSystem::Instance();
237 logSystem.updateModule(&(getLogConfig()), getName());
238 try {
240 } catch (std::runtime_error& e) {
241 throw std::runtime_error("Cannot set parameter '" + name + "' for module '"
242 + m_name + "': " + e.what());
243 }
244
245 logSystem.updateModule(nullptr);
246}
Class for logging debug, info and error messages.
Definition: LogSystem.h:46
void updateModule(const LogConfig *moduleLogConfig=nullptr, const std::string &moduleName="")
Sets the log configuration to the given module log configuration and sets the module name This method...
Definition: LogSystem.h:191
static LogSystem & Instance()
Static method to get a reference to the LogSystem instance.
Definition: LogSystem.cc:31
void setParamPython(const std::string &name, const PythonObject &pyObj)
Implements a method for setting boost::python objects.

◆ setParamPythonDict()

void setParamPythonDict ( const boost::python::dict &  dictionary)
privateinherited

Implements a method for reading the parameter values from a boost::python dictionary.

The key of the dictionary has to be the name of the parameter and the value has to be of one of the supported parameter types.

Parameters
dictionaryThe python dictionary from which the parameter values are read.

Definition at line 249 of file Module.cc.

250{
251
252 LogSystem& logSystem = LogSystem::Instance();
253 logSystem.updateModule(&(getLogConfig()), getName());
254
255 boost::python::list dictKeys = dictionary.keys();
256 int nKey = boost::python::len(dictKeys);
257
258 //Loop over all keys in the dictionary
259 for (int iKey = 0; iKey < nKey; ++iKey) {
260 boost::python::object currKey = dictKeys[iKey];
261 boost::python::extract<std::string> keyProxy(currKey);
262
263 if (keyProxy.check()) {
264 const boost::python::object& currValue = dictionary[currKey];
265 setParamPython(keyProxy, currValue);
266 } else {
267 B2ERROR("Setting the module parameters from a python dictionary: invalid key in dictionary!");
268 }
269 }
270
271 logSystem.updateModule(nullptr);
272}
void setParamPython(const std::string &name, const boost::python::object &pyObj)
Implements a method for setting boost::python objects.
Definition: Module.cc:234

◆ setPropertyFlags()

void setPropertyFlags ( unsigned int  propertyFlags)
inherited

Sets the flags for the module properties.

Parameters
propertyFlagsbitwise OR of EModulePropFlags

Definition at line 208 of file Module.cc.

209{
210 m_propertyFlags = propertyFlags;
211}

◆ setReturnValue() [1/2]

void setReturnValue ( bool  value)
protectedinherited

Sets the return value for this module as bool.

The bool value is saved as an integer with the convention 1 meaning true and 0 meaning false. The value can be used in the steering file to divide the analysis chain into several paths.

Parameters
valueThe value of the return value.

Definition at line 227 of file Module.cc.

228{
229 m_hasReturnValue = true;
230 m_returnValue = value;
231}

◆ setReturnValue() [2/2]

void setReturnValue ( int  value)
protectedinherited

Sets the return value for this module as integer.

The value can be used in the steering file to divide the analysis chain into several paths.

Parameters
valueThe value of the return value.

Definition at line 220 of file Module.cc.

221{
222 m_hasReturnValue = true;
223 m_returnValue = value;
224}

◆ setType()

void setType ( const std::string &  type)
protectedinherited

Set the module type.

Only for use by internal modules (which don't use the normal REG_MODULE mechanism).

Definition at line 48 of file Module.cc.

49{
50 if (!m_type.empty())
51 B2FATAL("Trying to change module type from " << m_type << " is not allowed, the value is assumed to be fixed.");
52 m_type = type;
53}

◆ terminate()

void terminate ( void  )
overridevirtual

Do the training for all sectors.

Reimplemented from Module.

Definition at line 492 of file GRLNeuroTrainerModule.cc.

493{
494 // do training for all sectors with sufficient training samples
495 for (unsigned isector = 0; isector < m_GRLNeuro.nSectors(); ++isector) {
496 // skip sectors that have already been trained
497 if (m_GRLNeuro[isector].isTrained())
498 continue;
499 float nTrainMin = m_multiplyNTrain ? m_nTrainMin * m_GRLNeuro[isector].getNumberOfWeights() : m_nTrainMin;
500 std::cout << m_nTrainMin << " " << m_nValid << " " << m_nTest << std::endl;
501 if (m_trainSets[isector].getNumberOfSamples() < (nTrainMin + m_nValid + m_nTest)) {
502 B2WARNING("Not enough training samples for sector " << isector << " (" << (nTrainMin + m_nValid + m_nTest)
503 << " requested, " << m_trainSets[isector].getNumberOfSamples() << " found)");
504 continue;
505 }
506 train(isector);
507 m_GRLNeuro[isector].Trained(true);
508 // save all networks (including the newly trained)
509 //m_GRLNeuro.save(m_filename, m_arrayname);
510 }
511
512 // save the training data
514}
void train(unsigned isector)
Train a single MLP.
void saveTraindata(const std::string &filename, const std::string &arrayname="trainSets")
Save all training samples.
unsigned nSectors() const
return number of neural networks
Definition: GRLNeuro.h:86

◆ train()

void train ( unsigned  isector)

Train a single MLP.

Definition at line 519 of file GRLNeuroTrainerModule.cc.

520{
521#ifdef HAS_OPENMP
522 B2INFO("Training network for sector " << isector << " with OpenMP");
523#else
524 B2INFO("Training network for sector " << isector << " without OpenMP");
525#endif
526 // initialize network
527 unsigned nLayers = m_GRLNeuro[isector].getNumberOfLayers();
528 unsigned* nNodes = new unsigned[nLayers];
529 for (unsigned il = 0; il < nLayers; ++il) {
530 nNodes[il] = m_GRLNeuro[isector].getNumberOfNodesLayer(il);
531 }
532 struct fann* ann = fann_create_standard_array(nLayers, nNodes);
533 // initialize training and validation data
534 GRLMLPData currentData = m_trainSets[isector];
535 // train set
536 unsigned nTrain = m_trainSets[isector].getNumberOfSamples() - m_nValid - m_nTest;
537 struct fann_train_data* train_data =
538 fann_create_train(nTrain, nNodes[0], nNodes[nLayers - 1]);
539 for (unsigned i = 0; i < nTrain; ++i) {
540 vector<float> input = currentData.getInput(i);
541 for (unsigned j = 0; j < input.size(); ++j) {
542 train_data->input[i][j] = input[j];
543 }
544 vector<float> target = currentData.getTarget(i);
545 for (unsigned j = 0; j < target.size(); ++j) {
546 train_data->output[i][j] = target[j];
547 }
548 }
549 // validation set
550 struct fann_train_data* valid_data =
551 fann_create_train(m_nValid, nNodes[0], nNodes[nLayers - 1]);
552 for (unsigned i = nTrain; i < nTrain + m_nValid; ++i) {
553 vector<float> input = currentData.getInput(i);
554 for (unsigned j = 0; j < input.size(); ++j) {
555 valid_data->input[i - nTrain][j] = input[j];
556 }
557 vector<float> target = currentData.getTarget(i);
558 for (unsigned j = 0; j < target.size(); ++j) {
559 valid_data->output[i - nTrain][j] = target[j];
560 }
561 }
562 // set network parameters
563 fann_set_activation_function_hidden(ann, FANN_SIGMOID_SYMMETRIC);
564 fann_set_activation_function_output(ann, FANN_SIGMOID_SYMMETRIC);
565 fann_set_training_algorithm(ann, FANN_TRAIN_RPROP);
566 // keep train error of optimum for all runs
567 vector<double> trainOptLog = {};
568 vector<double> validOptLog = {};
569 // repeat training several times with different random start weights
570 for (int irun = 0; irun < m_repeatTrain; ++irun) {
571 double bestValid = 999.;
572 vector<double> trainLog = {};
573 vector<double> validLog = {};
574 trainLog.assign(m_maxEpochs, 0.);
575 validLog.assign(m_maxEpochs, 0.);
576 int breakEpoch = 0;
577 int bestEpoch = 0;
578 vector<fann_type> bestWeights = {};
579 bestWeights.assign(m_GRLNeuro[isector].getNumberOfWeights(), 0.);
580 fann_randomize_weights(ann, -0.1, 0.1);
581 // train and save the network
582 for (int epoch = 1; epoch <= m_maxEpochs; ++epoch) {
583#ifdef HAS_OPENMP
584 double mse = parallel_fann::train_epoch_irpropm_parallel(ann, train_data, m_nThreads);
585#else
586 double mse = fann_train_epoch(ann, train_data);
587#endif
588 trainLog[epoch - 1] = mse;
589 // reduce weights that got too large
590 for (unsigned iw = 0; iw < ann->total_connections; ++iw) {
591 if (ann->weights[iw] > m_wMax)
592 ann->weights[iw] = m_wMax;
593 else if (ann->weights[iw] < -m_wMax)
594 ann->weights[iw] = -m_wMax;
595 }
596 // evaluate validation set
597 fann_reset_MSE(ann);
598#ifdef HAS_OPENMP
599 double valid_mse = parallel_fann::test_data_parallel(ann, valid_data, m_nThreads);
600#else
601 double valid_mse = fann_test_data(ann, valid_data);
602#endif
603 validLog[epoch - 1] = valid_mse;
604 // keep weights for lowest validation error
605 if (valid_mse < bestValid) {
606 bestValid = valid_mse;
607 for (unsigned iw = 0; iw < ann->total_connections; ++iw) {
608 bestWeights[iw] = ann->weights[iw];
609 }
610 bestEpoch = epoch;
611 }
612 // break when validation error increases
613 if (epoch > m_checkInterval && valid_mse > validLog[epoch - m_checkInterval]) {
614 B2INFO("Training run " << irun << " stopped in epoch " << epoch);
615 B2INFO("Train error: " << mse << ", valid error: " << valid_mse <<
616 ", best valid: " << bestValid);
617 breakEpoch = epoch;
618 break;
619 }
620 // print current status
621 if (epoch == 1 || (epoch < 100 && epoch % 10 == 0) || epoch % 100 == 0) {
622 B2INFO("Epoch " << epoch << ": Train error = " << mse <<
623 ", valid error = " << valid_mse << ", best valid = " << bestValid);
624 }
625 }
626 if (breakEpoch == 0) {
627 B2INFO("Training run " << irun << " finished in epoch " << m_maxEpochs);
628 breakEpoch = m_maxEpochs;
629 }
630 trainOptLog.push_back(trainLog[bestEpoch - 1]);
631 validOptLog.push_back(validLog[bestEpoch - 1]);
632 vector<float> oldWeights = m_GRLNeuro[isector].getWeights();
633 m_GRLNeuro[isector].m_weights = bestWeights;
634 }
635 if (m_saveDebug) {
636 for (unsigned i = nTrain + m_nValid; i < m_trainSets[isector].getNumberOfSamples(); ++i) {
637 float output = m_GRLNeuro.runMLP(isector, m_trainSets[isector].getInput(i));
638 vector<float> target = m_trainSets[isector].getTarget(i);
639 //for (unsigned iout = 0; iout < output.size(); ++iout) {
640 if (((int)target[0]) == 1)h_result_sig[isector]->Fill(output);
641 else h_result_bg[isector]->Fill(output);
642 //}
643 }
644 }
645 // free memory
646 fann_destroy_train(train_data);
647 fann_destroy_train(valid_data);
648 fann_destroy(ann);
649 delete[] nNodes;
650}
const std::vector< float > & getInput(unsigned i) const
get input vector of sample i
Definition: GRLMLPData.h:37
const std::vector< float > & getTarget(unsigned i) const
get target value of sample i
Definition: GRLMLPData.h:39
float runMLP(unsigned isector, const std::vector< float > &input)
Run an expert MLP.
Definition: GRLNeuro.cc:87

Member Data Documentation

◆ h_cdc2d_phi_bg

std::vector<TH1D*> h_cdc2d_phi_bg
protected

Definition at line 146 of file GRLNeuroTrainerModule.h.

◆ h_cdc2d_phi_sig

std::vector<TH1D*> h_cdc2d_phi_sig
protected

Histograms for monitoring.

Definition at line 140 of file GRLNeuroTrainerModule.h.

◆ h_cdc2d_pt_bg

std::vector<TH1D*> h_cdc2d_pt_bg
protected

Definition at line 147 of file GRLNeuroTrainerModule.h.

◆ h_cdc2d_pt_sig

std::vector<TH1D*> h_cdc2d_pt_sig
protected

Definition at line 141 of file GRLNeuroTrainerModule.h.

◆ h_ncdc_bg

std::vector<TH1D*> h_ncdc_bg
protected

Definition at line 157 of file GRLNeuroTrainerModule.h.

◆ h_ncdc_sig

std::vector<TH1D*> h_ncdc_sig
protected

Definition at line 152 of file GRLNeuroTrainerModule.h.

◆ h_ncdcf_bg

std::vector<TH1D*> h_ncdcf_bg
protected

Definition at line 158 of file GRLNeuroTrainerModule.h.

◆ h_ncdcf_sig

std::vector<TH1D*> h_ncdcf_sig
protected

Definition at line 153 of file GRLNeuroTrainerModule.h.

◆ h_ncdci_bg

std::vector<TH1D*> h_ncdci_bg
protected

Definition at line 160 of file GRLNeuroTrainerModule.h.

◆ h_ncdci_sig

std::vector<TH1D*> h_ncdci_sig
protected

Definition at line 155 of file GRLNeuroTrainerModule.h.

◆ h_ncdcs_bg

std::vector<TH1D*> h_ncdcs_bg
protected

Definition at line 159 of file GRLNeuroTrainerModule.h.

◆ h_ncdcs_sig

std::vector<TH1D*> h_ncdcs_sig
protected

Definition at line 154 of file GRLNeuroTrainerModule.h.

◆ h_necl_bg

std::vector<TH1D*> h_necl_bg
protected

Definition at line 161 of file GRLNeuroTrainerModule.h.

◆ h_necl_sig

std::vector<TH1D*> h_necl_sig
protected

Definition at line 156 of file GRLNeuroTrainerModule.h.

◆ h_result_bg

std::vector<TH1D*> h_result_bg
protected

Definition at line 151 of file GRLNeuroTrainerModule.h.

◆ h_result_sig

std::vector<TH1D*> h_result_sig
protected

Definition at line 145 of file GRLNeuroTrainerModule.h.

◆ h_selE_bg

std::vector<TH1D*> h_selE_bg
protected

Definition at line 148 of file GRLNeuroTrainerModule.h.

◆ h_selE_sig

std::vector<TH1D*> h_selE_sig
protected

Definition at line 142 of file GRLNeuroTrainerModule.h.

◆ h_selPhi_bg

std::vector<TH1D*> h_selPhi_bg
protected

Definition at line 149 of file GRLNeuroTrainerModule.h.

◆ h_selPhi_sig

std::vector<TH1D*> h_selPhi_sig
protected

Definition at line 143 of file GRLNeuroTrainerModule.h.

◆ h_selTheta_bg

std::vector<TH1D*> h_selTheta_bg
protected

Definition at line 150 of file GRLNeuroTrainerModule.h.

◆ h_selTheta_sig

std::vector<TH1D*> h_selTheta_sig
protected

Definition at line 144 of file GRLNeuroTrainerModule.h.

◆ m_2DfinderCollectionName

std::string m_2DfinderCollectionName
protected

Name of the StoreArray containing the input 2D tracks.

Definition at line 75 of file GRLNeuroTrainerModule.h.

◆ m_arrayname

std::string m_arrayname
protected

Name of the TObjArray holding the networks.

Definition at line 85 of file GRLNeuroTrainerModule.h.

◆ m_checkInterval

int m_checkInterval
protected

Training is stopped if validation error is higher than checkInterval epochs ago, i.e.

either the validation error is increasing or the gain is less than the fluctuations.

Definition at line 111 of file GRLNeuroTrainerModule.h.

◆ m_conditions

std::vector<ModuleCondition> m_conditions
privateinherited

Module condition, only non-null if set.

Definition at line 521 of file Module.h.

◆ m_description

std::string m_description
privateinherited

The description of the module.

Definition at line 511 of file Module.h.

◆ m_filename

std::string m_filename
protected

Name of file where network weights etc.

are stored after training.

Definition at line 79 of file GRLNeuroTrainerModule.h.

◆ m_GRLCollectionName

std::string m_GRLCollectionName
protected

Name of the StoreObj containing the input GRL.

Definition at line 77 of file GRLNeuroTrainerModule.h.

◆ m_GRLNeuro

GRLNeuro m_GRLNeuro
protected

Instance of the NeuroTrigger.

Definition at line 118 of file GRLNeuroTrainerModule.h.

◆ m_hasReturnValue

bool m_hasReturnValue
privateinherited

True, if the return value is set.

Definition at line 518 of file Module.h.

◆ m_load

bool m_load
protected

Switch to load saved parameters from a previous run.

Definition at line 91 of file GRLNeuroTrainerModule.h.

◆ m_logConfig

LogConfig m_logConfig
privateinherited

The log system configuration of the module.

Definition at line 514 of file Module.h.

◆ m_logFilename

std::string m_logFilename
protected

Name of file where training log is stored.

Definition at line 83 of file GRLNeuroTrainerModule.h.

◆ m_maxEpochs

int m_maxEpochs
protected

Maximal number of training epochs.

Definition at line 113 of file GRLNeuroTrainerModule.h.

◆ m_moduleParamList

ModuleParamList m_moduleParamList
privateinherited

List storing and managing all parameter of the module.

Definition at line 516 of file Module.h.

◆ m_multiplyNTrain

bool m_multiplyNTrain
protected

Switch to multiply number of samples with number of weights.

Definition at line 99 of file GRLNeuroTrainerModule.h.

◆ m_name

std::string m_name
privateinherited

The name of the module, saved as a string (user-modifiable)

Definition at line 508 of file Module.h.

◆ m_nTest

int m_nTest
protected

Number of test samples.

Definition at line 103 of file GRLNeuroTrainerModule.h.

◆ m_nThreads

int m_nThreads
protected

Number of threads for training.

Definition at line 107 of file GRLNeuroTrainerModule.h.

◆ m_nTrainMax

double m_nTrainMax
protected

Maximal number of training samples.

Definition at line 97 of file GRLNeuroTrainerModule.h.

◆ m_nTrainMin

double m_nTrainMin
protected

Minimal number of training samples.

Definition at line 95 of file GRLNeuroTrainerModule.h.

◆ m_nValid

int m_nValid
protected

Number of validation samples.

Definition at line 101 of file GRLNeuroTrainerModule.h.

◆ m_package

std::string m_package
privateinherited

Package this module is found in (may be empty).

Definition at line 510 of file Module.h.

◆ m_parameters

GRLNeuro::Parameters m_parameters
protected

Parameters for the NeuroTrigger.

Definition at line 93 of file GRLNeuroTrainerModule.h.

◆ m_propertyFlags

unsigned int m_propertyFlags
privateinherited

The properties of the module as bitwise or (with |) of EModulePropFlags.

Definition at line 512 of file Module.h.

◆ m_repeatTrain

int m_repeatTrain
protected

Number of training runs with different random start weights.

Definition at line 115 of file GRLNeuroTrainerModule.h.

◆ m_returnValue

int m_returnValue
privateinherited

The return value.

Definition at line 519 of file Module.h.

◆ m_saveDebug

bool m_saveDebug
protected

If true, save training curve and parameter distribution of training data.

Definition at line 89 of file GRLNeuroTrainerModule.h.

◆ m_trainArrayname

std::string m_trainArrayname
protected

Name of the TObjArray holding the training samples.

Definition at line 87 of file GRLNeuroTrainerModule.h.

◆ m_trainFilename

std::string m_trainFilename
protected

Name of file where training samples are stored.

Definition at line 81 of file GRLNeuroTrainerModule.h.

◆ m_trainSets

std::vector<GRLMLPData> m_trainSets
protected

Sets of training data for all sectors.

Definition at line 120 of file GRLNeuroTrainerModule.h.

◆ m_TrgECLClusterName

std::string m_TrgECLClusterName
protected

Name of the StoreArray containing the ECL clusters.

Definition at line 73 of file GRLNeuroTrainerModule.h.

◆ m_type

std::string m_type
privateinherited

The type of the module, saved as a string.

Definition at line 509 of file Module.h.

◆ m_wMax

double m_wMax
protected

Limit for weights.

Definition at line 105 of file GRLNeuroTrainerModule.h.

◆ n_cdc_sector

int n_cdc_sector = 0
protected

Number of CDC sectors.

Definition at line 133 of file GRLNeuroTrainerModule.h.

◆ n_ecl_sector

int n_ecl_sector = 0
protected

Number of ECL sectors.

Definition at line 135 of file GRLNeuroTrainerModule.h.

◆ n_sector

int n_sector = 0
protected

Number of Total sectors.

Definition at line 137 of file GRLNeuroTrainerModule.h.

◆ radtodeg

double radtodeg = 0
protected

convert radian to degree

Definition at line 130 of file GRLNeuroTrainerModule.h.

◆ scale_bg

std::vector<int> scale_bg
protected

BG scale factor for training.

Definition at line 164 of file GRLNeuroTrainerModule.h.

◆ TC1GeV

std::vector<float> TC1GeV
protected

Definition at line 128 of file GRLNeuroTrainerModule.h.

◆ TCcotThetaLab

std::vector<float> TCcotThetaLab
protected

Definition at line 125 of file GRLNeuroTrainerModule.h.

◆ TCPhiCOM

std::vector<float> TCPhiCOM
protected

Definition at line 126 of file GRLNeuroTrainerModule.h.

◆ TCPhiLab

std::vector<float> TCPhiLab
protected

Definition at line 124 of file GRLNeuroTrainerModule.h.

◆ TCThetaCOM

std::vector<float> TCThetaCOM
protected

Definition at line 127 of file GRLNeuroTrainerModule.h.

◆ TCThetaID

std::vector<int> TCThetaID
protected

Definition at line 123 of file GRLNeuroTrainerModule.h.


The documentation for this class was generated from the following files: