Belle II Software  light-2403-persian
EdgeLayer Class Reference
Inheritance diagram for EdgeLayer:
Collaboration diagram for EdgeLayer:

Public Member Functions

def __init__ (self, nfeat_in_dim, efeat_in_dim, gfeat_in_dim, efeat_hid_dim, efeat_out_dim, num_hid_layers, dropout, normalize=True)
 
def forward (self, src, dest, edge_attr, u, batch)
 

Public Attributes

 nonlin_function
 Non-linear activation.
 
 num_hid_layers
 Number of hidden layers.
 
 dropout_prob
 Dropout probability.
 
 normalize
 Normalization.
 
 lin_in
 Linear input layer.
 
 lins_hid
 Intermediate linear layers.
 
 lin_out
 Output linear layer.
 
 norm
 Batch normalization.
 

Detailed Description

Updates edge features in MetaLayer:

.. math::
    e_{ij}^{'} = \\phi^{e}(e_{ij}, v_{i}, v_{j}, u),

where :math:`\\phi^{e}` is a neural network of the form

.. figure:: figs/MLP_structure.png
    :width: 42em
    :align: center

Args:
    nfeat_in_dim (int): Node features input dimension (number of node features in input).
    efeat_in_dim (int): Edge features input dimension (number of edge features in input).
    gfeat_in_dim (int): Gloabl features input dimension (number of global features in input).
    efeat_hid_dim (int): Edge features dimension in hidden layers.
    efeat_out_dim (int): Edge features output dimension.
    num_hid_layers (int): Number of hidden layers.
    dropout (float): Dropout rate :math:`r \\in [0,1]`.
    normalize (str): Type of normalization (batch/layer).


:return: Updated edge features tensor.
:rtype: `Tensor <https://pytorch.org/docs/stable/tensors.html#torch.Tensor>`_

Definition at line 30 of file geometric_layers.py.

Constructor & Destructor Documentation

◆ __init__()

def __init__ (   self,
  nfeat_in_dim,
  efeat_in_dim,
  gfeat_in_dim,
  efeat_hid_dim,
  efeat_out_dim,
  num_hid_layers,
  dropout,
  normalize = True 
)
Initialization.

Definition at line 58 of file geometric_layers.py.

68  ):
69  """
70  Initialization.
71  """
72  super(EdgeLayer, self).__init__()
73 
74 
75  self.nonlin_function = F.elu
76 
77  self.num_hid_layers = num_hid_layers
78 
79  self.dropout_prob = dropout
80 
81  self.normalize = normalize
82 
83 
84  self.lin_in = nn.Linear(
85  efeat_in_dim + 2 * nfeat_in_dim + gfeat_in_dim, efeat_hid_dim
86  )
87 
88  self.lins_hid = nn.ModuleList(
89  [
90  nn.Linear(efeat_hid_dim, efeat_hid_dim)
91  for _ in range(self.num_hid_layers)
92  ]
93  )
94 
95  self.lin_out = nn.Linear(efeat_hid_dim, efeat_out_dim, bias=not normalize)
96 
97  if normalize:
98 
99  self.norm = nn.BatchNorm1d(efeat_out_dim)
100 
101  _init_weights(self, normalize)
102 

Member Function Documentation

◆ forward()

def forward (   self,
  src,
  dest,
  edge_attr,
  u,
  batch 
)
Called internally by PyTorch to propagate the input through the network.
 - src, dest: [E, F_x], where E is the number of edges.
 - edge_attr: [E, F_e]
 - u: [B, F_u], where B is the number of graphs.
 - batch: [E] with max entry B - 1.

Definition at line 103 of file geometric_layers.py.


The documentation for this class was generated from the following file: