Belle II Software development
EdgeLayer Class Reference
Inheritance diagram for EdgeLayer:

Public Member Functions

def __init__ (self, nfeat_in_dim, efeat_in_dim, gfeat_in_dim, efeat_hid_dim, efeat_out_dim, num_hid_layers, dropout, normalize=True)
 
def forward (self, src, dest, edge_attr, u, batch)
 

Public Attributes

 nonlin_function
 Non-linear activation.
 
 num_hid_layers
 Number of hidden layers.
 
 dropout_prob
 Dropout probability.
 
 normalize
 Normalization.
 
 lin_in
 Linear input layer.
 
 lins_hid
 Intermediate linear layers.
 
 lin_out
 Output linear layer.
 
 norm
 Batch normalization.
 

Detailed Description

Updates edge features in MetaLayer:

.. math::
    e_{ij}^{'} = \\phi^{e}(e_{ij}, v_{i}, v_{j}, u),

where :math:`\\phi^{e}` is a neural network of the form

.. figure:: figs/MLP_structure.png
    :width: 42em
    :align: center

Args:
    nfeat_in_dim (int): Node features input dimension (number of node features in input).
    efeat_in_dim (int): Edge features input dimension (number of edge features in input).
    gfeat_in_dim (int): Gloabl features input dimension (number of global features in input).
    efeat_hid_dim (int): Edge features dimension in hidden layers.
    efeat_out_dim (int): Edge features output dimension.
    num_hid_layers (int): Number of hidden layers.
    dropout (float): Dropout rate :math:`r \\in [0,1]`.
    normalize (str): Type of normalization (batch/layer).


:return: Updated edge features tensor.
:rtype: `Tensor <https://pytorch.org/docs/stable/tensors.html#torch.Tensor>`_

Definition at line 30 of file geometric_layers.py.

Constructor & Destructor Documentation

◆ __init__()

def __init__ (   self,
  nfeat_in_dim,
  efeat_in_dim,
  gfeat_in_dim,
  efeat_hid_dim,
  efeat_out_dim,
  num_hid_layers,
  dropout,
  normalize = True 
)
Initialization.

Definition at line 58 of file geometric_layers.py.

68 ):
69 """
70 Initialization.
71 """
72 super(EdgeLayer, self).__init__()
73
74
75 self.nonlin_function = F.elu
76
77 self.num_hid_layers = num_hid_layers
78
79 self.dropout_prob = dropout
80
81 self.normalize = normalize
82
83
84 self.lin_in = nn.Linear(
85 efeat_in_dim + 2 * nfeat_in_dim + gfeat_in_dim, efeat_hid_dim
86 )
87
88 self.lins_hid = nn.ModuleList(
89 [
90 nn.Linear(efeat_hid_dim, efeat_hid_dim)
91 for _ in range(self.num_hid_layers)
92 ]
93 )
94
95 self.lin_out = nn.Linear(efeat_hid_dim, efeat_out_dim, bias=not normalize)
96
97 if normalize:
98
99 self.norm = nn.BatchNorm1d(efeat_out_dim)
100
101 _init_weights(self, normalize)
102

Member Function Documentation

◆ forward()

def forward (   self,
  src,
  dest,
  edge_attr,
  u,
  batch 
)
Called internally by PyTorch to propagate the input through the network.
 - src, dest: [E, F_x], where E is the number of edges.
 - edge_attr: [E, F_e]
 - u: [B, F_u], where B is the number of graphs.
 - batch: [E] with max entry B - 1.

Definition at line 103 of file geometric_layers.py.

103 def forward(self, src, dest, edge_attr, u, batch):
104 """
105 Called internally by PyTorch to propagate the input through the network.
106 - src, dest: [E, F_x], where E is the number of edges.
107 - edge_attr: [E, F_e]
108 - u: [B, F_u], where B is the number of graphs.
109 - batch: [E] with max entry B - 1.
110 """
111 out = (
112 torch.cat([edge_attr, src, dest, u[batch]], dim=1)
113 if u.shape != torch.Size([0])
114 else torch.cat([edge_attr, src, dest], dim=1)
115 )
116
117 out = self.nonlin_function(self.lin_in(out))
118 out = F.dropout(out, self.dropout_prob, training=self.training)
119
120 out_skip = out
121
122 for lin_hid in self.lins_hid:
123 out = self.nonlin_function(lin_hid(out))
124 out = F.dropout(out, self.dropout_prob, training=self.training)
125
126 if self.num_hid_layers > 1:
127 out += out_skip
128
129 if self.normalize:
130 out = self.nonlin_function(self.norm(self.lin_out(out)))
131 else:
132 out = self.nonlin_function(self.lin_out(out))
133
134 return out
135
136

Member Data Documentation

◆ dropout_prob

dropout_prob

Dropout probability.

Definition at line 79 of file geometric_layers.py.

◆ lin_in

lin_in

Linear input layer.

Definition at line 84 of file geometric_layers.py.

◆ lin_out

lin_out

Output linear layer.

Definition at line 95 of file geometric_layers.py.

◆ lins_hid

lins_hid

Intermediate linear layers.

Definition at line 88 of file geometric_layers.py.

◆ nonlin_function

nonlin_function

Non-linear activation.

Definition at line 75 of file geometric_layers.py.

◆ norm

norm

Batch normalization.

Definition at line 99 of file geometric_layers.py.

◆ normalize

normalize

Normalization.

Definition at line 81 of file geometric_layers.py.

◆ num_hid_layers

num_hid_layers

Number of hidden layers.

Definition at line 77 of file geometric_layers.py.


The documentation for this class was generated from the following file: