hypergraphconv attentiion gradient = 0 #8351
wuyiyang66
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I use pyg.nn.hypergraphConv to build a GCN model. Hypergraph attention is also used. When checking gradients, I found zero gradient of parameter gcn.att. I wonder why it happened.
Here is my code:
class HypGraph(nn.Module):
def init(self, dim, gcn_hidden, gcn_out):
super(HypGraph, self).init()
self.dim = dim
self.gcn_hidden = gcn_hidden
self.gcn_out = gcn_out
# self.embedding = nn.Embedding(29, self.dim, max_norm=True)
self.gcn1 = gnn.HypergraphConv(
self.dim,
self.gcn_hidden,
use_attention=True,
heads=4,
concat=True,
dropout=0.1,
)
self.gcn2 = gnn.HypergraphConv(
4 * self.gcn_hidden,
self.gcn_out,
use_attention=False,
heads=1,
concat=False,
dropout=0.1,
)
Beta Was this translation helpful? Give feedback.
All reactions