difference between add_self_loops: bool = True and root_weight: bool = True #7408
Replies: 4 comments 8 replies
-
There is a slight difference between |
Beta Was this translation helpful? Give feedback.
-
Thank you @rusty1s for your fast and precise answer. I understand better the difference between the two and I can see the different applications when it's set to True. In my specific case, I try to predict the central node class using only the features from the neighbors, so I want to be sure that I don't make any mistake while setting root_weight = False . |
Beta Was this translation helpful? Give feedback.
-
Hi @rusty1s. Thank you again for your fast and precise answer last time. I re-open this topic because I have additional questions on the matter. I spent some time on the github discussions and I found that this problem was similar to what I was trying to do = I want to predict the class of a node using ONLY the features from its neighbors and not using its own features because I have a lot of features missing from my test set. So I use And it works perfectly fine when I use just one Graph Convolutional Layer. But I am note sure that this works when I have at least 2 Graph Convolutional Layers.
I will denote by hL( At layer 0, the node embeddings h0(
At layer 2, the node embeddings h2(
So the central node will still "see" its own initial features ? At least partially or transformed ? Am I right ? Because each node is its own neighbor at least 2-hops away. Could I correct this if I put only directed edges going to the nodes in the test set ? That way the initial features (that are missing) will not go back to the test set node during message passing ? Imagine that I train my model on the initial graph containing only the black bidirectional edges. Then, I slightly update my network with the blue node that is on the receiving end of two unidirectional blue edges coming from D and E for example (see graph below). I know that my model will be able to compute a node embedding and do a prediction, but will it be relevant ? Sorry for the long post and I hope I'm clear. Thanks in advance :-) |
Beta Was this translation helpful? Give feedback.
-
Hi again @rusty1s, thank you again for your answers last time but I have another question. I was able to experiment with the And it works perfectly fine for the validation and testing sets when my model has been previously trained. But then I tried to use this approach as well during training time. But when I do that, the model doesn't train at all. The loss and the validation don't change at each epoch.
And when I try to train the model:
The output looks like something like this:
I do not understand why it does that. I am afraid I might have misunderstood what the I thought that maybe I had to set the argument to
Which is a bit strange because the package pyg-lib (0.3.1+pt21cu118) is properly installed. Looking at the source code for the NeighborSampler which is used by the NeighborLoader, it seems to me that the
It's a small detail but I guess the wrong error is raised ? So what do you think of it ? Would the Sorry for the long post and thanks in advance. best regards, |
Beta Was this translation helpful? Give feedback.
-
Hello everyone, I have a question regarding the documentation of some convolutional layers. I am quite new to pytorch_geometric so sorry if my question is obvious.
Does the 'root_weight' parameter from some convolutional layers (e.g. RGCNConv) is similar to the 'add_self_loops' parameter from some other convolutional layers (e.g. GCNConv) ?
Or are there some minor differences justifying the use of different words ?
It seems to me that 'add_self_loops' is restricted to homogeneous graphs whereas 'root_weight' is for Heterogeneous Graphs maybe ? It seems to me that there isn't a convolutional layer with both arguments so that would make sense.
I want to perform node embeddings for each node on a heterogeneous graph WITHOUT using the node features of the node I am trying to embed, but only using the node features of all the neighbors across all edge_types. So for now, I was only using layers with add_self_loops = False. Can I use other layers with root_weight = False instead ? If yes, then I have more convolutions to try.
And obviously, I need to be sure that there is no self_loops in the edge_index.
That's what I understood from a comment in the issue #561 (#561), but I just want to be sure to not make any mistakes.
Thanks in advance
:-)
Beta Was this translation helpful? Give feedback.
All reactions