-
Notifications
You must be signed in to change notification settings - Fork 11
Open
Description
Hi, thanks for your wonderful work! This work enlights me a lot, but there are some questions I am confused about:
- from your source code, whether the node representation is 'structure-aware' because
quantized_edgegoes throughgraph_layer_2and computes classification loss? as the below code in model.py:
h = self.graph_layer_2(g, quantized_edge)
h_list.append(h)
h = self.linear(h)
and why here use quantized_edge as attribute inputs of GraphConv? before here the quantized_edge is used for structure reconstruction.
- in your supplementary paper, you mentioned "To better illustrate the connection between local graph structure and the codebook learned by our graph VQ-VAE, we conduct node-centered subgraph retrieval in the learned MLP representation spaces of NOSMOG and our VQGRAPH. Specifically, we extract the representation with distilled MLP model for a query node in Citeseer."
So, the "structure-aware" embedding is the learned MLP representation spaces, but what about the representation spaces of teacher GNN? Is it also "structure-aware" after the graph tokenizer and a linear classification?
Metadata
Metadata
Assignees
Labels
No labels