Skip to content

question about "structrue-aware" #13

@ChloeKiwi

Description

@ChloeKiwi

Hi, thanks for your wonderful work! This work enlights me a lot, but there are some questions I am confused about:

  1. from your source code, whether the node representation is 'structure-aware' because quantized_edge goes through graph_layer_2 and computes classification loss? as the below code in model.py:
    h = self.graph_layer_2(g, quantized_edge)
    h_list.append(h)
    h = self.linear(h)

and why here use quantized_edge as attribute inputs of GraphConv? before here the quantized_edge is used for structure reconstruction.

  1. in your supplementary paper, you mentioned "To better illustrate the connection between local graph structure and the codebook learned by our graph VQ-VAE, we conduct node-centered subgraph retrieval in the learned MLP representation spaces of NOSMOG and our VQGRAPH. Specifically, we extract the representation with distilled MLP model for a query node in Citeseer."

So, the "structure-aware" embedding is the learned MLP representation spaces, but what about the representation spaces of teacher GNN? Is it also "structure-aware" after the graph tokenizer and a linear classification?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions