Skip to content

AttentiveFP supernode missing #226

@alexbui91

Description

@alexbui91

Hello, thank you for your work.

I have interest in the AttentiveFP implementation from the paper "Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism". The Google search result leads me to this github repo. According to the paper, after the first phase of GNN message passing, the model adds a super node to the graph and considers this derived version as a new graph. However, according to your implementation, the same graph is used for both two phases. Please correct me if I am wrong.
Screenshot from 2024-10-11 11-18-42

f.y.i: the pytorch geometric implementation seems to be more accurate with the supernode implementation:
https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/models/attentive_fp.html#AttentiveFP

Thank you for your consideration!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions