Skip to content
/ sphere Public

Spherical Embeddings for Atomic Relation Projection Reaching Complex Logical Query Answering, WWW '25

License

Notifications You must be signed in to change notification settings

nlp-tlp/sphere

Repository files navigation

Spherical Embeddings for Atomic Relation Projection Reaching Complex Logical Query Answering

This is the implementation of the paper Spherical Embeddings for Atomic Relation Projection Reaching Complex Logical Query Answering in WWW '25.

Acknowledgements

We acknowledge the code of KGReasoning and the code of QTO for their contributions.

Getting started

Step 1: Data preparation

  • Download the datasets here, then move KG_data.zip to ./sphere/ directory

  • Unzip KG_data.zip to ./sphere/data/:

    cd sphere/
    unzip -d data KG_data.zip

Step 2: Dependencies installation

  • For condaers:

    conda env create -f requirements.yml
    conda activate sphere
  • For pipers:

    python -m venv venv
    source ./venv/bin/activate
    pip install -r requirements.txt

Step 3: Model

  • There are three datasets: FB15k-237, FB15k, NELL995
  • We recommend that you start with the FB15k-237 and FB15k first for faster implementation and less GPU memory requirements than that with the NELL dataset.

Step 3.1: Training query embeddings models

  1. Run the scripts scripts/*.sh at the parent directory sphere/ to train query embeddings models (GQE, Query2Box, SpherE) for the default dataset FB15k-237
  2. Uncomment others in *.sh to train models using other datasets (FB15k/NELL995). For example, type the following command to train a specific model:
  • SpherE (trained using 1p queries only)

    scripts/sphere_1p.sh
  • GQE

    scripts/gqe.sh
  • GQE (trained using 1p queries only)

    scripts/gqe_1p.sh
  • Query2Box

    scripts/query2box.sh
  • Query2Box (trained using 1p queries only)

    scripts/query2box_1p.sh

Step 3.2: Move the checkpoint

  • After successfully training a specific model, copy or move its checkpoint file checkpoint under the logs/ directory to the directory clqa/pre_trained_clqa/, to prepare for the next step. For example, the final path of the checkpoint file is as follows:

    clqa/pre_trained_clqa/checkpoint

    or rename it

    clqa/pre_trained_clqa/FB15k-237_sphere_256

Step 3.3: Generalizing query embeddings to answer complex logical queries using fuzzy logic

  1. Run the scripts scripts/*.sh at the parent directory sphere/ to generate atomic query matrix (neural adjacency matrix) and to generalize query embeddings models (GQE, Query2Box, SpherE) using the default dataset FB15k-237 for complex logical queries.
  2. Ensure that this option --clqa_path, for example, --clqa_path clqa/pre_trained_clqa/FB15k-237_sphere_256 matches to the name of checkpoint of pre-trained model, e.g. FB15k-237_sphere_256. You can rename checkpoint to FB15k-237_sphere_256 for distinguishing other checkpoints.
  3. Ensure that this option -d, for example -d 256, matches to the dimension option -d 256 of pre-trained model.
  4. Uncomment others in *.sh for other datasets (FB15k/NELL995).
  • SpherE (trained using 1p queries only)

    scripts/sphere_qto.sh
  • GQE

    scripts/gqe_qto.sh
  • GQE (1p version)

    scripts/gqe_qto_1p.sh
  • Query2Box

    scripts/query2box_qto.sh
  • Query2Box (1p version)

    scripts/query2box_qto_1p.sh

Citation

If you find this code useful for your research, please consider citing the following paper:

@inproceedings{nguyen2025spherical,
    author = {Nguyen, Chau D. M. and French, Tim and Stewart, Michael and Hodkiewicz, Melinda and Liu, Wei},
    title = {Spherical Embeddings for Atomic Relation Projection Reaching Complex Logical Query Answering},
    year = {2025},
    isbn = {9798400712746},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    url = {https://doi.org/10.1145/3696410.3714747},
    booktitle = {Proceedings of the ACM on Web Conference 2025},
    pages = {35–46},
    numpages = {12}
    location = {Sydney, NSW, Australia},
    series = {WWW '25},
}

About

Spherical Embeddings for Atomic Relation Projection Reaching Complex Logical Query Answering, WWW '25

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published