Skip to content

Murali-group/ICoN

Repository files navigation

ICoN: Integration Using Co-attention across Biological Networks

We propose ICoN, a novel unsupervised graph neural network model that takes multiple protein–protein association networks as inputs and generates a feature representation for each protein that integrates the topological information from all the networks. A key contribution of ICoN is exploiting a mechanism called “co-attention” that enables cross-network communication during training. The model also incorporates a denoising training technique, introducing perturbations to each input network and training the model to reconstruct the original network from its corrupted version. Our experimental results demonstrate that ICoN surpasses individual networks across three downstream tasks: gene module detection, gene coannotation prediction, and protein function prediction. Compared to existing unsupervised network integration models, ICoN exhibits superior performance across the majority of downstream tasks and shows enhanced robustness against noise. This work introduces a promising approach for effectively integrating diverse protein–protein association networks, aiming to achieve a biologically meaningful representation of proteins.

Screenshot

Generate protein embedding

  1. To set up the environment, run the commands in icon_env.txt file.

  2. To run ICoN with the best hyperparameters on yeast networks, run (from project folder)

    python code/main_icon.py code/config/icon_best_yeast.json
    
  3. You can change the hyperparameters or the networks you are integrating by editing or creating a new config file in code/config.

    i. To run co-attention based ablation study on ICoN use the config file: icon_no-coattn_yeast.json

    ii. To run noise induction module based ablation study on ICoN use the config file: icon_no-noise_yeast.json

    iii. To run ICoN on yeast networks each corrupted with 30% false positive and false negative edges, use: icon_best_noisy03_yeast.json

    iv. To run ICoN on yeast networks each corrupted with 50% false positive and false negative edges, use: icon_best_noisy05_yeast.json

Evaluate generated embedding

We utilized BIONIC-evals [1] to evaluate embeddings generated by ICoN and compare ICoN with other unsupervised biological network integration models. The details for reproducing the figures in the paper are outlined below.

To evaluate ICoN in terms of downstream tasks: i. gene module detection, ii.gene coannotation prediction, and iii. gene function prediction we utilized BIONIC-evals.

We have provided the datasets (embeddings generated from 1 run of models), standards, config, and script used in creating figures in the manuscript of ICoN. To reproduce these figures (for one run) follow the instructions below:

  1. First install BIONIC-evals following the instructions given in BIONIC-evals

  2. Place our provided <script> folder inside <BIONIC-evals/bioniceval>.

  3. Now replace the following folders in <BIONIC-evals/bioniceval> with our provided folders here: i. datasets ii. config iii. standards

    Note: We have provided some files in .zip format. Please extract them before proceeding.

i. Comparative analysis between ICoN and other network integration models (and input networks):

  1. Run BIONIC-evals with <config/single_runs/yeast.json>
  2. Then run:
    python paper_plots.py <bionic_eval_results_folder>
    

ii. Ablation study of ICoN:

Co-attention

  1. Run BIONIC-evals with <config/single_runs/ablation_nocoattn.json>
  2. Then run:
    python ablation_study_coattn.py <bionic_eval_results_folder>
    

Noise induction module

  1. Run BIONIC-evals with <config/single_runs/ablation_nonoise.json>
  2. Then run:
    python ablation_study_noise.py <bionic_eval_results_folder>
    

iii. Interpretation of co-attention coefficient:

Run:

python co_attention_weights-lineplot.py <bionic_eval_datasets_folder>

iv. Robustness to noise:

  1. Run BIONIC-evals with <config/single_runs/noisyinput_icon_bionic_union.json>
  2. Then run:
    python noise_robustness.py <bionic_eval_results_folder>
    

Publication:

Nure Tasnina, T M Murali, ICoN: integration using co-attention across biological networks, Bioinformatics Advances, Volume 5, Issue 1, 2025, vbae182, https://doi.org/10.1093/bioadv/vbae182

References

[1] Duncan Forster and congyoua (2022) “duncster94/BIONIC-evals: v0.2.0”. Zenodo. doi: 10.5281/zenodo.6964943.

About

Integration using Co-attention across Biological Networks

Resources

License

GPL-3.0, MIT licenses found

Licenses found

GPL-3.0
LICENSE
MIT
LICENSE_MIT

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages