We propose ICoN, a novel unsupervised graph neural network model that takes multiple protein–protein association networks as inputs and generates a feature representation for each protein that integrates the topological information from all the networks. A key contribution of ICoN is exploiting a mechanism called “co-attention” that enables cross-network communication during training. The model also incorporates a denoising training technique, introducing perturbations to each input network and training the model to reconstruct the original network from its corrupted version. Our experimental results demonstrate that ICoN surpasses individual networks across three downstream tasks: gene module detection, gene coannotation prediction, and protein function prediction. Compared to existing unsupervised network integration models, ICoN exhibits superior performance across the majority of downstream tasks and shows enhanced robustness against noise. This work introduces a promising approach for effectively integrating diverse protein–protein association networks, aiming to achieve a biologically meaningful representation of proteins.
-
To set up the environment, run the commands in icon_env.txt file.
-
To run ICoN with the best hyperparameters on yeast networks, run (from project folder)
python code/main_icon.py code/config/icon_best_yeast.json
-
You can change the hyperparameters or the networks you are integrating by editing or creating a new config file in code/config.
i. To run co-attention based ablation study on ICoN use the config file: icon_no-coattn_yeast.json
ii. To run noise induction module based ablation study on ICoN use the config file: icon_no-noise_yeast.json
iii. To run ICoN on yeast networks each corrupted with 30% false positive and false negative edges, use: icon_best_noisy03_yeast.json
iv. To run ICoN on yeast networks each corrupted with 50% false positive and false negative edges, use: icon_best_noisy05_yeast.json
We utilized BIONIC-evals [1] to evaluate embeddings generated by ICoN and compare ICoN with other unsupervised biological network integration models. The details for reproducing the figures in the paper are outlined below.
To evaluate ICoN in terms of downstream tasks: i. gene module detection, ii.gene coannotation prediction, and iii. gene function prediction we utilized BIONIC-evals.
We have provided the datasets (embeddings generated from 1 run of models), standards, config, and script used in creating figures in the manuscript of ICoN. To reproduce these figures (for one run) follow the instructions below:
-
First install BIONIC-evals following the instructions given in BIONIC-evals
-
Place our provided <script> folder inside <BIONIC-evals/bioniceval>.
-
Now replace the following folders in <BIONIC-evals/bioniceval> with our provided folders here: i. datasets ii. config iii. standards
Note: We have provided some files in .zip format. Please extract them before proceeding.
- Run BIONIC-evals with <config/single_runs/yeast.json>
- Then run:
python paper_plots.py <bionic_eval_results_folder>
- Run BIONIC-evals with <config/single_runs/ablation_nocoattn.json>
- Then run:
python ablation_study_coattn.py <bionic_eval_results_folder>
- Run BIONIC-evals with <config/single_runs/ablation_nonoise.json>
- Then run:
python ablation_study_noise.py <bionic_eval_results_folder>
Run:
python co_attention_weights-lineplot.py <bionic_eval_datasets_folder>
- Run BIONIC-evals with <config/single_runs/noisyinput_icon_bionic_union.json>
- Then run:
python noise_robustness.py <bionic_eval_results_folder>
Nure Tasnina, T M Murali, ICoN: integration using co-attention across biological networks, Bioinformatics Advances, Volume 5, Issue 1, 2025, vbae182, https://doi.org/10.1093/bioadv/vbae182
[1] Duncan Forster and congyoua (2022) “duncster94/BIONIC-evals: v0.2.0”. Zenodo. doi: 10.5281/zenodo.6964943.