Skip to content

giorgia-nadizar/interpretable-control-competition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Interpretable Control Competition

Repository for the GECCO'25 interpretable control competition.

Install

Clone the repository and create the conda virtual environment with all needed packages.

git clone https://github.com/giorgia-nadizar/interpretable-control-competition.git
cd interpretable-control-competition
conda env create -f environment.yml
conda activate ic39

Alternatively, if you do not want to use conda, you can also directly create a virtual environment from the Python version installed on your system (must be at least Python 3.9.21):

git clone https://github.com/giorgia-nadizar/interpretable-control-competition.git
cd interpretable-control-competition
python3 -m venv ic39
source ic39/bin/activate
pip3 install -r requirements.txt

Control Task: ATARI Pong

Task details

For more details on the 'Pong' we refer to the official documentation on the ALE website: https://ale.farama.org/environments/pong/.

Repository content

The pong package contains two files:

  • controller.py has a general controller class (which you can extend with your own implementation), and a random controller for testing purposes
  • example.py shows the basic evaluation loop for the chosen environment, the Pong-v4

The competition's final evaluation will be performed with the same environment (Pong-v4).

Note: in some cases running the example.py with render_mode='human' might result in a libGL error. One potential fix is to run the following command in your conda environment.

conda install -c conda-forge libstdcxx-ng

Competition rules

Submissions

The goal is to provide an interpretable control policy that solves the task.

Each submission will have to include:

  • Control policy score, explanation, and pipeline description: a document
    • containing the score obtained by the policy, an interpretability analysis of the policy (covering all relevant information deducible from it), and the pipeline used to obtain it
    • of up to 2 pages in the Gecco format, excluding references
  • Control policy and code: for reproducibility and assessment purposes, we require
    • updated environment file or additional requirements needed to make the code work
    • run file, i.e., a Python script, from which the submitted policy can be assessed on the environment
    • optimization file, i.e., a Python script, from which the optimization process can be reproduced
    • optimization log reporting the progression of the policies scores during the performed optimization

Evaluation

Each submission will be evaluated according to two criteria:

  • Performance rank, which will be evaluated by simulating the submitted policy
  • Interpretability rank, which will be appraised by a panel of judges, who will consider:
    • Clarity of the pipeline
    • Readability of the model
    • Clarity of the explanation provided
    • Amount of processing required to derive the explanation from the raw policy

These two ranks will be combined using the geometric mean to compute the overall global rank for each competition entry.

Prize

Winners will be awarded a certificate.

About

Repository for the GECCO'25 interpretable control competition.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages