Skip to content

usail-hkust/CoLLMLight

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CoLLMLight: Cooperative Large Language Model Agents for Network-Wide Traffic Signal Control

1 Introduction

Official code for article "CoLLMLight: Cooperative Large Language Model Agents for Network-Wide Traffic Signal Control".

Traffic Signal Control (TSC) plays a critical role in urban traffic management by optimizing traffic flow and mitigating congestion. While Large Language Models (LLMs) have recently emerged as promising tools for TSC due to their exceptional problem-solving and generalization capabilities, existing approaches fail to address the essential need for inter-agent coordination, limiting their effectiveness in achieving network-wide optimization. To bridge this gap, we propose CoLLMLight, a cooperative LLM agent framework for TSC. Specifically, we first construct a structured spatiotemporal graph to capture real-time traffic dynamics and spatial relationships among neighboring intersections, enabling the LLM to reason about complex traffic interactions. Moreover, we introduce a complexity-aware reasoning mechanism that dynamically adapts reasoning depth based on real-time traffic conditions, ensuring optimal computational efficiency without sacrificing decision quality. Besides, we propose a fine-tuning strategy that leverages iterative simulation-driven data collection and environmental feedback to build a lightweight LLM tailored for cooperative TSC. Extensive experiments on both synthetic and real-world datasets demonstrate that CoLLMLight outperforms state-of-the-art methods in diverse traffic scenarios, showcasing its effectiveness, scalability, and robustness.

CoLLMLight

2 Requirements

python>=3.9,tensorflow-cpu=2.8.0, cityflow, pandas=1.5.0, numpy=1.26.2, wandb, transformers=4.48.2, vllm, lmdeploy

cityflow needs a linux environment, and we run the code on Ubuntu.

3 Quick Start

First, deploy a LLM server through lmdeploy

lmdeploy serve api_server YOUR_LLM_PATH --tp=YOUR_GPU_NUM

Then, run the CoLLMLight with open

python run_CoLLMlight.py --model_path=YOUR_LLM_PATH --dataset='newyork_28x7' --traffic_file='anon_28_7_newyork_real_double.json'

4 CoLLMLight Training

Step 1: Simulation Data Sampling

First, sample the simulation data from the CityFlow environment. Run the following command:

python run_fts.py

The sampled data will be saved in ./data/FinetuneData/SynTrain_sample.json.

Step 2: Reasoning Chain Optimization

You need to set your API key in ./utils/LLMs.py first. Then, use GPT-4o and the simulation data to generate the synthetic reasoning chain. Execute the command below:

python reasoning_tuning_data_synth.py

The generated synthetic reasoning chain will be saved in ./data/FinetuneData/syntrain_reasoning_tuning.json, which you can use to finetune your LLM.

Step 3: Refinement

After that, use the finetuned LLM to interact with the simulation data and obtain the refined finetune data. Run the following command, replacing YOUR_LLM_PATH with the actual path of your LLM:

lmdeploy serve api_server YOUR_LLM_PATH --tp=YOUR_GPU_NUM
python reasoning_refinement.py --model_path=YOUR_LLM_PATH

Citation

@misc{yuan2025collmlightcooperativelargelanguage,
      title={CoLLMLight: Cooperative Large Language Model Agents for Network-Wide Traffic Signal Control}, 
      author={Zirui Yuan and Siqi Lai and Hao Liu},
      year={2025},
      eprint={2503.11739},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2503.11739}, 
}

About

CoLLMLight: Cooperative Large Language Model Agents for Network-Wide Traffic Signal Control

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages