Skip to content

GAA-UAM/R-MTGB

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robust Multi-Task Gradient Boosting (R-MTGB)

A robust and scalable multi-task learning (MTL) framework that integrates outlier task detection into a structured gradient boosting process. Built with Python and scikit-learn, R-MTGB is designed to generalize well across heterogeneous task sets and is resilient to task-level noise.


📘 About

R-MTGB (Robust Multi-Task Gradient Boosting) is a novel ensemble-based learning framework developed to handle task heterogeneity and task-level noise in multi-task learning settings. The model introduces a three-stage boosting architecture:

  1. Shared Representation Learning: Learns features common across all tasks.
  2. Outlier Task Detection & Weighting: Optimizes regularized, task-specific parameters to dynamically down-weight noisy or outlier tasks.
  3. Task-Specific Fine-Tuning: Refines models individually to capture task-specific nuances.

✨ Features

  • Multi-task learning with task-specific and shared components.
  • Automatic outlier task detection.
  • Gradient boosting-based architecture with interpretability.
  • Compatible with various loss functions (regression/classification).
  • Performance analysis with per-task metrics.
  • Synthetic data generator for benchmarking.
  • Scikit-learn compatible design.

💻 Installation

Clone the repository and install dependencies using requirements

git clone https://github.com/GAA-UAM/R-MTGB.git
cd R-MTGB
pip install -r requirements.txt

🔑 License

The package is licensed under the GNU Lesser General Public License v2.1.

📚 Citations

If you use R-MTGB in your research or work, please consider citing this project using the following citation format. This citation refers to the arXiv preprint, and its manuscript is currently under review at Neurocomputing journal:

@misc{emami2025robustmultitaskgradientboosting,
      title={Robust-Multi-Task Gradient Boosting}, 
      author={Seyedsaman Emami and Gonzalo Martínez-Muñoz and Daniel Hernández-Lobato},
      year={2025},
      eprint={2507.11411},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2507.11411}, 
}

👨‍💻 Authors


Documentation

To get started with this project, please refer to the Wiki."

🤝 Contributing

Contributions are welcome! Please open an issue or submit a pull request.


💾 Release Information

Version

0.0.1

Updated

05 June 2025

Date-released

26 Jan 2024