A robust and scalable multi-task learning (MTL) framework that integrates outlier task detection into a structured gradient boosting process. Built with Python and scikit-learn, R-MTGB is designed to generalize well across heterogeneous task sets and is resilient to task-level noise.
R-MTGB (Robust Multi-Task Gradient Boosting) is a novel ensemble-based learning framework developed to handle task heterogeneity and task-level noise in multi-task learning settings. The model introduces a three-stage boosting architecture:
- Shared Representation Learning: Learns features common across all tasks.
- Outlier Task Detection & Weighting: Optimizes regularized, task-specific parameters to dynamically down-weight noisy or outlier tasks.
- Task-Specific Fine-Tuning: Refines models individually to capture task-specific nuances.
- Multi-task learning with task-specific and shared components.
- Automatic outlier task detection.
- Gradient boosting-based architecture with interpretability.
- Compatible with various loss functions (regression/classification).
- Performance analysis with per-task metrics.
- Synthetic data generator for benchmarking.
- Scikit-learn compatible design.
Clone the repository and install dependencies using requirements
git clone https://github.com/GAA-UAM/R-MTGB.git
cd R-MTGB
pip install -r requirements.txt
The package is licensed under the GNU Lesser General Public License v2.1.
If you use R-MTGB in your research or work, please consider citing this project using the following citation format. This citation refers to the arXiv preprint, and its manuscript is currently under review at Neurocomputing journal:
@misc{emami2025robustmultitaskgradientboosting,
title={Robust-Multi-Task Gradient Boosting},
author={Seyedsaman Emami and Gonzalo Martínez-Muñoz and Daniel Hernández-Lobato},
year={2025},
eprint={2507.11411},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2507.11411},
}
To get started with this project, please refer to the Wiki."
Contributions are welcome! Please open an issue or submit a pull request.
0.0.1
05 June 2025
26 Jan 2024