Skip to content

Hiroki11x/Timm_OOD_Calibration

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

An Empirical Study of Pre-trained Model Selection for Out-of-Distribution Generalization and Calibration

Abstract

In out-of-distribution (OOD) generalization tasks, fine-tuning pre-trained models has become a prevalent strategy. Different from most prior work that has focused on advancing learning algorithms, we systematically examined how pre-trained model size, pre-training dataset size, and training strategies impact generalization and uncertainty calibration on downstream tasks. We evaluated 100 models across diverse pre-trained model sizes, five pre-training datasets, and five data augmentations through extensive experiments on four distribution shift datasets totaling over 120,000 GPU hours. Our results demonstrate the significant impact of pre-trained model selection, with optimal choices substantially improving OOD accuracy over algorithm improvement alone. Additionally, we find that larger models and bigger pre-training datasets not only enhance OOD performance but also improve calibration, helping to mitigate overconfidence, contrary to some prior studies that found modern deep networks to calibrate worse than classical shallow models. Our work underscores the overlooked importance of pre-trained model selection for out-of-distribution generalization and calibration.

Prerequisites

  • Python >= 3.6.5
  • Pytorch >= 1.6.0
  • cuDNN >= 7.6.2
  • CUDA >= 10.0

Downloads

4 Datasets For Domain Generalization Task

  1. VLCS
  2. PACS
  3. OfficeHome
  4. DomainNet
  5. WILDS Camelyon17

Pre-Trained Models from Timm

Implementation

As for the DomainBed, we follow the official implementations shown in the links below.

Citation

Paper authors

* denotes equal contribution

About

An Empirical Study of Pre-trained Model Selection for Out-of-Distribution Generalization and Calibration (TMLR2025)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published