Working paper
Tackling Domain Shift in Bird Audio Classification via Transfer Learning and Semi-Supervised Distillation: A Case Study on BirdCLEF+ 2025
Year:
2025Published in:
CLEF Working NotesBirdcall classification
Transfer learning
Semi-supervised learning
Domain adaptation
We present our solution from team volodymyr vialactea to the BirdCLEF+ 2025 challenge, which achieved stateof-the-art performance, placing 2nd on the Private Leaderboard with a ROC AUC of 0.928 on the Private test set and 0.925 on the Public test set. Our system is based on five key components: a strong baseline model, in-domain transfer learning, semi-supervised learning implemented via model distillation to mitigate domain shift, postprocessing, and model ensembling. We conduct an ablation study to evaluate the contribution of each component and analyze the effects of different augmentations and data setups. Furthermore, we investigate the domain shift between training and test distributions and explore strategies for its mitigation.