Update Your Transformer to the Latest Release: Re-Basin of Task Vectors#
Proceedings of the Forty-Second International Conference on Machine Learning
July 2025
Model Compositionality Task Vectors Re-Basin Transformers
This work tackles a fundamental problem in modern machine learning: when foundation models are updated with new checkpoints, all derived fine-tuned models become incompatible and must be retrained from scratch. We introduce a data-free re-basin method that seamlessly transfers task-specific knowledge from fine-tuned models to updated pretrained backbones without any retraining. Notably, this approach is applicable to any Transformer-based architecture, including those with novel attention mechanisms or architectures that have not been previously fine-tuned.