Title | Top-tuning: A study on transfer learning for an efficient alternative to fine tuning for image classification with fast kernel methods |
Publication Type | Journal Article |
Year of Publication | 2024 |
Authors | Alfano, PDidier, Pastore, VPaolo, Rosasco, L, Odone, F |
Journal | Image and Vision Computing |
Volume | 142 |
Pagination | 104894 |
Date Published | 02/2024 |
ISSN | 02628856 |
Abstract | The impressive performance of deep learning architectures is associated with a massive increase in model complexity. Millions of parameters need to be tuned, with training and inference time scaling accordingly, together with energy consumption. But is massive fine-tuning always necessary? In this paper, focusing on image classification, we consider a simple transfer learning approach exploiting pre-trained convolutional features as input for a fast-to-train kernel method. We refer to this approach as top-tuning since only the kernel classifier is trained on the target dataset. In our study, we perform more than 3000 training processes focusing on 32 small to medium-sized target datasets, a typical situation where transfer learning is necessary. We show that the top-tuning approach provides comparable accuracy with respect to fine-tuning, with a training time between one and two orders of magnitude smaller. These results suggest that top-tuning is an effective alternative to fine-tuning in small/medium datasets, being especially useful when training time efficiency and computational resources saving are crucial. |
URL | https://linkinghub-elsevier-com.ezproxy.canberra.edu.au/retrieve/pii/S0262885623002688 |
DOI | 10.1016/j.imavis.2023.104894 |
Short Title | Image and Vision Computing |
Associated Module:
CBMM Relationship:
- CBMM Related