Ablation studies validated design choices for model distillation in constructing the MegaDescriptor for animal re-identification. The study compared performance of ArcFace and Triplet loss with Swin-B transformer or EfficientNet-B3 backbone, showing Swin-B and ArcFace performed better.
Swin-B with ArcFace combination maintained competitive or better performance than other variants. Results suggested that Triplet loss underperforms compared to ArcFace even with correct hyperparameters.
Hyperparameter tuning involved a comprehensive grid search strategy to select optimal parameters. The best setting for ArcFace achieved a median performance of 87.3% with quantiles of 49.2% and 96.4%, showing significant performance improvement.
#ablation-studies #model-distillation #metric-learning #hyperparameter-tuning #performance-evaluation
Collection
[
|
...
]