ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2310.02048
38
3
v1v2 (latest)

Exploring Generalisability of Self-Distillation with No Labels for SAR-Based Vegetation Prediction

3 October 2023
Laura Martínez-Ferrer
Anna Jungbluth
Joseph A. Gallego-Mejia
Matt Allen
Francisco Dorr
F. Kalaitzis
Raúl Ramos-Pollán
ArXiv (abs)PDFHTML
Abstract

In this work we pre-train a DINO-ViT based model using two Synthetic Aperture Radar datasets (S1GRD or GSSIC) across three regions (China, Conus, Europe). We fine-tune the models on smaller labeled datasets to predict vegetation percentage, and empirically study the connection between the embedding space of the models and their ability to generalize across diverse geographic regions and to unseen data. For S1GRD, embedding spaces of different regions are clearly separated, while GSSIC's overlaps. Positional patterns remain during fine-tuning, and greater distances in embeddings often result in higher errors for unfamiliar regions. With this, our work increases our understanding of generalizability for self-supervised models applied to remote sensing.

View on arXiv
Comments on this paper