Radio galaxy zoo: towards building the first multipurpose foundation model for radio astronomy with self-supervised learning

Inigo V Slijepcevic, Anna M M Scaife, Mike Walmsley, Micah Bowles, O Ivy Wong, Stanislav S Shabala, Sarah V White

Research output: Contribution to journalArticlepeer-review

Abstract

In this work, we apply self-supervised learning with instance differentiation to learn a robust, multipurpose representation for image analysis of resolved extragalactic continuum images. We train a multi-use model which compresses our unlabelled data into a structured, low dimensional representation which can be used for a variety of downstream tasks (e.g. classification, similarity search). We exceed baseline supervised Fanaroff–Riley classification performance by a statistically significant margin, with our model reducing the test set error by up to half. Our model is also able to maintain high classification accuracy with very few labels, with only 7.79 per cent error when only using 145 labels. We further demonstrate that by using our foundation model, users can efficiently trade off compute, human labelling cost and test set accuracy according to their respective budgets, allowing for efficient classification in a wide variety of scenarios. We highlight the generalizability of our model by showing that it enables accurate classification in a label scarce regime with data from the new MIGHTEE survey without any hyperparameter tuning, where it improves upon the baseline by ∼ 8 per cent⁠. Visualizations of our labelled and un-labelled data show that our model’s representation space is structured with respect to physical properties of the sources, such as angular source extent. We show that the learned representation is scientifically useful even if no labels are available by performing a similarity search, finding hybrid sources in the RGZ DR1 data set without any labels. We show that good augmentation design and hyperparameter choice can help achieve peak performance, while emphasizing that optimal hyperparameters are not required to obtain benefits from self-supervised pre-training.
Original languageEnglish
Pages (from-to)19-32
JournalRAS Techniques and Instruments
Early online date18 Dec 2023
DOIs
Publication statusPublished - 5 Jan 2024

Fingerprint

Dive into the research topics of 'Radio galaxy zoo: towards building the first multipurpose foundation model for radio astronomy with self-supervised learning'. Together they form a unique fingerprint.

Cite this