Bootstrap Your Own Variance
In collaboration with University of Göttingen
AuthorsPolina Turishcheva*, Jason Ramapuram*, Sinead Williamson*, Dan Busbridge, Eeshan Dhekane, Russ Webb
Bootstrap Your Own Variance
In collaboration with University of Göttingen
AuthorsPolina Turishcheva*, Jason Ramapuram*, Sinead Williamson*, Dan Busbridge, Eeshan Dhekane, Russ Webb
This paper was accepted at the workshop Self-Supervised Learning - Theory and Practice at NeurIPS 2023.
*=Equal Contributors
Understanding model uncertainty is important for many applications. We propose Bootstrap Your Own Variance (BYOV), combining Bootstrap Your Own Latent (BYOL), a negative-free Self-Supervised Learning (SSL) algorithm, with Bayes by Backprop (BBB), a Bayesian method for estimating model posteriors. We find that the learned predictive std of BYOV vs. a supervised BBB model is well captured by a Gaussian distribution, providing preliminary evidence that the learned parameter posterior is useful for label free uncertainty estimation. BYOV improves upon the deterministic BYOL baseline (+2.83% test ECE, +1.03% test Brier) and presents better calibration and reliability when tested with various augmentations (eg: +2.4% test ECE, +1.2% test Brier for Salt & Pepper noise).
Self-Supervised Learning with Gaussian Processes
January 30, 2026research area Methods and Algorithms
Self supervised learning (SSL) is a machine learning paradigm where models learn to understand the underlying structure of data without explicit supervision from labeled samples. The acquired representations from SSL have demonstrated useful for many downstream tasks including clustering, and linear classification, etc. To ensure smoothness of the representation space, most SSL methods rely on the ability to generate pairs of observations that…
Homomorphic Self-Supervised Learning
November 18, 2022research area Computer Vision, research area Methods and Algorithmsconference NeurIPS
This paper was accepted at the workshop “Self-Supervised Learning - Theory and Practice” at NeurIPS 2022.
Many state of the art self-supervised learning approaches fundamentally rely on transformations applied to the input in order to selectively extract task-relevant information. Recently, the field of equivariant deep learning has developed to introduce structure into the feature space of deep neural networks, specifically with respect to such…