HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal Prototypes

Aiden Durrant*, Georgios Leontidis

*Corresponding author for this work

Research output: Working paperPreprint

1 Downloads (Pure)

Abstract

Hyperbolic manifolds for visual representation learning allow for effective learning of semantic class hierarchies by naturally embedding tree-like structures with low distortion within a low-dimensional representation space. The highly separable semantic class hierarchies produced by hyperbolic learning have shown to be powerful in low-shot tasks, however, their application in self-supervised learning is yet to be explored fully. In this work, we explore the use of hyperbolic representation space for self-supervised representation learning for prototype-based clustering approaches. First, we extend the Masked Siamese Networks to operate on the Poincaré ball model of hyperbolic space, secondly, we place prototypes on the ideal boundary of the Poincaré ball. Unlike previous methods we project to the hyperbolic space at the output of the encoder network and utilise a hyperbolic projection head to ensure that the representations used for downstream tasks remain hyperbolic. Empirically we demonstrate the ability of these methods to perform comparatively to Euclidean methods in lower dimensions for linear evaluation tasks, whilst showing improvements in extreme few-shot learning tasks.
Original languageEnglish
PublisherArXiv
Pages1-12
Number of pages12
DOIs
Publication statusPublished - 18 May 2023

Keywords

  • Machine Learning
  • Deep Learning
  • Self-Supervised Learning
  • Hyperbolic Learning

Fingerprint

Dive into the research topics of 'HMSN: Hyperbolic Self-Supervised Learning by Clustering with Ideal Prototypes'. Together they form a unique fingerprint.
  • "Maxwell" HPC for Research

    Katie Wilde (Manager) & Andrew Phillips (Manager)

    Research Facilities: Facility

Cite this