Skip to main content
eScholarship
Open Access Publications from the University of California

Superconducting Hyperdimensional Associative Memory Circuit for Scalable Machine Learning

Abstract

We propose a generalized architecture for the first rapid-single-flux-quantum (RSFQ) associative memory circuit. The circuit employs hyperdimensional computing (HDC), a machine learning (ML) paradigm utilizing vectors with dimensionality in the thousands to represent information. HDC designs have small memory footprints, simple computations, and simple training algorithms compared to superconducting neural network accelerators (SNNAs), making them a better option for scalable SFQ machine learning (ML) solutions. The proposed superconducting HDC (SHDC) circuit uses entirely on-chip RSFQ memory which is tightly integrated with logic, operates at 33.3 GHz, is applicable to general ML tasks, and is manufacturable at practically useful scales given current SFQ fabrication limits. Tailored to a language recognition task, SHDC consists of 2-20 M Josephson junctions (JJs) and consumes up to three times less power than an analogous CMOS HDC circuit while achieving 78-84% higher throughput. SHDC is capable of outperforming the state of the art RSFQ SNNA, SuperNPU, by 48-99% for all benchmark NN architectures tested while occupying up to 90% less area and consuming up to nine times less power. To the best of the authors' knowledge, SHDC is currently the only superconducting ML approach feasible at practically useful scales for real-world ML tasks and capable of online learning.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View