Extropic’s ‘Lite’ Paper Unveils Vision for Next-Generation AI Tech, Superconducting Chips
Insider Brief
- Extropic, a semi-stealthy startup, released a paper detailing its technological approach.
- According to the paper, the company is building technology that is underpinned by the development of parameterized stochastic analog circuits.
- The company’s “accelerators” promise vast improvements in both runtime and energy efficiency for algorithms.
For a lite paper, it’s some heavy reading.
That being said, the Extropic team, a somewhat stealthy startup that recently announced a $14.1 million seed round, released a paper to better flesh out what the company is up to.
According to the paper, Extropic is building technology that is underpinned by the development of parameterized stochastic analog circuits. Another way to put this is that these circuits are like advanced, adjustable electronic circuits that can handle a wide range of tasks by mimicking the randomness found in nature, making them especially useful for complex computing tasks that involve uncertainty or prediction.
If scalable, this represents a significant departure from conventional digital computing
The Extropic “accelerators” promise vast improvements in both runtime and energy efficiency for algorithms that require sampling from complex energy landscapes. Inspired by the principles of Brownian motion, this new class of accelerators leverages programmable randomness, positioning Extropic at the forefront of generative AI innovation.
There’s a real timeliness to the company’s revelation. The team writes that the tech industry is now grappling with the insatiable demand for computing power, spurred by the rapid advancement of artificial intelligence (AI).
The need for power is running up against the physical limits of current technology. Historically, the march of computing efficiency has kept pace with the growing demand for computer technology, thanks in large part to the miniaturization of CMOS transistor technology, as predicted by Moore’s Law. However, the paper points out that we are edging closer to the physical limits of this technology, with transistors nearing atomic sizes and thermal noise challenging digital operation, which means a bottleneck looms over the future of computing. The team calls this “Moore’s Wall.”
This shadow is looming when the energy requirements of AI are surging, leading to proposals for extreme solutions, such as nuclear-powered data centers. Extropic reports that the quest for sustainability in scaling computational power and AI requires an unprecedented infrastructural and engineering overhaul.
Extropic’s paper offers a glimpse into their alternative, inspired by the efficiency of biological systems. In nature, computing is neither rigid nor exclusively digital, rather it thrives on the intrinsic randomness and discrete interactions within cellular chemical reaction networks, the paper states. This biological efficiency suggests a potential path beyond the limitations of traditional digital logic.
Central to Extropic’s approach are Energy-Based Models (EBMs), which is set at the intersection between thermodynamics and probabilistic machine learning. These models, particularly exponential families, present a method for efficiently parameterizing probability distributions with minimal data. This is crucial for modeling rare but impactful events, enabling a high degree of entropy and accurate reflection of target distributions.
Extropic’s lite paper also delves into the technical prowess of its superconducting chips, designed to operate at low temperatures and harness the Josephson effect for accessing non-Gaussian probability distributions. This advance not only emphasizes theenergy efficiency of these chips but also highlights Extropic’s commitment to extending its technological reach through semiconductor devices suited for room temperature operation.
According to the paper: “Extropic’s superconducting chips are entirely passive, meaning we only expend energy when measuring or manipulating its state. This likely makes these neurons the most energy-efficient in the universe. These systems will be highly energy efficient at scale: Extropic targets low-volume, high-value customers like governments, banks, and private clouds with these systems.”
Beyond hardware, Extropic is crafting a software layer that bridges abstract EBM specifications with practical hardware controls. This effort aims to transcend the memory limitations inherent in deep learning, heralding a new age of AI acceleration that could redefine the boundaries of artificial intelligence.
Practically, the paper offers several benefits the technology is poised to offer the following benefits:
- Extends hardware scaling well beyond the constraints of digital computing
- Enables AI accelerators that are many orders of magnitude faster and more energy efficient than digital processors (CPUs/GPUs/TPUs/FPGAs)
- Unlocks powerful probabilistic AI algorithms that are not feasible on digital processors
Extropic was founded by Guillaume Verdon and Trevor McCourt. In December, Extropic announced it closed its series seed funding round, raising a total of $14.1 million. The round was led by Steve Jang of Kindred Ventures and saw participation from venture capital firms, including Buckley Ventures, HOF Capital, Julian Capital, Marque VC, OSS Capital, Valor Equity Partners and Weekend Fund, among others.