Guest Post: Photonics West 2025 — Enabling Researchers in Quantum Tech

Guest Post
By Jason Ball, Ph.D., Engineer, Liquid Instruments
At this year’s SPIE Photonics West Exhibition, one of the most omnipresent topics of interest was that of quantum technology. While conference discussions demonstrated how this emerging field continues to offer promising results on the road toward commercialization, the reality is that quantum research is often difficult, requiring more precise, scalable, and flexible tools and techniques to overcome the innate challenges of working at the boundaries of physics.
Quantum means more than just computing

Quantum computing, particularly error-corrected quantum computing, has made significant progress over the last year. While these results often steal the headlines, quantum information science encompasses many other technologies that are generating exciting results as well.
Among the research presented at Photonics West, there are advances in quantum sensing and metrology, such as using solid-state spins for magnetometers, or generating squeezed light for increased precision in imaging. The field of quantum communications continues to make strides in effective generation of entangled photon pairs for secure data transmission, as well as the ongoing development of quantum repeaters and emitters for large-scale networks. Underneath all this possibility, the need for flexible, cost-effective benchtop equipment becomes ever more important, enabling researchers worldwide to build on their progress with agility and speed.
Scaling from lab to market
Many of these aforementioned technologies are approaching a market-ready state — primed to be commercialized, if they haven’t been already. For optical quantum information science, the implementation of these technologies requires many specialized components. The most obvious of these are lasers and associated components, including frequency combs, but also cold-atom traps, vacuum chambers, single-photon detectors, and electronics.
Electronics in particular present challenges in terms of how to best balance scalability, performance, and cost. A given quantum optics experimental setup often consists of a wide variety of hardware, all of which must be manually integrated and synchronized. Condensing these instruments into as few devices as possible improves scalability, reduces costs, and cuts down the time needed for prototyping and calibration. This approach, in turn, enables researchers to focus less on setup, and more on the work that matters. To this end, field-programmable gate arrays, or FPGAs, have become a popular tool for realizing these types of experiments due to their parallel processing capabilities and reconfigurability.
Machine learning and quantum technology
There are certainly open questions around the intersection of AI and quantum tech. Whether quantum algorithms can reduce the training time and energy of deep learning models is still up for debate. However, the converse is undeniably true: It’s clear that machine learning algorithms are already working their way into quantum technology. The algorithms can help researchers process and classify signals, identify correlations in large data sets, infer quantum state information, and compensate for systematic errors in equipment.
When deployed on an FPGA, small-scale neural networks can perform a variety of signal processing functions, while using a fraction of the resources required with a CPU or GPU. One of the most straightforward examples is an autoencoder, which removes noise from an input signal. The ability to develop neutral networks on an FPGA helps researchers quickly test out these types of applications and determine whether machine learning can augment and accelerate their research.
The critical role of instrumentation in enabling tomorrow’s researchers
As the quantum sector grows increasingly commercial, reliability and scalability of control electronics takes on more importance. The nature of quantum optics research means that scope and methods can change rapidly, highlighting the growing need for a flexible, future-proof solution for test and measurement.
Modern instrumentation platforms are addressing these needs through versatile, reconfigurable hardware architectures based on flexible FPGAs. These advanced systems combine multiple test and measurement instruments into single devices, significantly reducing equipment footprint in the lab while increasing versatility. The most sophisticated solutions offer both ready-to-use instrument configurations and the ability to implement custom FPGA functions or even deploy neural networks directly on the hardware. This combination of performance, customization, and integration gives quantum researchers new ways to overcome fundamental physics challenges and practical engineering hurdles, ultimately accelerating the path to commercial viability for quantum technologies.
###
Jason Ball is an engineer at Liquid Instruments, where he focuses on applications in quantum physics, particularly quantum optics, sensing, and computing. He holds a Ph.D. in physics from the Okinawa Institute of Science and Technology and has a comprehensive background in both research and industry, with hands-on experience in quantum computing, spin resonance, microwave/RF experimental techniques, and low-temperature systems.