Researchers: With Quantum Power Comes Quantum Responsibility
Insider Brief
- A new article in Nature warns that the rapid development of quantum technologies, especially in defense, demands robust ethical governance to prevent risks like privacy violations and cyberattacks.
- The researchers emphasize that ongoing defense projects, such as quantum sensors and encryption, carry ethical concerns, including breaking encryption standards and mass surveillance.
- The article proposes six guiding principles, including risk categorization, prevention of malicious use, and fostering international cooperation, to ensure responsible development of quantum technologies.
Quantum technologies are emerging rapidly, offering new capabilities in health, communications, transportation and finance, to name a few. However, a team of scientists writing in Nature warns that the absence of robust ethical frameworks for these technologies could lead to significant risks, including privacy violations, cyberattacks and the development of new forms of chemical or biological warfare.
The article advocates for the need for ethical governance in quantum technology, especially as countries race to integrate quantum advancements into national defense.
The potential of quantum technology is evident in ongoing defense projects across the globe. For example, the U.S. Department of Defense has allocated $45 million toward integrating quantum components into weapons systems to improve targeting precision, the researchers write. Similarly, the UK Ministry of Defence is investing in quantum sensors for secure navigation, and India is pursuing quantum encryption to protect sensitive military communications. According to the article, China is also advancing its quantum defense capabilities, including the development of a quantum radar system designed to detect objects that evade conventional radar.
While quantum technologies promise transformative benefits, they also present unique ethical risks. One primary concern is that quantum computers could break current encryption standards, potentially exposing sensitive government data and undermining national security. Furthermore, the researchers write that quantum sensors could enhance mass surveillance efforts, infringing on privacy and civil liberties. The article points out that quantum algorithms may be so complex that they become difficult to audit or reverse-engineer, leading to what the authors term a “responsibility gap,” where accountability for harmful outcomes becomes unclear.
Quantum technology shares some ethical risks with artificial intelligence (AI), another fast-evolving field in defense. Both technologies have the potential to disrupt industries and society, but the article stresses that quantum technology’s risks are distinct and require tailored governance frameworks. These risks include the creation of quantum-powered AI systems that could increase bias or lack transparency.
According to the article, ignoring these risks until quantum systems are fully operational would result in higher costs and more severe consequences later on.
Risks of Emerging Technologies
One challenge for defense organizations is that quantum technologies are at varying stages of maturity. While quantum sensors are already being deployed, quantum computers are still largely in the experimental phase. They write that Technologies like IBM’s Goldeneye cryostat — a large cooling system required for quantum computers — are still too bulky and complex for widespread defense use.
Despite these uncertainties, the article argues that the ethical implications of quantum technologies should be considered early on. It rejects the ‘neutrality thesis,’ which holds that technology is ethically neutral until applied. The authors instead assert that the design and development phases are inherently ethical decisions.
The article highlights several concrete risks that defense organizations must address. Quantum sensors, for instance, could undermine the secrecy of submarines, weakening nuclear deterrence strategies. Additionally, quantum computers could create new chemical or biological weapons, raising questions about just war principles and international regulations.
Proactive, Anticipatory Ethical Governance
To address these concerns, the researchers call for a proactive, anticipatory ethical governance approach that integrates ethical considerations into the design and development stages of quantum technologies. They suggest a list of six guiding principles for responsible innovation in defense, emphasizing that governance must evolve alongside the technology.
The principles include:
- Categorize Risks: Defense organizations should develop models to categorize risks posed by quantum technologies.
- Counter Malicious Uses: Efforts must be made to prevent authoritarian regimes and malicious actors from using quantum technologies.
- Ensure Justified Securitization: As quantum technologies become national security priorities, defense organizations must ensure that their securitization is balanced with global benefits.
- Foster Multilateral Collaboration and Oversight: International cooperation is critical in establishing regulatory frameworks for quantum technologies.
- Prioritize Information Security: Defense organizations should focus on reducing the risks of information leaks surrounding sensitive quantum technologies.
- Promote Dual-Use Development: The defense sector should support civilian applications of quantum technology to address global challenges in areas like healthcare, agriculture, and climate change.
The team explains how these principles could address some of these concerns in a way that adds protections without stifling innovation that could produce societal benefits.
First, to mitigate risks, the researchers propose that defense organizations develop models for categorizing risks into “known knowns,” “known unknowns,” and “unknown unknowns.” This framework would allow organizations to prioritize which risks to address first and ensure that the ethical governance of quantum technology is proportionate to its stage of maturity.
Collaboration will also be essential, the researchers argue. Quantum technologies will require expertise from multiple disciplines, including physics, ethics, international law, and risk assessment. The article emphasizes the need for multilateral cooperation, suggesting that an independent oversight body, similar to the International Atomic Energy Agency, could govern the use of quantum technologies in defense.
Another major risk identified in the article is the potential for authoritarian states to misuse quantum technologies. Quantum computers, for instance, could be used to break encryption standards and increase state surveillance. The researchers stress the need for export controls and regulations to prevent the misuse of quantum technologies by repressive regimes.
As quantum technologies are increasingly “securitized” and prioritized as national defense tools, the article warns that countries should avoid the mistakes made during the AI arms race. Protectionist policies that limit collaboration could hinder the development of quantum technology. Instead, the article advocates for shared standards and interoperability between countries, especially among allies.
The researchers also call for quantum technologies to be developed for societal benefit. They propose that defense organizations should support civilian applications of quantum technology, including advances in healthcare, agriculture and climate change mitigation. Drawing lessons from AI, the researchers suggest that quantum technology could follow a “fusion” strategy, where civilian and defense sectors collaborate to pool resources and accelerate innovation.
The article was written by a team of leading experts in the fields of quantum ethics and defense technologies. The authors include: Mariarosaria Taddeo, a professor of digital ethics and defense technologies at the Oxford Internet Institute, University of Oxford, UK. She is also a DSTL ethics fellow at the Alan Turing Institute in London and serves on the AI Ethics Advisory Panel for the UK Ministry of Defence. Kate Pundyk is a research assistant at the Oxford Internet Institute, and Alexander Blanchard is a senior researcher at the Stockholm International Peace Research Institute, Sweden