Quantum technologies are steadily becoming more present in the media, at tech conventions, and in the future plans of a growing number of businesses and organisations.
The increasing level of interest in the field has been brought about by what is often referred to as the second quantum revolution. At the heart of this revolution lie quantum-mechanical phenomena such as interference, superposition, and entanglement, which are the building blocks of a new paradigm for information processing. Technologies empowered by these effects promise to disrupt the way we compute, communicate, and measure. Nevertheless, numerous quantum-enabled technologies have existed for several decades, and are the basis of the devices on which we perform classical computing. In this article we outline the two key quantum revolutions, which we shall refer to as Quantum 1.0 and Quantum 2.0*.
The beginning of the 20th century witnessed the birth of quantum mechanics. For a decade prior, physicists knew that classical mechanics was incomplete. Experimental observations about the nature of radiation — such as how it is absorbed and emitted from material systems, e.g. atoms or metals — disagreed starkly with the predictions of the prevailing classical theories. In classical physics, light is assumed to be a wave. Giants such as Max Planck, Niels Bohr, and Albert Einstein overturned this notion, assuming that light instead comes in minuscule packets or quanta of energy, known as photons. In doing so, they were able reconcile the inconsistencies that had plagued classical theories. An entirely new chapter in the history of human knowledge had been opened.
Key mathematical foundations were developed in the following years, and the new quantum theory was immediately applied to understanding the properties of matter. Puzzling open questions suddenly began to make sense. Why do the elements in the periodic table fall into the pattern they do? Why does each element only absorb and emit light of certain frequencies? What allows some materials to conduct electricity, but not others?
Following World War II, an increasingly important scientific and economic objective became the development of computers. It would not be long before the newly developed quantum theory would dramatically disrupt this endeavour. The backbone of the computing or information revolution is the transistor. Transistors are tiny switches, electronic on-or-off. When packed together by the billions, they form integrated circuits. Integrated circuits are the brain and muscles of today’s computers.
Quantum mechanics gave scientists a means of understanding semiconductors, and ultimately led to the creation of the transistor. Put simply, no quantum, no internet. (And also no mobile phones, no videogames, no Instagram… you get the idea!). Thanks to Quantum 1.0, we were able to build devices whose functionalities extended beyond the capabilities of classical physics: lasers, digital cameras, modern medical instruments, and even nuclear power plants.
Arguably, however, the most mind-boggling effects of quantum mechanics — such as entanglement — played little to no part in the first revolution. What sparked Quantum 2.0 was the marriage between quantum physics and information theory. Information theory, fathered by Claude Shannon after the second world war, is the mathematical theory that describes how information is processed, stored, and communicated. Quantum information theory explores the same concepts within the context of quantum systems. While the underlying goal of the theories is analogous, quantum information theory has a richer set of resources at its disposal.
In 1994, Peter Shor — then working at Bell Labs — discovered that most modern cryptographic encryption schemes can be broken efficiently by a machine capable of processing quantum information. This dramatic illustration of the power of quantum computers elevated quantum information theory to new heights. Quantum information theory had found a potent real world application, albeit one that could bring havoc to the world’s digital infrastructure!
Since Shor’s discovery, the banners of the second quantum revolution have been steadily approaching. The door to Quantum 2.0 has been gradually prised open by improved precision in the control of isolated quantum systems such as atoms and photons. This is crucial, because quantum information is fragile. Nature has a strong preference for quantum systems to lose their quantum identities, a phenomenon known as decoherence. Novel quantum technologies allow us to fight back against decoherence, and use individual quantum systems for information processing purposes.
The three strands of Quantum 2.0
Quantum 2.0 is expected to be highly disruptive in three major domains: communication, measurement and sensing, and computation.
Communication is one of the defining aspects of modern life. Every minute, hundreds of terabytes of information are exchanged across the globe. Privacy, in this scenario, is a big issue. We would like our love messages and our bank account details to remain private. Quantum communication is best known as the science of sharing information in a completely secure way. In this sense, quantum technologies promise the creation of a quantum network where malicious users cannot access private information. In a recent pivotal experiment, Chinese and Austrian scientists used a satellite to exchange quantum-protected information between Beijing and Vienna, paving the way towards a large-scale quantum internet.
Precision measurements are crucial in every aspect of society, giving us the ability to effectively navigate, coordinate, and medicate, to point to just a few key applications. Quantum sensing and metrology exploit effects unique to quantum systems, such as entanglement, to realise measurements with greater precision than is possible with classical measuring devices. Applications include quantum magnetometers to detect tiny variations of magnetic fields (think of fancy compasses), atomic clocks (clocks that do not drift, and which tick many billions of times a second), and quantum-enhanced lidar (a widely used surveying and navigation method, with applications in fields as diverse as archaeology, agriculture, and autonomous vehicle guidance).
Last but not least — and arguably the sexiest application of the lot in the eyes of the popular press — is quantum computing. Tech giants such as Google, IBM, Intel, and Microsoft, as well as startups such as Rigetti and Xanadu, are building the world’s first commercial quantum computers. These machines, first theorised by Yuri Manin and Richard Feynman in the early 80s, promise to outperform the best classical supercomputers for many important tasks. While the near-term utility of prototypical quantum processors is unclear, in the long run, fully-fledged universal quantum computers will be used to solve currently intractable problems. These include simulation of quantum systems (e.g. for drug and material design), studying chemical reactions (e.g. better catalysts for fertilisers), and quantum machine learning and AI (e.g. improved pattern recognition).
We will dig deeper into each of these three key strands of quantum technologies over the coming weeks, in a fortnightly series of articles. The first will focus on quantum communication, and its interplay with cryptography. This will be followed by an overview of quantum computing, with a particular focus on the emerging industry landscape. Our introductory series will conclude with a look at quantum sensing and metrology, exploring how quantum systems can push the boundaries of measurements.
Quantum mechanics changed the way we think of the world. It also changed the way we interact with Nature, opening possibilities that are unthinkable in an entirely classical world. The next great challenge is to facilitate the adoption of quantum technologies by society at large. QWA is helping to bridge the gap between quantum and business. The go-to for providing suggestions, feedback and questions is our email address firstname.lastname@example.org.
* Terminology first suggested by the UK’s DSTL.