5 breakthroughs made possible by quantum technologies
- Research into quantum physics is leading to advances in many areas of research and development.
- For example, quantum computers represent a very promising avenue with many potential applications.
- However, there are technical and theoretical obstacles to overcome that stand in the way of the commercialisation and practical use of these advances, such as entanglement.
- In the medium term, quantum physics research could be used for astronomical imaging, health and semiconductors.
#1 Towards large-scale quantum computers thanks to improvements in quantum error correction
Quantum computers use quantum bits (qubits). These bits are different from standard computer bits, which can be either 0 or 1, as they can be both 0 and 1. These machines could be much faster than the fastest computers available today as they would be able to compute with many qubits, resulting in an exponential increase in computing power. Qubits can be made from different materials, such as superconductors or trapped ions. Other future methods include photonic quantum processors that use light.
A true quantum computer will require the integration of many qubits into a single device. This will not be an easy task as they are very delicate and the quantum information they contain can easily be destroyed, leading to errors in quantum calculations.
To correct these errors, a quantum error correction (QEC) system will be essential. This generally involves encoding a bit of quantum information onto a set of qubits that act together as a single “logical qubit”. One such technique is surface coding, in which a quantum information bit is encoded onto an array of qubits. A problem with this approach, however, is that adding extra qubits to the system in turn adds extra sources of error.
According to physicists, true large-scale quantum computing will require an error rate of around one in a million, but the best current error correction technologies can only achieve rates of around one in a thousand. So, there is still a long way to go.
The success of this process is an important milestone for quantum technology infrastructures
Researchers at Google Quantum AI have recently created a surface code scheme that should adapt to the error rate required in a quantum processor made up of superconducting qubits that constitute either data qubits (for operation) or measurement qubits. The latter are adjacent to the data qubits and can measure a bit or phase reversal. These are two types of error that affect the qubits.
The researchers found that a “distance‑5 qubit array” comprising a total of 49 physical qubits had an error rate of 2.914%, compared with 3.028% for a « distance‑3 array » comprising 17 qubits. This reduction shows that increasing the number of qubits is a viable route to “fault-tolerant quantum computing” and that an error rate of better than one in a million could be possible in a distance-17 qubit array comprising 577 qubits.
#2 A converter for different quantum devices
The platforms currently being developed for quantum computers are based on different quantum systems such as photons (particles of light), neutral atoms, ions, superconductors and semiconductors. In future quantum networks, these systems will need to communicate with each other, but as they rely on different types of coding, this could prove difficult.
Researchers at Laboratoire Kastler Brossel (LKB) in France have created a converter that enables quantum devices based on different systems to communicate. “We have designed a kind of black box that allows you to switch from one type of quantum information coding to another thanks to the phenomenon of entanglement,” explains physicist Julian Laurat, one of the members of the LKB team. Entanglement, the subject of the 2022 Nobel Prize in Physics, is a purely quantum phenomenon whereby two or more particles can have a closer relationship than that permitted by classical physics. This means that if we determine the quantum state of one of the particles, we can instantly determine the quantum state of the other, regardless of the distance separating them. Once considered a quirk of the quantum world, this “spooky action at a distance”, as Albert Einstein called it, is now exploited in quantum cryptography and communication systems, as well as in the sensors used to detect gravitational waves (a deformation of the fabric of space-time that propagates at the speed of light)..
Thanks to entanglement, the LKB researchers have been able to preserve the quantum codes information signal, which is fragile, while changing the basis on which it is written.
“The success of this process is an important milestone for quantum technology infrastructures,” points out Beate Asenbeck, a doctoral student at the LKB. “Once we can interconnect quantum devices, more complex and efficient networks can be built.”
The researchers have filed a patent to protect their technology, which is now being used by Welinq, a start-up founded by Julien Laurat and his colleague Tom Darras.
#3 Quantum error correction could improve astronomical imaging
High-resolution, long-baseline optical interferometers could revolutionise astronomical imaging: here, light from two or more telescopes, placed at a certain distance from each other, is combined to create an image of a celestial object, such as a star. The images obtained in this way have much finer detail than those obtained with each individual telescope. In this way, the multiple telescopes act as a gigantic “virtual” telescope whose diameter is much larger than that of any real telescope.
In theory, the further apart the telescopes are, the higher the image resolution. In practice, however, environmental noise and light losses between the two instruments degrade the quality of the light signals, limiting the possible distance between them.
Quantum technologies can help circumvent these “light transmission losses” by using quantum memories and entanglement to replace direct optical links between the telescopes, thereby increasing the distances between them. In the most direct approach, the signal could be stored in atomic states or qubits. However, one problem remains: these states are fragile and can be easily destroyed.
Researchers at Macquarie University in Australia and the National University of Singapore (NUS) have now found a way to protect the quantum information contained in the light coming from a celestial object.
In their new technique, the researchers manipulate the state of a star’s light from the two telescopes so that it is in a form that is protected from environmental noise. By then carrying out specific measurements, any errors in the qubits can be detected and corrected by the QEC codes before recovering the information contained in the starlight. This information is then used to construct an image of the star.
#4 Using quantum vacuum fluctuations to fabricate a high-speed random number generator
Modern cryptography relies on the generation of random numbers that are then used as keys to encrypt the huge amounts of data produced by governments and large corporations, for example. Although algorithms are commonly used to generate seemingly random numbers, a hacker could in principle work out the predetermined steps of an algorithm and thus predict its output.
An improved system would rather be based on a truly random process, like the probabilistic nature of phenomena that occur at the quantum level.
The vacuum of space is not really a vacuum but is teeming with random quantum fluctuations as pairs of particles and antiparticles are spontaneously created and then annihilated when they collide with each other. These processes occur on extremely short time scales and can been used to produce random numbers. The problem is that these systems are subject to parasitic noise from their own components, which slows down the process.
To solve this problem, researchers at Ghent University in Belgium built a computer chip (measuring just 5 mm in length) and then mapped all the imperfections in the chip as well as the sources of noise within it. This enabled them to identify the origin of the interference and measure the quantum fluctuations with much greater sensitivity. The result: a chip capable of generating random numbers 200 times faster than existing commercial devices.
#5 Quantum advantage without error correction
IBM researchers have shown that it is possible to achieve quantum advantage (or “supremacy”) without error correction. To do this, they used a 127-qubit quantum processor to calculate the magnetisation of a material using a 2D Ising model. This model represents the magnetic properties of a 2D material using a network of quantum spins that interact with their nearest neighbours. Apparently simple, this model is known to be extremely difficult to solve.
The researchers used a technique called “noisy intermediate-scale (NISQ) quantum computation” in which the calculation is performed rapidly to avoid the accumulation of errors1. This type of calculation will allow for more general quantum algorithms in the short term, before truly fault-tolerant quantum computers become available.
The calculation was carried out using a superconducting quantum chip comprising 127 qubits executing quantum circuits 60 layers deep with a total of around 2800 two-qubit gates. These gates are the quantum analogues of conventional logic gates.
The quantum circuit generates highly entangled quantum states that the researchers then used to program the 2D Ising model by performing a sequence of operations on the qubits and pairs of qubits. Although this method eliminates much of the noise, errors remained. The researchers therefore applied a quantum error mitigation process using conventional computer software. The technique works thanks to the 127-qubit processor’s ability to encode many configurations of the Ising model. Conventional computers would not have sufficient memory to achieve such a feat.
Archive ouverte arXiv