Home / Chroniques / 5 breakthroughs made possible by quantum technologies
Digital Surveillance: Extreme Close-Up of Robotic or Bionic Eye with Advanced Circuitry
π Science and technology π Digital

5 breakthroughs made possible by quantum technologies

Key takeaways
  • Research into quantum physics is leading to advances in many areas of research and development.
  • For example, quantum computers represent a very promising avenue with many potential applications.
  • However, there are technical and theoretical obstacles to overcome that stand in the way of the commercialisation and practical use of these advances, such as entanglement.
  • In the medium term, quantum physics research could be used for astronomical imaging, health and semiconductors.

#1 Towards large-scale quantum computers thanks to improvements in quantum error correction

Quantum com­puters use quantum bits (qubits). These bits are dif­fer­ent from stand­ard com­puter bits, which can be either 0 or 1, as they can be both 0 and 1. These machines could be much faster than the fast­est com­puters avail­able today as they would be able to com­pute with many qubits, res­ult­ing in an expo­nen­tial increase in com­put­ing power. Qubits can be made from dif­fer­ent mater­i­als, such as super­con­duct­ors or trapped ions. Oth­er future meth­ods include photon­ic quantum pro­cessors that use light.

A true quantum com­puter will require the integ­ra­tion of many qubits into a single device. This will not be an easy task as they are very del­ic­ate and the quantum inform­a­tion they con­tain can eas­ily be des­troyed, lead­ing to errors in quantum calculations.

To cor­rect these errors, a quantum error cor­rec­tion (QEC) sys­tem will be essen­tial. This gen­er­ally involves encod­ing a bit of quantum inform­a­tion onto a set of qubits that act togeth­er as a single “logic­al qubit”. One such tech­nique is sur­face cod­ing, in which a quantum inform­a­tion bit is encoded onto an array of qubits. A prob­lem with this approach, how­ever, is that adding extra qubits to the sys­tem in turn adds extra sources of error.

Accord­ing to phys­i­cists, true large-scale quantum com­put­ing will require an error rate of around one in a mil­lion, but the best cur­rent error cor­rec­tion tech­no­lo­gies can only achieve rates of around one in a thou­sand. So, there is still a long way to go.

The suc­cess of this pro­cess is an import­ant mile­stone for quantum tech­no­logy infrastructures

Research­ers at Google Quantum AI have recently cre­ated a sur­face code scheme that should adapt to the error rate required in a quantum pro­cessor made up of super­con­duct­ing qubits that con­sti­tute either data qubits (for oper­a­tion) or meas­ure­ment qubits. The lat­ter are adja­cent to the data qubits and can meas­ure a bit or phase reversal. These are two types of error that affect the qubits.

The research­ers found that a “distance‑5 qubit array” com­pris­ing a total of 49 phys­ic­al qubits had an error rate of 2.914%, com­pared with 3.028% for a « distance‑3 array » com­pris­ing 17 qubits. This reduc­tion shows that increas­ing the num­ber of qubits is a viable route to “fault-tol­er­ant quantum com­put­ing” and that an error rate of bet­ter than one in a mil­lion could be pos­sible in a dis­tance-17 qubit array com­pris­ing 577 qubits.

#2 A converter for different quantum devices

The plat­forms cur­rently being developed for quantum com­puters are based on dif­fer­ent quantum sys­tems such as photons (particles of light), neut­ral atoms, ions, super­con­duct­ors and semi­con­duct­ors. In future quantum net­works, these sys­tems will need to com­mu­nic­ate with each oth­er, but as they rely on dif­fer­ent types of cod­ing, this could prove difficult.

Research­ers at Labor­atoire Kast­ler Brossel (LKB) in France have cre­ated a con­vert­er that enables quantum devices based on dif­fer­ent sys­tems to com­mu­nic­ate. “We have designed a kind of black box that allows you to switch from one type of quantum inform­a­tion cod­ing to anoth­er thanks to the phe­nomen­on of entan­gle­ment,” explains phys­i­cist Juli­an Laur­at, one of the mem­bers of the LKB team. Entan­gle­ment, the sub­ject of the 2022 Nobel Prize in Phys­ics, is a purely quantum phe­nomen­on whereby two or more particles can have a closer rela­tion­ship than that per­mit­ted by clas­sic­al phys­ics. This means that if we determ­ine the quantum state of one of the particles, we can instantly determ­ine the quantum state of the oth­er, regard­less of the dis­tance sep­ar­at­ing them. Once con­sidered a quirk of the quantum world, this “spooky action at a dis­tance”, as Albert Ein­stein called it, is now exploited in quantum cryp­to­graphy and com­mu­nic­a­tion sys­tems, as well as in the sensors used to detect grav­it­a­tion­al waves (a deform­a­tion of the fab­ric of space-time that propag­ates at the speed of light)..

Thanks to entan­gle­ment, the LKB research­ers have been able to pre­serve the quantum codes inform­a­tion sig­nal, which is fra­gile, while chan­ging the basis on which it is written.

“The suc­cess of this pro­cess is an import­ant mile­stone for quantum tech­no­logy infra­struc­tures,” points out Beate Asen­beck, a doc­tor­al stu­dent at the LKB. “Once we can inter­con­nect quantum devices, more com­plex and effi­cient net­works can be built.”

The research­ers have filed a pat­ent to pro­tect their tech­no­logy, which is now being used by Welinq, a start-up foun­ded by Juli­en Laur­at and his col­league Tom Darras.

#3 Quantum error correction could improve astronomical imaging

High-res­ol­u­tion, long-baseline optic­al inter­fer­o­met­ers could revolu­tion­ise astro­nom­ic­al ima­ging: here, light from two or more tele­scopes, placed at a cer­tain dis­tance from each oth­er, is com­bined to cre­ate an image of a celes­ti­al object, such as a star. The images obtained in this way have much finer detail than those obtained with each indi­vidu­al tele­scope. In this way, the mul­tiple tele­scopes act as a gigant­ic “vir­tu­al” tele­scope whose dia­met­er is much lar­ger than that of any real telescope.

In the­ory, the fur­ther apart the tele­scopes are, the high­er the image res­ol­u­tion. In prac­tice, how­ever, envir­on­ment­al noise and light losses between the two instru­ments degrade the qual­ity of the light sig­nals, lim­it­ing the pos­sible dis­tance between them.

Quantum tech­no­lo­gies can help cir­cum­vent these “light trans­mis­sion losses” by using quantum memor­ies and entan­gle­ment to replace dir­ect optic­al links between the tele­scopes, thereby increas­ing the dis­tances between them. In the most dir­ect approach, the sig­nal could be stored in atom­ic states or qubits. How­ever, one prob­lem remains: these states are fra­gile and can be eas­ily destroyed.

Research­ers at Macquar­ie Uni­ver­sity in Aus­tralia and the Nation­al Uni­ver­sity of Singa­pore (NUS) have now found a way to pro­tect the quantum inform­a­tion con­tained in the light com­ing from a celes­ti­al object.

In their new tech­nique, the research­ers manip­u­late the state of a star’s light from the two tele­scopes so that it is in a form that is pro­tec­ted from envir­on­ment­al noise. By then car­ry­ing out spe­cif­ic meas­ure­ments, any errors in the qubits can be detec­ted and cor­rec­ted by the QEC codes before recov­er­ing the inform­a­tion con­tained in the star­light. This inform­a­tion is then used to con­struct an image of the star.

#4 Using quantum vacuum fluctuations to fabricate a high-speed random number generator

Mod­ern cryp­to­graphy relies on the gen­er­a­tion of ran­dom num­bers that are then used as keys to encrypt the huge amounts of data pro­duced by gov­ern­ments and large cor­por­a­tions, for example. Although algorithms are com­monly used to gen­er­ate seem­ingly ran­dom num­bers, a hack­er could in prin­ciple work out the pre­de­ter­mined steps of an algorithm and thus pre­dict its output.

An improved sys­tem would rather be based on a truly ran­dom pro­cess, like the prob­ab­il­ist­ic nature of phe­nom­ena that occur at the quantum level.

The vacu­um of space is not really a vacu­um but is teem­ing with ran­dom quantum fluc­tu­ations as pairs of particles and anti­particles are spon­tan­eously cre­ated and then anni­hil­ated when they col­lide with each oth­er. These pro­cesses occur on extremely short time scales and can been used to pro­duce ran­dom num­bers. The prob­lem is that these sys­tems are sub­ject to para­sit­ic noise from their own com­pon­ents, which slows down the process.

To solve this prob­lem, research­ers at Ghent Uni­ver­sity in Bel­gi­um built a com­puter chip (meas­ur­ing just 5 mm in length) and then mapped all the imper­fec­tions in the chip as well as the sources of noise with­in it. This enabled them to identi­fy the ori­gin of the inter­fer­ence and meas­ure the quantum fluc­tu­ations with much great­er sens­it­iv­ity. The res­ult: a chip cap­able of gen­er­at­ing ran­dom num­bers 200 times faster than exist­ing com­mer­cial devices.

#5 Quantum advantage without error correction

IBM research­ers have shown that it is pos­sible to achieve quantum advant­age (or “suprem­acy”) without error cor­rec­tion. To do this, they used a 127-qubit quantum pro­cessor to cal­cu­late the mag­net­isa­tion of a mater­i­al using a 2D Ising mod­el. This mod­el rep­res­ents the mag­net­ic prop­er­ties of a 2D mater­i­al using a net­work of quantum spins that inter­act with their nearest neigh­bours. Appar­ently simple, this mod­el is known to be extremely dif­fi­cult to solve.

The research­ers used a tech­nique called “noisy inter­me­di­ate-scale (NISQ) quantum com­pu­ta­tion” in which the cal­cu­la­tion is per­formed rap­idly to avoid the accu­mu­la­tion of errors1. This type of cal­cu­la­tion will allow for more gen­er­al quantum algorithms in the short term, before truly fault-tol­er­ant quantum com­puters become available.

The cal­cu­la­tion was car­ried out using a super­con­duct­ing quantum chip com­pris­ing 127 qubits execut­ing quantum cir­cuits 60 lay­ers deep with a total of around 2800 two-qubit gates. These gates are the quantum ana­logues of con­ven­tion­al logic gates.

The quantum cir­cuit gen­er­ates highly entangled quantum states that the research­ers then used to pro­gram the 2D Ising mod­el by per­form­ing a sequence of oper­a­tions on the qubits and pairs of qubits. Although this meth­od elim­in­ates much of the noise, errors remained. The research­ers there­fore applied a quantum error mit­ig­a­tion pro­cess using con­ven­tion­al com­puter soft­ware. The tech­nique works thanks to the 127-qubit pro­cessor’s abil­ity to encode many con­fig­ur­a­tions of the Ising mod­el. Con­ven­tion­al com­puters would not have suf­fi­cient memory to achieve such a feat.

Isabelle Dumé

Ref­er­ences:

Sup­press­ing quantum errors by scal­ing a sur­face code logic­al qubit. Nature 614, 676–681

 A quantum-bit encod­ing con­vert­er. Nature Photon­ics 17 165–170

Archive ouverte arX­iv

Ima­ging Stars with Quantum Error Cor­rec­tion. Phys. Rev. Lett. 129, 210502

100-Gbit/s Integ­rated Quantum Ran­dom Num­ber Gen­er­at­or Based on Vacu­um Fluc­tu­ations. PRX Quantum 4, 010330

Evid­ence for the util­ity of quantum com­put­ing before fault tol­er­ance Nature 618 500–505

Support accurate information rooted in the scientific method.

Donate