Home / Chroniques / Quantum computing and AI: less compatible than expected?
AI concept using deep learning and cloud computing incorporating big data technology Modern machine learning with neural network and coding 3D rendered illustration
π Science and technology

Quantum computing and AI: less compatible than expected?

Flippo Vincentini
Filippo Vicentini
Assistant Professor of AI and Quantum Physics at Ecole Polytechnique (IP Paris)
Key takeaways
  • There is a belief that quantum computing could revolutionise artificial intelligence and in particular deep learning.
  • However, quantum computing will not necessarily advance AI because it encounters difficulties in processing information from neural networks and voluminous data.
  • In particular, quantum computers are very slow and only very short calculations can be carried out without breakdowns.
  • However, AI machine learning is an essential tool for learning how to design and operate quantum computers today.

With a num­ber of tech firms prom­ising to be able to solve some small real-world prob­lems with­in the next few years, it would seem that the world is on the cusp of a quantum com­put­ing break­through. As such there had been much hope that access to such quantum com­put­ing would trans­form arti­fi­cial intel­li­gence too. But a grow­ing con­sensus sug­gests this may not yet be with­in reach.

What can be said about the origins behind the belief that quantum computing could revolutionise AI?

Fil­ippo Vicentini. AI is a very wide-ran­ging term. So, I’ll focus on “deep learn­ing”, which is behind the new tech­no­lo­gies like text, audio and video gen­er­at­ive mod­els that we are see­ing explode today. The idea that quantum com­put­ing could boost AI devel­op­ment became more prom­in­ent around 2018–19. Com­pan­ies were com­ing out with early quantum com­puters with 1, 2, 3, or 4 noisy qubits. Because of their lim­it­a­tions, these machines could not be used to do lar­ger real-world cal­cu­la­tions, which is where we expect quantum com­put­ing to really shine. Instead, they were tasked with doing many short “quantum” sub­routines (com­monly known as quantum cir­cuits), feed­ing back into a clas­sic­al optim­isa­tion algorithm. This approach is strik­ingly sim­il­ar to how neur­al net­works are trained in deep learning.

The hope, back around that time, was that a reas­on­ably sized “quantum cir­cuit” would be more express­ive — mean­ing it could present more com­plex solu­tions to a prob­lem with few­er resources — than a neur­al net­work, thanks to quantum phe­nom­ena like inter­fer­ence and super­pos­i­tion. In short, this could mean that quantum cir­cuits could enable algorithms that could learn to find cor­rel­a­tions with­in data more effect­ively. Hence, the field of quantum machine learn­ing was born, and sev­er­al research­ers star­ted to try to bring ideas from one side to the oth­er. There was a lot of excite­ment at the time.

Several companies have touted the advent of more powerful quantum computers in the next few years. Does that mean we should expect a leap forward in AI as well?

The brief answer would be no, I don’t think quantum com­put­ing will boost AI for­ward. It’s becom­ing increas­ingly clear that quantum com­puters will be very use­ful for applic­a­tions that require lim­ited input and out­put but a huge amount of pro­cessing power. For instance, to solve com­plex phys­ics prob­lems related to super­con­duct­iv­ity or sim­u­late chem­ic­al molecules. How­ever, for any­thing related to big data and neur­al net­works, the con­sensus is grow­ing that it may ulti­mately not be worth the effort.

This pos­i­tion was recently laid out in a paper1 by Swiss Nation­al Super­com­put­ing Centre’s Tor­sten Hoe­fler, Amazon’s Thomas Hän­er, and Microsoft’s Mat­thi­as Troy­er. I just fin­ished review­ing sub­mis­sions for the QTML24 (quantum tech­niques in machine learn­ing) con­fer­ence and the tone of the quantum machine learn­ing com­munity was on a down­ward trend.

Why is that?

More and more experts are recog­nising that quantum com­puters will likely remain very slow when it comes to input and out­put of data. To give you an idea, we expect that a quantum com­puter that could exist maybe five years from now — if we are being optim­ist­ic — will have the same speed to read and write as an aver­age com­puter from 1999 or 2000.

If we try to run quantum com­puters faster to increase the amount of data we can inject, we will start to intro­duce more errors in the cal­cu­la­tion, and the res­ult will deteri­or­ate. There seems to be a speed lim­it for the oper­a­tion of these machines above which the noise and errors are too strong to be cor­rec­ted, even when we look about 20 years in the future.

Both clas­sic­al and quantum com­puters are noisy. For one, a bit or a qubit, at some point, can ran­domly switch to a 1. While we can address this effect­ively in clas­sic­al com­puters, we don’t have that tech­no­logy in quantum com­puters. We estim­ate it will take at least anoth­er 15 years to devel­op fully fault-tol­er­ant quantum com­puters. That means we can only do very ‘short’ calculations.

Moreover, the out­put of a quantum com­puter is prob­ab­il­ist­ic, which cre­ates addi­tion­al chal­lenges. Clas­sic­al com­puters give you a determ­in­ist­ic res­ult – run the same sim­u­la­tion twice and you’ll get the same answer. But every time you run a quantum algorithm the out­put will be dif­fer­ent. The res­ult must be extrac­ted from the dis­tri­bu­tion of the out­puts (how many times you see 0s and 1s). To recon­struct the dis­tri­bu­tion accur­ately you must repeat the cal­cu­la­tion many, many times which increases the over­head. This is anoth­er reas­on why some algorithms seemed very power­ful a few years ago, but even­tu­ally were shown not to yield a sys­tem­at­ic advant­age over the clas­sic­al ones we can already run on nor­mal computers.

Does that mean that AI and quantum computing will be distant cousins, with little overlap?

Not at all. In fact, my col­leagues and I have recently launched a peti­tion2 to ask for fund­ing at the European Uni­on level for machine learn­ing and quantum sci­ences. Machine learn­ing is quickly becom­ing an essen­tial tool to learn how to design and oper­ate quantum com­puters nowadays. For example, every device is slightly dif­fer­ent. Rein­force­ment learn­ing tech­niques can ana­lyse your machine and its par­tic­u­lar pat­terns to help fit algorithms spe­cific­ally to that device. There’s one com­pany called Q‑CTRL3, for instance, that has been doing pion­eer­ing work in that field. Google’s quantum AI4 and Amazon’s Braket5 are two oth­er lead­ers lever­aging these ideas.

AI could also be very com­pli­ment­ary to quantum com­put­ing. Let’s take Microsoft’s Azure Quantum Ele­ments, which used a com­bin­a­tion of Microsoft Azure HPC (high-per­form­ance com­put­ing) and AI prop­erty-pre­dic­tion fil­ters to whittle down a selec­tion of 32 mil­lion can­did­ates for a more effi­cient rechargeable bat­tery mater­i­al to just 18 can­did­ates. These were run through power­ful, estab­lished, pro­cessing-intens­ive algorithms, which are quite lim­ited because they con­sume a lot of energy, and can’t work with very com­plic­ated molecules. That is exactly where quantum com­put­ing could step in, in the near future.

I believe AI and quantum com­put­ing will be dif­fer­ent com­pon­ents in a stack of tools — com­ple­ment­ary but not com­pat­ible. We want to keep push­ing those dir­ec­tions and many more by cre­at­ing a joint team called “PhiQus” between Ecole Poly­tech­nique (IP Par­is) and Inria togeth­er with Marc-Olivi­er Ren­ou and Tit­ou­an Carette.

Interview by Marianne Guenot
1https://​cacm​.acm​.org/​r​e​s​e​a​r​c​h​/​d​i​s​e​n​t​a​n​g​l​i​n​g​-​h​y​p​e​-​f​r​o​m​-​p​r​a​c​t​i​c​a​l​i​t​y​-​o​n​-​r​e​a​l​i​s​t​i​c​a​l​l​y​-​a​c​h​i​e​v​i​n​g​-​q​u​a​n​t​u​m​-​a​d​v​a​n​tage/
2https://www.openpetition.eu/petition/online/support-the-machine-learning-in-quantum-science-manifesto‑2
3https://q‑ctrl.com
4https://​quan​tumai​.google
5https://​aws​.amazon​.com/​f​r​/​b​r​aket/

Support accurate information rooted in the scientific method.

Donate