The deployment of quantum technologies1 offers promising prospects for solving problems with complexity beyond the processing capacity of conventional systems. Financial institutions run up against the limitations of classical computing architectures when it comes to optimising portfolios with large numbers of assets and complex constraints (caps, exclusions, etc.), modeling sophisticated derivative products, or projecting extreme risk scenarios2. These operations demand both precision in execution and significant computational power, driving research toward alternative paradigms. By leveraging superposition to process multiple configurations simultaneously, interference between superposed states to steer probabilities toward the most relevant outcomes, and entanglement to synchronise interdependencies between variables, this system could reshape current decision-making methods3.
This shift nonetheless requires a careful assessment of the tangible benefits, measured against the technical and methodological constraints inherent to these new tools. Speaking with Polytechnique Insights, Lionel Martellini, Professor of Finance at EDHEC Business School and founder and director of the EDHEC Quantum Institute, shares his expertise on the integration of these algorithms into security selection, portfolio construction, and risk management. His research focuses in particular on measuring the added value of these innovations within financial processes and on the conditions for their viability within market structures4.
The question of the maturity of these technologies within the financial ecosystem remains a critical point. It is important to distinguish the applications likely to deliver measurable progress, to identify the factors slowing their deployment, and to define the scientific milestones required before everyday use becomes feasible. The gap between current research capabilities and the reliability requirements of real-world operations thus represents the junction between laboratory hypotheses and the practical demands of the industry5.
Quantum computing: what machines are we actually talking about?
Today, the term “quantum computing” covers very different realities.
Machines known as NISQ (Noisy Intermediate Scale Quantum) are those currently available. They use a limited number of qubits6, still sensitive to noise and errors. They enable experimental demonstrations but remain constrained by the size and duration of computations.
Fault-tolerant quantum computing refers to architectures capable of correcting errors in a systematic way. This stage is a prerequisite for the large-scale use of advanced quantum algorithms, particularly for optimisation or financial simulation.
Alongside these two horizons, a portion of current applications rely on quantum-inspired methods, run on classical computers. These draw on principles derived from quantum computing to enhance certain calculations, without requiring physical qubits7.
Dynamics of high-performance computing: toward resolving financial complexities
Financial markets rely on operations with computational density that is continuously growing. Whether structuring multi-asset portfolios, pricing derivatives with non-linear profiles, or modeling stress scenarios, classical architectures reach a saturation threshold as the scale of problems expands8. In Quantum Speedup of Monte Carlo Methods, Ashley Montanaro establishes that the quantum amplitude estimation algorithm offers a theoretically significant acceleration of Monte Carlo methods, which are essential to derivative asset pricing and risk management. His work demonstrates that, for a given target level of precision, the volume of simulations required benefits from a quadratic reduction compared to conventional approaches. This efficiency gain opens the door to reducing the computational cost of complex financial calculations, provided that fault-tolerant quantum computers and models compatible with the algorithm’s requirements are available9.
For Lionel Martellini, “a genuine quantum advantage consists of improving the return of a portfolio, or reducing its risk, in a way that generates an economic gain greater than the additional costs induced by the quantum solution.” He confirms the need to measure the real contribution of these technologies beyond simple speed gains, specifying that “it is essential to consider costs before concluding that a quantum advantage exists.” Indeed, the investments required for these systems remain substantial.
In portfolio optimisation, risk management models struggle with the exponential growth in asset combinations10. Techniques such as quantum approximate optimisation algorithms (QAOA) or QUBO formulations offer prospects for exploring these data spaces more intelligently, particularly for combinatorial problems where the objective is to identify optimal configurations under constraints — these methods having been specifically developed to efficiently navigate high-dimensional optimisation landscapes11. On the valuation side, Martellini notes that “the central problem consists of computing the expected value of a payoff under a risk-adjusted probability, often via Monte Carlo simulations,” and that “the quantum amplitude estimation algorithm (QAE) offers a quadratic gain: the pricing error decreases more rapidly, which reduces the number of trajectories required.” However, financial viability remains the ultimate arbiter: “On a technological level, the gain is clear, but the economic advantage remains to be assessed based on the costs of accessing quantum computers, including their energy consumption.”
Epistemology of use cases: model relevance and algorithmic drift
The appeal of quantum processors for portfolio optimisation is fully achieved in environments characterised by high dimensionality and shifting data structures. “When the parameters are few and stationary, classical methods suffice. Small problems can be solved with simple, low-cost tools,” the expert notes. The utility of quantum approaches emerges when the dynamic complexity of data flows overwhelms the capacity of conventional methods12.
Nevertheless, research must avoid use cases disconnected from real-world needs. Martellini cautions against “combining security selection and portfolio risk-return optimisation into a single problem. This creates combinatorial complexity that makes a quantum advantage appear artificially.” Asset selection must serve clear financial objectives: “It can be motivated by performance, by comparing the market price to fair value, or by risk, for example by seeking a portfolio with low correlation.”
“Quantum washing” refers to a methodological drift consisting of attributing a quantum advantage to problems that have been artificially over-complicated. The tendency toward “quantum washing” represents a barrier to the credibility of solutions and therefore to their adoption. Indeed, according to Lionel Martellini, “there is a tendency to artificially construct use cases in order to highlight the advantages of quantum computing. This gives somewhat the impression of a solution desperately in search of a problem, or of an oversized hammer looking for nails to strike.” This bias can lead to promising but inapplicable conclusions. “Even when a problem is real,” he continues, “there is a risk of overstating the supposed advantages. The benefits presented often depend on unstated assumptions regarding the maturity of the technology or its actual cost.”
Material realities and pathways toward system hybridisation
The integration of quantum computing into finance runs up against constraints that go beyond raw processing power, notably technological maturity and cybersecurity. Contemporary machines (NISQ) operate with components that remain unstable. “Current quantum computers are still too limited for large-scale use,” Martellini notes. Real-world benefits depend on a more stable technological horizon, known as fault-tolerant13.
The financial factor is equally significant: “Some machines will cost tens of millions, others several hundreds of millions, or even billions,” Martellini points out, adding that “the return on investment of these machines and their energy consumption — whether in shared or dedicated cloud usage — will be decisive factors.” Furthermore, the handling of sensitive data requires stringent encryption protocols and meticulous oversight of cloud infrastructures.
As such, the technology is best understood as a complementary layer to existing systems. “Today, hybrid approaches are the most realistic path forward,” the finance professor states. He also mentions “quantum-inspired simulators, such as digital annealers or tensor network-based methods.” These tools allow for experimentation without the constraints of physical qubits. Moreover, “while the autonomous quantum computer remains a medium-term objective, in the short and medium term it is hybrid classical-quantum architectures that represent the most realistic approach for obtaining actionable results.” This transition makes it possible to capture targeted gains and explore use cases while keeping operational risks under control.