a close up of a clock with numbers on ita close up of a clock with numbers on it

Insider Brief

  • IBM and Vanguard researchers demonstrated a quantum-classical workflow for portfolio construction using 109 qubits on IBM’s Heron processors, showing potential advantages for large-scale financial optimization.
  • The team applied a Conditional Value at Risk-based Variational Quantum Algorithm (CVaR-VQA), combining quantum sampling and classical optimization to balance asset selection under risk and constraint conditions.
  • While current hardware limits prevent tackling thousand-asset portfolios where quantum advantage would emerge, the study establishes feasibility and identifies complex, noise-tolerant circuits as promising for future financial applications.
  • Photo by Tyler Prahm on Unsplash

A team of IBM-Vanguard researchers used a quantum computer to tackle portfolio construction, a financial problem that becomes intractable for classical machines at scale. The team ran experiments with 109 qubits on IBM’s Heron processors, reported in the pre-print server arXiv that a quantum-classical workflow can outperform traditional methods for certain types of investment optimization.

“This work highlights the growing potential of quantum optimization workflows,” Roberto Lo Nardo and Gabriele Agliardithe, scientists from the IBM team, wrote in a blog post on the work. “By combining quantum circuits that explore high-dimensional solution spaces with classical algorithms that refine and validate results, researchers can tackle problems that are too large or too complex for either quantum or classical methods alone.”

Portfolio construction — essentially choosing a mix of assets to maximize returns under risk and compliance constraints — has long challenged financiers. The textbook Markowitz model, which has been around since financial pioneer Harry Markowitz introduced it in the mid-20th century, provides a simplified version of the problem, but real-world applications like exchange-traded funds involve far more complexity.

With thousands of bonds or equities to balance, constraints on budgets and risk categories, and shifting metrics such as duration or expected return, the optimization problem grows rapidly. Even top-tier classical solvers can take up to 10 minutes to find approximate solutions when portfolios involve 1,000 or more instruments.

For ETF designers and fund managers who need rapid scenario testing, this lag creates bottlenecks .

Sampling-Based Variational Scheme

The IBM study explores whether quantum algorithms can accelerate such optimization. The researchers focus on the Conditional Value at Risk-based Variational Quantum Algorithm (CVaR-VQA), a hybrid approach that combines quantum sampling with classical optimization. Rather than relying on formulations that map directly to quadratic unconstrained binary optimization, the method uses bit-strings sampled from a quantum circuit to evaluate a customized cost function. Making this a little simpler for those unfamiliar with the mathematics: Instead of turning the financial problem into a strict mathematical form that fits the computer’s standard input (which is what “quadratic unconstrained binary optimization” means), the researchers let the quantum computer generate many possible answers — called bit-strings — and then score those answers using a custom rule to see which one works best.

Ultimately, this gives the system the flexibility to avoid the overhead of encoding constraints directly into qubits. Each qubit corresponds to one asset, and the algorithm iteratively samples, evaluates cost, updates parameters, and repeats. Classical local search is then applied to refine the quantum output .

A key question in variational methods is which quantum circuits — or ansatzes — work best. According to the study, the team compared a standard TwoLocal circuit with a more advanced design called bias-field counterdiabatic optimization, or BFCD. Early simulations suggested that the harder-to-simulate BFCD circuits produced better convergence.

This result hints at a possible sweet spot: quantum circuits that are too complex for efficient classical emulation but still trainable on hardware may deliver the most useful outcomes. The experiments also tested different entanglement structures, including bilinear chains and “colored” maps tailored to IBM’s hexagon-based design, or heavy-hex topology.

Hardware Runs on 109 Qubits

The group carried out tests on both simulators and real hardware. In simulations using 31 qubits, the researchers found that focusing more on worst-case scenarios (low CVaR) produced the most accurate results, and repeating the quantum circuit runs multiple times helped the algorithm reach stable, consistent answers.

On the 109-qubit Heron processors, the team executed circuits with up to 4,200 gates. Despite noise, raw quantum samples improved with each iteration and moved closer to optimal values.

The researchers used a technique called “local search post-processing” to refine the quantum computer’s results. In this approach, the quantum system explores the big picture and identifies promising areas of the landscape, while a classical computer zooms in to fine-tune those results and find the best possible solution nearby. After local search post-processing, the best gap to the proven optimum was 0.49%, which compared favorably with classical local search alone, which performed worse on the same tests.

The study argues that a quantum-classical workflow provides benefits beyond raw accuracy. Because the sampling-based method does not require rewriting the portfolio problem into strict mathematical forms like QUBOs, it preserves more realism in financial modeling. The approach also yields multiple candidate solutions along the way, offering investors richer data for decision-making. At the same time, the hardware results demonstrate that convergence continues even under noise, showing robustness of the method.

Limits And Future Work

The researchers report that, as in all studies, there are some limits that point toward a need for future research. The problems tackled involved 109 bonds — a scale solvable by classical methods in seconds. True financial advantage would only appear at the thousands-of-assets scale, where classical solvers bog down. Current hardware cannot yet handle those sizes.

It’s also important to call out that variational algorithms require repeated circuit runs for training, which can become costly as the number of parameters increases. If the raw samples are too far from optimal, local search may not improve them, underscoring the importance of well-designed circuits.

For finance, the experiments show a path to exploring bond or ETF construction with greater flexibility and possibly faster turnaround in the future. For quantum computing, they provide evidence that harder-to-simulate circuits may be the most promising candidates for practical advantage. The results also suggest new benchmarking possibilities: using realistic financial optimization tasks rather than abstract problems as yardsticks for quantum progress.

Scaling remains the central hurdle, the study suggests. IBM’s team notes that only when problems reach beyond 1,000 assets does classical performance falter enough to create an opening for quantum methods. Achieving advantage will require further algorithmic refinements, larger and more reliable hardware, and strategies to cut down the training overhead. Ideas include parameter transfer — training circuits classically, then running them only once on hardware to gather samples — or adopting hybrid cost functions that reduce depth.

The study concludes that while the present runs fall short of showing outright advantage, they prove feasibility. By combining quantum sampling with classical post-processing, researchers demonstrated portfolio optimization on real hardware at a scale that points toward practical applications once machines mature.

For investors, the general thrust of the work means is that the possibility of designing ETFs with the aid of quantum processors may still be distant, but for the first time, the path is becoming clearer.

The team writes: “As quantum hardware continues to scale and algorithms continue to mature, hybrid workflows hold promise to outperform classical methods for solving complex, constrained problems. It is entirely possible that we will soon see quantum tools integrated into the daily workflows of asset managers, traders, and risk analysts.”

For a deeper, more technical dive, please review the paper on arXiv. It’s important to note that arXiv is a pre-print server, which allows researchers to receive quick feedback on their work. However, it is not — nor is this article, itself — official peer-review publications. Peer-review is an important step in the scientific process to verify results.

The study was conducted by researchers from IBM Research and Vanguard’s Centre for Analytics & Insight. The IBM team included Gabriele Agliardi from IBM Research–Italy; Dimitris Alevras, Vaibhaw Kumar, and Sumit Kumar from IBM Research–US; Roberto Lo Nardo from IBM Research–UK; Gabriele Compostella from IBM Research–Germany; and Manuel Proissl from IBM Research–Switzerland. They collaborated with Bimal Mehta from Vanguard’s Centre for Analytics & Insight.


0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *