Quantum computing opening up new business horizons

Release date: 10 Mar 2021

When scouting for future innovations in the financial industry, quantum computing is moving to centre stage as one of the hottest technologies to watch out for. Slowly and steadily gaining momentum on its way from a theoretical concept to tomorrow’s next disruptive technology, quantum computing is already showing lots of potential in solving problems which are way out of reach for conventional computing. The race for “quantum supremacy” has started and more and more scientists are seeking to unlock business value that is just impossible to unlock with traditional IT equipment.

Deutsche Börse Group is closely following new trends as part of its strategic focus on new technologies and exploring what is driving our industry now and in the future. Therefore, we initiated an early pilot project to apply quantum technology to a real-world problem to better understand the technology and its applicability to our business. Deutsche Börse Group engaged JoS QUANTUM, a Frankfurt-based fintech company, to develop a quantum algorithm tackling existing challenges in computing our business risk models.

What is the problem to solve? Our risk models are used to forecast the financial impact of adverse external developments such as macroeconomic events, changes in competition, or new regulation. Today, computation is done via traditional Monte Carlo simulation on existing off-the-shelf hardware. Depending on the complexity of the model and the number of simulation parameters, computation time for risk models ranges from some minutes to a few hours or even some days. For a full sensitivity analysis of today’s model, the number of scenarios to analyse grows disproportionately and computation time jumps out of every realistic scope. We focussed on the speedup for up to 1,000 inputs, which would require up to 10 years of Monte Carlo simulation. In practice, calculations requiring more than a few days of computation time have no business benefit.

What is the approach? By introducing quantum computing, the aim of the project was to analyse if the total computing duration for a full sensitivity analysis can be reduced significantly, possibly from multiple years to below 24 hours. Furthermore, we wanted to gain insights about what kind of quantum computing power would be required to run such a full sensitivity analysis on a day-to-day basis and also to better understand the effort needed to code and run a quantum real-world use case.

Speed up calculation through quantum algorithms

A first step to calculate the sensitivity of the models’ simulated result was the variation of the input parameters of the business risk scenario. The to-be-expected speed-up has been simulated for a simplified model by applying a Amplitude Estimation (the quantum version of Monte Carlo simulation) combined with Grover’s algorithm (a quantum search algorithm), which enables quadratic speedup of calculations on quantum machines. This was then compared to the performance of the traditional Monte Carlo simulation results running on off-the-shelf hardware. In a second step, the results of the simplified model have been extrapolated to a model size of practical use. These results demonstrated that the application of quantum computing would drastically reduce the required computational effort and thus total calculation time. For the chosen benchmark of 1,000 inputs the “warp factor” is about 200,000, reducing the off-the-shelf Monte-Carlo computation time of about 10 years to less than 30 minutes quantum computing time.

The third step was to execute the model successfully on IBM's quantum machine (“vigo”) in Poughkeepsie, NY, USA, accessed via the IBM cloud. This machine has today only a smaller number of Qubits, limiting the model size. Due to today’s hardware limitations, a smaller version of the model was run. The execution on the quantum machine allowed us to verify the successful implementation of the risk model in quantum code. In addition, the hardware requirements for the respective number and quality of qubits has been evaluated to extrapolate which generation of quantum computing hardware will be required to run a full sensitivity analysis in production; this seems to be available already in a few years.

The results of the comparison between the classical, off-the-shelf computer performance and the quantum simulation and quantum measurements are shown below.


Quantum computing demonstrating promising results

The results of the pilot project demonstrated that the expected speed-ups using quantum computing can be achieved. In addition, we demonstrated the successful execution of a subset of the risk model on an IBM Quantum machine. We estimated the earliest availability of the minimum required quantum hardware for a full model calculation, which is only a few years out and comparatively modest. And this is what makes this project so special. Quantum hardware providers could and will possibly meet these requirements in the second half of this decade; meaning that a real-life application of quantum computing in risk management could only be a matter of a few years!

This could open new horizons with significant benefits: A much more sophisticated, detailed, and fast, but equally robust risk management model, enabling comprehensive sensitivity analysis. Quantum computing would deliver results on a rich model fast enough to be of practical use. Application of the same quantum-based approach of this pilot study to other areas of Risk Management can be thought of as well: credit risk and operational risk seem to be natural areas where more complex models would allow more precise predictions.

Find the full scientific paper here: „A Quantum Algorithm for the Sensitivity Analysis of Business Risks“.

Quantum - 100 trillion times as fast as a conventional computer

What exactly is Quantum Computing all about? As its name suggests, quantum computing is based on the laws of quantum physics, applying them to the deepest level of the things our world is made of. Here, the laws of Newtonian physics are no longer valid, and things behave like waves and particles at the same time. Electrons, e.g., may adopt not only two states alternatively, but also anything in between. The states in between are called “superposition”. Applied to computing, this has a far-reaching consequence: While traditional computers use bits and rely on binary logic, the so-called “qubits” that make up quantum computers can assume a third stage – the “superposition”, in which it is still unclear if they emerge from it with a value of 0 or 1. This enables them to calculate faster – in fact, a lot faster: in the case of a recent experiment conducted by the Chinese Hefei University of Technology, 100 trillion times as fast as a conventional computer. 

The technological challenges are still immense, though. To name just a few: superposition can be achieved for fractions of a second only. It is extremely sensitive to external “noise”; therefore, it needs to be kept in a highly protected environment. For example, “superconducting qubits”, which are one of the technologies to realise quantum computers, need to operate at extremely low temperatures and in a vacuum. Still, the promises of quantum computing are so enormous that these challenges seem worth tackling.

Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine. Richard Feynman and Yuri Manin later suggested that a quantum computer had the potential to simulate things that a classical computer could not.
In 1994, US mathematician Peter Shor formulated a quantum algorithm for a problem which was too difficult to solve by conventional means. This was followed by various other algorithms with the same objective. One of them is Grover’s algorithm, a quantum search algorithm devised by the Indian American computer scientist Lov Kumar Grover enabling a quadratic speedup of calculations. The model developed by JoS QUANTUM is based on a Grover algorithm and uses “imperfect Grover oracles.”