Quantum Computing: The Next Revolution in Finance

by

Quantum computing is coming of age and is now breaking out of the research lab and is set to revolutionise the world we live in by driving breakthroughs in drug discoveries, chemistry, materials science, high energy physics and even climate change science. Quantum computers have the potential to solve problems that even the most powerful supercomputers cannot solve. By accelerating the discovery of solutions to big global challenges it has the potential to be more disruptive than the technology waves of the past decades.

An estimate in 2021 put total investment at $30 billion with a mix of public and private funding with over 200 start-ups[1]. In recognition of the importance of this new technology, governments are announcing new levels of funding for future national economic benefits but also western governments recognize potential security threats from aggressor nations using quantum computing to break encryption.

What Makes Quantum Computers Special?

Quantum computers solve problems in a very different way to computers we use today (which are referred to as ‘classical computers’ in the quantum computing world. The advantage of quantum computers is in their ability to be in a huge number of different states at the same time but classical computers can only be in one state at a time. Whether they be supercomputers, desktop PCs, or mobile phones, the building block of a classical computer is the ‘bit’. A bit is in a state is either 0 or 1, like a switch which stores binary information. A quantum computer uses a fundamentally different building block; the qubit, which can represent many states at once. A quantum computer uses the principles of quantum mechanics of superposition,  entanglement and interference. When a qubit is in ‘superposition’ its state is a combination of 0 and 1 with a given probability. When you measure a qubit its output is still a 0 or 1 but which one you get depends on the initial probability.

Binary versus a qubit.
The state of a classical bit (on the left) can be only 0 or 1. A qubit has a state that can be anything in between 0 and 1 and is shown as a vector to a point on a sphere.

The picture above is a simple representation of a classical bit versus a qubit. A classical computer has two states of 0 or 1 shown on the left. A qubit’s state is conceptually pictured on the right as a vector to a point on a sphere called ‘Bloch sphere’. When the qubit is measured the vector is pointing at either 0 or 1 depending on the original probability.

In a classical computer, the bits are independent of each other. In a quantum computer, the qubits can be entangled together to make one large quantum state.  This gives us a probability distribution of all of the possible states of the combined qubits. The combination of superposition and entanglement allows the number of states that can be represented on a quantum computer to far exceed what is possible on a classical computer. A classical computer would require 512 bits to represent an entangled state of 2 qubits. A 100 qubits would require a classical computer with more bits than there are atoms on the planet earth. This is the core difference; a classical computer can be in any state but only one at a time whereas a quantum computer can be in a superposition of ALL of those states at the same time. Probabilities are used to describe a state but in fact, the state is represented as a quantum wavefunction. The wave function (and hence the probabilities) of the correct solution are found using ‘interference’. The principle of interference allows a quantum computer to enhance the correct solution state or cancel out states of unwanted solutions. The sequence of steps of constructive and destructive interference is what forms a quantum algorithm. The ability to perform computations on vast amounts of states at once allows quantum computers to perform tasks beyond the capability of classical computers.

Cooling tubes on a quantum computer

The Noise in the Machine

Despite the advances, there are many technical problems to solve before Quantum Computing becomes more widely available. Quantum computers are extremely difficult to engineer, build and program. Qubit states can be very fragile and results can easily be compromised due to high error rates coming from noise from the environment, or faults in manufacturing. The external environment can interact with a qubit causing its state to leak away called decoherence. Decoherence can occur in a matter of seconds severely limiting how long a quantum algorithm can run. Temperature fluctuations, radiation, electromagnetic waves and cosmic rays are forms of interference that can cause a qubit to lose its state. Various kinds of shielding can reduce this. For example, quantum computers are housed in dilution refrigerators chilling to near zero Kelvin. Error correction is another approach being researched to increase the reliability of the qubit. With error correction, a single, more reliable logical qubit is made from 100s or even 1000s of qubits but this is not possible with current hardware. Despite the challenges, the timelines of development and innovation have been shattered recently with IBM targeting a 1000 qubit processor by the end of 2023 and practical quantum computing and widespread adoption by 2030. PsiQuantum, a Microsoft funded start-up is working on building a 1 million qubit machine in the next few years. So rather than being at the beginning of an experimental research phase the industry is at a tipping point from the lab to real-world applications.

High-Frequency Risk Management

The applications of Quantum Computing are wide and the chemical and automotive industries are already exploring uses for discovering new materials and supply chain optimisation. Financial services are exploring the technology too. Some of the largest banks are investing in research to understand the technology including JP Morgan, Goldman Sachs, Barclays, Standard Chartered, BBVA, CaixaBank and Scotia Bank to name but a few. Financial institutions are likely to be some of the earliest success stories for Quantum Computing particularly in what’s called the NISQ[2] era where current machines are able to produce results in hard-to-do tasks such as optimization and simulation where meaningful results could be potentially produced on noisy error-prone qubits.

Recent global events; war and pandemics, exacerbate market uncertainty and increases volatility. Financial institutions are constantly seeking to develop more sophisticated quantitative models. The challenge in finance is finding the best solution to calculations with large volumes of data in a time frame that is useful and sufficiently accurate for actionable decision making. Although hard to estimate accurately the total value of financial market transactions was put at $2.27 quadrillion dollars[3] in 2021. The volume of transactions is increasing at a pace that one of the greatest challenges in finance is risk management. The worry is that risk management will not be able to keep pace with the rapidly increasing volumes and warnings of a future financial crash like 2008 will come far too late. Even risk management tools that work now may fail to scale up as transaction volume increases.

Financial institutions look at the risk in their portfolios by creating simulations. Monto Carlo simulation is used to estimate the probabilities of various risk scenarios. However for the simulation to be sufficiently accurate it needs to be run many times. As the number of parameters increases, simulation times increase to a point that the time taken to compute makes the result less useful. In a world where markets are moving fast,  any lag in your risk calculations increases the likelihood of risk management data being out of date. Risk management has to be high frequency.

Faster Simulations

Financial Institutions need to look at possible solutions to speed up optimisation and simulations hence the interest in Quantum Computing. For Monte Carlo simulation in quantum computing, there is an algorithm called Quantum Amplitude Estimation. This algorithm promises to estimate risk or instrument prices using fewer samples compared to a standard Monte Carlo simulation. This can translate to faster execution of Monte Carlo-based pricing engines or higher accuracy for a given time budget. Using this approach, Monte Carlo simulations can potentially achieve a quadratic speed up[4]. The term ‘quadratic speedup’ is often used to describe how much faster the algorithm could be if implemented on a quantum computer. In computer science, it’s defined as  O(n) vs O(√n). In other words, something that runs in 100 seconds would take 10 seconds with a quadratic speedup. For 10,000 it would be 100. For 1,000,000 would be 1000 and so on. It’s a linear speed-up rather than an exponential one and computation at scale becomes tractable rather than impossible in reasonable time frames. Quantum Amplitude Estimation could make large scale Monte Carlo simulations tractable on NISQ era machines.

Derivative Pricing

Other areas of research for finance in Quantum Computing include derivative pricing, machine learning, portfolio optimisation and even settlements. At the heart of many of these problems are large-scale combinatorial simulations that scale exponentially with the number of inputs which is highly suited to Quantum Computing. The kind of problems that lend themselves well to quantum computing are ones that start with a small amount of initial data which then produces an exponentially large number of states but the solution data size is very small. Non-linear option pricing is a good example of this as the initial pricing parameters to the option are small but the number of possible paths in the simulation is very large but then the output is very small i.e. a price. Option pricing is a good use case for quantum computing. A recent paper[5] by Will Zeng at Gladman Sachs estimates the resource requirements needed to price a TARP option price under 1 second as being 7500 logical qubits. Although the resource estimate is beyond current machines a key breakthrough in this paper was the development of a new method of derivative pricing for quantum computing which dramatically reduces the resource requirements. Algorithm development in quantum computing has lagged hardware development and advancements in algorithm development will further speed up results. Quantum computing algorithms are very different to classical computer algorithms and require a different style of thinking. It will be a combination of innovative algorithm development and hardware advances that is likely to make solutions available sooner.

Regulatory Focus on Risk Management

For all technology deployments in the finance sector, regulation is always a core consideration. Basel III has laid down strict rules on how much capital an institution must retain to cover risk and has boosted the capital requirements from about 2% in Basel II to 10.5% in Basel III[6]. With Basel IV around the corner, Basel’s focus on risk management means that it is now of critical importance as better risk management can reduce the required size of capital buffers for market risk.

For example, to assess the risk of a portfolio you might use Value at Risk (VaR) models. Calculating VaR can become computationally intensive as the number of asset classes increases. VaR is an imperfect risk measure of risk, especially in times of changing volatility. When VaR models are back-tested against portfolio returns, if the actual risk is greater than the model predicts (a breach) then allowances have to be made for the inaccuracy of the model. These allowances typically involve extra capital buffers or the reduction of position sizes each of these is a poor use of capital. In unpredictable markets, attempts to create suitable models are stymied by inadequate computing power or models that are oversimplified to make them computable and models are not run often enough to be relevant. Some institutions have used machine learning to help optimise the model parameters. However, models that are used for risk-weighting assets need to be approved by the regulator. Although quantum algorithms are complex they still mathematically provable. Machine learning models may not be.

A hacker

The Risk of the Quantum Computing to Encryption

When financial institutions consider risk it’s not market risk that’s important but also operational and external risks and one such risk is cyber security. Quantum computing might not have attracted the level of interest and funding if it was not for one man; Peter Shor. In 1995 Peter Shor published a paper[7] proposing how to find the prime factors of an integer in polynomial time. This is important because the security of the internet relies on RSA encryption. RSA encryption uses a key which is generated from two prime numbers and rests on the assumption that factoring long keys would take so long as to be considered impossible. It would take the fastest classical supercomputer millions of years to find the key to an RSA encrypted code. The difficulty of factoring 2 prime numbers is asymmetrical; it’s easy to multiply to large prime numbers but very hard to go the other way i.e. find the two original primes from that number. Peter Shor’s paper changed the assumption that RSA was uncrackable. Given a large enough quantum computer, it could take as little as 10 seconds to crack a bitcoin key[8].

As well is breaking the security of online transactions and communications there is also the fear that quantum computers could crack Bitcoin and all other cryptocurrencies as they are also based on RSA encryption. In practice, this is extremely unlikely to be a risk to either cryptocurrencies or the web in the short term. A team at Sussex University estimated that it would take a 300 million qubit machine 1 hour to crack RSA[9]. Some researchers have quoted lower numbers[10] but with the highest number of qubits being 127 for the IBM machine we’re not there now.

However, the risk is very real. The pace of development of quantum hardware is increasing and quantum hardware capable of breaking encryption may be ready sooner than we think. The risk is complacency and being caught unprepared. The American standards body NIST is currently evaluating encryption algorithms that are quantum computing resistant, called Post-Quantum Cryptography[11], which recognises we have to act now. Even if new Post-Quantum Cryptography were to be developed the sheer undertaking of rolling it out the world’s internet infrastructure is of an unimaginable scale which is the main reason for starting now. The risks associated with breaking RSA encryption are far-reaching. Its unlikely cybercriminals would have the resources to deploy quantum computing. The biggest threat is to national security. Most governments’ secrets are meant to be kept secure for 30-50 years and the roadmap for capable quantum computing is well within that timescale. Large aggressive nation-states may elect to deploy their resources for espionage and even cyber terrorism. The UK’s GCHQ director has recently sounded warnings over China’s threat to cyber security[12]. Digital assets and cryptocurrencies could be destroyed by a large enough nation-state should it choose to do so. Just because the hardware is not here now it doesn’t mean that an attacker has to wait. “Hack now, decrypt later” refers to the practice of exfiltrating and storing encrypted information from sensitive targets such as defence, infrastructure, pharma and finance, until such time as you are able to decrypt it. It is believed that these types of attacks are happening now. The US Department of Homeland Security recently issued guidelines for organisations to mitigate their security risks with the advancement of quantum computing[13].

In the finance sector, we look to quantify risk in terms of likelihood and impact. Although a quantum computing-based security breach may be considered to be a low likelihood the impact could be catastrophic. The combination of ‘low’ and ‘catastrophic’ turns it into a very real risk that organisations must put in place mitigations now through training, and upskilling in the quantum computing world.

Where Next for Quantum Computing in Finance?

It’s still early days for both quantum computing and its use in finance. The very largest institutions exploring the technology are asking the following questions; how could quantum computing supercharge our existing computational workloads? How do I build it into my development workflow? How do I train my teams to work with quantum computing? What kind of research and experimentation frameworks can I use to help make informed decisions and design quantum algorithms? What do I need to know about Post-Quantum Cryptography? Being able to answer these questions forms part of what is called “Quantum Readiness”. It’s about understanding the technology now to be ready for when it’s more mainstream. Quantum computing may not be quite ready for large-scale deployment but the speed of development means it’s likely to be ready a lot sooner than people expect. Hartmut Neven[14], Director, Google Quantum AI Lab summarised it very nicely “It looks like nothing is happening, nothing is happening, and then whoops suddenly you’re in a different world.” Neven’s law is the observation that quantum computers are gaining power at a double exponential rate. So unlike Moore’s Law where exponential growth in processing power is counted every two years, the double exponential growth of Neven’s Law means that early improvements in quantum computing power may seem slight at first, but later ones are dramatic. We should not wait for exponential growth. Machines are ready now for experimentation and development of techniques, algorithms, IP and understanding application use cases. It’s possible to experiment with how quantum meshes into other technologies and the organisation. Quantum computing will not operate in a vacuum but will be integrated with services operating together in the cloud. It is part of the future evolution of a compute platform with a mesh of quantum computers, classical computers and GPUs; more powerful together than separately.

The Challenges of Making it a Reality

Putting quantum solutions into production in combination with existing systems and the long-term management of this are still unsolved problems that organisations typically have no experience of. It’s useful to learn how machine learning went from hype to mainstream. Machine learning applications were hampered by the problems of keeping ML apps running, retraining, deployment, versioning and monitoring. DevOps for machine learning (MLOps) is a new field born out of the need to productionise ML apps. The same needs to emerge for Quantum Computing (QCOps?) as we gain experience in how to incorporate Quantum Computing into our workflows. How do you develop, test, version control and deploy quantum algorithms? How do you even do a ‘diff’ on versions of a quantum circuit?

Quantum Algorithm development is still in its infancy and is lagging the hardware innovations. Most algorithm development is still very low level almost at the equivalent level of programming classical computers in binary. The ability to do more comes with greater levels of abstraction. Classical computing algorithms and applications are based on layers of services and abstractions that have been built over decades. There are no equivalents in the quantum computing world. Even simple debugging of algorithms is significantly harder when you can’t step through the algorithm and the results are in probabilities. Looking at how we improve this development stack is critical to building more sophisticated quantum computing algorithms.

Looking Forward

Despite the challenges, these are exciting times. Early adopters can see the potential of Quantum Computing. They are building IP, training, learning the development stack, designing algorithms and understanding the likely effect on their business. These firms see the change coming and don’t want to be left behind which is why companies like JP Morgan, Goldman Sachs, Rolls Royce, BMW and VW are investing now. They are visionaries with a long-term view. There is no doubt that the field is evolving at a rapid pace and breakthroughs will come sooner than we expect which is why organisations have to start looking a quantum computing now. Quantum computing can potentially revolutionise the search for solutions to some of the world’s most difficult problems and truly is the space race of our generation.

To find out more about how we can help start your journey in exploring quantum computing in finance then contact us now.


[1] https://www.mckinsey.com/featured-insights/the-rise-of-quantum-computing

[2] https://en.wikipedia.org/wiki/Noisy_intermediate-scale_quantum_era

[3] https://www.dtcc.com/annuals/2021/performance/dashboard

[4] https://www.nature.com/articles/s41534-019-0130-6

[5] https://arxiv.org/pdf/2012.03819.pdf

[6] https://www.investopedia.com/ask/answers/062515/what-minimum-capital-adequacy-ratio-must-be-attained-under-basel-iii.asp

[7] https://ieeexplore.ieee.org/document/365700

[8] https://www.quintessencelabs.com/blog/breaking-rsa-encryption-update-state-art/

[9] https://physicsworld.com/a/bitcoin-encryption-is-safe-from-quantum-computers-for-now/

[10] https://arxiv.org/abs/1905.09749

[11] https://csrc.nist.gov/Projects/post-quantum-cryptography

[12] https://www.standard.co.uk/news/uk/gchq-china-cyber-attack-threat-warning-jeremy-fleming-b931342.html

[13] https://www.dhs.gov/quantum

[14] https://research.google/people/HartmutNeven/



One response to “Quantum Computing: The Next Revolution in Finance”