How a quantum-inspired model is motivating new methods in finance and machine learning.

Quantum theory involves counter-intuitive concepts, like quantum superposition and entanglement. But things get even harder if one has to describe systems of many particles. In the last 30 years, quantum physicists have developed methods that deal with the complexity of the quantum world. That theoretical machinery is now finding new exciting practical applications.

A quantum system is described by a mathematical object known as wavefunction, which allows us to calculate the probability with which each particle is measured in a given state. Imagine that you have a quantum computer with 300 qubits. This is the size that some quantum hardware companies are planning to deliver in 2022. Each qubit can be in a «0» or «1» state. Now, in quantum theory, you need to describe the probability that any possible combination of 0’s and 1’s is measured. It turns out that, if you have 300 qubits, the number of combinations is actually larger than the estimated number of particles in the known Universe!

In essence, the description of relatively modest quantum systems is numerically challenging, since it is subjected to the infamous curse of dimensionality. This is really a pain in research fields like Material Physics or Quantum Computation. No wonder, physicists have been working hard on methods to deal with complex probability distributions even in an approximate manner. One popular approach is to consider that particles do not talk to each other, in other words, they are described by independent probability distributions. This simplifies the problem, but it has obvious limitations. For example, it cannot account for the effect of quantum gates between qubits.

Let us assume that we do allow for some interactions, but in a controlled way. This can be done by replacing independent probability distributions (left) by a product of matrices (right).

This product of matrices is known as a Matrix Product State. The immense complexity of the quantum state is now reduced to the knowledge of a few matrices. The trick works as long as particles, say qubits in a quantum computer, are only slightly correlated.

The basic idea behind Matrix Product States was introduced in the 90’s and intensively developed in the field of Quantum Information Theory since the 2000’s, leading to efficient methods to numerically describe exotic quantum materials and quantum computers.

But what about practical applications in the classical world? The approximation of complex probability distributions lies at the heart of many machine learning methods. Consider for example a collection of many black-and-white (‘0’ and ‘1’) pictures defined on a N times N grid. Describing that collection of pictures involves a probability distribution over many binary variables, and the complexity of that distribution will also grow exponentially in the number of pixels, just like in the quantum case. Recent works have actually shown that the matrix product approach can also be used here with promising results, ranging from anomaly detection to generative models.

Complex probability distributions are also very relevant in finance, for example in risk analysis and in the evaluation of financial products. Here, Matrix Product State methods can be applied for approximating those probability distributions, as shown by one of Inspiration-Q’s founders, https://arxiv.org/pdf/1909.06619.pdf

In the following years we will witness more developments in the application of Matrix Product States to real-life applications. The story of cross-fertilization between fundamental and applied fields is following similar lines as in the case of neural networks and Monte Carlo methods. It is part of our mission in Inspiration-Q to further develop these ideas and transform them into business solutions.