Close panel

Close panel

Close panel

Close panel

Innovation Updated: 20 Jan 2020

How may quantum computing affect Artificial Intelligence?

The use of quantum algorithms in artificial intelligence techniques will boost machines’ learning abilities. This will lead to improvements in the development, among others, of predication systems, including those of the financial industry. However, we’ll have to wait to start these improvements being rolled out.

computacion-cuantica-inteligencia-artificial-movil-

The processing power required to extract value from the unmanageable swaths of data currently being collected, and especially to apply artificial intelligence techniques such as machine learning, keeps increasing. Researchers have been trying to figure out a way to expedite these processes applying quantum computing  algorithms to artificial intelligence techniques, giving rise in the process to a new discipline that’s been dubbed Quantum Machine Learning (QML).

“Quantum machine learning can be more efficient than classic machine learning, at least for certain models that are intrinsically hard to learn using conventional computers," says Samuel Fernández Lorenzo, a quantum algorithm researcher who collaborates with BBVA’s New Digital Businesses area. “We still have to find out to what extent do these models appear in practical applications.”

Machine learning and artificial intelligence technologies are the two key areas of research in the application of quantum computing algorithms. One of the particularities of this calculation system is that it allows representing several states at the same time, which is particularly convenient when using AI techniques. For example, as noted by Intel, voice-assistants could greatly from this implementation, as quantum could exponentially help improve their accuracy, boosting both their processing power and the amount of data they would be able to handle.  Quantum computing increases the number of calculation variables machines can juggle and therefore allow them to provide faster answers, much like a person would.

More accurate algorithms

The ability to represent and handle so many states makes quantum computing extremely adequate for solving problems in a variety of fields. Intel has opened several lines of research on quantum algorithms. The first applications  they are going to see are in fields such as material sciences, where the modeling of small molecules is a computing intensive task. Going forward, larger machines will allow designing medicines or optimizing logistics to, for example, find the most efficient route among any number of alternatives.

Currently, most industrial applications of artificial intelligence come from the so-called supervised learning, used in tasks such as image recognition or consumption forecasting.  “In this area, based on the different QML proposals that have already been set forth, it is likely that we’ll start seeing acceleration – which, in some cases, could be exponential – in some of the most popular algorithms in the field, such as ‘support vector machines’ and certain types of neural networks,” explains Fernández Lorenzo.

There is still a lot of work to be done in the area of reinforcement learning and to start applying it to solve specific practical issues in the industry

A less-treaded path, but which shows great promise, is the field of non-supervised learning. “Dimensionality reduction algorithms are a particular case. These algorithms are used to represent our original data in a more limited space, but preserving most of the properties of the original dataset.” In this point, the researcher notes the use of quantum computing will come in particularly handy at the time of pinpointing certain global properties in a dataset, not so much specific details.

Finally, there is still a lot of work to be done in the area of reinforcement learning and to start applying it to solve specific practical issues in the industry. Its potential to handle complex situations has been proven by its applications in videogaming. The most demanding task here, in terms of computing workload and time consumption, is training the algorithm. “In this context,” says Fernández Lorenzo, “some theoretical proposals have already been laid out to accelerate this training using quantum computers, which may contribute to developing an extremely powerful artificial intelligence in the future."

Applications in the banking sector

In the financial sector, the combination of AI with quantum computing may help improve and combat fraud detection. On the one hand, models trained using a quantum computer could be capable of detecting patterns that are hard to spot using conventional equipment. At the same time, the acceleration of algorithms would yield great advantages in terms of the volume of information that the machines would be able to handle for this purpose.

Work is also being conducted in developing models that will allow to combine numerical calculations with expert advice to make final financial decisions. One of the main advantages is that these models “are easier to interpret than neural network algorithms, and therefore more likely to earn regulatory approval,” says BBVA’s NBD researcher.

Also, one of the hottest trends in banking right now is providing customers with tailored products and services using advanced recommendation systems. In this sense, several quantum models have already been proposed aimed at enhancing these systems’ performance. “It doesn’t seem far-fetched to think that the sector will be able to suggest, in the near future, investment strategies based on quantum-inspired algorithms,” says Fernández.

To get to this point, researchers are concentrating on figuring out a way to leverage the capabilities of current quantum processors, exploring the connections between the recently announced quantum supremacy and machine learning. “Specifically, here, the quantum advantage could lie in the possibility of building models that could be very hard to implement using conventional computers. The applicability of this type of models in real-life industry contexts is yet to be studied,” concluded the researcher.