DeepMind AI discovers new calculation method to speed up computers

⇧ [VIDÉO] You might also like this partner content (after ad)

An artificial intelligence (AI) has succeeded in creating a new matrix calculation algorithm, more efficient than the one currently in use. The discovery may seem a little obscure for those who have not rubbed shoulders with the mysteries of matrix calculation. However, it could have a real impact, particularly in the field of computing. Some calculations could thus see their speed increase by as much as 20%, according to DeepMind researchers.

The “researcher” behind this discovery is none other than an artificial intelligence. In other words, a machine learning algorithm, developed by the company DeepMind. Its nickname: AlphaTensor. It’s been two years since the team of scientists leading the project put him to work on a very specific task.

AI has indeed attempted to outclass human intelligence in the field of matrix calculation. Objective: find a way to perform this type of multiplication using as few operations as possible. “ We focus on the fundamental task of matrix multiplication and use deep reinforcement learning (DRL) to research matrix multiplication algorithms with proven accuracy and efficiency “, specify the scientists in an article, published in Nature.

Matrix multiplications consist of multiplying “matrices” together. In other words, sorts of tables of numbers. Roughly, it is therefore a question of multiplying between them two grids of numbers. Why the hell would anyone want to do this to themselves? Quite simply because it is very useful, in a large number of areas.

Indeed, matrix calculation is a fundamental computing task. It is therefore used to some extent by almost all software. In some very specific fields, such as graphics, artificial intelligence (neural networks) and scientific simulations, it is even used on a very large scale. In these areas, even a slight improvement in performance can lead to a significant performance gain. It can also cause machines to consume less energy, since less computing power can be used for the same result.

The first breakthrough in 50 years

According to scientists, this famous calculation speed could be increased by some 20% on certain devices, thanks to this new method. This would not necessarily directly apply to all devices, such as computers or smartphones. But if this advance is making a lot of noise, it is also because it is the first for almost 50 years in the field of matrix multiplication. For a very long time, matrix multiplication was carried out in a purely proportional way to the number of elements multiplied.

In 1969, a mathematician named Volker Strassen brought a breakthrough in this area. He proved that multiplying a matrix of two rows of two numbers by another of the same size did not necessarily involve eight multiplications. He has indeed managed to reduce the operation to seven different calculations. This approach is called Strassen’s algorithm. Other advances have been made since, but which were not applicable in concrete areas.

The success of AlphaTensor is therefore causing a stir in the field. To achieve this, the AI ​​was put to work without any prior knowledge of the solutions currently in use. He was asked to create an algorithm that would complete this multiplication task in a minimum of steps. AlphaTensor ended up finding an algorithm that can multiply two matrices with four rows of four numbers using 47 multiplications.

A better performance therefore than the 49 multiplications that the Strassen method allowed so far. Other techniques have been found for matrices of different sizes: 70, in total. A small number, compared to all the algorithms found by the AI. For example, AlphaTensor found 14,000 different calculation methods just for 4X4 matrices. On the other hand, few surpassed the current method of calculation.

The problem? Scientists don’t quite understand how it all works. ” We don’t really know why the system offered this “Explains Hussein Fawzi, of DeepMind, in an article in New Scientist. ” Why is this the best way to multiply matrices? It is unclear. Somehow neural networks have an intuition of what looks good and what looks bad. Honestly, I can’t tell you exactly how it works. I think there’s theoretical work to be done on that, on exactly how deep learning manages to do this stuff “, he concludes.

For Oded Lachish de Birkbeck, a researcher at the University of London also interviewed by the media, the discovery of the AI ​​is in any case promising. It could herald other similar advances. ” I think we’ll see AI-generated results for other problems of a similar nature, though rarely something as central as matrix multiplication. There is a significant motivation for such technology, as fewer operations in an algorithm not only mean faster results, it also means less energy expended. “, he recalls.

Source: Nature

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *