This work could finally take analog neural networks to the next level.
The human brain is a true marvel of evolution and natural selection whose immense complexity still largely escapes us. But this does not prevent researchers from working on techniques based on comparable principles; for example, everyone has heard of neural networks.
When we think about these systems that are very important today in the field of artificial intelligence, we think above all of algorithms. Yet the definition of the neural network is strictly functional; no matter what form it takes, what matters is how it processes information.
An analog neural network
Very briefly, this term designates a whole host of systems whose functioning is directly inspired by the cerebral architecture, with its logical sub-units — the neurons — connected by synapses.
This means that these neural networks can absolutely exist physically. Without going into detail, they then operate on analog bases, as opposed to digital mode. We then speak of analog neural network (or ANN). This is still a fairly in its infancy approach. But many works have already suggested that this could be a very interesting avenue for AI research.
On paper, these ANNs have considerable advantages. Since they are based on a physical structure, and not exclusively virtual, they are immensely more energy efficient. They can also be driven and operate at speeds vastly superior to digital networks. The information is in fact processed there in parallel ; this means that the number of logical sub-units can be increased without slowing down the calculations, since the information does not have to pass permanently between the memory and the processor.
This research paper also explains that this concept is much more practical when it comes to designing an interface between AI and the real world, which is very important for the concrete applications of this technology.
The problem is that these advances still depend on the exploration of new materials and production techniques; to compete with the algorithms, it is indeed necessary to design an extremely fast system.
And just recently, the team of Murat Onen, a researcher at the prestigious MIT, presented work that promises to take this concept to the next level – literally, since they have designed an artificial synapse a million times faster than those of the human brain.
To achieve this, they relied on an inorganic material with very interesting properties, inorganic phosphosilicate glass (or PSG). It is a silicon-derived material that is studded with tiny pores. Usually it is used to protect furniture from moisture. But the researchers have exploited another of its properties, namely its ability to withstand very high electrical voltages without flinching.
The researchers were therefore able to use it to produce an ultra-efficient artificial synapse. Particles can travel there at a maddening speed. ” Normally, we couldn’t apply such large fields to a device or we would burn it to ashes. says Onen. “ But instead, the protons ended up traveling at immense speed, without damaging anything. », and at room temperature. ” It’s almost teleportation “, he breathes, visibly amazed.
A decisive proof of concept for the AI of the future?
This progress allowed them to build an extremely fast analog neural network, capable of operating “ at reasonable voltages “. This is the very first time that a team has managed to demonstrate the practical feasibility of such a high-performance ANN. And this study could therefore prove to be important for the future of AI; in any case, it has the potential to open the door to a whole new generation of physical devices specifically designed for these applications.
” This work has brought these devices to a point where they are starting to show great promise for future applications. », rejoices Jesús A. del Alamo, one of the authors of the study. William Chuech, a professor of materials science at the famous Stanford University shares this interpretation in any case, even if he did not participate in the work. ” This lays the foundation for a new class of device to power deep learning algorithms “, he says.
” With an analog processor, we will train neural networks of unprecedented complexity that no one can afford to train today.i,” Onen concludes. A real potential paradigm shift, then. He considers that this would be an absolutely major step forward, a bit like going from the simple ” car “to the real” spatialship “.
From now on, it only remains to refine the concept to begin to envisage applications on an industrial scale which could in turn participate in the emergence of a whole new generation of analog computers.
The text of the study is available here.