× close
Learning by light: This is the dynamics of light waves used inside a physical self-learning machine. What’s important is that its shape is unusual and its growth is completely reversed compared to when it was most developed (red). Photo credit: Florian Marquardt, MPL
Artificial intelligence not only delivers impressive performance but also creates significant energy demand. The more demanding the training task, the more energy it consumes.
Víctor López-Pastor and Florian Marquardt, two scientists at the Max Planck Institute for Light Science in Erlangen, Germany, present a method by which artificial intelligence can be trained much more efficiently. Their approach relies on physical processes instead of the digital artificial neural networks currently in use. The work was published in the magazine Physical Assessment X.
The amount of energy required to train GPT-3, which makes ChatGPT an eloquent and apparently well-informed Chatbot, has not yet been revealed by Open AI, the company behind that artificial intelligence (AI). . According to German statistics company Statista, this would require 1,000 megawatt hours – equivalent to about 200 German households of three or more people consuming annually. Although this energy expenditure enabled GPT-3 to learn whether the word “deep” was more likely to be followed by the word “sea” or “learning” in its dataset, according to all accounts, GPT-3 still doesn’t understand the basic meaning of that. phrase.
Neural networks on neuromorphic computers
To reduce the energy consumption of computers and especially AI applications, over the past few years, several research organizations have been working on a completely new concept of how computers can process data in future. This concept is called neuromorphic computing. Although this may sound similar to artificial neural networks, it actually has little to do with them because artificial neural networks run on regular digital computers.
This means that the software, or more accurately the algorithm, is modeled after the way the brain works, but the digital computer serves as the hardware. They perform the computational steps of the neural network sequentially, distinguishing between processor and memory.
“The transfer of data between these two components alone consumes a huge amount of energy as the neural network trains hundreds of billions of parameters, i.e. synapses,” said Marquardt, director of the Max Planck Institute of Science. business, with up to one terabyte of data”. of Light and professor at the University of Erlangen.
The human brain is completely different and will probably never be evolutionarily competitive if it operates with the same energy efficiency as computers with silicon transistors. Most likely it failed due to overheating.
The brain is characterized by performing many steps of the thinking process in parallel rather than sequentially. Neurons, or more accurately synapses, are both processors and memory. Many different systems around the world are being considered as potential candidates for neuromorphic counterparts to our neurons, including photonic circuits that use light instead of electricity. element to perform calculations. Their components serve simultaneously as switches and memory cells.
× close
Artificial intelligence is a combination of a pinball and an abacus: In this thought experiment, the blue positively charged pinball represents a set of training data. The ball is launched from one side of the disc to the other. Photo credit: Florian Marquardt, MPL
A self-learning physical machine optimizes its synapses independently
Together with López-Pastor, a doctoral student at the Max Planck Institute for Light Science, Marquardt has now devised an effective training method for neuromorphic computers. “We have developed the concept of a self-learning physical machine,” explains Florian Marquardt. “The core idea is to implement training as a physical process, where the machine parameters are optimized by the process itself.”
When training conventional artificial neural networks, external feedback is needed to regulate the strength of billions of synaptic connections. “Not requiring this feedback makes training much more effective,” says Marquardt. Deploying and training artificial intelligence on self-learning physical machines will not only save energy but also save computation time.
“Our method works regardless of what physical process takes place in the self-learning machine, and we don’t even need to know the exact process,” Marquardt explains. “However, the process must meet several conditions. Most importantly, it must be reversible, meaning it must be able to run forward or backward with minimal energy loss.”
“In addition, the physical process must be non-linear, meaning sufficiently complex,” says Marquardt. Only nonlinear processes can perform complex transformations between input data and results. A rolling ball rolling across a plate without colliding with another plate is a linear action. However, if it is disturbed by others, the situation becomes non-linear.
Hands-on testing on computerized optical neurosimulation
Examples of reversible nonlinear processes can be found in optics. Indeed, López-Pastor and Marquardt have collaborated with an experimental group developing an optical neuromorphic computer. This machine processes information in the form of overlapping light waves, so that the appropriate components adjust the type and intensity of the interaction. The researchers’ aim is to bring the concept of a self-learning physical machine into practice.
“We hope to present the first self-learning physical machine in three years,” says Florian Marquardt. By then, there will be neural networks thinking with many more synapses and trained with significantly larger amounts of data than today.
As a result, there will be an even greater desire to deploy neural networks outside of conventional digital computers and replace them with efficiently trained neuromorphic computers. “We therefore believe that self-learning physical machines have a great chance of being used to further develop artificial intelligence,” the physicist said.
More information:
Víctor López-Pastor et al., Machine Learning Based on Hamiltonian Echo Backpropagation, Physical Assessment X (2023). DOI: 10.1103/PhysRevX.13.031020
Magazine information:
Physical Assessment X
Provided by the Max Planck Institute for the Physics of Light
#physicsbased #selflearning #machine #replace #current #artificial #neural #networks #save #energy
World Innovations: Top Trends Shaping the Future Worldwide
Global Migration Trends: Understanding the Modern Movement of People
World Sports: Discover the Most Exciting Global Sporting Events