An artist's rendering of a neural network.

Energy-Saving, Physics-Driven AI Set to Outpace Traditional Neural Networks

Discovering a new age of computing where physics plays a vital role in creating more efficient AI.

advertisement

Artificial intelligence (AI) has transformed our digital world, but it comes at a hefty energy cost. The quest for a more energy-efficient AI method has led scientists at the Max Planck Institute to a groundbreaking discovery, potentially reshaping the future of AI.

AI systems like GPT-3, while revolutionary, consume vast amounts of energy. Open AI, responsible for the creation of ChatGPT, hasn’t shared the exact energy consumption of training GPT-3. However, estimates suggest it’s equivalent to the yearly consumption of 200 sizeable German households. And while GPT-3 can distinguish whether “deep” is followed by “sea” or “learning,” it struggles to grasp the deeper meaning behind words.

Neuromorphic Computing: The New Wave

While current AIs rely on digital computers, researchers are exploring neuromorphic computing to lower energy use. Contrary to its name, it doesn’t entirely align with artificial neural networks. Traditional AI systems imitate brain processes but depend on digital computers, separating memory and processor. As Florian Marquardt, director at the Max Planck Institute points out, transferring data between these components consumes a significant portion of energy.

Our brains process thoughts parallelly, avoiding the sequential processing seen in modern computers. Instead of separate memory and processors, our synapses combine the two functions. Current computers would not stand evolutionary tests due to their inefficient energy consumption and potential overheating issues. Hence, the intrigue in neuromorphic systems, which use light-based components acting as both switch and memory.

advertisement

The Self-Learning Physical Machine

Marquardt, in collaboration with Víctor López-Pastor, proposes a game-changing concept—a self-training machine. This system enhances efficiency by optimizing itself without external feedback, saving not just energy but also time. Marquardt emphasizes that while the exact process isn’t necessary to know, it should be reversible and non-linear to work effectively.

Only non-linear processes, which handle complex input-to-output transformations, are suitable. A simple analogy: a solitary pinball moving on a plate demonstrates a linear action. But once it collides with another pinball, the dynamics turn non-linear.

The duo, López-Pastor, and Marquardt, are currently collaborating on developing an optical neuromorphic computer that harnesses the power of superimposed light waves. They aim to launch the first physical, self-training machine within three years, which could handle more data and boast more synapses than today’s neural networks.

advertisement

With the ever-growing demand for more powerful neural networks, the need for energy-efficient alternatives will intensify. Marquardt concludes, “Self-learning physical machines offer a promising avenue in advancing AI.”

PLEASE READ: Have something to add? Visit Curiosmos on Facebook. Join the discussion in our mobile Telegram group. Also, follow us on Google News. Interesting in history, mysteries, and more? Visit Ancient Library’s Telegram group and become part of an exclusive group.

Written by Ivan Petricevic

I've been writing passionately about ancient civilizations, history, alien life, and various other subjects for more than eight years. You may have seen me appear on Discovery Channel's What On Earth series, History Channel's Ancient Aliens, and Gaia's Ancient Civilizations among others.

Write for us

We’re always looking for new guest authors and we welcome individual bloggers to contribute high-quality guest posts.

Get In Touch