
Sanban Damrussy Ganapathy of the Faculty of Physics in Buffalo leads a team developing neurally shaped computer chips aimed at mimicking the complex structures of the human brain for energy efficiency. Credit: Douglas Revalle in Buffalo/University
It is estimated that AI models can use over 6,000 energy joules to generate a single text response. In comparison, your brain only needs 20 Joules per second to keep you alive.
Therefore, researchers at Buffalo universities are taking inspiration from the human brain to develop computing architectures that can support the growing energy needs of artificial intelligence.
“There’s nothing in the world as efficient as our brains. We’ve evolved to maximize information storage and processing and minimize energy usage,” says Dr. Sambandamurthy Ganapathy, professor at the UB School of Physics and dean of the Department of Research for UB College of Arts and Sciences.
“The brain is too complex to actually replicate, but it can mimic how information can be stored and processed to create more energy-efficient computers and, therefore, more energy-efficient AI.”
This brain-inspired approach is known as neural morphological computing. Its origins date back to the 1980s, but has become more relevant in recent years, especially as computing tasks become more energy-intensive and complex, making them more AI-intensive.
Neural morphology computing is related to both brain-inspired hardware and software, but Ganapathy’s team focuses on hardware. Their research is a blend of quantum science and engineering that involves examining the unique electrical properties of materials that can be used to construct neurally morphological computer chips.
The team’s goal is to develop chips and devices that are ultimately energy efficient as well as well as task completion.
“Today’s computers were built for simple, repetitive tasks, but with the rise of AI, we don’t want to solve any simple problems anymore,” says Ganapathy. “We want computers to solve complex problems, as humans do every day. Neural variation computing may provide the structure for computers to do this.”
Computers already share similarities with the brain
Computers that mimic the human brain are not as impressive as you would expect.
Computers use billions of transistors, small switches, either electrical (one) or block (zero) to encode all information in binary (one and zero). Our brains encode information in a surprisingly similar way. Instead of transistors, there are billions of neurons that either fire electrical signals or stay silent.
“Neuromorphic Computing aims to go beyond merely binary frameworks and approach the much more complex systems that are essentially given to us,” said Nitin Kumar, a graduate student at Ganapathy’s Institute.
Memory and processing in the same location
One way the brain is more complex and energy efficient is to be more energy efficient than a computer. Information is stored and processed in the same location.
“The left side of the brain does not hold all memories, and the right is not where all learning occurs,” Ganapathy says. “It’s intertwined.”
Information storage and processing are separated by traditional computers, so much energy is used, and it simply transports data along a small circuit between the memory unit and the processing unit. This could be even more energy-intensive if the computing architecture supports AI models.
“Of course, the question is how close and how close you can and can handle memory with processing within a computer chip,” says Ganapathy. “This is known as in-memory computing and is a major advantage of neural computing.”
Artificial Neurons and Synapses
Memory and processing are intertwined in the brain thanks to complex systems of neurons.
Neurons send electrical signals to each other through the synapses that connect them, effectively carrying information across a vast network. In computer terminology, synapses store memory, while neurons do the processing.
Therefore, the Ganapathy team is developing artificial neurons and synapses designed to mimic electrical signals of information from biological counterparts.
“Essentially, we want to replicate rhythmic, synchronized electrical vibrations that we might see in brain scans,” Kumar says. “To do this, you need to create neurons and synapses from sophisticated materials that can turn on/off electrical conductivity accurately and controllably.”
Find the right ingredients
The advanced material that fits this bill is known as phase change materials (PCM).
PCM can switch between conductive and resistive phases back and forth when hit with controlled electrical pulses, allowing scientists to synchronize electrical vibrations.
The PCMS can also maintain a conductive or resistive stage even after the applied electrical pulse has finished. In other words, they essentially retain memories of previous phases.
“This allows for a gradual change in the level of conductivity in response to repeated electrical pulses, similar to how biological synapses are reinforced by repeated activation,” says Ganapathy.
Some of the PCM materials the team recently published their research include copper copper oxide, niobium oxide, and other compounds known as metal organic frameworks. Their work has been featured in the Journal of the American Chemical Society, Advanced Electronic Materials and Arxiv Preprint Server, respectively.
“Our experiments use voltage and temperature to switch the conductivity of a material. Then we examine this effect to the electrons in the material,” says Kumar.
“To incorporate these materials into neural variation chips as artificial neurons and synapses, we need to understand them at the atomic scale. Therefore, we are currently working with the community to enable atomic-level control of the material structure and enable accurate adjustments to electrical switching properties.”
“Our next goal” is to “build a vibratory neural network that can synchronize the vibrations of multiple devices to emulate complex brain functions such as pattern recognition, motor control and other rhythmic behaviors.”
A more human-like computer?
Ganapathy emphasizes that neural morphological computers mimic the brain on a purely phenomenological level. Neural morphology computing is intended to replicate functional behaviors and benefits of the brain, not consciousness.
However, neural variation computers can solve problems like computers and solve problems like humans.
Today’s computers follow linear logic. The same input always leads to the same output. The human brain is very non-linear, and can have the same situation present in a person 10 times before, and respond to 10 different ways.
Computers today don’t work with limited or unclear data. For example, it is unlikely to give your AI an ambiguous prompt and provide the output you are looking for. Humans, on the other hand, often respond well to limited or confusing information.
“So, by providing a more complex architecture like the human brain, it could potentially be possible to process it in a more nonlinear way and adapt to limited data,” Ganapathy said.
Researchers believe this is particularly useful in applications like self-driving cars. This reduces human performance in most road conditions, but with no easy solution and in more complex scenarios. Think of a deer jumping in front of your car while someone chases right behind you.
In fact, self-driving cars could be one of the best applications for neural chips, given that remote servers are not thousands of miles away, and real-time decisions are made on the device itself.
“Neurotic chips may not be in your smartphone anytime soon, but I think you’ll see them in very specific applications like self-driving cars. There are probably chips that respond to the roads and chips to find the best possible route,” Ganapathy says. “There could be one large neural morphology computer that solves all the problems. Instead, you’ll see many different neural morphology chips that each solves the problem.”
Details: John Ponis et al, Atomic Origin of Conductance Switching in ε-Cu0.9v2O5, Neuroprosy Single Crystal Oscillator, Journal of the American Chemical Society (2024). doi:10.1021/jacs.4c11968
Nitin Kumar et al., Noise spectroscopy and electrotransport of NBO2 memorizers, dual resistance switching, advanced electronic materials (2025). doi:10.1002/aelm.202400877
Divya Kaushik et al, Thermal-Stable Zeolite Imidazolate Framework (ZIF-8) Resistive Thickness Switching Device, Resistance Switching Device, ARXIV (2025). doi:10.48550/arxiv.2501.01822
Provided by the University of Buffalo
Quote: How can AI increase energy efficiency? Researchers turn to the human brain for inspiration on July 1, 2025 at https://techxplore.com/news/2025-07-ai-energy-efficient-human-brain.html (July 1, 2025)
This document is subject to copyright. Apart from fair transactions for private research or research purposes, there is no part that is reproduced without written permission. Content is provided with information only.
