A resistor that works in a similar way to nerve cells in the body could be used to build neural networks for machine learning.
Many large machine learning models rely on increasing amounts of processing power to achieve their results, but this has vast energy costs and produces large amounts of heat.
One proposed solution is analogue machine learning, which works like a brain by using electronic devices similar to neurons to act as the parts of the model. However, these devices have so far not been fast, small or efficient enough to provide advantages over digital machine learning.
Murat Onen at the Massachusetts Institute of Technology and his colleagues have created a nanoscale resistor that transmits protons from one terminal to another. This functions a bit like a synapse, a connection between two neurons, where ions flow in one direction to transmit information. But these “artificial synapses” are 1000 times smaller and 10,000 times faster than their biological counterparts.
Just as a human brain learns by remodelling the connections between millions of interconnected neurons, so too could machine learning models run on networks of these nanoresistors.
“We are doing somewhat similar things [to biology], like ion transport, but we are now doing it so fast, whereas biology couldn’t,” says Onen, whose device is a million times faster than previous proton-transporting devices.
The resistor uses powerful electric fields to transport protons at very high speeds without damaging or breaking the resistor itself, a problem previous solid-state proton resistors had suffered from.
For practical analogue machine learning, systems containing many millions of resistors will be required. Onen concedes that this is an engineering challenge, but the fact that the materials are all silicon-compatible should make it easier to integrate with existing computing architectures.
“For what they achieve in terms of technology – very high speed, low-energy and efficient – this looks really impressive,” says Sergey Saveliev at Loughborough University, UK. However, the fact that the device uses three terminals, rather than two as a human neuron does, might make it more difficult to run certain neural networks, he adds.
Pavel Borisov, also at Loughborough University, agrees that it is an impressive technology, but he points out that the protons come from hydrogen gas, which could prove tricky to keep safely in the device when scaling up the technology.
Topics: