(Credit: Manel Torralba via flickr)
When we learn, electric signals are sent around the brain via synapses, creating neural pathways. Less energy is required each time a path is travelled, as the route becomes more defined. This essentially makes learning and memory part of the same process. Reported in the journal Nature Materials, the Stanford device attempts to mimic this behaviour, co-locating processing and memory, and using much less energy than traditional computing while doing so.
The artificial synapse consists of two thin, flexible films with three terminals, connected by a salt-water electrolyte. The device works as a transistor, with one of the terminals controlling the flow of electricity between the other two. By discharging and recharging it repeatedly, the researchers were able to ‘train’ the synapse, and were able to predict within one per cent of uncertainty what voltage was required to get the synapse to a specific electrical state and keep it there.
“It works like a real synapse but it’s an organic electronic device that can be engineered,” said Alberto Salleo, associate professor of materials science and engineering at Stanford and senior author of the paper.
“It’s an entirely new family of devices because this type of architecture has not been shown before. For many key metrics, it also performs better than anything that’s been done before with inorganics.”
Prof Alberto Salleo and graduate student Scott Keene (Credit: L.A. Cicero)
Salleo and his team have only built one artificial synapse, but researchers at Sandia National Laboratories were able to use 15,000 measurements from experiments on that synapse to simulate how an array of them would work in a neural network. The simulated array was able to identify handwritten digits between 0 and 9 with between 93 to 97 per cent accuracy. This type of visual task is something that traditional computing generally struggles with.
“More and more, the kinds of tasks that we expect our computing devices to do require computing that mimics the brain because using traditional computing to perform these tasks is becoming really power hungry,” said Sandia’s A Alec Talin, the paper's other senior author. “We’ve demonstrated a device that’s ideal for running these type of algorithms and that consumes a lot less power.”
Where digital transistors can be in only two states, such as 0 and 1, the researchers successfully programmed 500 states in the artificial synapse. In switching from one state to another they used about one-tenth as much energy as a state-of-the-art computing system needs in order to move data from the processing unit to the memory.
According to the researchers, the organic nature of the device also means it could be compatible with our own neurons, opening up the possibility of brain-machine interfaces.
MOF captures hot CO2 from industrial exhaust streams
How much so-called "hot" exhaust could be usefully captured for other heating purposes (domestic/commercial) or for growing crops?