Two-Transistor Neuro-Synaptic RAM Could Dramatically Drop the Size, Complexity of AI Hardware
12 times smaller than today's electronic neuron-synapse pairs, the NSRAM device created by researchers means big gains in efficiency.
Researchers from King Abdullah University of Science and Technology (KAIST) and the National University of Singapore, have come up with a new way to make brain-inspired hardware for artificial intelligence — and it centers around the humble silicon transistor.
"Traditionally, the race for supremacy in semiconductors and artificial intelligence has been a matter of brute force, seeing who could manufacture smaller transistors and bear the production costs that come with it," explains first author Sebastián Pazos. "Our work proposes a radically different approach based on exploiting a computing paradigm using highly efficient electronic neurons and synapses. This discovery is a way to democratise nanoelectronics and enable everyone to contribute to the development of advanced computing systems, even without access to cutting-edge transistor fabrication processes."
The team's work builds on earlier efforts to use transistor-based electronics to build devices whose operation is inspired by the human brain, split into electronic "neurons" and "synapses". When built with traditional silicon transistors, these devices quickly become bulky: each neuron in the circuit requires at least 18 transistors, and each synapse requires six.
The focus of the team's work is on bringing those figures down — to the point where a single transistor, built on a standard complementary metal-oxide semiconductor (CMOS) process, can be used as either a neuron or a synapse, dropping the complexity of the hardware by an order of magnitude. The secret: "impact ionisation," a physical phenomenon tied to the resistance of the bulk terminal, which results in current spikes similar to those seen in multi-transistor electronic neurons or in long-lasting charge storage similar to a multi-transistor electronic synapse.
By tweaking the resistance values to obtain specific impact ionisation effects — traditionally seen by semiconductor manufacturers as the symptom of a flaw, rather than something to be desired — the team were able to build a two-transistor device dubbed a Neuro-Synaptic Random Access Memory (NSRAM) cell.
This, the researchers say, can be switched between neuron or synapse operating modes on-demand, and could form the basis of considerably smaller and more efficient neural processors for on-device machine learning and artificial intelligence workloads. To prove it, the team manufactured prototypes on an older 180nm process node with a claimed 100 percent yield — though it remains to be seen if the approach scales down to cutting-edge single-digit-nanometer nodes as well.
The team's work has been published in the journal Nature under open-access terms.