"Incipient Ferroelectricity" Turns Field-Effect Transistors Into Efficient Neuron-Like Devices

"These devices [create] a low-cost, efficient computing system that uses a lot less energy," claims co-author Mayukh Das.

Researchers from Penn State University and the University of Minnesota are hopeful that they've made a breakthrough towards faster yet more efficient machine learning and artificial intelligence (ML and AI) systems — by exploiting the "incipient ferroelectric" property of specially-designed field-effect transistors (FETs).

"AI accelerators are notoriously energy-hungry," co-author Harikrishnan Ravichandran explains of the areas in which the team hopes their new transistors will deliver major efficiency gains. "Our devices switch rapidly and consume far less energy, paving the way for faster, greener computing technologies.”

"The main goal of the project was to explore whether incipient ferroelectricity, usually seen as a disadvantage because it leads to short memory retention, could actually be useful," adds corresponding author Saptarshi Das. "In cryogenic conditions, this material exhibited traditional ferroelectric-like behavior suitable for memory applications. But at room temperature, this property behaved differently. It had this relaxor nature."

"'Incipient ferroelectricity' means there’s no stable ferroelectric order at room temperature," lead author Dipanjan Sen explains of the property that the team investigated. "Instead, there are small, scattered clusters of polar domains. It's a more flexible structure compared to traditional ferroelectric materials."

Typically, the "relaxor" behavior of incipient ferroelectric materials at room temperature is a drawback, making their operation less predictable and more fluid — but the team's breakthrough was to approach it as an advantage instead, showing how it could be of use in devices like neuromorphic processors that increase machine learning and artificial intelligence performance by processing information like the neurons in the human brain.

"To test this," co-author Mayukh Das says, "we performed a classification task using a grid of three-by-three pixel images fed into three artificial neurons. The devices were able to classify each image into different categories. This learning method could eventually be used for image identification and classification or pattern recognition. Importantly, it works at room temperature, reducing energy costs. These devices function similarly to the nervous system, acting like neurons and creating a low-cost, efficient computing system that uses a lot less energy."

"Right now," Sen admits, "this is at the research and development stage. Perfecting these materials and integrating them into everyday devices like smartphones or laptops will take time, so there's so much more to explore. In addition, we're examining other materials, like barium titanate, to uncover their potential. The opportunities for growth are immense, both in materials and device applications."

The team's work has been published in the journal Nature Communications under open-access terms.

Main article image courtesy of Jennifer M. McCann/Penn State University.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Get our weekly newsletter when you join Hackster.
Latest articles
Read more
Related articles