Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

The Future of AI Is Looking Less Cloudy

A nanoelectronic device is capable of performing real-time AI classifications, while consuming 100 times less energy than existing methods.

Nick Bild
1 year agoMachine Learning & AI
A novel nanoelectronic device runs AI algorithms very efficiently (📷: Northwestern University)

Large machine learning algorithms consume a lot of energy during operation, making them unsuitable for portable devices and posing a significant environmental challenge. These energy-intensive algorithms, which are often used for complex tasks such as natural language processing, image recognition, and autonomous driving, rely on data centers packed with high-performance hardware. The electricity required to run these centers, as well as the cooling systems to prevent overheating, results in a significant carbon footprint. The negative environmental consequences of such energy consumption have raised concerns and highlighted the need for more sustainable AI solutions.

To meet the demands of complex, modern AI algorithms, the processing is frequently offloaded to cloud computing resources. However, sending sensitive data to the cloud can raise significant privacy issues, as the data might be exposed to third parties or potential security breaches. Moreover, this offloading introduces latency, causing performance bottlenecks in real-time or interactive applications. This may not be acceptable for certain applications, like autonomous vehicles or augmented reality.

To overcome these challenges, efforts are being made to optimize machine learning models and reduce their size. Optimization techniques focus on creating more efficient, smaller models that can run directly on smaller hardware platforms. This approach helps to lower energy consumption and reduce the dependence on resource-intensive data centers. However, there are limits to these techniques. Shrinking models too much can result in unacceptable levels of performance degradation.

Innovations in this area are sorely needed to power the intelligent machines of tomorrow. Recent work published by a team led by researchers at Northwestern University looks like it might offer a new path forward for running certain types of machine learning algorithms. They have developed a novel nanoelectronic device that consumes one hundred times less energy than existing technologies, and yet is capable of performing real-time computations. This technology could one day serve as an AI coprocessor in a wide range of low-power devices, ranging from smartwatches and smartphones to wearable medical devices.

Rather than relying on traditional, silicon-based technologies, the researchers developed a new type of transistor that is made from two-dimensional molybdenum disulfide and one-dimensional carbon nanotubes. This combination of materials gives rise to some unique properties that allow the current flow through the transistor to be strongly modulated. This, in turn, allows for dynamic reconfigurability of the chip. A calculation that might require one hundred silicon-based transistors could be performed with as few as two of the new design.

With their new technology, the team created a support vector machine algorithm to use as a classifier. It was trained to classify electrocardiogram data to identify not only the presence of an irregular heartbeat, but also the specific type of arrhythmia that is present. To assess the accuracy of this device, it was tested on a public electrocardiogram dataset containing 10,000 samples. It was discovered that five specific types of irregular heartbeats could be recognized correctly, and distinguished from a normal heartbeat, in 95% of cases on average.

The principal investigator in this study noted that “artificial intelligence tools are consuming an increasing fraction of the power grid. It is an unsustainable path if we continue relying on conventional computer hardware.” This fact is becoming more apparent by the day as new AI tools come online. Perhaps one day this technology will help to alleviate this problem and set us on a more sustainable path, while simultaneously tackling the privacy- and latency-related issues that we face today.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles