Hackster is hosting Hackster Holidays, Ep. 6: Livestream & Giveaway Drawing. Watch previous episodes or stream live on Monday!Stream Hackster Holidays, Ep. 6 on Monday!

Sipeed's TinyMaix Puts MNIST Digit Recognition on a Modest Microchip ATmega328 Microcontroller

Open source project, written during a hackathon weekend, adds INT8 and FP32 machine learning model support to low-end microcontrollers.

Sipeed is aiming to bring machine learning inference capabilities to even low-end microcontrollers with TinyMaix, written over a weekend — and has proven its capabilities by running an MNIST handwritten digit recognition model on a modest Microchip ATmega328 chip.

"TinyMaix is designed for running AI neural network models on resources limited MCUs [Microcontroller Units], which [is] usually called tinyML," the company explains of the software. "TinyMaix is a weekend hackathon's project, so it is simple enough to read though in 30 minutes and it will help tinyML newbies to understand how is it running."

Having been written over the course of a weekend, TinyMaix is as compact as you might expect: The project has just 400 lines of core source code, which helps it to consume as little RAM as possible — the secret behind the impressive demonstration in which it runs a handwritten digit recognition model on an ATmega328 microcontroller with just 2kB of static RAM (SRAM) and 32kB of flash storage.

"TinyMaix aims to be a simple tinyML infer[erence] library," Sipeed explains. "It abandons many new features and [doesn't] use libs like CMSIS-NN. Follow[ing] this design goal, now TinyMaix is as simple as five files to compile."

In its initial release, TinyMaix supports INT8 and FP32 precision modes, Arm's SIMD, NEON, and MVEI acceleration instructions, and RISC-V's RV32P and RV64V architectures. Sipeed has also published a list of potential future features, including INT16 mode, support for MobileNetV2 above the current MobileNetV1 maximum, and Winograd convolution optimization for boosted performance. The company has stated, however, that it won't add features including BF16 support or GPU acceleration. "TinyMaix is for MCUs," it explains. "Not for powerful PC/mobile phones."

Full instructions for usage, training, and model conversion from Keras H5 or TensorFlow Lite are available on the project's GitHub repository, along with the source code under the permissive Apache 2.0 license. Sipeed is also working on adding support for online model training via MaixHub, though this was not yet available at the time of writing.

Main article image courtesy of oomlout, Creative Commons Attribution-ShareAlike 2.0.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles