Creating Cross-Platform Small AI with picoLLM

You can run Gemma, Llama, Mistral, Mixtral, and Phi φ LLMs locally on a Raspberry Pi using PicoLLM without accessing the cloud.

Over the last year we've seen the emergence of what I have started to call Small AI, a new generation of smaller generative AI models running at the edge. Back in December Microsoft released its lightweight Phi-2 model, and this week Picovoice has unveiled their own PicoLLM platform, a cross-platform inference engine intended to run Phi-2 and other lightweight models like it on-device.

Local LLM-Powered Voice Assistant on Raspberry Pi (CREDIT: picovoice)

I first talked about Picovoice four to five years ago. But back then the machine learning landscape was a very different shape, with today's large language models still in their infancy.

Back then they offered wake word detection, and both speech-to-intent and speech-to-text engines, running offline without an Internet connection. More recently these models have been joined by a text-to-speech engine.

This week's release of their PicoLLM framework came with an really rather interesting demo, chaining their model zoo together. In the demo their wake word model triggered the streaming speech-to-text model, which feeds into their new picoLLM model, which outputs to their streaming text-to-speech model. Essentially they're replicate OpenAI's GPT4o launch demo. Except Picovoice's version runs locally, on a Raspberry Pi.

Local LLM on Raspberry Pi (CREDIT: Picovoice)

PicoLLM framework supports Gemma, Llama, Mistral, Mixtral, and Phi φ families of models, and runs cross-platform on Windows, macOS, and Linux — including Raspberry Pi OS on the Raspberry Pi 4 and 5 — as well as Android and iOS. Interestingly it's also been designed to work as client-side code inside your web browser.

You'll find the PicoLLM engine on GitHub, alongside it are SDKs and demos for various languages. The Picovoice engines are free for hobbyist and non-commerical use, with some usage limitations.

Alasdair Allan
Scientist, author, hacker, maker, and journalist. Building, breaking, and writing. For hire. You can reach me at 📫 alasdair@babilim.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles