Thinking Inside the Box
AI in a Box by Useful Sensors makes running cutting-edge algorithms like LLMs locally as easy as plugging in a power cord.
Recent advancements in artificial intelligence (AI) have been nothing short of revolutionary, impacting many aspects of our daily lives. Large language model chatbots, voice recognition, and translation technologies are just a few of the areas that have made significant progress in recent years. These innovations have the potential to transform how we communicate, access information, and bridge language barriers. However, their widespread adoption has been hampered by the significant computational resources required to operate them.
Large language model chatbots, in particular, have captivated the world with their ability to generate human-like text and engage in conversations. These models have been used in customer support, content creation, and even creative writing. However, achieving such natural language understanding and generation requires enormous neural networks with billions of parameters. This has typically necessitated access to high-performance cloud-based hardware, which raises concerns about data privacy and security when sensitive information is transmitted to and processed by external servers.
To address privacy and accessibility concerns, ongoing innovations are focused on making AI applications more efficient and accessible. Researchers are developing techniques to reduce the computational overhead of large models, allowing them to run on less powerful and more affordable hardware. Additionally, efforts are underway to create user-friendly interfaces and platforms that simplify the installation and use of AI frameworks, making these technologies more accessible to individuals who may not be technically inclined.
A new device from Useful Sensors called AI in a Box that just popped up on Crowd Supply is taking privacy and accessibility to a whole new level. AI in a Box is relatively inexpensive, ranging from $299 to $475, yet it can run cutting-edge algorithms like large language models entirely on-device — no Internet connection required. And it comes with a number of applications pre-installed, so there is no complex setup involved. In fact, plugging in the power cord is all it takes to get started.
The open source device can be leveraged by developers to host their own creations, but the primary target audience appears to be people that just want everything to work right out of the box. For them, AI in a Box comes with a conversational chatbot, real-time translations between many languages, live captioning of conversations, and a voice-powered tool that can emulate a USB keyboard.
Inside the custom enclosure, you will find a powerful Rockchip 3588S system-on-chip with an eight-core 64-bit CPU, 8 GB of RAM, and a neural processing unit. An internal or external (depending on the option selected) display, speaker, and microphone handle all of the device’s inputs and outputs. USB ports are included to interface with external devices, and for the more advanced user, the system runs the Ubuntu 22.04 operating system, so it is endlessly hackable.
AI in a Box is expected to be delivered to backers in January of 2024. There is a risk that the date could slip, however, Useful Sensors has produced commercial products in the past, like the Person Sensor and the Tiny Code Reader, so it is a pretty good bet that the AI in a Box will be delivered on schedule.
The all-in-one AI in a Box is priced at $299 and was designed with ease-of-use in mind, with all of the components housed in the same casing. For a limited time, a $475 prototype kit is also available with an external display, speaker, and microphone to make it easier to customize for the advanced user. Full details about these options are available on Crowd Supply.