I really enjoy cooking however I am a bit lazy when it comes to following recipes, especially when it means having to re-opening the ingredients list and description on the phone with smudgy fingers. Let's say I'm about to put X grams of butter in the bowl but I forget about how many, would it not be great being able to just put the butter package under the camera and having a voice or display to remind me of the quantity? Or if I just improvise a recipe and it turns out to be super delicious, then a smart food detector could have kept track of all ingredients that I added and automatically generate a recipe.
First, flash your Raspberry Pi 4 with Raspberry OS to an SD card with your favorite flashing tool, like Raspberry Imager or Balena. Then, prepare the device for headless usage by enabling SSH and setting WIFI credentials (or skip if you'll use an ethernet connection).
Secondly, install dependencies for Edge Impulse. Please follow the instructions in the documentation guide for Raspberry 4.
For my proof-of-concept model, I collected data on the classes of bananas, dates, and oats. To collect the data, I first connected my smartphone as a device on Edge Impulse and captured a set of images. In the Edge Impulse studio, they will enter a labeling queue where you draw and rectangle around each object (a bounding box). Tip: take a sequence of images on the same objects, if the object is just a bit displaced, the tool can find the new bounding box.
After mounting the Raspberry to the kitchen cabinet, as a device to Edge Impulse
At the Impulse design tab, the steps for input data processing, training block, and output features are set. An impulse takes raw data, uses signal processing to extract features, and then uses a learning block to classify new data. Based on project settings, suitable blocks will be suggested.
A model is only as good as its training data and of course, I will add lots more training data when continuing working on this model. During this creating of this article, I ran two iterations of training where the later one had more data and I noticed an improvement in accuracy instantly. When testing the setup, it was clear that the oats were most difficult to recognize for the model, while bananas and dates were easier for the model to handle.
Demo
I'm amazed by
Comments
Please log in or sign up to comment.