Private ?
There are already quite a few voice assistants that you can either use on your smartphone or on dedicated devices: while they already offer a wide range of uses, most of them collect a disturbing amount of data and seem to rely exclusively on the cloud. Snips, on the other hand, allows you to create your own voice assistant that runs exclusively on-device (well, except if you want it to query some data on the internet) and can be installed on small footprint devices such as the Raspberry Pi.
Gender neutral ?
It's not news to anyone that the tech world has a serious inclusiveness problems - including but not limited to gender -, with only a few companies taking steps towards changing the culture that explains why that is. There is, however, a specific domain where women haven't been forgotten: voice assistants. You'll notice how most voice assistants are given women's names and voices: Cortana, Alexa or Siri (even though you can choose a man's voice for the latter). I wanted my assistant to be neither male nor female, opting instead for a neutral name and the most neutral voice I could find.
While Snips is a pretty neutral name, I couldn't resist calling my assistant Sudo, mainly because picturing the following interaction made me laugh :
- Turn on the lights-...- Sudo, turn on the lights- Okay, turning on the lights
Eco-friendly ?
"Turn off the lights, this is not Versailles" is something that many kids have heard from their parents while growing up. The problem is, it's often easy to forget to turn off a light or any other appliance, and there aren't always grown-ups to remind you to do so - mostly because you might now be the grown-up in charge. Sudo will try to change your habits by being that grown-up, at least when it comes to saving energy
OverviewSince this project involves different components, here's an overview of the pieces involved :
Setting up your house to be smarterSetting up Snips
The first thing to do was to install Snips on the Raspberry Pi - I used a Raspberry Pi 3 - model B. I first downloaded and flashed the latest version of Raspbian onto a SD card, created an empty file called ssh and another called wpa_supplicant.conf at the root of the SD Card, which I then populated with my Wifi credientials before ejecting the card. Once the device has finished booting, you should see it on your local network and be able to ssh to it using the default username and password (pi / rapsberry) and the raspberrypi.local hostname if you're using Mac OS.After that, I followed the instructions given here https://docs.snips.ai/getting-started/quick-start-raspberry-pi. In a nutshell, you install a tool called Sam (based on nodejs) on your development machine and Sam will be your main tool to interact with your voice assistant, including the installation process that you can launch using the init command once you've configured Sam to target your Raspberry Pi using its hostname or IP.
You then head to the Snips console https://console.snips.ai and create a new assistant. Mine uses the French language but you're free to use whichever suits you among the supported ones. I called my assistant Sue-Do and installed a simple skill from the Snips AppStore (Heure by Joseph which allows you to ask Snips for the time). Once you've added a skill to your assistant, you can then click the deploy button, which will provide you with a Sam installation command that you can copy/paste to your development machine.
Once the assistant is deployed, the sam status command allows you to see if all snips service are running on your device (except the snips-analytics one which never seems to be running).
You're almost ready to try your assistant... except it can't hear you right now! For it to be able to do so, you need a microphone (and a speaker so that you can hear its answers). In terms of speaker, I did all my development using a powered external speaker that I hooked up to the audio jack of the Raspberry Pi and powered using one of the USB ports.
I used Seeed studio's 4-mic array which you can just plug onto the Raspberry Pi (while still off) : the getting started instructions should get you going.
Finally, after checking the sound output and input using sam setup audio and sam test microphone (or speaker), you are ready to type sam watch and summon your assistant by saying "Hey Snips", followed by a request that you know the skill you installed will be available to perform (in my case, asking for the time).
Setting up Home Assistant
Once again, I just followed the instructions found on the Home Assistant website to manually install Home Assistant on Raspbian, including the section on how to autostart Home Assistant at boot (in our case, Raspbian uses systemd so follow the instructions from that paragraph)
Note that you can use the raspi-config command line utility to change the hostname of your Raspberry Pi and therefore access Home Assistant via its local hostname (in my case, sudo.local).
Home Assistant is becoming more and more intuitive to use and setup: as an example, I did not have to setup most of my components as the discovery component took care of that: my Ikea Tradfri lights were recognized as soon as I entered the gateway code
Alternatively, this can be added to your configuration file (located in /home/homeassistant/.homeassistant/configuration.yaml) by adding the following lines:
tradfri: host: IP_ADDRESS
Connecting Snips and Home AssistantSnips uses the hermes protocol to communicate between its components - as you'll see later in the project, these components can therefore be used individually or swapped, allowing for a modular experience. hermes relies on MQTT for transport, and the Snips installation therefore includes a MQTT broker. For Home Assistant to be able to communicate with Snips, you can then just add the MQTT platform to its component, specifying Snips's MQTT broker as the broker to use - in my case, both services were running on the same Raspberry Pi, so I just added this to my configuration file:
mqtt:
broker: 127.0.0.1
port: 1883
Be careful, the instructions on this page do not give the right port for Snip's MQTT broker's port (I've submitted a pull request so this might be corrected by the time anyone reads this).
Making sure Snips and Home assistant understand each other
The tutorial given on the Snips website gives you a clear view of the steps to follow to create a Snips app by forking an existing app from the store. I followed the tutorial to the t and created an intent (that I called ikealights). Once I reached the section called "Code actions" , I changed things a bit. As the Snips documentation mentions, code snippets on the console are meant to be used to simple and quick interactions. I found it easier to save all my code on the Raspberry Pi itself (and of course I'd then be able to push it to Github). So when configuring the action for my code, rather than selecting the tradfri Home Assistant component in the list, I selected the python_script component (hence the empty code snippet in the screenshot below).
I then added these lines to my configuration.yaml file :
snips:
feedback_sounds: true
python_script:intent_script: !include intent_script.yaml
The first one adds Snips to Home Assistant, the second one allows me to add my own code snippets (in the python_scripts directory that I created in the same folder as the configuration.yaml file). The last line allows me to include a new file, intent_script.yaml, to my configuration (I could have added the contents of this file right after "intent_script"). This file allows you to tell Home Assistant what action to take when receiving an intent from Snips - in our case, the only intent we will receive for now is the one called ikealights. Therefore, here's the content of intent_script.yaml so far :
ikealights:
speech:
type: plain
text: "OK, j'allume la lumière"
action:
- service: python_script.ikealights
data_template:
lamp_name: "{{ lamp_name }}"
In a few words, we first declare which intent we're replying to, and what to do when we receive it - what Snips should answer, and what action to launch. In my case, I actually ask Home Assistant to launch a python script called ikealights defined in the python_script/ikealights.py file as follows :
lamp_name = data.get('lamp_name')
if lamp_name is not None:
logger.info("turning on lamp {}".format(lamp_name))
service_data = {'entity_id': lamp_name, 'brightness': 255 }
hass.services.call('light', 'turn_on', service_data, False)
I'm using a python script because I had already written that one a few hours earlier when trying to understand how scripts work, therefore I suspect this action code could have been included in itent_script.yaml
The whole setup now works and you can ask Snips to turn your lights on and off - which in itself is pretty nice ! And now, onto the power consumption skill !
The power consumption skillThe easiest way to track a device's usage inside Home Assistant is to use the history_stats component. In your configuration.yaml file, add the following lines, replacing light.streetlight by the lamp that you want to track :
sensor:
- platform: history_stats
name: streelight on today
entity_id: light.streetlight
state: 'on'
type: time
start: '{{ now().replace(hour=0).replace(minute=0).replace(second=0) }}'
end: '{{ now() }}'
This sensor will give you the number of hours that this specific lamp has been switched on since midnight. Now, we can add a new skill that Snips will be able to answer to: let's start by the end, adding these lines to our intents_scripts file to give Snips the correct answer :
consumption: speech: type: plain text: La lampe a passé {{ states('sensor.streetlight_on_today') }} heures allumée aujourd''hui
You can then head to the Snips console and add a new intent called consumption - or whatever you entered in the first line :
You can then create as many training examples as ways you'd ask Snips about your consumption: "How much time has this lamp been on today ?" "What's the usage for this lamp today ?", etc. Finally, you can create your app that answers to this intent, specifying that the Home Assistant component to answer it is answer_script (since we haven't created a dedicated python script this time)
After deploying your assistant using sam (the logs should mention your new skill), you can restart Home Assistant and casually ask Snips how much time your lamp has spent switched on today.
Snips takes charge of Versailles
So far, your assistant has learned what to answer if you ask for your consumption. But since that's not how most of us work, we need it to actually go one step further and initiate the conversation. We will therefore create what Home Assistant calls an automation. Basically, we will set a rule so that if the light has been on for more than 3 hours, Snips will ask you if you want to turn it off. You can add your automation directly into your configuration.yaml file, but it's better to use a separate file called automations.yaml and include it in the conf file.
streelight_overuse:
trigger: - platform: numeric_state entity_id: sensor.streetlight_on_today above: 3
sequence: service: snips.say_action data: text: 'Dis, tu veux pas éteindre ta lumière ?' intent_filter: - lightsTurnOff
Here, as soon as the sensor that we created earlier has its value above 3, the assistant will initiate a new dialog session by saying your witty comment out loud, and expecting the user to ask it to turn off a light in return. It's up to the user to actually do that or just decide to keep the light on anyways.
This structure means that you should create one sensor and the corresponding automation per light, and you'll be able to choose different thresholds for the trigger depending on where the lamp is used, some lamps not needing to be on more than a few hours a day.
Giving your assistant a bodySo far, our assistant has worked on the temporary setup from the beginning of the article, namely, a powered speaker connected directly to the bare Raspberry+ReSpeaker combo.
In order to create a fully fledged persona (and protect the PCBs from dust), I decided to create a housing for my assistant. I wanted to be able to unplug it without removing any kind of cover and also wanted to get rid of the speaker I had.
I used one of Adafruit's little mono amps to take the signal from the Raspberry's audio jack (I purchased a jack to wire adapter), amplify it and send it to a little speaker. I actually cut a micro USB cable that I had and used the USB part to connect to the amp's power input on one end and one of the Raspberry's USB ports one the other end (this amp only needs 0.4A and the USB port can go up to 0.5A). I plugged the micro USB part into the Pi's power input, and the other half to a micro USB breakeout board. This allowed me to deport the Pi's power input to wherever I saw fit (instead of just on the Pi itself) without having to power it to GPIO (I hate the idea of soldering anything to the Pi)
Now that the electronics were out of the way, I was free to integrate the whole thing into the housing I had chose: an old Nabaztag that I had taken apart. If you don't know what a Nabaztag is, it was one of the first consumer IoT devices, a rabbit that connected to your Wifi and checked your emails, told you about the weather etc. in a whimsical voice and using some LEDs.
I kept the internal structure of the Rabbit despite its bulkiness because it allowed me to still use the ears and put them in their existing sockets (even though I removed the motors driving them).
I used some putty to fix the speaker that I used where the Nabaztag's speaker used to be, and used some hot glue to fix the micro USB breakout board at the back of the rabbit so you'd be able to plug and unplug it easily. I then lasercut a simple rectangle out of MDF with the right mounting wholes for the Rapsberry Pi, so that I was able to then hot glue that panel to the black structure. I tried to fit the cables as well as I could without having to cut them.
Calling it Sudo
At this point, our robot slash voice assistant still answers to the default hotword "Snips". Like previously mentionned, Snips allows you to use any other hotword, and even provides you with the script to do so ! The procedure is detailed in this tutorial , I had to do it twice since I think there was still to much noise around me when I first recorded the samples. Once you've done that, the tool will generate your very own model (it's actually contained in a folder), than you can then move to /etc/snips/ and reference in the configuration file - don't forget to also select "personal hotword" on the console before deploying your assistant again.
Taking it a bit furtherSo right now your assistant is able to turn your lights on and off, track how much you're using your lights on that day and alert you if you've kept them on for too long. But how do you know if that made any kind of difference ? How can you compare your results from today to those of yesterday ?
In order to achieve this, we'll add a few more sensor to our configuration file :
- platform: history_stats
name: 'streetlight on today'
entity_id: 'light.streetlight_1'
state: 'on'
type: time
start: '{{ now().replace(hour=0).replace(minute=0).replace(second=0) }}'
end: '{{ now() }}'
- platform: history_stats
name: 'streetlight on yesterday'
entity_id: 'light.streetlight_1'
state: 'on'
type: time
end: '{{ now().replace(hour=0).replace(minute=0).replace(second=0) }}'
duration:
hours: 24
- platform: history_stats
name: 'streetlight on this week'
entity_id: 'light.streetlight_1'
state: 'on'
type: time
start: '{{ as_timestamp( now().replace(hour=0).replace(minute=0).replace(second=0) ) - now().weekday() * 86400 }}'
end: '{{ now() }}'
In order to check that those values did indeed work, I added them to my Home Assistant homepage using badges (you have to use the new Lovelace UI) :
Now that Home Assistant offers us those new values, we can update our Snips skill to be able to ask Snips for the stats from these periods. Head back to the console and add a slot to your skill, choosing the integrated snips/datetime type
You can then enter a few training examples in which you'll teach Snips which part of the sentence corresponds to the slot. As you can see on the screenshot, using the builtin type means that Snips is able to provide you with the exact date from what the user said.
On the Home Assistant end, we now need to understand the date that Snips sends us. This time we will create a specific python script again, called consumption.py, and enter the lines below :
queried_date_string = data.get('date')
qds_year = queried_date_string[0:4]
qds_month = queried_date_string[5:7]
qds_day = queried_date_string[8:10]
qds_hour = queried_date_string[11:13]
qds_min = queried_date_string[14:16]
qds_sec = queried_date_string[17:19]
queried_date = datetime.datetime(int(qds_year),int(qds_month),int(qds_day),int(qds_hour),int(qds_min),int(qds_sec))
delta = queried_date - datetime.datetime.now()
if(delta == -1)
service_data = {'text': "La lampe a passé {{ states('sensor.streetlight_on_today') }} heures allumée ajourd'hui" }
elif(delta == -2)
service_data = {'text': "La lampe a passé {{ states('sensor.streetlight_on_yesterday') }} heures allumée hier" }
elif(delta < -2 and delta >-9)
service_data = {'text': "La lampe a passé {{ states('sensor.streetlight_on_yesterday') }} heures allumée cette semaine" }
else
service_data = {'text':"Je ne peux comprends pas cette date"}
hass.services.call('snips.say', service_data, False)
(For some reason I wasn't able to use the datetime.strptime function inside the script, only the time.strptime one, hence all the parsing at the beginning). We also need to alter the intent_script this way :
consumption:
speech:
type: plain
text: "Je consulte mes archives, ne bouge pas"
action:
- service: python_script.consumptionhistory
data_template:
date: "{{ date }}"
And there you have it, your assistant can now tell you how much time your lamp has been on the previous day or the current week, so that you can track your progress.
Ending wordsWhile I had fun doing this projects, I can see how it could still easily be improved : - having to create several sensor for each lamp is tedious, it would be easier to store the stats history in a MySQL database and query every state change for the domain "light", either with a specific lamp in mind or to have the calculated uptime as a whole- lamps are not the only energivore devices and LED lights consume less energy than say, an AC unit left on we could extend this to HVAC, etc. Furthermore, in the future I'll add the Linky sensor (for France) to be able to take into account non-connected devices.- finally, the gamified aspect could be improved, with Snips asking you to set an improvement objective for each week, using the builtin percentage slot type (ex: 20% less uptime than the previous one) and warning you during the week if your ratio for the day is close to expiring, so that you can meet your goal at the end of each week. The LEDs on the ReSpeaker could help visualize the quota consumption.
Bonus : a few more headshots and the shooting behind the scenes
Comments