Last month, we lost his best friend and a loving part of our family, Phreddy.
What we didn't realize at the time was how much he had depended on her for his hearing. Without her around to "cue" off of, he has really been lost. As I watched him deal with the depression associated with loss, I became acutely aware that he had also lost a different part of himself.
This project is an attempt to help get Sebastian some ears...
I decided to go with the Adafruit WICED Feather for this project, as it natively supports TLS v1.2, which Amazon requires on their IoT service. This allows us to talk directly to AWS, rather than having to go through a Pi or some other type of gateway. (There are methods to use other chipsets to connect and authenticate with certs such as Mongoose OS, but that is beyond the current scope of this project.) Connecting this to the Cypress Pioneer Board is simply a matter of tying our two I2C wires together.
Well, let's break this down into what input we want to let Sebastian know about
- Doorbell/Knock Front of house
- Knock Back Entryway
Those are the obvious ones, but why do we have to limit this to audio cues?...
Let's go ahead and add to this with:
- Motion sensors in the backyard
- Certain notifications just during daylight
- Override using AWS IoT Button
- When the mailman comes, even though it is a remote mailbox... ?
So, the question now stands, "how do we present these different input scenarios to him in a manner that he can adapt to and easily distinguish?"...
Innotek makes a line of training collars, bark collars, and "invisible fences". Now, when most people hear the term "shock collar" they get a little freaked out. When used properly, they simply deliver a "touch sensation" that is in no way painful at all.
This gives us a binary alert mechanism, but how will Sebastian know whether he needs to go bark at the front or the back door
- Pulsed patterns
- Position of stimulus
- Supplemental lighting
We can use the rumble motors out of this old Playstation controller to send patterns that vary in both sequence and intensity.
I'm sure most folks have heard this myth at one point or another. The truth is, their color vision is very similar to a human with red-green color blindness.
Jay Neitz, Ph.D., a leader in the field of ophthalmology, released a paper entitled, "Color vision in the dog", that roughly comes to the following conlusion:
There is even "an app for that".
So, with this being known, perhaps we can tailor our lighting feedback to accomodate this spectrum by using yellow and blue LEDs...
But we are getting ahead of ourselves...First, we need to get our AWS IoT conduit up and running. I am assuming you have your console set up (if not, head over to https://aws.amazon.com/console/).
Now go ahead and log in and head to IoT under Services.
Click
Now we can fill in our sketch for the WICED Feather and connect up to AWS IoT.
Comments