The Plan:
My home automation idea is to use a Raspberry Pi 2 running Windows 10 IoT Core to interface to my Avatar Remote Wifi Modules that I have previously created. These modules publish sensor data or will accept On / Off commands.
Specifically I will develop a Windows 10 application that subscribes to and collects Avatar sensor data and can issue Avatar actuator commands. The application will display any discovered Avatar modules and allow the user to select one and display its sensor data or issue a command. The sensor data is raw voltage, frequency or digital state. The application will decode the raw information into temperature, humidity, light intensity etc..., depending on the data type.
Additionally, I would like to gain experience in writing code that interfaces with Cortana so she can alert users when sensor data is above or below a threshold. I’ll add an option to the application to configure thresholds on data such as temperature or humidity that when exceeded will generate a notification to the user.
As a stretch I would like to send selected sensor data to an Azure application so smart phones, tablets or PCs can access the sensor data remotely and also have Cortana alert remote users of threshold crossings. I would like this to be Halo-esc like feel so it mimics the role playing game when Cortana appears to alert the Master Chief about impending danger.
Background Information:
As mentioned in the plan I submitted I already had working prototypes of Wireless Avatar Remote Modules. In fact I had just finished building a new breadboard prototype of an Avatar Module that uses an ESP8266 board for WiFi just before being selected for the Windows 10 IoT competition.
Avatar Modules can be made using any Micro-controller or Device that supports GPIO, ADC and PWM functions and has the ability to implement an IP Stack that supports UDP sockets. An Avatar module becomes part of what I call the Avatar Framework. To become part of the Framework an Avatar Module has some basic requirements that it must meet to be compatible with Avatar Applications. These include Discovery, Configuration and Data Publishing.
I call it a framework because it will eventually become more than just an API between Control Applications and the Environment that Sensor Modules sense and Actuator Modules control. Eventually it will support built in system health monitor service, Data recording applications, Simulator applications capable of playing back recorded data to aid in debugging control applications, mediator applications to process sensor data to generate and publish derived types of data and events and last but not least Translation Gateway applications to attach non Avatar Modules such as a Wemo light switch or receptacle.
Here is a top level Use Case Diagram to help visually show the Framework. I've also included 3 sub Use Case Diagrams covering each of the 3 requirements an Avatar Sensor must implement.
So where does Win10 IoT Core/Azure and Raspberry Pi2 fit?
To answer that question let's talk about the Controller Applications that would use the Avatar Framework and also the Avatar Remote Actuators. I have purposely designed the Framework to work locally tucked safely behind a firewall. Essentially the Avatar Framework is ( NoT ) a Network of Things. There are benefits to be gained by this, namely the Sensors and Actuators don't have to bogged down implementing high security protocols they can be simple and cheap. The second main benefit is I won't have to use TCP and its point to point connections to communicate between nodes, point to point doesn't scale very well. If you send your data out on the internet your going to need TCP's guaranteed delivery and in order packet delivery. If you keep most of your data behind the firewall on your own controlled network UDP and Multicasting will be sufficient. To help you visualize some of the ways the Framework can be used I'll throw a few existing diagrams I had starting with a simple Smart Phone and Single Avatar Sensor configuration up to configurations with many sensors, multiple Controller Apps and with an external GUI interface on a Smart Phone or Tablet.
Keep in mind all those configurations are on a local network and tucked safely behind a firewall. I've mentioned on my blog before that I want any external cloud interface to only export any data that is needed for some beneficial feature of the system and not just because it can or it sounds cool. After all, who wants Kim Jong-un controlling their lights or HVAC system once your cloud server is hacked. :)
The other item I pointed out that you would see lacking from my older drawings are Avatar Remote Actuators, while I mention them as being part of the Framework I have not documented many detailed requirements or details of how they should work. Part of the reason is that I've really focused on defining the Framework from the sensor point of view and figure that once I have fleshed out a couple of control applications I will get a feel for exactly how an app would want to control external devices.
Having said all that, I saw this Windows IoT Core competition and Microsoft asked "if we gave you an Raspberry Pi2 what would you do with it". My first thought was does it run Java, because that is what I've done with Beaglebone and Beaglebone Black. Microsoft also mentioned that adding an Azure interface would be worth extra bonus points.
So I took a look at Windows IoT Core and I think it has some promising possibilities, so I submitted my plan and here I am coding and typing away.
While most of the Software for my Avatar Framework is in Micro-controller C and Java code that can be used on Windows, Linux or Android, I can see a value in porting the Java to C# since the languages are not that different and then I would be able to export status to Azure with the Windows IoT Core. Additionally I see at least the worth of trying to leverage Cortana interface to see if there are some interesting things she can add to a home automation Controller App.
Additionally I can see a Windows IoT Core implementing an Avatar Remote Actuator. I still have a lot of work to do for defining Actuators beyond supporting discovery and configuration, but I can see the need for an actuator to be a little more robust than a simple Micro-controller, the reason being that it may need to take some quick decisive action on its own if it is controlling a device that could pose a Safety risk if not shutdown when a particular condition occurs.
The 30 Day Reality:
So now you know what my plan is, but what have I been able to accomplish in 30 days. As I mentioned in my video, I wasn't able to reuse any of the Java code I had, so I needed to port it over to C#, this took me over 2 1/2 weeks of the competition time. However once I got the framework ported over and all stuffed in a DLL I was able to quickly whip up a demo application to visualize the data from the discovery process and subscribing to data. I was also able to implement a rudimentary Light intensity alarm. I was able to have the IoT Core speak when alarms were triggered, but on the IoT core the only voice is male, and on my windows 10 PC there is a female voice but it is not Cortana. It seems the only way to get an interaction with Cortana is if she starts your application you can keep a conversation going. I posted a project demo video and also a code walk through video. Enjoy!
Comments