This is a project made for the first assignment of the Internet Of Things course, at Sapienza University of Rome.
I have created a cloud-based IoT system that collects information from a set of virtual environmental sensors using the MQTT protocol, with the AWS IoT technology. I have also implemented a simple web site to display the data collected from the sensors.
The system is composed by two Python scripts, which represent two Environmental Station that generate data through the virtual sensors (Temperature, Humidity, WindDirection, WindIntensity and RainHeight). The resulting values are published on a topic over the MQTT channel, as JSON files. From the broker there will be another Python script (in the role of Subscriber) that inserts the received data into the table EnvironmentalStationDB, in the DynamoDB system. From the database the records are shown in the web site.
I suggest you to clone the GitHub repository to do all the steps that I will mention to you soon.
To use it you need to install the AWS IoT SDK for Python that you can get using
> pip3 install AWSIoTPythonSDK
(note that we are using pip3 since the script is written using Python 3).
For further information about the SDK check out here.
The following sections are a hands-on tutorial on how to setup and run the IoT system.
AWT IoT CoreAt first you need to create a profile on AWS, or if you are a student you should use your istitutional e-mail address to subscribe on AWS Educate. Once you have created a profile, go to AWS Starter Account and open the AWS Console. Following this guide you will be able to:
- Create a Thing
- Register a Device
- Configure the Device and activate it with certificates and policy
Then you have to create an IAM user, because this web service helps you securely control access to AWS resources. With this guide you are able to do it.
Anyway, you will install and configure the settings for AWS CLI to test the program, only creating in your pc a directory that keep settings and keys of the Thing created before. So, in the Linux terminal, you should create this directory with these commands:
> mkdir ~/.aws
> cd ~/.aws
> touch config credentials
Furthermore, you have to configure these two files just created with the AWS-Access-Key-ID, AWS-Secret-Access-Key, Region name and Output format.
For security reasons I will not post on this tutorial the keys, but you could contact me if you want to test my system.
Now the platform is configured and we are ready to the development of the environmental stations and the virtual sensors.
Virtual SensorsUsing Python I have created two stand-alone programs that represent virtual environmental stations, which generate periodically a set of random values for 5 different sensors:
- temperature (-50... 50 Celsius)
- humidity (0... 100%)
- wind direction (0... 360 degrees)
- wind intensity (0... 100 m/s)
- rain height (0... 50 mm / h)
A single program, therefore, will compute values through its virtual sensors and it will publish the message composed by the values calculated before on the MQTT channel, as long as the station works correctly.
Subscribe to a topicArriving at this step we are interested to attend messages over the MQTT channel.
The scripts is in the folder VirtualEnvironmentalStations/MQTT, so putting us in that directory we can start the program with this command:
> python3 SubscriberClient.py
In this phase there are a client that is connected to the broker, and it is waiting for incoming messages.
In addition to subscribe to a topic, this script can send records to the DynamoDB system, in particolar to the table EnvironmentalStationDB. Since that there are no native libraries for Python that permit to communicate over AWS Service, I have used the Boto3 library, which allows to write software that makes use of AWS Services. For install this library on your machine:
> pip install boto3
Therefore with Boto3 we initialize the connection to DynamoDB and the access to our table. In the on_message
function, since we will receive messages by the subscription of the topic, I have converted the string to a JSON (in the same way of the next section) and I put the document into the table, with the function put_item(ITEM). When there will be at least one publisher, you can verify on the table that the values have been entered into the DynamoDB table.
Other functions are explained in the next phase, that describe how to start the Publisher clients.
Generate and Publish valuesThe scripts are available into the folder VirtualEnvironmentalStations/MQTT. For the correct functioning the certificates that I will pass to you, must be inserted into the folder VirtualEnvironmentalStations/certs, otherwise there will be a security exception.
With the commands (on two different tabs):
> python3 Station1.py
and
> python3 Station2.py
You are able to start the environmental stations.
So in the program firstly I have declared two functions, to opening the connection between the MQTT Client and the broker, and to receive responses from the broker when a record arrived correctly. Then the Python script configures all its parameters starting from the AWS Endpoint to the name of the clientID, passing through the definition of the certificates and the keys. As soon as all the values have been configured, it can start to try the connection with the AWS Server, that permits to interface with it to exchange messages and resources.
When it is connected to the server, starts to generate values simulating the behavioral aspect of the sensors, so it computes the Random.randint over all the sensors, with the date and the time when these operations are made. Into a dictionary I insert the records of the sensors' values, indeed with the JSON library the dictionary becomes a JSON, and it will published on the topic sensor/data with a Quality of Service 1.
I used this type of QoS (the type 1 means at least one message) because even if there would be problems of duplicates, we have higher reliability than QoS 0 (that means at most once).
When the JSON document goes over the MQTT channel, there are several prints to check that the records are sent correctly.
DynamoDBWhen all the scripts are working, the result of all the procedure can be viewed either on the terminal tabs or directly on DynamoDB service. So, opening the window of the database (from the AWS console), and choosing the EnvironmentalStationDB table, the records managed until now are all in the table.
Since that I have created a rule on AWS Core, but it wasn't work correctly, the only thing that I could do (as I said before) was to pass arguments from a script, otherwise this process would not work. For this purpose, if you want, you can follow this guide to create a rule and insert automatically the records in the table as soon as the messages arrive to the MQTT broker.
Web DashboardUsing HTML5, Javascript, JQuery, CSS and Bootstrap I created a web site that provides the following functionality:
- Display the latest values received from all the sensors of a specified environmental station.
- Display the values received during the last hour from all environmental station of a specified sensor.
As you can see on the image, the Homepage contains the information about the latest value for all sensors of a specific Environmental Station, and two tables that track last values received by the site.
In the second page of the web site you can see the values received during the last hour from all environmental station of a specified sensor, obviously with information regarding which station sent them and at what time.
Remember that this process is based on Publishers and Subscribers, therefore the web site keep the values directly from the database but you can find problems with the CORS policy. This is a mechanism that allows restricted resources on a web page to be requested from another domain outside the domain from which the first resource was served. For this purpose you have to fix this problem, and you have to follow two main steps:
- Install the extension ModHeader on your browser and insert into the field "Response Headers" the name Access-Control-Allow-Origin, with the value * to unlock permissions for all the resources
- Create an Identity Pool on AWS Cognito
After that you are able to get values from resources on different web domains.
To try the web site, you should go into the folder WebApp/web and open with your favourite browser the file Dashboard.html. This site will interface directly with the DynamoDB table, because with the associated JS file script.js (that you can find in the folder WebApp/assets/js) I have already entered the Identity Pool ID with the region.
// Configuration with AWS Cognito Identity Credentials.
AWS.config.region = "us-east-1";
AWS.config.credentials = new AWS.CognitoIdentityCredentials({
IdentityPoolId: "us-east-1:f79208a2-a26f-4c2b-be6d-ec367432dded"
});
// Set up connection with DynamoDB.
var dynamodb = new AWS.DynamoDB();
var docClient = new AWS.DynamoDB.DocumentClient();
Inside the script there are all the functions that permit us to see that the process is working properly. For instance here I present you the function that keeps in a list all environmental stations connected to the broker, and scans all the records in the table from the latest to others.
After that we have to query with other parameters, and take only the tuples that have the value ID equal to the current station ID.
This is only an example, but in the script there are some other functions that satisfy the requests mentioned above.
How the system worksI make a summary to explain the correct procedure for the system to work properly:
- Open the scripts SubscriberClient.py, Station1.py, Station2.py, in the order of how I listed them and in three different terminals, with Python 3
- Open the Dashboard with your favourite browser going into the repository, under the folder WebApp/web and open the file Dashboard.html
Comments