Electricity is considered to be the heart of modern social and economic development. The transition to a sustainable energy system is challenging for the operation and stability of electric power systems as power generation becomes increasingly uncertain, grid loads increase, and their dynamical properties fundamentally change.
Most of the problems in the power system are optimization and prediction. AI can provide unique solutions for energy production, power grid balance and energy consumption analysis. AI has become an important part of the power industry. AI is an application process of self-learning and calculation. It can integrate human vision, perception, understanding, communication, adaptability, and other abilities and combine with computers' powerful data processing functions.
Renewable Energy Sources & Smart EconomyThe ascent of renewable energy sources provides the global community with a much demanded alternative to traditional, finite and climate-unfriendly fossil fuels. However, their adoption poses a set of new paradigms, out of which two interrelated aspects deserve particular attention:
- Prior to the rise of renewable energy sources, the traditional operating ecosystem involved few production entities (sources) supplying energy to consumers over unidirectional flows. With the advent of renewable options, end-users (households and enterprises) now not only consume energy but have the ability to produce and supply it. Also, for a reliable ecosystem, people need to ensure that energy grids are smarter, self healable with precise threat and fault detection.
- Despite the increased flexibility brought in by the introduction of renewable sources, the management of supply and demand in a more complex generation/distribution/consumption environment and the related economic implications (particularly the decision to buy energy at a given price or not) have become even more challenging.
In a smart grid, consumer demand information is collected, and centrally evaluated against current supply conditions and the resulting proposed price information is sent back to customers for them to decide about usage. As the whole process is time-dependent, dynamically estimating grid stability becomes not only a concern but a major requirement.
Put simply, the objective is to understand and plan for both energy production and/or consumption disturbances and fluctuations introduced by system participants in a dynamic way, taking into consideration not only technical aspects but also how participants respond to changes in the associated economic aspects (energy price).
The Challenges of Applying AI to Smart Electric Grids- Insufficient data sample accumulation: Data samples that meet the requirements of diverse AI technology applications are not rich enough, so the realization of AI applications based on small samples is a problem that needs to be studied continuously.
- Reliability: Reliability needs to be further improved. Although AI technology applied to power systems has reached a high level of identification rate for problems and faults, it still cannot meet the requirements of practical application.
- Infrastructure: The application of AI is based on abundant data samples, advanced computing power and distributed communication collaboration. However, the supporting capacity and level of relevant infrastructure resources such as quick production AI algorithms, and distributed collaboration platforms need to be improved.
- Lack of power industry-specific algorithms: The algorithm is the basis of AI. Compared with perception, prediction and security maintenance, algorithm adaptability of AI in a power system is still weak.
In this project, we will explore how can we predict electric grid stability with Neuton AI which has tremendous potential for industry-specific reliable AI algorithms. Also, we will explore the communication infrastructure for such electric grid operations. With the combined knowledge of AI + IoT = AIoT, we will try to solve some parts of the above-mentioned challenges. The technology used for our solutions is listed below:
- Neuton TinyML: Neuton is a no-code platform based on a patented neural network framework. I selected this solution for my experiment since it is free to use and automatically creates tiny machine learning models deployable even on 8-bit MCUs. According to Neuton developers, you can create a compact model in one iteration without compression.
- Particle IoT: Particle provides an integrated IoT Platform-as-a-Service that helps businesses connect, manage, and deploy software applications to connected devices, from edge to cloud and back. Over 240k developers and 160+ Enterprise customers are building on Particle, from fast-growing startups to Fortune 100 companies.
The original dataset contains 10, 000 observations. It also contains 12 primary predictive features and two dependent variables.
Predictive features:
- 'tau1' to 'tau4': the reaction time of each network participant, a real value within the range 0.5 to 10 ('tau1' corresponds to the supplier node, 'tau2' to 'tau4' to the consumer nodes);
- 'p1' to 'p4': nominal power produced (positive) or consumed (negative) by each network participant, a real value within the range -2.0 to -0.5 for consumers ('p2' to 'p4'). As the total power consumed equals the total power generated, p1 (supplier node) = - (p2 + p3 + p4);
- 'g1' to 'g4': price elasticity coefficient for each network participant, a real value within the range 0.05 to 1.00 ('g1' corresponds to the supplier node, 'g2' to 'g4' to the consumer nodes; 'g' stands for 'gamma');
Dependent variables:
- 'stab': the maximum real part of the characteristic differentia equation root (if positive, the system is linearly unstable; if negative, linearly stable);
- 'stabf': a categorical (binary) label ('stable' or 'unstable').
As there is a direct relationship between 'stab' and 'stabf', 'stab' will be dropped and 'stabf' will remain as the sole dependent variable. Here is the link to the dataset: https://archive.ics.uci.edu/ml/datasets/Electrical+Grid+Stability+Simulated+Data+
ProcedureStep 1: Importing Dataset and Target VariableOn the Neuton model training platform, we will upload the dataset for our use case and select 'stabf' as the target variable.
In the training parameters, set the Input data type to FLOAT32 and Normalization type to "Unique scale for each feature". Then proceed to train.
Once the training has started, we can see the model data analysis to understand the close relationship between the original dependent and independent variables.
Correlation: It is important to verify the correlation between each numerical feature and the dependent variable, as well as the correlation among numerical features leading to potential undesired collinearity. The heatmap below provides an overview of the correlation between the dependent variable ('stabf' or 'target') and the 12 numerical features.
Once the training has finished, we can see the model metrics, the model accuracy is: 0.921435
We can see the classification performance using the generated confusion matrix,
Download the model for our IoT device.
We have selected a particle Argon board for this project (although you can use any particle board without the need to reprogram each board). The Particle Argon is a powerful Wi-Fi development kit that you can use on Wi-Fi networks.
Equipped with the Nordic nRF52840 and Espressif ESP32 processors, the Argon has built-in battery charging circuitry which makes it easier to connect a Li-Po battery and 20 mixed-signal GPIOs to interface with sensors, actuators, and other electronics.
Particle IoT boards are secure and fully network equipped to allow Smart Grid Infrastructure to deliver data and updates on grid failures at a faster and cheaper rate.
Setting up Particle IDEand Workbench:
Inside the Particle Workbench project folder, add your downloaded Neuton model, your folder structure would look like this (checksum, parser, protocol, application and StatFunctions are files required to make predictions on data received over Serial communication using CSV uploader tool)
The most important function here is,
static float* on_dataset_sample(float* inputs)
{
if (neuton_model_set_inputs(inputs) == 0)
{
uint16_t index;
float* outputs;
uint64_t start = micros();
if (neuton_model_run_inference(&index, &outputs) == 0)
{
uint64_t stop = micros();
uint64_t inference_time = stop - start;
if (inference_time > max_time)
max_time = inference_time;
if (inference_time < min_time)
min_time = inference_time;
static uint64_t nInferences = 0;
if (nInferences++ == 0)
{
avg_time = inference_time;
}
else
{
avg_time = (avg_time * nInferences + inference_time) / (nInferences + 1);
}
RGB.control(true);
RGB.color(255, 255, 255); // white
switch (index)
{
case 0:
Particle.publish("Prediction: Stable Grid", String(index));
RGB.color(0, 255, 0);
break;
case 1:
Particle.publish("Prediction: Unstable Grid", String(index));
RGB.color(255, 0, 0);
break;
default:
break;
}
return outputs;
}
}
return NULL;
}
Compile the application in the cloud or locally, once compiled you are ready to flash it to your device. Make sure you have selected the correct Particle OS for your device.
CSV data uploader utility:
We are going to test our predictions by sending the test data features from our test dataset CSV file over a USB serial. Install dependencies,
# For Ubuntu
$ sudo apt install libuv1-dev gengetopt
# For macOS
$ brew install libuv gengetopt
- Clone this repo,
$ git clone https://github.com/Neuton-tinyML/dataset-uploader.git
$ cd dataset-uploader
- Run make to build the binaries,
$ make
Once it's done, you can send the CSV file over the USB
$ ./uploader -s /dev/ttyACM0 -b 230400 -d /home/vil/Desktop/electric_grid_test.csv
Also, the prediction is printed on the Particle IoT cloud, thus with this, we come to the end of this project. The best strategy for such an AIoT project would be to make predictions and at the same time collect and report data for improving the models for the future and devices like Particle to make OTA updates for the fleet easier.
Comments
Please log in or sign up to comment.