Human-wildlife conflicts are interactions between humans and wildlife that result in negative impacts on human social, economic, or cultural life, wildlife populations, or the environment. They occur when humans and wildlife share the same resources, such as land, water, food, or habitat, and when humans perceive wildlife as a threat to their safety, livelihood, or well-being.
Human-wildlife conflicts are not a new phenomenon, but they have increased in frequency and intensity in recent decades due to various factors, such as human population growth, urbanization, habitat fragmentation, climate change, wildlife poaching, and illegal trade. These factors have reduced the natural habitats and resources of wildlife, forcing them to adapt to human-dominated landscapes and compete with humans for survival. In January 2023, a 64-year-old man in Tamil Nadu was attacked by an elephant that killed 21 people in Kerala and Tamil Nadu. The elephant, code-named PT 7, was tranquilized by a forest team and relocated to the Wayanad Elephant Camp. This is one example that shows the importance of finding effective and sustainable solutions to prevent human-wildlife conflicts.
Some innovative solutions that can reduce or deter human-wildlife conflicts include fencing, repellents, alarms, lights, or translocation.
Let’s Build Better with BluesIn this post, I will show you how I created my own solution that detects an animal and alerts people with LoRa and Blues Cellular IoT. Below is an illustration overview of the solution I built.
For this project, I used a Xiao ESP32s3 Sense board, which is a low-power controller with an onboard camera. This helped me classify wild animals using Edge Impulse. I then transferred the classification information to a Blues base station via LoRa. To complete this transfer, I designed a custom PCB that has both Xiao ESP32S3 Sense and LoRa Wio E5 modules.
The Base station receives the LoRa signals and reverts them to the cloud network via Blues Cellular IoT.
Instead of using wires, you can create your customized PCBs with Seeed Fusion PCBA services, and they're also offering free prototype and functional testing on your first sample.
Discover more here! Invigorate your Inspiration for IoT with Wio-E5 and FREE Seeed Fusion PCBA Prototypes https://seeedstudio.com/blog/2021/10/21/invigorate-your-inspiration-for-iot-with-lora-e5-and-free-seeed-fusion-pcba-prototypes/…
Unleash the Magic Power of IoT with the New Wio-WM1110 and Get FREE Seeed Fusion PCBA Services https://seeedstudio.com/blog/2023/06/0
Here you can find out more info on the Wio E5 and Wio WM1110.
Edge Impulse Model CreationI used Edge Impulse to develop an image classification system. To do this, I used 10 GB of data from Kaggle.
Larger data sets will give better results.
Next, I uploaded all the necessary image models to Edge Impulse.
Then, I labeled the images. For this step, you can label them manually, or you can generate predefined labels with YOLO V5.
Next, I prepared the impulse design, with the classification set in the image below.
Then, I generated the model features. Below is my model output.
Once I generated the model output, I deployed the model for Arduino IDE.
The next step was to import the El Library to my Arduino IDE. To import, I opened the Arduino IDE and navigated to the Sketch tab. I then selected the Add Zip Library. Finally, I selected the downloaded zip file from the Edge Impulse portal to deploy the EI model to the Xiao ESP32S3 Sense.
Here is the Arduino sketch’s flow that helped me classify the wild animals and transfer the alert via LoRa.
#include <wild-animal-monitoring_inferencing.h>
#include "edge-impulse-sdk/dsp/image/image.hpp"
#include "esp_camera.h"
#include <Arduino.h>
#include <HardwareSerial.h>
HardwareSerial mySerial(0); //Create a new HardwareSerial class -- D6/D7
static char recv_buf[512];
static bool is_exist = false;
static int at_send_check_response(char *p_ack, int timeout_ms, char *p_cmd, ...)
{
int ch = 0;
int index = 0;
int startMillis = 0;
va_list args;
memset(recv_buf, 0, sizeof(recv_buf));
va_start(args, p_cmd);
mySerial.printf(p_cmd, args);
Serial.printf(p_cmd, args);
va_end(args);
delay(200);
startMillis = millis();
if (p_ack == NULL)
{
return 0;
}
do
{
while (mySerial.available() > 0)
{
ch = mySerial.read();
recv_buf[index++] = ch;
Serial.print((char)ch);
delay(2);
}
if (strstr(recv_buf, p_ack) != NULL)
{
return 1;
}
} while (millis() - startMillis < timeout_ms);
return 0;
}
static int node_send(uint32_t timeout)
{
static uint16_t count = 0;
int ret = 0;
char data[32];
char cmd[128];
int node = 1;
int alarm = 1;
memset(data, 0, sizeof(data));
sprintf(data, "%04X,%04X", node, alarm);
sprintf(cmd, "AT+TEST=TXLRPKT,\"5345454544%s\"\r\n", data);
ret = at_send_check_response("TX DONE", 2000, cmd);
if (ret == 1)
{
Serial.print("Sent successfully!\r\n");
}
else
{
Serial.print("Send failed!\r\n");
}
return ret;
}
#define CAMERA_MODEL_XIAO_ESP32S3 // Has PSRAM
#define PWDN_GPIO_NUM -1
#define RESET_GPIO_NUM -1
#define XCLK_GPIO_NUM 10
#define SIOD_GPIO_NUM 40
#define SIOC_GPIO_NUM 39
#define Y9_GPIO_NUM 48
#define Y8_GPIO_NUM 11
#define Y7_GPIO_NUM 12
#define Y6_GPIO_NUM 14
#define Y5_GPIO_NUM 16
#define Y4_GPIO_NUM 18
#define Y3_GPIO_NUM 17
#define Y2_GPIO_NUM 15
#define VSYNC_GPIO_NUM 38
#define HREF_GPIO_NUM 47
#define PCLK_GPIO_NUM 13
#define LED_GPIO_NUM 21
/* Constant defines -------------------------------------------------------- */
#define EI_CAMERA_RAW_FRAME_BUFFER_COLS 320
#define EI_CAMERA_RAW_FRAME_BUFFER_ROWS 240
#define EI_CAMERA_FRAME_BYTE_SIZE 3
/* Private variables ------------------------------------------------------- */
static bool debug_nn = false; // Set this to true to see e.g. features generated from the raw signal
static bool is_initialised = false;
uint8_t *snapshot_buf; //points to the output of the capture
static camera_config_t camera_config = {
.pin_pwdn = PWDN_GPIO_NUM,
.pin_reset = RESET_GPIO_NUM,
.pin_xclk = XCLK_GPIO_NUM,
.pin_sscb_sda = SIOD_GPIO_NUM,
.pin_sscb_scl = SIOC_GPIO_NUM,
.pin_d7 = Y9_GPIO_NUM,
.pin_d6 = Y8_GPIO_NUM,
.pin_d5 = Y7_GPIO_NUM,
.pin_d4 = Y6_GPIO_NUM,
.pin_d3 = Y5_GPIO_NUM,
.pin_d2 = Y4_GPIO_NUM,
.pin_d1 = Y3_GPIO_NUM,
.pin_d0 = Y2_GPIO_NUM,
.pin_vsync = VSYNC_GPIO_NUM,
.pin_href = HREF_GPIO_NUM,
.pin_pclk = PCLK_GPIO_NUM,
//XCLK 20MHz or 10MHz for OV2640 double FPS (Experimental)
.xclk_freq_hz = 20000000,
.ledc_timer = LEDC_TIMER_0,
.ledc_channel = LEDC_CHANNEL_0,
.pixel_format = PIXFORMAT_JPEG, //YUV422,GRAYSCALE,RGB565,JPEG
.frame_size = FRAMESIZE_QVGA, //QQVGA-UXGA Do not use sizes above QVGA when not JPEG
.jpeg_quality = 12, //0-63 lower number means higher quality
.fb_count = 1, //if more than one, i2s runs in continuous mode. Use only with JPEG
.fb_location = CAMERA_FB_IN_PSRAM,
.grab_mode = CAMERA_GRAB_WHEN_EMPTY,
};
/* Function definitions ------------------------------------------------------- */
bool ei_camera_init(void);
void ei_camera_deinit(void);
bool ei_camera_capture(uint32_t img_width, uint32_t img_height, uint8_t *out_buf) ;
/**
@brief Arduino setup function
*/
void setup()
{
// put your setup code here, to run once:
Serial.begin(115200);
//comment out the below line to start inference immediately after upload
mySerial.begin(9600);
pinMode(LED_BUILTIN, OUTPUT);
uint16_t error;
char errorMessage[256];
while (!Serial);
Serial.println("Edge Impulse Inferencing Demo");
if (ei_camera_init() == false) {
ei_printf("Failed to initialize Camera!\r\n");
}
else {
ei_printf("Camera initialized\r\n");
}
ei_printf("\nStarting continious inference in 2 seconds...\n");
ei_sleep(2000);
delay(200);
if (at_send_check_response("+AT: OK", 100, "AT\r\n"))
{
is_exist = true;
at_send_check_response("+MODE: TEST", 1000, "AT+MODE=TEST\r\n");
at_send_check_response("+TEST: RFCFG", 1000, "AT+TEST=RFCFG,868,SF12,125,12,15,14,ON,OFF,OFF\r\n");
delay(200);
}
else
{
is_exist = false;
Serial.print("No mySerial module found.\r\n");
}
}
/**
@brief Get data and run inferencing
@param[in] debug Get debug info if true
*/
void loop()
{
// instead of wait_ms, we'll wait on the signal, this allows threads to cancel us...
if (ei_sleep(5) != EI_IMPULSE_OK) {
return;
}
snapshot_buf = (uint8_t*)malloc(EI_CAMERA_RAW_FRAME_BUFFER_COLS * EI_CAMERA_RAW_FRAME_BUFFER_ROWS * EI_CAMERA_FRAME_BYTE_SIZE);
// check if allocation was successful
if (snapshot_buf == nullptr) {
ei_printf("ERR: Failed to allocate snapshot buffer!\n");
return;
}
ei::signal_t signal;
signal.total_length = EI_CLASSIFIER_INPUT_WIDTH * EI_CLASSIFIER_INPUT_HEIGHT;
signal.get_data = &ei_camera_get_data;
if (ei_camera_capture((size_t)EI_CLASSIFIER_INPUT_WIDTH, (size_t)EI_CLASSIFIER_INPUT_HEIGHT, snapshot_buf) == false) {
ei_printf("Failed to capture image\r\n");
free(snapshot_buf);
return;
}
// Run the classifier
ei_impulse_result_t result = { 0 };
EI_IMPULSE_ERROR err = run_classifier(&signal, &result, debug_nn);
if (err != EI_IMPULSE_OK) {
ei_printf("ERR: Failed to run classifier (%d)\n", err);
return;
}
// print the predictions
ei_printf("Predictions (DSP: %d ms., Classification: %d ms., Anomaly: %d ms.): \n",
result.timing.dsp, result.timing.classification, result.timing.anomaly);
#if EI_CLASSIFIER_OBJECT_DETECTION == 1
bool bb_found = result.bounding_boxes[0].value > 0;
for (size_t ix = 0; ix < result.bounding_boxes_count; ix++) {
auto bb = result.bounding_boxes[ix];
if (bb.value == 0) {
continue;
}
ei_printf(" %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\n", bb.label, bb.value, bb.x, bb.y, bb.width, bb.height);
if (bb.label == "elephant")
{
Serial.println("Elephant Detected");
node_send(1000);
digitalWrite(LED_BUILTIN, HIGH); // turn the LED on (HIGH is the voltage level)
delay(1000); // wait for a second
digitalWrite(LED_BUILTIN, LOW); // turn the LED off by making the voltage LOW
delay(1000);
}
}
if (!bb_found) {
ei_printf(" No objects found\n");
}
#else
for (size_t ix = 0; ix < EI_CLASSIFIER_LABEL_COUNT; ix++) {
ei_printf(" %s: %.5f\n", result.classification[ix].label,
result.classification[ix].value);
}
#endif
#if EI_CLASSIFIER_HAS_ANOMALY == 1
ei_printf(" anomaly score: %.3f\n", result.anomaly);
#endif
free(snapshot_buf);
}
/**
@brief Setup image sensor & start streaming
@retval false if initialisation failed
*/
bool ei_camera_init(void) {
if (is_initialised) return true;
#if defined(CAMERA_MODEL_ESP_EYE)
pinMode(13, INPUT_PULLUP);
pinMode(14, INPUT_PULLUP);
#endif
//initialize the camera
esp_err_t err = esp_camera_init(&camera_config);
if (err != ESP_OK) {
Serial.printf("Camera init failed with error 0x%x\n", err);
return false;
}
sensor_t * s = esp_camera_sensor_get();
// initial sensors are flipped vertically and colors are a bit saturated
if (s->id.PID == OV3660_PID) {
s->set_vflip(s, 1); // flip it back
s->set_brightness(s, 1); // up the brightness just a bit
s->set_saturation(s, 0); // lower the saturation
}
#if defined(CAMERA_MODEL_M5STACK_WIDE)
s->set_vflip(s, 1);
s->set_hmirror(s, 1);
#elif defined(CAMERA_MODEL_ESP_EYE)
s->set_vflip(s, 1);
s->set_hmirror(s, 1);
s->set_awb_gain(s, 1);
#endif
is_initialised = true;
return true;
}
/**
@brief Stop streaming of sensor data
*/
void ei_camera_deinit(void) {
//deinitialize the camera
esp_err_t err = esp_camera_deinit();
if (err != ESP_OK)
{
ei_printf("Camera deinit failed\n");
return;
}
is_initialised = false;
return;
}
/**
@brief Capture, rescale and crop image
@param[in] img_width width of output image
@param[in] img_height height of output image
@param[in] out_buf pointer to store output image, NULL may be used
if ei_camera_frame_buffer is to be used for capture and resize/cropping.
@retval false if not initialised, image captured, rescaled or cropped failed
*/
bool ei_camera_capture(uint32_t img_width, uint32_t img_height, uint8_t *out_buf) {
bool do_resize = false;
if (!is_initialised) {
ei_printf("ERR: Camera is not initialized\r\n");
return false;
}
camera_fb_t *fb = esp_camera_fb_get();
if (!fb) {
ei_printf("Camera capture failed\n");
return false;
}
bool converted = fmt2rgb888(fb->buf, fb->len, PIXFORMAT_JPEG, snapshot_buf);
esp_camera_fb_return(fb);
if (!converted) {
ei_printf("Conversion failed\n");
return false;
}
if ((img_width != EI_CAMERA_RAW_FRAME_BUFFER_COLS)
|| (img_height != EI_CAMERA_RAW_FRAME_BUFFER_ROWS)) {
do_resize = true;
}
if (do_resize) {
ei::image::processing::crop_and_interpolate_rgb888(
out_buf,
EI_CAMERA_RAW_FRAME_BUFFER_COLS,
EI_CAMERA_RAW_FRAME_BUFFER_ROWS,
out_buf,
img_width,
img_height);
}
return true;
}
static int ei_camera_get_data(size_t offset, size_t length, float *out_ptr)
{
// we already have a RGB888 buffer, so recalculate offset into pixel index
size_t pixel_ix = offset * 3;
size_t pixels_left = length;
size_t out_ptr_ix = 0;
while (pixels_left != 0) {
out_ptr[out_ptr_ix] = (snapshot_buf[pixel_ix] << 16) + (snapshot_buf[pixel_ix + 1] << 8) + snapshot_buf[pixel_ix + 2];
// go to the next pixel
out_ptr_ix++;
pixel_ix += 3;
pixels_left--;
}
// and done!
return 0;
}
#if !defined(EI_CLASSIFIER_SENSOR) || EI_CLASSIFIER_SENSOR != EI_CLASSIFIER_SENSOR_CAMERA
#error "Invalid model for current sensor"
#endif
This part of this Arduino sketch will send an alert message via LoRa if it detects any wild animal.
#if EI_CLASSIFIER_OBJECT_DETECTION == 1
bool bb_found = result.bounding_boxes[0].value > 0;
for (size_t ix = 0; ix < result.bounding_boxes_count; ix++) {
auto bb = result.bounding_boxes[ix];
if (bb.value == 0) {
continue;
}
ei_printf(" %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\n", bb.label, bb.value, bb.x, bb.y, bb.width, bb.height);
if (bb.label == "elephant")
{
Serial.println("Elephant Detected");
node_send(1000);
digitalWrite(LED_BUILTIN, HIGH); // turn the LED on (HIGH is the voltage level)
delay(1000); // wait for a second
digitalWrite(LED_BUILTIN, LOW); // turn the LED off by making the voltage LOW
delay(1000);
}
}
if (!bb_found) {
ei_printf(" No objects found\n");
}
Once the particular animal is detected, the above function will trigger the node_send()
function. Here is the node_send()
function definition.
static int node_send(uint32_t timeout)
{
static uint16_t count = 0;
int ret = 0;
char data[32];
char cmd[128];
int node = 1;
int alarm = 1;
memset(data, 0, sizeof(data));
sprintf(data, "%04X,%04X", node, alarm);
sprintf(cmd, "AT+TEST=TXLRPKT,\"5345454544%s\"\r\n", data);
ret = at_send_check_response("TX DONE", 2000, cmd);
if (ret == 1)
{
Serial.print("Sent successfully!\r\n");
}
else
{
Serial.print("Send failed!\r\n");
}
return ret;
}
This above function will send a node number and alert message to the receiver when it is triggered.
Below, you can see the results of the Arduino serial monitor.
Next, I worked on the base station side. I set up the base station to receive all the LoRa signals and alert users when data would be moved to Blues Notehub via Notcard. Then, I set it to alert all remote users via Qubitro.
In this image, both Xiao + LoRa E5 and Blues Notecard are connected to the Xiao ESP32 S3 board. I used this Arduino sketch to receive the data from the slave node and then transferred it to Blues Notehub.
#include <Arduino.h>
#include <Notecard.h>
#include <Wire.h>
#define PRODUCT_UID "xxxxxxxxxxxxxxxxxxxxxxxxxxx"
#define myProductID PRODUCT_UID
Notecard notecard;
#include <Adafruit_NeoPixel.h>
int Power = 11;
int PIN = 12;
#define NUMPIXELS 1
Adafruit_NeoPixel pixels(NUMPIXELS, PIN, NEO_GRB + NEO_KHZ800);
static char recv_buf[512];
static bool is_exist = false;
static int at_send_check_response(char *p_ack, int timeout_ms, char *p_cmd, ...)
{
int ch = 0;
int index = 0;
int startMillis = 0;
va_list args;
memset(recv_buf, 0, sizeof(recv_buf));
va_start(args, p_cmd);
Serial1.printf(p_cmd, args);
Serial.printf(p_cmd, args);
va_end(args);
delay(200);
startMillis = millis();
if (p_ack == NULL)
{
return 0;
}
do
{
while (Serial1.available() > 0)
{
ch = Serial1.read();
recv_buf[index++] = ch;
Serial.print((char)ch);
delay(2);
}
if (strstr(recv_buf, p_ack) != NULL)
{
return 1;
}
} while (millis() - startMillis < timeout_ms);
return 0;
}
static int recv_prase(void)
{
char ch;
int index = 0;
memset(recv_buf, 0, sizeof(recv_buf));
while (Serial1.available() > 0)
{
ch = Serial1.read();
recv_buf[index++] = ch;
Serial.print((char)ch);
delay(2);
}
if (index)
{
char *p_start = NULL;
char data[32] = {
0,
};
int rssi = 0;
int snr = 0;
p_start = strstr(recv_buf, "+TEST: RX \"5345454544");
if (p_start)
{
p_start = strstr(recv_buf, "5345454544");
if (p_start && (1 == sscanf(p_start, "5345454544%s,", data)))
{
data[16] = 0;
int node;
int alert;
char *endptr;
char *endptr1;
char *endptr2;
char *endptr3;
char datatvoc[5] = {data[0], data[1], data[2], data[3]};
char dataco2[5] = {data[4], data[5], data[6], data[7]};
node = strtol(datatvoc, &endptr, 16);
alert = strtol(dataco2, &endptr1, 16);
double temperature = 0;
J *rsp = notecard.requestAndResponse(notecard.newRequest("card.temp"));
if (rsp != NULL) {
temperature = JGetNumber(rsp, "value");
notecard.deleteResponse(rsp);
}
double voltage = 0;
rsp = notecard.requestAndResponse(notecard.newRequest("card.voltage"));
if (rsp != NULL) {
voltage = JGetNumber(rsp, "value");
notecard.deleteResponse(rsp);
}
J *req = notecard.newRequest("note.add");
if (req != NULL) {
JAddBoolToObject(req, "sync", true);
J *body = JCreateObject();
if (body != NULL) {
JAddNumberToObject(body, "Node", node);
JAddNumberToObject(body, "Alert", alert);
}
notecard.sendRequest(req);
Serial.println("NoteCard Data Sent");
}
}
p_start = strstr(recv_buf, "RSSI:");
if (p_start && (1 == sscanf(p_start, "RSSI:%d,", &rssi)))
{
String newrssi = String(rssi);
Serial.print(rssi);
Serial.print("\r\n");
}
p_start = strstr(recv_buf, "SNR:");
if (p_start && (1 == sscanf(p_start, "SNR:%d", &snr)))
{
Serial.print(snr);
Serial.print("\r\n");
}
return 1;
}
}
return 0;
}
static int node_recv(uint32_t timeout_ms)
{
at_send_check_response("+TEST: RXLRPKT", 1000, "AT+TEST=RXLRPKT\r\n");
int startMillis = millis();
do
{
if (recv_prase())
{
return 1;
}
} while (millis() - startMillis < timeout_ms);
return 0;
}
void setup()
{
Wire.begin();
notecard.begin();
J *req = notecard.newRequest("hub.set");
if (myProductID[0]) {
JAddStringToObject(req, "product", myProductID);
}
JAddStringToObject(req, "mode", "continuous");
notecard.sendRequest(req);
Serial.begin(115200);
Serial1.begin(9600);
Serial.print("Receiver\r\n");
if (at_send_check_response("+AT: OK", 100, "AT\r\n"))
{
is_exist = true;
at_send_check_response("+MODE: TEST", 1000, "AT+MODE=TEST\r\n");
at_send_check_response("+TEST: RFCFG", 1000, "AT+TEST=RFCFG,868,SF12,125,12,15,14,ON,OFF,OFF\r\n");
delay(200);
}
else
{
is_exist = false;
Serial.print("No Serial1 module found.\r\n");
}
pixels.begin();
pinMode(Power, OUTPUT);
pinMode(LED_BUILTIN, OUTPUT);
digitalWrite(Power, HIGH);
delay(200);
}
void loop()
{
if (is_exist)
{
digitalWrite(LED_BUILTIN, HIGH); // turn the LED on (HIGH is the voltage level)
pixels.clear();
pixels.setPixelColor(0, pixels.Color(0, 5, 0));
pixels.show();
delay(500);
node_recv(2000);
digitalWrite(LED_BUILTIN, LOW); // turn the LED off by making the voltage LOW
pixels.clear();
pixels.setPixelColor(0, pixels.Color(5, 0, 0));
pixels.show();
delay(500);
}
}
Here you need to modify the product UID.
#define PRODUCT_UID "your project UID"
I then navigated to the Notehub | Create Project and created a new project.
I entered my project name, copied the project UID, and replaced it in the Arduino Sketch. After this, I was able to see the Blues Notehub starting to receive data.
Once the data reached the Notehub, I used Qubitro to visualize and make an alert function based on the received data.
Qubitro simplifies the IoT development process by providing tools, services, and infrastructure that enable users to connect their devices, store and visualize their data, and build robust applications without coding. Qubitro supports various IoT protocols and platforms, such as LoRaWAN, MQTT, and The Things Stack.
I navigated to the Qubitro Portal and created a new project.
Then, I selected MQTT as the data source.
Next, I entered all the input details as needed.
Then, I opened the created MQTT source and selected the connection details. It will show you all the credentials, we need these credentials in order to transfer our data to Qubitro.
Let’s move on to the Blues Notehub’s route page. Here, I created a new MQTT route and changed the username and password according to my Qubitro credentials.
At the bottom, I defined which data should move to Qubitro. Here I’m using body only. So, my payload only transferred to the Qubitro.
After my payload was successfully transferred to Qubitro, I opened the Qubitro portal to monitor the incoming data.
We can visualize the data via the Qubitro dashboard. First, I navigated to the Dashboard section and created a new dashboard.
Then, I clicked on edit and added the widgets I needed.
Next, I have added the image widget with the data points, making it straightforward to comprehend the data levels.
Then, I added a standard gauge to check and visualize the alert level.
Finally, I used a state widget for slave number visualization
and map widgets for GPS mapping.
This was my final dashboard, which shows the location of my base station and the slave node’s status.
In this section, I added a rule function to alert users when an anomaly is detected. To do this, I navigated to the function in the Qubitro portal and then selected the rule section.
Then I created a new rule.
Here I used Webhooks. I entered all the basic details and moved to the next page.
Then, I opened the webhook site and copied the webhook URL.
Next, I pasted the URL to the Qubitro rule page.
Then I selected the conditions. For this condition, my rule will trigger when my Alert level is greater than 0.
Below is the webhook response.
At this point, you could use Twilio or Mailgun to send a customized SMS or email to alert you.
Wrap upHuman-wildlife conflict can have negative impacts on biodiversity conservation, human development, and social harmony. It can also lead to retaliation against wildlife, which can further endanger their survival. With the help of Blues and Machine Learning, we can develop creative solutions to help overcome these issues.
Comments