There is no doubt in the fact that animals are being endangered at an alarming rate. And among one of them is the Elephant. Not only poaching, but the numbers of elephants are also in danger due to human-elephant encounters -a prevalent problem in places where human establishments lie on forest fringes where elephant population is high. The clearing of forests for agricultural lands have increased this human-elephant conflict.
Smart Parks have constantly been working on this field of conservation of elephants, among many other species, to use technology to help the animals reduce unnecessary human-animal conflict, and detect and reduce animal fatality through poaching.
The Smart Marks tracking collar already has many features like geotagging and perimeter detection. Artemis brings some smart upgrades to the existing collars with minimal design changes, to make elephant tracking, monitoring and protection a little easier.
The proposed project uses tinyML using the Edge Impulse studio. The two major features are:i) Elephant Behaviour Detection
Artemis uses the Arduino Nano 33 BLE Sense for a very useful array of sensors present on it. The inbuilt microphone can detect elephant sounds and vocalizations and run them through the trained tinyML model made on Edge Impulse Studio to classify various types of elephant vocalizations.
A comprehensive dataset was created using resources from elephantvoices.org and categorized broadly into the following fourteen categories:
- Alert Rumbles
- Bonding and Calling
- Calf in Distress
- Comforted Calf
- Conflict and Aggression
- Defense and Mobbing
- Estrous
- Grumbling and Protest
- Mating
- Musth
- Play
- Social and bonding
- Suckling
- Unknown (which identified the other non-elephant vocalizations)
The reason for this comprehensive categorization was two-fold:
i) To study elephant behavior further and understand them.ii) To identify individual elephants in musth and estrous and monitor them closely to reduce any damage it may cause to itself and civilization near it.
Although considering the vast range of frequencies emitted by elephants, with different activities having very similar type of vocalizations that differ minutely, this amount of data was sufficient to test the model to get an acceptable accuracy. Field testing can be done with more recorded vocalizations in higher quality.
An impulse was created on the Edge Impulse Studio for this data. MFE (Mel Energy Feature) filterbank was used for the nature of elephant vocalizations.
Since our data had a very diverse bandwidth, the N-N classifier was used with a 2D convolution architecture. We tried improving the model accuracy further by editing the N-N architecture (in Keras Mode).
Since the dataset obtained was not equally balanced, we took random samples from our data for anomaly detection. The selected test data yielded favorable results.
Poaching for ivory is a major threat to elephant population. To tackle this problem, Artemis aims to equip the collars with two features:
a) Human and human produced vocalization identification:We trained an Edge Impulse model to identify vehicle noises, gunshots as well as human whispers. We used an MFCC (Mel Frequency Cepstral Coefficient) filterbank for human vocalizations as well as MFE filterbanks for non-human vocalizations. Human vocalizations were done using the microphone of a smartphone (as our Arduino was not shipped to us on time to use for the project).
This not only detects any human presence near elephants, but also can be useful in detecting an elephant leave the perimeter by identifying human voices.
The data used to train our model used limited number of datasets, so the accuracy was yielded to high. The model to be deployed on the collars will be trained with a much wider variety of dataset to obtain realistic values.
b) Image classification using OpenMV H7 camera:We use the Open MV H7 camera to capture live images and classify them. The FILR Lepton Adapter Module is used for capturing thermal images for easier classification as well as identification at night.
Elephant thermal images were taken from this Github database by Arribada.
An image dataset containing thermal images of humans and elephants was used to train the model, using an Impulse as shown:
Interfacing the Camera with Edge Impulse is straightforward, and the documentation at Edge Impulse for interfacing the OpenMV H7 can easily guide one for testing and deployment.With the trained model, upon testing, the results obtained were relatively accurate.
A larger dataset can be used for better results.
iii) Additionally, the temperature sensor on the Arduino can be used to detect temperature variations on the body of an elephant for detecting patterns in musth and estrous.
(Note: The dashboard was initially planned to be made on Avnet's IOT Connect platform with the Arduino 33 BLE Sense. But due to the delay in shipping, the dashboard simulation is not yet available. One can, though, easily follow the instructions here, to interface the Avnet IOT Connect with any hardware.)
Comments