As smart homes become increasingly popular, it is important to remember that DIY home improvement is always an option. While something like a smart lock or a doorbell camera seemed impossible for an embedded hobbyist 10 years ago, today’s technology makes it as simple as taking candy from a baby (not that anyone should ever take candy from a baby, that’s a mean thing to do). In this project, we will use popular topics such as face detection and a deep neural network to create a smart lock that locks and unlocks using facial recognition. Even without any embedded experience, this project should not be too hard to recreate and have up and running within a day.
DISCLAIMER: I would not recommend using this as your front door lock, there are many potential security risks. It also has some features that make it hard to use as a lock for your front door. If expanded upon, by all means it could be used to secure a house, but that’s a more daunting task. Its main purpose is to show that DIY IoT projects are not out of reach for the common person.
A Rough Overview1. The camera continuously polls and stores video as frames (no audio) until it detects a face.
- 6 hours of video can be stored; beyond 6 hours is automatically deleted.
2. If a face is detected, the image goes through a Convolutional Neural Network to generate a face fingerprint.
- If it’s a non-user (i.e. doesn’t match any fingerprints in the KnownUser directory), their face is stored in a NonUser directory (deleted every 24 hours) and the deadbolt locks.
- If it’s a known user, the camera keeps polling until the same user is recognized 5 times, then the deadbolt unlocks and the user’s face is stored in the Entrants directory.
Hardware Setup
First, we must set up PYNQ on our Ultra96-V2. To do so, start by visiting http://www.pynq.io/board.html and clicking the “Avnet Ultra96-V2: v2.4 PYNQ image” option, which will prompt you to download the boot image that the board requires in order to run the PYNQ software. Insert your 16 GB MicroSD card into your PC using either a USB to SD converter or an SD to MicroSD converter (if your computer has an SD card slot) and write the image onto it using Win32 Disk Imager. In the image below, my SD card is called [E:/] and the path to my PYNQ boot image is C:/VM_Shared/ultra96v2_v2.4.img, but yours will most likely be different.
After Win32 has finished writing, eject your SD card by right clicking the drive in your file explorer and selecting eject. Now you can remove it from your PC and place it into the Ultra96 V2 MicroSD card slot. Connect the board to power using the DC barrel jack and to your PC using the Micro USB cable. Then, making sure the boot switch (SW3) is set to SD mode as in the GIF below, power it on using switch four (SW4) near the USB ports. You should see a red LED (INIT) light up near the MicroSD card slot, then, after a few seconds, it should turn off and a blue LED (DONE) near it should light up.
At this point, you can open a new Google Chrome window and type 192.168.3.1 (or 192.168.3.1:9090/lab if you want a better coding environment) into the search bar, which will take you to the PYNQ Jupyter Notebooks sign-in page. The password is xilinx. If the page doesn’t load, give it a minute and hit refresh; sometimes it takes a bit for the board to completely finish booting.
Once in the PYNQ home page, navigate to common/wifi.ipynb and follow the steps to connect your Ultra96 to WiFi. At this point, you can follow the optional steps below to program your board wirelessly, but it will most likely be slower than using the Micro USB cable.
Optional Wireless Programming
1. Open a new terminal in Jupyter Notebooks by selecting New → Terminal (or + → Terminal in JupyterLab).
2. In the terminal, type ifconfig
and copy down the IP address under the wlan
section.
3. Open a new tab in Google Chrome and type the IP address in the search bar (again, followed by :9090/lab if you want to use the lab environment), making sure your PC is on the same WiFi as the board.
4. Now, you can disconnect the Micro USB cable and program your board wirelessly.
Software Set Up
Open a new terminal by selecting New → Terminal (or + → Terminal in JupyterLab), then follow these steps to install the necessary software onto the Ultra96 (NOTE: you can copy the code (without the $
) from here and paste into the terminal using ctrl+shift+v):
1. Upgrade pip:
$ pip3 install --upgrade --user pip
2. Install the latest version on OpenCV, the software we will use for face detection. While the PYNQ boot image already has OpenCV, we need to install the latest version for what we need:
$ cd /home/xilinx
$ wget -O opencv.zip https://github.com/opencv/opencv/archive/3.4.3.zip
3. Unzip OpenCV and move some directories around:
$ unzip opencv.zip
$ mv opencv-3.4.3 opencv
$ cd opencv
$ mkdir build
$ cd build
4. Insert your USB flash drive into the Ultra96 USB port then run the following commands to use it as swap memory:
- First, find the name of the flash drive with:
$ sudo blkid
Mine was called sdb1
.
- Then, run these commands to set up the swap space:
$ sudo mkswap /dev/sdx
$ sudo swapon /dev/sdx
5. Use the following command to configure OpenCV:
$ cmake -DCMAKE_BUILD_TYPE=RELEASE \
-DCMAKE_INSTALL_PREFIX=/usr/local \
-DINSTALL_PYTHON_EXAMPLES=ON \
-DINSTALL_C_EXAMPLES=OFF \
-DPYTHON_EXECUTABLE=/usr/bin/python3 \
-DBUILD_EXAMPLES=ON ..
6. Now compile OpenCV using the following command (NOTE: This may take around 3 hours):
$ make -j2
7. After OpenCV is done compiling, install it using:
$ sudo make install
$ sudo ldconfig
8. Change directory to /home/xilinx:
$ cd /home/xilinx
9. Install PYNQ computer vision overlays:
$ sudo pip install --upgrade --user git+https://github.com/Xilinx/PYNQ-ComputerVision.git
10. You might need to update pandas for the overlays to work:
$ pip install --upgrade pandas
11. Install smbus:
$ sudo apt-get install python3-smbus
12. Create a new directory in Jupyter Notebooks (name it whatever your heart desires). Then $ cd
into the new directory in the terminal and clone the project files from Github:
$ cd /home/xilinx/jupyter_notebooks/<YOUR PROJECT DIRECTORY>
$ git clone https://github.com/julianbartolone/doorbellcam
AssemblyNow it’s time to put everything together. Here’s where some user creativity kicks in because I didn’t necessarily find the best way to assemble the locking mechanism. The fastening of the servo onto the deadbolt is up to you (mine pictured below), but we will now go over how to wire it all up.
First, save your Jupyter Notebooks. Then, power off the Ultra96 by pressing the power button (SW4) or typing $ sudo shutdown now
into the terminal. After the board shuts down, disconnect the DC power cable from it. Now, place the Click Mezzanine board on top of the Ultra96 with the leads of the Click Mezzanine into the pin sockets on the Ultra96 (it should fit perfectly). Next, plug the click servo into mikroBUS 1 on the Click Mezzanine (it should also fit perfectly). You can now plug the power back in and turn the Ultra96 on using the PWR button on the Click Mezzanine. If you were using WiFi to program the board before, you’ll need to set that up again using the Micro USB cable.
On the Click Servo, you’ll see 16 different slots of 3 header pins; each of these can accept a servo. In this project, we’ll use slot #1. First, split the wires from the servo apart by cutting the black part at the end of the wires into three (IMPORTANT: keep the sockets at the end of the wire intact or else you will have to solder them back on. The black plastic part is not so important, but the metal part that it shields is what you need to keep.) Plug the brown wire into the top pin of slot #1, plug the red wire into the +5V pin of mikroBUS 2, and plug the yellow wire into the bottom pin of slot #1. This is because, for some reason, the VCC of the Servo Click was not working for me, so I had to supply it from the Click Mezzanine.
Now for the most confusing and rickety part. For the Servo Click to work, there is a pin that needs to be tied to ground. This pin is the OE/CS (output enable on the Servo Click/chip select on the Click Mezzanine) pin, but it is inverted so rather than driving it high to enable the output, we must drive it low. I could not figure out how to do this using software, so I had to physically connect it to the Click Mezzanine mikroBUS 2 GND pin.
Voila! You're done. The Jupyter Notebook has instructions on how to use the project. If you're confused about anything or have any questions feel free to message me!
Resources/References
Comments