Home security is quite popular in modern day with many technologies like Amazon's Ring and Google's Nest. These technologies can output live video of the house's exterior and can detect motion.
However, there is one crucial part missing: checking if the actual door is locked. The best solution right now are contact sensors to determine if the door is open, which can't actually tell if a door is locked. A door could be closed but its deadbolt and handle can still be open. The only way of knowing is when the door is actually opened, which is often too late.
isDoorLocked() is as simple as the name sounds: it uses OpenCV and ultrasonic sensors to actually determine if both the handle and deadbolt on a door are locked.
How doors workMost exterior doors have two parts. A lockable handle (push down and move the door) and a deadbolt (extra security).
The handle's lock will run top to bottom if it is locked and left to right (as shown in the image) if it is unlocked.
Most American deadbolts lock by turning clockwise if the handle/deadbolt are on the right or counter-clockwise if the handle/deadbolt are on the left.
Essentially, the solution would have an image of an unlocked (open) deadbolt and handle as well as a locked (closed) deadbolt and handle. If the similarity passes a certain threshold, then we know for sure what state the deadbolt and handle are in. This solution is adaptable to a large majority of doors and handles since most doors' unlocked and locked states look different.
Okay... so how do I do that?This is called template matching. We have a smaller image of something (a handle or deadbolt) we need to find in a larger image, and we need to know that coordinates of both.
The code for that looks like this:
import cv2
tmplt_handle = cv2.imread('templates/template_open2.png', cv2.IMREAD_UNCHANGED)
tmplt_deadbolt = cv2.imread('templates/template_deadbolt.png', cv2.IMREAD_UNCHANGED)
def getMatch(img, tmplt):
hh, ww = tmplt.shape[:2]
tmplt_mask = tmplt[:,:,3]
tmplt_mask = cv2.merge([tmplt_mask,tmplt_mask,tmplt_mask])
tmplt2 = tmplt[:,:,0:3]
corrimg = cv2.matchTemplate(img,tmplt2,cv2.TM_CCORR_NORMED, mask=tmplt_mask)
min_val, max_val, min_loc, max_loc = cv2.minMaxLoc(corrimg)
xx = max_loc[0]
yy = max_loc[1]
pt1 = (xx,yy)
pt2 = (xx+ww, yy+hh)
newimg = img[pt1[1]:pt2[1], pt1[0]:pt2[0]]
newimg = newimg[len(newimg)//3:2*len(newimg)//3, len(newimg[0])//3:2*len(newimg[0])//3]
return newimg
cam = cv2.VideoCapture(0)
# Within while True loop
ret, img = cam.read()
handle, deadbolt = getMatch(img, tmplt_handle), getMatch(img, tmplt_deadbolt)
I have a template for the handle (template_open2.png) and the deadbolt (template_deadbolt.png) which look like this:
All this does is try to see if there are deadbolts or handles within the image of the door using the TM_CCORR_NORMED algorithm. We ask the code to search for the handle or deadbolt given a sample image of each, and then OpenCV tries to do its best at finding the handle and deadbolt. It will get the height width, and top-left corner x and y coordinates. Once we have that, we can then get the two corners of the image to then crop to a reasonable size, and then return that.
After this code runs, we have two cropped images: one is the handle state and one is the deadbolt state. Once we have them, we can do image similarity between the template of closed or open handle/deadbolt and the one that was scanned in the door picture.
If they pass a certain threshold, then we know what state the deadbolt or handle are in; else, it will be inconclusive. For this project, I found a threshold of 30% similarity worked best. Sometimes, the readings can be somewhat variable, so I found that a measurement would be valid if it was the same for 15 times in a row.
So, that code would look something like this:
compoh = cv2.imread('templates/oh.png')
compod = cv2.imread('templates/od.png')
compch = cv2.imread('templates/ch.png')
compcd = cv2.imread('templates/cd.png')
def imgsim(first, second):
first_gray = cv2.cvtColor(first, cv2.COLOR_BGR2GRAY)
second_gray = cv2.cvtColor(second, cv2.COLOR_BGR2GRAY)
score, diff = structural_similarity(first_gray, second_gray, full=True)
return score*100
# Within while True loop
ohsim, chsim = imgsim(handle, compoh), imgsim(handle, compch)
odsim, cdsim = imgsim(deadbolt, compod), imgsim(deadbolt, compcd)
hstatus = 'inconclusive' if ohsim<30 and chsim<30 else 'locked' if chsim>ohsim else 'unlocked'
dstatus = 'inconclusive' if odsim<30 and cdsim<30 else 'locked' if cdsim>odsim else 'unlocked'
if(b1hstatus!=hstatus): #Fail change.
b1hstatus = hstatus
b1hcounter = 0
elif(b1hstatus==hstatus and b2hstatus!=b1hstatus): #waiting to change
b1hcounter += 1
if(b1hcounter > 15): #Success change.
uploadqueue.append(f'handle {b1hstatus}')
b1hcounter = 0
b2hstatus = b1hstatus
if(b1dstatus!=dstatus): #Fail change.
b1dstatus = dstatus
b1dcounter = 0
elif(b1dstatus==dstatus and b2dstatus!=b1dstatus): #waiting to change
b1dcounter += 1
if(b1dcounter > 15): #Success change.
uploadqueue.append(f'deadbolt {b1dstatus}')
b1dcounter = 0
b2dstatus = b1dstatus
This defines four images to compare: an open handle (handle that is unlocked), a closed handle, an open deadbolt, and a closed deadbolt.
If the current reading matches the previous reading, then we can increment the counters for the deadbolt or handle by 1. If it doesn't then we set it to zero. This ensures that we get 15 correct measurements in a row before declaring that a door is closed or open.
The data should be uploaded to an adafruit IO database, and the instructions for creating a feed is here. I made three feeds called handle, deadbolt, and ultrasonic sensor for uploading data at most 1 time per second.
The full code has this, so it can be copy-pasted from there.
We'll need to create a link between Adafruit IO and IFTTT to actually send emails.
Our "this" will be the adafruit IO feed saying "unlocked" or "door open" and our "that" will be to send an email. It will look something like this if you did it correctly.
After a long editing process, it is finally here.
Comments