Many students at this university are feeling the impacts of rising global temperatures. Several UMD dorms, specifically those for freshmen, do not have built-in air conditioning systems. As a result, warmer months can be miserable for students who are looking to study or sleep. While traditional, oscillating fans can alleviate some of the heat, they are designed to cool the ambient environment, rather than the user in the room. We seek to revolutionize the fan market by developing a lightweight, portable, and scalable alternative to traditional oscillating fans. The uFan is an AI and battery-operated fan with two-axis tilt and vision-based speed control.
The proposed solution is equipped with a Raspberry Pi Camera, which is used to detect the user's face. Based on the camera output, the fan can dynamically rotate horizontally and vertically to center the fan on the user's face, allowing them to stay cool. The horizontal rotation is achieved using a turntable, and the cordless design allows for full 360-degree rotation, making it suitable for all environments. The fan’s compact, tabletop design makes it easy to place on a desk. Operating the fan is hands-free as the user can adjust the fan's speed by simply raising their index finger to the camera.
The physical design is also cheap and aesthetically organized. The prototype requires only two batteries: a portable charger for the Raspberry Pi 4 controller and a 12-volt battery for the motors. Most electrical components are packaged in specially printed wooden boxes to minimize exposed cables.
DemoRelated WorkA startup previously attempted to build a person-tracking smart fan named Following Fan with a similar concept to ours, the main difference is that they used a large 20-inch oscillating fan. However, this startup seems to have shut down before actually bringing their product to market.
https://thegadgetflow.com/product/person-tracking-smart-fan/
Hardware BuildThis section explains how the fan works in a bottom-up approach, it will detail the components and their function from the bottom to the top of the fan.
The base of the fan is an 8-inch diameter wooden circle placed below a Lazy Susan. There are a few more circular layers stacked on top. All layers were laser cut.
The Lazy Susan allows smooth, 360-degree rotation of the fan and is pictured below. The turntable rotation is done with direct drive using a motor pictured below.
Above the turntable motor is another wooden circle layer with four holes for screws and a hole for the turntable motor wires. This layer serves as a base of the top layer. The screws are used to hold all the layers together, and nuts are added to preserve the height between layers.
The top and final layer of the turntable contains a hole for the base of the fan, the portable charger used to power the Raspberry Pi, and a hole for the turntable motor cables. Additionally, there are the four screws from the layer below and holes to mount enclosures for the 12-volt battery, the Raspberry Pi itself, and the stepper motor controllers. The underside of the top layer, which shows all the holes and screws, is shown below.
Furthermore, there is a piece to secure another motor which is used to drive the angled tilt of the fan, allowing vertical motion. This motor is shown in the image below.
The bulk of the wiring is located on the top layer, and it connects six main components:
- The Raspberry Pi 4 Model B
- The Pi camera
- The 12-volt battery
- The portable charger
- The stepper motor controllers
- The servo arm
The Raspberry Pi is the main controller of the project, and contains several connections. The power is connected to the portable charger with a USB-C cable (cable not shown in image). The Raspberry Pi camera is connected to the Pi's camera port with the white cable. The GPIO pins provide power and data control for the servo arm, and data control for the stepper motor controllers (the controllers are not powered through the Pi). For presentation purposes, a laser-cut box was added around the Pi. The top face of the box is only half the size of the bottom of the Pi because making it wider would impede the rotation of the fan.
The Pi camera, pictured on the left of the fan, is used to take photos of the environment. The photos are fed into machine learning models to segment the user's face and index fingers, as described later. The camera is secured to a 3D printed ring which encapsulates the fan. The ring has a hole to allow the axial tilt stepper motor to tilt it (thus tilting the ring, shown at the top of the image).
The 12-volt battery is used to power the stepper motor controllers (data is received from the Pi). It is surrounded entirely by a laser-cut box, only exposing the positive and ground cables. The cables are split so the stepper motor controllers are wired in parallel.
The portable charger is cylindrical and fits directly into the circular top layer. It connects to the Pi's power port and is pictured below (cable not shown).
The stepper motor controllers shown below are used to drive the stepper motors. The controller on the left is connected to the axial tilt stepper, while the other controller connects to the turntable motor. The cable is threaded through the wood to access the motor.
The servo arm is a laser-cut contraption made from three layers of wood that clips to the ring of the fan. The pictured servo motor uses its blade to make contact with the speed button on the fan. The Raspberry Pi is used to drive the motor to push the button, allowing programmatic control of the fan speed.
Now that the hardware is specified, writing the software is straightforward. We created a few Python scripts linked in the repository in the Attachments section. The scripts use OpenCV and Mediapipe to detect if a face is visible and an index finger is visible, respectively. If a face is detected, the software will rotate the turntable and tilt the fan until the face is roughly in the center of the image. Once the face is centered, the fan will stop rotating. If an index finger is detected, the fan speed will change by triggering the laser arm. The index finger detection uses Mediapipe to distinguish it from other fingers, preventing false positives.
Milestone 1 SummaryAt milestone 1, we debated several different approaches to solving some of our key design problems. First, we discussed whether to use a Pi camera and traditional machine learning models for detecting user location, rather than an infrared device that is worn by the user. We ultimately decided to use the Pi camera and computer vision approach because it did not require the user to wear a device and it was less sensitive to changes in the ambient environment.
During this process, we also finalized our high-level project design. We decided to use a two-axis tilt, which would use a contraption to tilt the fan vertically and a rotating turntable that would turn the fan. An option that was also discussed in this step was using a Roomba to move the fan, but this was ultimately not feasible because the lab did not have one available to use.
We also finalized our initial shopping list and the materials used for the fan. We planned to use plywood for the turntable, while using 3D-printed parts for the y-axis tilt.
Milestone 2 SummaryBy milestone 2, we had completed three tasks for our fan.
First, we implemented a basic face-tracking algorithm. This used the Raspberry Pi camera and OpenCV to draw a bounding box around a face in the frame. Given the face's position relative to the frame, a message "tilt up" or "tilt down" would be printed for anticipated fan movement.
We also implemented a y-axis rotation mechanism to allow for fan rotation. In this stage, we used a servo motor. While we knew that it had limitations in terms of torque, the approach sufficed for this milestone. Combined with the bounding box from our face-tracking algorithm, we were able to tilt the fan up and down to reposition the user's face in the vertical center of the frame.
In addition, we also built the mechanism to press the button on the back of the fan to change fan speed. Initially, we planned to use a solenoid for the problem, but the lack of force from the solenoid led us to using a servo motor instead. We tested triggering this programmatically by pushing the button when two faces were detected in frame.
Design Changes Since MilestonesSince our first two milestones, our primary focus has been on designing the turntable for horizontal rotation. This is where we implemented our multi-layer, Lazy Susan contraption. For horizontal rotation, we decided to use a stepper motor for the extra torque.
We also made changes to our vertical rotation model. First, we shaved parts of the fan's central axis to allow for a smoother rotation. We also replaced our servo motor with a stepper motor to increase the total torque.
On the software side, we used the Mediapipe package to integrate our Pi Camera with our speed-controlling servo motor. We designed our algorithm such that if the user points up their index finger, the speed button will be clicked.
A major challenge we faced was the possibility of wire wrap-around — if we plugged the contraption into a wall outlet or the turntable motor was mounted elsewhere, the rotation of the turntable could have caused the wires to wrap around. Thankfully, since we used direct drive from above the bottom layer and batteries, we managed to dodge this issue. We also added boxes to contain the wires to clean up the design itself.
Comments
Please log in or sign up to comment.