In making videos for my projects I have often wanted to get a smooth shot of a part, such as a pan of an Arduino Nano. I have seen about 3 projects where people made devices that could pan a camera, but they were either not internet-connected or they were very expensive. So I set out to improve them.
The PlanI started this journey by imagining what I wanted in a device that moved a camera. What if my hands were full? Then I would want to use my voice. What if I only had my phone next to me? Then make a local webpage. This led me to use the Particle Photon, as the Particle Cloud API is extremely versatile. They have a javascript SDK which allows for access of many different cloud functions. Additionally, IFTTT has a Particle service, allowing for me to use the Google Assistant service with it.
Building the DeviceIn order to make the rails I cut a 5ft piece of 1/2" PVC pipe in half, making two 2.5ft sections. Then I took 2 pieces of wood and drilled 2 holes in each of them, as they would hold and guide the rails.
Next I connected them with the PVC sections, and then attached the whole thing to a 3ft board.
I also attached a stepper motor bracket to the lumber and fitted a NEMA 17 motor into it, along with a pulley for a timing belt.
Having two rails and a motor is fine, but how does the camera move? That's where Fusion 360 comes into play. So I designed a simple camera mount that slides along the two rails and attaches to the T2 timing belt.
Initially I thought that calling a Particle Cloud function on the Photon would be the best option, but that would limit how well you could interchange boards, so I settled on using cloud events. The firmware for the photon sets up the stepper motor and listens for the cloud events "pan" and "set_accel". Once it receives "pan" it moves the stepper at that speed to the end and back. "set_accel" changes the amount of acceleration when the camera moves.
In order to use the Google Assistant as a control device I had to somehow connect it to the Particle Cloud. I set up IFTTT to use the Assistant service. When I say "Ok Google, move my camera at ____ speed" it would move my camera.
Then it would publish the event for the Photon using the speed I said as the data.
I did this for the "set_accel" event as well.
The WebpageSo what if I didn't want to use my voice or I wasn't near my Assistant? That's why I created a simple webpage to publish the events as well. The user first logs into their Particle account and is then greeted by the interface. A value can be entered into the textbox and then submitted either as a pan command or a set_accel command.
Comments