What’s ‘D’ Point of This?
This custom digital stylus uses inexpensive, off-the-shelf parts and simple algorithms to turn any surface into an input for your devices.
Digital styluses designed to operate on uninstrumented surfaces offer users a very natural way to interact with their electronic devices. These sophisticated tools employ advanced technologies, allowing users to draw, write, and navigate on a wide range of surfaces without the need for special digitizer tablets or screens. This versatility opens up a plethora of applications, making these styluses ideal for professionals in fields like graphic design, digital art, and note-taking, where precision and freedom of movement are paramount. Moreover, they offer a seamless transition between traditional and digital mediums, making them an excellent choice for artists and creatives who value the tactile feel of pen and paper but require the digital advantages for editing and sharing their work.
GitHub user and college student Jcparkyn was inspired by the creative possibilities of digital styluses and decided to build one as a project for an undergraduate electrical engineering course. While very clever, the build also looks to be fairly simple (at least when you have the instructions and source code provided to you!) and it does not require any expensive hardware. Jcparkyn notes that the stylus build is not meant to be a "plug and play" DIY project, but for a hobbyist with a bit of experience, it looks like it would be reasonable to duplicate the effort with the provided information.
Called D-POINT, this digital pen leverages both a camera and an inertial measurement unit (IMU) to track its position in three-dimensional space so that it can be used as an input device when writing on any flat, uninstrumented surface. The body of the pen is 3D-printed and houses a force sensor, battery, and a Seeed XIAO nRF52840 Sense development board that provides processing power, Bluetooth connectivity, and an IMU. A set of eight ArUco markers are affixed to the top of the stylus, and a low-cost webcam is used to capture images of those markers.
Determining the pen’s position involves using OpenCV to first locate the ArUco markers, after which a simple algorithm performs a rolling shutter correction (or alternatively, one could presumably use a global shutter camera). After identifying the corner positions of the markers, a Perspective-n-Point algorithm then determines the three-dimensional position of the pen. Finally, this position is converted into coordinates that are relative to the drawing surface.
So where does the IMU come into play, you ask? There is a bit of latency involved in processing the image frames, so an Extended Kalman Filter that leverages the accelerometer and gyroscope measurements was included to both refine the visual estimates, and also provide real-time updates between the processing of images. This step becomes especially important when interpreting fast movements made by the user of the stylus.
The source code has been released on GitHub, and there is also a setup guide to help you get started if you would like to build your own D-POINT.