Imagine playing a classic game like Battleships without the ability to see the board. For visually impaired players, the flat, paper-based version of this game poses significant challenges, lacking the tactile and auditory cues that make it accessible. My project, Battlesight AI, transforms this experience by integrating advanced computer vision and real-time feedback into an embedded device. This innovative solution enables visually impaired players to locate ships, track shots, and engage fully with the game, making Battleships accessible and enjoyable for everyone. Battlesight AI is not just adapting a game it’s redefining inclusive play.
Development ProcessThe problem initially identified was the difficulty that individuals with visual impairments face when playing the game of Battleship, particularly in distinguishing the game’s grid, ships, and markers due to various visual impairments. The primary underlying causes included visual acuity loss, contrast sensitivity impairment, color vision deficiency, and visual field loss. These challenges made it difficult for players to engage with the game as it relies heavily on visual cues and spatial awareness.
The feedback from the contest master provided a critical shift in the project’s focus, moving away from the more accessible board game version of Battleships to the flat paper version, which presents significantly greater challenges for visually impaired players. Unlike the board game, which typically includes physical pieces and a structured grid that can be felt and manipulated, the paper version is entirely two-dimensional and lacks tactile features. This absence makes it exceptionally difficult for visually impaired players to locate ships, track their positions, and mark shots on the grid. The paper version’s reliance on visual cues alone—without any tactile, auditory, or high-contrast elements—creates a substantial barrier to accessibility. This insight was crucial in refining the problem definition, highlighting that the paper version’s lack of sensory markers is the primary obstacle to inclusion for visually impaired players. Recognizing this, the project’s focus shifted to developing solutions that specifically address these challenges, requiring innovative adaptations far different from those needed for the board game version.
This pivot brought with it a considerable increase in the technical demands of the solution. To make the paper version accessible, the solution now requires the development of advanced computer vision algorithms capable of accurately detecting the playing field on a flat piece of paper. These algorithms must be able to interpret the grid layout, ship positions, and player inputs in real-time, all while operating on an embedded device, which imposes limitations on processing power and speed.
This updated solution is designed to address the unique challenges posed by the paper version. By leveraging computer vision, the aim is to create a system that enables visually impaired players to engage with the game independently. The solution ensures that all players, regardless of their level of visual impairment, can enjoy a fully accessible and inclusive gaming experience. This approach represents a significant advancement in accessibility, making the traditionally visual game of Battleship playable for everyone.
Keeping Track of TimeTime is crucial in any game, and Battlesight AI is no exception. I needed a way to keep track of time that was accessible and intuitive. Enter the M5Dial: a versatile development board with a 1.28-inch round TFT touchscreen, rotary encoder, buzzer, and under-screen buttons. This tiny but powerful device became the heart of the timer system, offering both precision and accessibility.
Key Features of the M5Dial Timer:
- Haptic Feedback: A gentle vibration signals the setting of the timer and alerts players when the countdown reaches zero. This tactile cue ensures that players with visual impairments can easily track the time without needing to rely on sight.
- Auditory Signals: Beeps accompany every button press, providing immediate feedback. When the timer expires, a clear alarm sounds, ensuring that no one misses their turn.
- Modular Design: We wanted players to have control over their experience. The timer’s code is editable, allowing users to customize the timing system to suit their needs, whether that’s adjusting the countdown duration or changing the alarm sounds.
Components used:
- M5Dial: A development board with a 1.28-inch round TFT touchscreen, rotary encoder, buzzer, and under-screen buttons.
- Dual Buttons Module: Provides physical control for starting, stopping, and resetting the timer.
- Digital Clock Module: Keeps track of time and displays the current timer status.
To guide the user experience, I developed a Finite State Machine (FSM) for the timer. This FSM is the brain behind how the timer operates, moving smoothly between states like "SET_TIMER," "WAIT," and "COUNTDOWN." Here’s a snapshot of the FSM I used:
For those who want to dig a bit deeper, the following diagram shows the different states of the FSM with their internal workings on display.
For even more information about this module, visit the GitHub link where all code has been well documented to allow you to try it out for yourself!
Capturing the Playing FieldNext, I needed a way to bring the physical game board into the digital realm. The Xiao ESP32 Sense was the perfect candidate for this job, acting as a camera server to provide a live stream of the playing field. However, as with all things technical, it wasn’t without its challenges.
Initially, the high processing demands caused the board to overheat, leading to discolored camera images—a clear sign that I was pushing the hardware t its limits. To solve this, I had to make several optimizations:
- Resolution Adjustment: I reduced the camera resolution, striking a balance between image clarity and processing load.
- Data Transfer Optimization: By fine-tuning how data was transferred from the camera to the processing unit, I was able to minimize latency, ensuring the live feed was smooth and reliable.
Despite these challenges, the Xiao ESP32 Sense proved to be a powerful tool, enabling me to capture the playing field in real-time and set the stage for the next part of my system.
Set up your own camera live stream using the code found on my GitHub repo and read through the documentation to gain a better understanding of how the project is set up!
Processing the GameWith the camera feed in hand, the next step was processing it to detect the ships on the board. For this task, I turned to the UNIHIKER board, leveraging its processing power to run my image recognition algorithms using OpenCV.
My first version of the algorithm was designed to work with a hand-drawn grid. This approach was incredibly flexible—players could draw a grid anywhere and start playing. However, it quickly became clear that this flexibility came at a cost. The algorithm struggled with accuracy, especially when larger hand movements distorted the grid lines.
Here’s the initial algorithm pipeline I used:
To improve accuracy, I introduced a pre-defined printed grid. While this reduced the game’s flexibility slightly, it significantly boosted the robustness of my algorithm. The printed grid provided a consistent reference, allowing our algorithm to more reliably detect ship positions, even in less-than-ideal conditions.
This is the updated algorithm pipeline:
Throughout this journey, my goal has been to make Battleship accessible to everyone. By combining the M5Dial’s intuitive timer with the Xiao ESP32 Sense’s camera capabilities and the UNIHIKER board’s processing power, I’ve created a system that’s not just functional, but truly inclusive.
Battlesight AI transforms the traditional Battleship game, ensuring that all players—regardless of visual ability—can enjoy the thrill of sinking ships and strategizing victory. This project has been a challenging but rewarding experience, pushing me to innovate at every turn.
Comments