The project aims to bridge the gap between digital chess platforms and tactile, inclusive gameplay for individuals with visual disabilities. Leveraging computer vision technology, our system will interface with popular chess platforms like chess.com, capturing moves from the screen (whether on a desktop or mobile device) and translating them to a physical, tactile chess set. This innovation goes beyond existing solutions by enabling real-time, interactive play on a physical board, creating a seamless and inclusive gaming experience. Our project aligns with the growing interest in accessibility within the chess community, as seen in chess.com's recent initiatives. Additionally, our technology has the potential for broader applications, enhancing the gaming experience for all and promoting inclusivity in digital entertainment. Through integration with platforms like Feelif Chess, we aspire to create a universal solution that empowers individuals with visual impairments to enjoy chess across various digital environments.
What we completed:
We investigated the need to make chess more inclusive. This forum post shows much interest and work that needs to be done:
https://www.chess.com/blog/chess_dot_tom/lets-make-chess-com-the-most-accessible-site-to-play-chess
We identified a commercial vendor that produces a tactile chess board that integrates with other similar boards but does not integrate with chess.com. Our idea is to go beyond integration with an API such as available through chess.com, but to integrate with any screen – YouTube video demonstrating a game of chess, a movie with a scene including chess, or chess.com itself as displayed on a computer screen.
A hardware board that could be part of the solution:
https://www.feelif.com/tactile-books-games-tools/games/713/chess-8153/
We contacted the CEO of FeelIf and pitched the idea to him. He will take it to his team.
Some startups are also trying to automate the physical moves on the board, which, along the production of the board and pieces, is the most expensive part of the project:
The computer vision and event-pushing activity is a software problem. It can be done on a smartphone and does not require custom hardware. A smartphone already has a camera and development framework to capture the screen image and then push via API to FeelIf or a similar board the moves. Without any custom hardware, the smartphone app can provide audio feedback with the standard chess algebraic notation.
To turn it into a software project only, the smarphone device in absence of any available automated board, could read the moves, very much like auto-captions, to the blind user and they would move the pieces on the board.
Finally, we located existing Python code that enables computer vision for chess, which would be the first step for the smartphone screen recognition capacity:
https://blog.roboflow.com/chess-boards/
We’ve had a conversation with the creator of this Python code about collaborating on the project.
What needs to be done:
We were not able to complete the mobile app with computer vision or finalize the tactile board. However, turning this project into a software task has simplified many approaches.
Comments