What is your project about?
Introducing 'ARTimekeeper': A revolutionary proof of concept AR watch powered by OpenCV. Embark on a journey beyond traditional timekeeping as we reimagine data visualisation in the third dimension. With OpenCV's cutting-edge computer vision, 'ARTimekeeper' transforms your wristwatch into a dynamic augmented reality canvas. Immerse yourself in a world where data takes shape and substance, rendered seamlessly in 3D. Our project showcases the potential of OpenCV's capabilities, pushing boundaries and providing a glimpse into the future of wearable technology.
The machine learning and AI for feature matching in OpenCV is unparalleled in the open source domain, making it the perfect library to power this application. Also with these data sciences generating more and more meaningful insights, new and novel ways to visualise this data points are needed now more than ever.
Why did you decide to make it?
One, it's a cool idea and it'll be a fun challenge. Secondly, the motivation also stems from the desire to anchor the immersive AR experience within a familiar and accessible realm – a watch – a trusted source of information, seamlessly integrating cutting-edge technology with everyday convenience.
Provided the code is modular and well architected, other developers and engineers can use this project to develop their own OpenCV AR experiences that co-opt auxiliary display devices.
Our project's ambition extends beyond the realm of technology and innovation; it seeks to make a meaningful impact on various domains, aligning with several of the indicative areas outlined by the OpenCV AI Competition. By showcasing the potential of our AR watch, we aspire to contribute to the advancement of the following areas:
Education: Our AR watch has the potential to revolutionise how educational content is presented, creating an immersive learning experience that enhances understanding and engagement across subjects.
Visually Impaired Assistance: Through innovative audio-visual interfaces, our AR watch could serve as a valuable tool for the visually impaired, providing them with intuitive access to information and surroundings, depending on the degree of impairment. Accommodating people's visual needs with size and contrast adjustments/configurations within the display could be of great assistance.
Health and Medical: The AR watch's 3D data visualisation capabilities could be applied to medical imaging, offering healthcare professionals a novel way to interact with and analyse diagnostic data.
Entertainment: Our project envisions a new era of interactive entertainment, where users can engage with content in unprecedented ways, turning entertainment into an immersive and participatory experience.
By demonstrating the transformative potential of our AR watch we will be highlighting how these domains could benefit from the versatility and applicability of computer vision and wearable technology, while also fostering tangible benefits for a wide range of users and industries.
How does it work?
Our AR watch concept seamlessly merges reality and virtual content. Using OpenCV, it detects markers, triggers immersive 3D visuals that appear overlayed on a smartwatch, and can utilise other smart watch features (i.e. accelerometer or physical controls). This prepares for the future AR glasses revolution by allowing developers to preview and experiment with the potential of wearable technology and interactive data visualisation. Open-source collaboration empowers developers to shape the next wave of user experiences.
Let's define what success of this project looks like.
There are two deliverables: An iOS watch app that displays AR detection markers on its screen and relays user interactions with physical buttons to observers via a web-socket. Secondly, an OpenCV Application that functions to compute the watches pose and displays the video feed together with a projected animated 3D scene. Any inputs on the watch's physical buttons should have a visable impact on the 3D scene. These two deliverables should be a functioning proof of concept that is well engineered to foster easy collaboration using Clean Architecture principles.
The smart watch displays the AR Marker (Hiro example here, but natural features are probably more suitable) via a smart watch app. This app can be networked with related devices, be it a phone or web service. Networking allows for data to be passed bi-directionally between the watch app and the observer. This would allow the observer to change the state of the display. Additionally, the user could use physical controls on the watch to control the visual display and modify the output.
For the scope of this project, the observer can be a web cam running in a python OpenCV script. Python is perfect for the proof of concept as it is quick to develop with and the Python OpenCV wrapper as excellent coverage of the OpenCV api.
The communication layer depends on the hardware of the watch and the observer. For simplicity, a simple web-socket will be constructed to which the watch app and observer can subscribe. Presently, Apple watches only allow Bluetooth connectivity between the iPhone and Watch. Therefore to facilitate two way communication the web socket protocol is being used to connect to the observer on a local server.
Since current AR headsets are financially, and functionally, prohibitive at this time the final output will be displayed on a typical monitor or written to a video file using OpenCV (ffmpeg under the hood).
The main development priorities would be as follows...
Real-Time Processing Demands: Achieving seamless real-time augmented reality necessitates optimising our pipeline for rapid image processing. Balancing the computational demands of feature detection, tracking, and rendering while maintaining a smooth user experience is a challenge we aim to conquer.
Feature Detection and Tracking: While OpenCV's FREAK/AKAZE features offer robust marker detection, ensuring reliable tracking across diverse lighting conditions, perspectives, and scenarios poses a significant challenge. We are committed to refining our feature-based tracking algorithms to provide accurate and consistent results.
Data Visualisation Complexity: Effectively visualising a wide range of data types and formats in three dimensions requires meticulous design and implementation. Striking the right balance between information density, clarity, and user experience poses a creative challenge we eagerly embrace.
User Interaction Paradigm: Designing intuitive and natural user interactions through a smartwatch interface presents a unique challenge. Creating a seamless connection between the AR watch and user actions requires careful consideration of input selection, responsiveness, and user feedback.
Wearable and Webcam Integration: Integrating the smartwatch and webcam into a cohesive user experience involves hardware and software synchronisation challenges. Ensuring accurate synchronisation and minimal latency between the visual input and output devices demands meticulous calibration.
Marker Diversity and Recognition: Curating a diverse database of QR/AR markers that offer reliable recognition across varying physical contexts requires thorough testing and optimisation. Addressing potential occlusions, rotations, and marker occlusion poses an ongoing challenge.
Optimal Virtual Content Rendering: Balancing the dynamic rendering of virtual content within the real world while maintaining visual consistency and performance presents a computational challenge. Ensuring optimal frame rates and rendering quality is vital for an engaging and immersive experience.
Iterative Refinement: As with any ambitious project, refining our AR watch concept through iterative testing, user feedback, and enhancement cycles is a challenge we look forward to overcoming. The journey of optimising our solution to ensure reliability, usability, and innovation requires ongoing dedication.
Despite these challenges, we approach our project with a spirit of collaboration and a commitment to pushing the boundaries of what is possible. Each challenge is an opportunity for innovation, learning, and growth as we strive to deliver a compelling AR watch concept that demonstrates the potential of OpenCV and the future of wearable technology.
Future development beyond the scope of the project.
- Port Python OpenCV code to C/C++ or Rust for performance.
- Add GPU acceleration to any CPU intensive processes, where possible.
- Port OpenCV/Observer to iOS app for testing on iPhone or Apple headset.
- Develop apps for other smart watch platforms (i.e. Android).
- Generalise the code to work for any remote smart display.
- Further explore UI/UX options using physical watch controls.
- Develop multi-modal networking solution for cross platform compatibility.
- Test in iOS with an Apple Headset.
- Test in Android with a Headset.
- Connect with IOT services/asset managers.
- Integrate on device accelerometer data to augment the pose estimation.
Comments