Many industries are limited today by a lack of mobile remote viewing. Often, people need to place themselves into dangerous situations to understand and solve problems at a great potential costs both personally and financially. Emergency services such as police, fire, and urban search and rescue are among a variety of industries that are burdened by the requirement for direct human involvement in resolving conflicts, assessing risks, and locating individuals in dire need. Orient removes this limitation by utilizing advanced virtual reality hardware to provide a means to render and show content efficiently and effectively.
The primary market for Orient is health/life services such as urban search and rescue. When augmented with a long-range remote vehicle, Orient reduces both the cost and human risk of engaging in low-flying search and rescue operations by allowing rescue teams to scan search sites and manipulate their view of the situation without ever needing to step into a helicopter. Removing those extra steps saves a great deal of financial cost (maintaining vehicles and refueling) as well as eliminating the risk that reconnaissance teams face by flying into hazardous terrain. These initial search teams take up the bulk of resources due to the difficult and time-consuming process of locating people in urban areas after disasters.
Urban search & rescue teams will be the main beneficiaries of Orient as access to the Internet is crucial for the system to allow for a real-time stream from the camera to the user’s headset for a successful mission. Urban search and rescue revolve around natural disasters such as earthquakes, fires, and floods, allowing the system to be in close proximity to Internet access while still retaining its utility.
The system’s versatility can extend to other markets, providing modular value through other applications. While the primary focus is on urban search and rescue, the versatility of the technology behind this project allows for potential applications in a variety of markets, including tourism, entertainment, security, surveillance, monitoring, military tasks, reconnaissance missions, surgery, and medicine, research, scientific exploration, robotics, Unmanned Autonomous Vehicles, and more.
While virtual reality technology is growing to have a profound impact on how people live and work, current uses of this technology focus on consumer entertainment (i.e. VR games), simulation, and education. What differentiates this product is that, instead of giving the user a purely virtual, pre-rendered environment to traverse, Orient enables the possibility of emulating human behavior with electronics in the real world.
Most VR-enabling cameras today allow a pre-recorded video/environment to be viewed at a later time; Orient allows the camera feed to be streamed as close to real-time as possible. It receives camera input which is then analyzed and processed using digital signal processing. The result is a high-resolution 1080p video stream that is sent to a remote virtual reality headset whose movements determine the orientation of the camera. Current VR technologies are indirectly controlled through software or using mechanical components such as joysticks or phone interfaces. Orient is controlled directly with the user’s head movements to manipulate what the camera (and therefore the user) sees in real-time. Because the camera is stereoscopic, the image that the wearer sees will strongly resemble a “real” view of the environment. This is crucial for the applications that Orient is attempting to tackle, where the remote display must match reality and allow for better-informed decision-making.
IntroductionFlow & Block Diagram
Showcasing the full flow. From attaching the Beaglebone and camera together as a unit with the mobile component.
In regards to equipment and tools, we used open-source tools and took advantage of design suites. The PCB files were easily attained and refined to fit our requirements. Deriving a custom PCB from Beaglebon we were able to modify, adjust reduce and expand the peripherals we needed.
On the Android front, we had multiple examples allowing us to learn and teach each other App development which most of us were foreign to. Having strong foundational roots in hardware, we were fully aware of the limitations and possibilities to not exceed thresholds that would otherwise compromise our system. We were able to connect as a client from the mobile side as well as host a server to send back to the unit transmitting positioning data (accelerometer) & velocity/speed (gyroscope) data.
Gstreamer as difficult as the documentation was to decipher, we persevered by relentlessly trying all combinations. Once we had the communication Proof of Concept (POC) which allowed us to stream, the next step was to export a stream into a mobile device by compressing and decompressing the video-encoded feed. We learned TCP/IP communication hosting servers, joining clients from Native C++ to Javascript & Python.
AcknowledgmentsThis could not have been possible without the endurance, resilience, and passion of all members of the team. The dedication and effort we all invested in. All the challenges we encountered on the technical front also existed on the social and daily front. Covid dampened our last couple of months of school however it allowed us to connect on a different frequency that we will all remember, a bond that will nonetheless never fade!
I love all my members and hope one day we can reconnect and reunite to laugh and reflect on some of the hardest and most rewarding projects created to date! Miss you and hope everyone is doing well
Link to video here -
Comments
Please log in or sign up to comment.