Visually impaired individuals encounter various physical obstacles and unexpected situations in their daily lives. During outdoor activities, they are particularly vulnerable to risks such as collisions with obstacles or falls while using a white cane. In many cases, it is difficult for them to request immediate assistance, which can severely impact their safety and quality of life.
Additionally, tactile paving, designed to support the safe mobility of visually impaired individuals, serves as their "eyes." However, surveys reveal that many tactile pavings are damaged or improperly installed, becoming potential causes of accidents instead. In fact, 57% of surveyed locations lacked tactile paving, and over half of the installed pavings failed to meet proper standards, preventing them from fulfilling their intended purpose.
To address this issue, a systematic approach is needed to ensure the safety and independence of visually impaired individuals.
1-2. How We Solve the ProblemThe Green Cane, provided to individual visually impaired people, is a specially designed cane for their safety. This cane detects the impact when the visually impaired person falls, emits an alert sound, and notifies the guardian in real-time, enabling quick response. Additionally, the Green Cane provides feedback to the visually impaired person about hazardous areas and helps the guardian track the real-time location of the individual.
This system leverages a range of oneM2M features, including Registration, Resource Creation, Geofencing, and Notification, to ensure seamless integration and communication within the IoT framework. Moreover, it is powered by Becane Software, a custom-developed solution that extends the oneM2M framework by implementing advanced sensor data processing and decision-making capabilities. While oneM2M provides the foundational architecture, Becane Software introduces unique functionalities tailored to the specific needs of the Green Cane.
The details of the oneM2M features and Becane Software's advanced capabilities will be further explained in subsequent sections. These features collectively strengthen the safety of the visually impaired and offer an effective solution to alleviate the guardian's concerns.
1-3. The Positive Impact of Our SolutionBy providing various data about visually impaired individuals to their guardians and public institutions, the possibility of them encountering dangerous situations can be proactively prevented. This enhances the safety system available to the visually impaired and offers greater peace of mind to their guardians. The introduction of the Green Cane creates a foundation for the visually impaired and their guardians to live in a safer and more reliable environment. This system will contribute significantly to improving the quality of life for the visually impaired.
2. Model Diagram2-1. Overall solution's ArchitectureThis architecture highlights the integration of sensors, a gateway, and the oneM2M Cloud Service platform to ensure seamless communication and real-time decision-making. Detailed explanations of the oneM2M features and their implementation will be provided in subsequent sections.
2-2. Green Cane structure1) Sensors
- battery : Indicates the remaining battery capacity of the Green Cane to prevent discharge.
- gps : In order to inform the carer of the real-time information of the visually impaired and to give feedback to the visually impaired in public institutions, the real-time location of the visually impaired is identified.
- Impact : When the impact on the Green Cane exceeds a certain threshold, it is determined that the visually impaired individual has fallen. In such cases, a warning popup will appear on the caregiver's app, and they will have the option to call the nearest public institution.
- Sound : If it is determined that the visually impaired individual has fallen, the Green Cane will emit a sound to alert nearby people of the danger.
2) Becane Software
- Becane Software : Processes data obtained from sensors into resources that can be utilized by users and caregivers, then delivers it to the cloud platform for management.
1) Data Management
- Collection and Storage: Data generated by IoT devices or sensors is collected, stored, and processed by TinyIoT server, ensuring seamless management and availability of the data for further use.
- Real-Time Data Processing: TinyIoT server continuously processes and updates real-time data, such as collision status, received from IoT devices or sensors via HTTP protocol, ensuring immediate response and accurate data synchronization.
2) Subscription and Notification
- Subscription Management: Delivers alert messages based on user subscription settings for specific events, such as accidents or low battery levels.
- Event Alerts: Sends real-time HTTP-based notifications to users when sensor events like shock detection
3) Grouping
- Device and Data Grouping: Manages related sensors, canes, users, and datasets by regional groups to enable efficient data processing.
- Subscription Grouping: Handles events such as accidents and facilitates response actions, like requesting assistance, at the group level to enable real-world interventions
4) Location
- Geofencing: Enables preparation for areas prone to accidents through geofencing set by administrators and provides warnings or alerts when approaching such areas.
- Location-Based Processing: Analyzes location data to execute appropriate response processes and handle events.
5) Interoperability and Scalability
- RESTful API: RESTful API enables consistent communication among various devices, platforms, and applications.
- Standard Protocols: In the TinyIoT system, it facilitates integration with other devices and platforms, such as traffic management systems, providing users with data that considers various factors.
- Scalable Design: It adopts a flexible system architecture to accommodate new sensors, data types, and user requirements.
This diagram illustrates the implementation of the smart white cane system, including the integration of sensors (GPS, impact, battery, sound), a smart cane gateway, and applications for public institutions and caregivers.
3-2. Implementation of Sensors1) GPS module
The GPS module we adopted in our scenario is the Seeed Grove Air530 model. The features and specifications of this model are as shown in the following image.
An example of how this module receives GPS data is as follows.
Since we will not be using the actual sensor, but instead simulating its behavior in a Python environment through virtualization, we have decided to use only the latitude and longitude data from the provided data.
Here is our implemented code of GPS.py
Here is the code to generate latitude and longitude coordinates along a route from a user-specified starting location to a destination using the OSRM API.
This is the code to send latitude and longitude coordinates within the generated route to the MN-AE at specific intervals.
This is the code to generate a circular route starting from the main gate of Sejong University, reaching Seoul City Hall, and returning to the main gate of Sejong University. The coordinates along the route are sent to the MN-AE every second.
2) Impact Sensor
In our scenario, the LIS331HH, which we adopted as the impact detection sensor, is an accelerometer capable of detecting impact forces using acceleration.
The LIS331HH supports measurement ranges of ±6g, ±12g, and ±24g. Since it is mounted on a cane, and we aim to only consider strong impacts when the cane hits the ground as collisions, the sensor should be set to ±24g to prevent saturation.
The sensor provides acceleration values for three axes: X, Y, and Z. The output data for each axis is delivered in a 16-bit two's complement format, and these values must be read from the sensor's internal registers.
ex)
However, total acceleration is calculated by combining the data from each axis, as shown in the diagram
With a measurement range of ±24g and a maximum digital output value of 2^15=32768, if the output value is 16384, the actual acceleration, according to the calculation formula, is 12g.
The impact force can be calculated as the magnitude of the acceleration vector, as shown in the diagram
After measuring the acceleration for each axis (X, Y, and Z), the magnitude of the vector can be computed to determine the impact force.
Typically, the impact force when a person falls while walking averages between 10g and 15g. In cases of a slow fall or slipping, the impact force is generally around 5g to 10g
When the average walking speed of a typical person is approximately 1.09 m/s, visually impaired individuals generally walk at about 45% to 66% of this speed, depending on the surrounding environment. Considering the sensor's acceleration and impact calculation methods, a slow fall or slip can be interpreted as an impact detected during the walking of visually impaired individuals. Setting the maximum measurement range to 12g and calculating the impact value, the result for a collision can be estimated at around 9 to 10g, slightly above the midpoint average of the 6 to 10g range.
However, since we decided not to use an actual sensor and instead simulate the sensor's behavior in a virtual environment, the virtualized sensor generates random impact values between 1 and 100. Considering that an impact force of 9.5g corresponds to a maximum measurable range of 12g, we set the threshold at 80 out of 100.
ref: https://scienceon.kisti.re.kr/srch/selectPORSrchArticle.do?cn=JAKO201116549821131&utm_source=chatgpt.com
Here is our Impact.py code
As previously explained, the sensor is configured to measure impact forces as random numbers between 1 and 100, with the threshold operation performed by the MN-AE. The generated data is then transmitted to the MN-AE every 5 seconds with timestamp data.
3) Battery module
The battery module we adopted in our scenario is the Seeed Studio Polymer Ion 2200mAh battery pack. The specifications of this module are as follows.
Since we decided not to use an actual battery pack but to implement it virtually, we simulated a model where the battery level decreases over time and implemented it to send the battery level information to the MN-AE.
Here is battery code of Power.py
To start the battery level at 100%, the initial value was set to 102.5. The battery level decreases by 2.5% at a time, and data, including the timestamp, is sent to the MN-AE every 1 seconds.
The reason for setting these values is based on the calculated battery runtime.Since the battery sensor's voltage is 4.2V, the battery energy is determined to be 9.24Wh as shown below.
- Rasberry Pi Zero: 1.0W
- GPS Module: 0.3W
- Impact Sensor: 0.01W
- Speaker Module: 0.7W
Considering the power consumption based on the voltage of each sensor and calculating the runtime, the estimated battery runtime is 4 hours and 36 minutes. This translates to a 2.76% decrease per minute. Therefore, 1 minute was treated as 1 second, and the 2.76% decrease was approximated to 2.5% for the selected values.
4) Speaker
The speaker module we adopted in our scenario is the DFRobot Gravity Digital Speaker Module.
But for the speaker, since data is not generated continuously, we decided not to implement a virtualized sensor in the Python environment. Instead, the scenario will simply indicate that a sound is triggered in the form of a popup on the IN-AE, based on the situation.
3-3. Implementation of MN-AE(BE-Cane Software)Device data received must be parsed and processed into a usable form before being stored in the IN-CSE. The MN is responsible for performing these functions.
1) GPS Data
The GPS data received from the device is in the form of [longitude, latitude].
While GPS data consisting of longitude and latitude is essential for indicating the pedestrian's location, it is not sufficient for analyzing the pedestrian's movements and gait. To address this, the Haversine formula, which calculates the distance between two points on a sphere, was utilized.
The calculated distance between two points is then used to derive the speed. Since the formula provides the distance in kilometers, there is a conversion process to adjust the speed into a unit suitable for pedestrian movement, specifically meters per second (m/s).
The calculated current speed not only serves as a criterion for evaluating pedestrian movement but also becomes a key factor in analyzing the pedestrian's gait patterns and movements in subsequent processes.
2) Impact Data
As illustrated in the diagram, the process involves receiving impact data from a device and determining whether the data exceeds the threshold set to assess its impact on the cane user or to identify any potential issues affecting the user. Once the processing to determine whether the data surpasses the threshold is completed, the relevant values are transmitted to the IN-CSE.
The criteria for this determination are as follows: if the impact data does not exceed the preset threshold, it is considered irrelevant. When an impact exceeding the threshold is detected, the walking speed calculated earlier is used to determine whether there is any problem with the visually impaired user. The average walking speed of visually impaired individuals is reported to range between 2 and 4 m/s. If the impact exceeds the threshold and no footsteps are detected for a certain period, it is judged as an abnormal situation.
The criteria for determining impact are summarized as follows:
- When there was a strong impact previously, but no movement occurred, and it is judged as a problem: "ac"
- When a weak impact was received: "not"
- When an impact significant enough to affect the user was received: "is"
Based on these criteria, it is possible to avoid misjudging cases where a visually impaired person, despite experiencing an impact, can move without difficulty either independently or with assistance from others. Since walking speeds and patterns vary for each individual, collecting personalized data and configuring the system accordingly can enhance accuracy.
3) Battery Data
Battery data, similar to the impact data mentioned earlier, will be received from the device. After completing the processing to evaluate the value of the data, the relevant information will be transmitted to the IN-CSE.
Based on the aforementioned criteria, if the remaining battery level decreases and falls below 10%, an "off" status is generated and saved as onoff CIN.
1) Implementation Objective
- To fetch data from MN-AE and Cane(AE) in IN-AE and display it on the user interface (UI).
- Trigger specific notification (popups) to alert users and caregivers under defined conditions
2) Key features Data flow
Data Collection
- Sensor data from MN-AE and Cane (AE) are fetched periodically using the fetchData and fetch_aeData functions.
- Examples: GPS location, battery status, and impact information.
Condition-Based Popup Notifications
- Low Battery: Battery level drops below 10%.
- Impact Detected: Impact sensor value exceeds a defined threshold (
- e.g., shock value is 'is', shock value is 'ac'
Data Visualization
- Display GPS data on a map.
- Render battery status and impact data on the dashboard UI.
3) Code Example: Popup Trigger Conditions
- popup for Low Battery Notification
- popup for Impact Notification
If the value of the shock sensor is "ac," it is judged as an accident, and an accident popup will be displayed.
If the value of the shock sensor is "is," a warning popup will be displayed. If the warning popup is shown more than twice within 10 seconds, the area will be considered a dangerous zone, and a popup notifying of the dangerous area will be displayed.
3-5. Web application Implementation- The white cane senses impact when a visually impaired person falls. If the cane does not move within a certain amount of time, the guardian app will sound an alarm and the cane will make its own sound. At this point, the guardian can check the real-time location of the visually impaired person and report it to nearby public organizations. In addition, when a visually impaired person enters a dangerous area, the cane will sound its own alarm to alert them that it is a dangerous area. When the battery is less than 10 percent, an alarm will go off to indicate that the cane is charging.
- Each municipality receives the necessary data, such as the location of the cane and the impact data needed to care for the visually impaired in each region, analyzes the walking patterns and risks of the disabled, and provides personalized feedback to help users walk the safest. The district office or chief executive officer delivers feedback periodically through visits or mail. For example, they might warn that using certain routes at certain times can be dangerous.
The following is a description of the implemented website.
The right side of the page provides the profile of the cane user (visually impaired), an introduction to the Green Cane, and information about the current weather.
This screen shows the real-time location of the visually impaired individual on the map.
This section displays sensor data collected from the Smart White Cane, updating every 5 seconds.
If the impact sensor value exceeds 80, a warning popup is displayed. By pressing the 'Call' button, nearby public institutions and local community caregivers are immediately notified.
When the battery level of the cane drops below 10%, a low battery alert is displayed. This feature helps efficiently manage the cane's battery, ensuring uninterrupted functionality for the visually impaired user.
If the warning popup appears more than twice within 10 seconds in the application, it is determined that the visually impaired person is in a danger zone. A danger zone signal popup is then displayed to notify the caregiver for proper management.
If a significant impact is detected and there is no movement from the blind person for 5 seconds, it is determined as an accident. An alert like the one below is displayed, and notifications are sent to the police and rescue team simultaneously.
The following is a subscenario that gives feedback to the blind based on the information collected through Green Cane.
Public institutions provide personalized walking services through mail, etc. The left shows the average walking speed, average impact frequency, usage patterns, and hazard area proportion, and the right shows a map of the areas where users have traveled the most.
4. Demo Video5. DocumentsTeam Meeting Minutes
https://drive.google.com/drive/folders/1tWUwGmUlnhbPXaW0CFy6qjjMcKZg1pO5?usp=drive_link
PPT for detailed explanation
https://drive.google.com/file/d/13b54tO6HNUl8rlE1Ac8cDZAt3WMX94CQ/view?usp=sharing
Comments