Urban Edge Smart Campus is a system which comprises of MMwave sensors, IR blasters, UWB systems and AWS Service working together with autonomous robots
A human crowd management system has been implemented using AWS (Amazon Web Services) technology in response to the COVID-19 pandemic. To ensure adherence to social distance rules, the system aims to track and control the population in a specific area. The system is able to precisely identify and assess the density of crowds by utilizing AWS's sophisticated analytics and real-time data processing capabilities. In order to prevent occupancy limits from being exceeded, the technology is also used to check the availability of classrooms and other educational spaces. This proactive approach ensures a safer environment for people while reducing the risk of virus transmission.
Moreover, the human crowd management system deployed on AWS goes beyond merely monitoring the number of people in an area. It also focuses on determining the pattern of crowd behavior to provide valuable insights to the staff. By analyzing historical and real-time data, the system can identify peak times when the crowd density is highest. This information is relayed to the staff, enabling them to anticipate and prepare for busy periods. By knowing the peak times, staff members can allocate resources more efficiently, such as increasing the number of available staff or adjusting class schedules. This proactive approach helps optimize operations, minimize congestion, and ensure a smooth and organized flow of people within the area. By harnessing AWS's advanced analytics and machine learning capabilities, the system enables staff to make data-driven decisions and provide a better experience for everyone involved.
To enhance the crowd management system further, human sensors are employed to monitor the number of individuals present in a room. These sensors can accurately detect human presence and trigger the room's lighting system accordingly. By integrating with the AWS platform, the sensor data is transmitted in real-time to the cloud for processing and analysis. In addition to human sensors, voltage and current sensors are deployed to monitor the power consumption within the room. These sensors continuously measure the electrical current and voltage, providing valuable data on energy usage. The information collected by the voltage and current sensors is then displayed on the AWS platform, allowing staff members to monitor and analyze the power consumption patterns of the room. This enables them to identify any anomalies, optimize energy usage, and make informed decisions regarding power management. The integration of human sensors and voltage/current sensors with AWS enhances the overall crowd management system by providing valuable insights into both human presence and energy consumption, ensuring efficient resource allocation and sustainability.
We have an emergency staircase monitoring which addresses smoke inhalation and unruly crowd injuries, two of the most dangerous fire-related hazards. The emergency model displays real-time smoke, traffic and fall data on a dashboard with the optimum escape route. A Raspberry Pi detects smoke with a smoke detector. Thonny Python uses Python to detect smoke. Smoke-detected data is transferred to the cloud for processing before being shown on the dashboard. A Raspberry Pi camera can examine exits during an evacuation. The camera counts people approaching the exit using a conventional machine vision algorithm. The software counts things using OpenCV, YOLO, and PyTorch. The cloud processes and displays the data in a dashboard. Millimeter-wave sensors detect falls. The sensor can detect falls and notify administrators so they can assist the injured. This project's SBC is a Raspberry Pi 3B+ for machine vision and sensor integration. Other parts include sensors. The Raspberry Pi Camera Module v2 was chosen since it works with the Raspberry Pi 3B+ and has a greater resolution. The smoke detection system uses the MQ4 gas sensor because it works. Grayscale Correlogram Clustering (GCC) counts and recognises crowds. GCC employs colour and texture to classify images in computer vision and image processing. Google created GCC. Object identification, image retrieval, and medical imaging have shown their value. The emergency model conceptual design includes cloud-connecting the SBC, sensors, and camera. This facilitates dashboard data analysis. It offers programming codes and assembly instructions for connecting components. The experiment shows that the MQ4 gas sensor can accurately identify smoke and that the camera can count people in real time.
The purpose of the energy monitoring is to give users insightful data about power usage. The system can determine the power use by precisely sensing the voltage and current of a device. The energy monitoring system incorporates automatic light and air conditioning features that depend on human presence recognition to further increase energy efficiency.
The automatic lighting and air-conditioning system make use of sensors to identify humans presents. The system shuts off automatically when nobody is around. The lighting and air-conditioning system reacts to a person entering the room by altering the light intensity and ambient temperature to the proper level as soon as the sensors recognise human presence. By adjusting the lighting and temperature settings according to occupancy, these functions aid in maximising energy use.
The automated lighting application, includes the MMWave sensor to detect human presence, XIAO nRF52840 micro-controller and relay, are represented inside the yellow dotted box. The sensor transmits a signal to activate the relay, which then turns on the light, when it detects people nearby. The energy monitoring application, of the circuit is shown by the green dotted box. To measure the power consumption, a voltage and current sensor was used to determine its power. This data is then compiled and was displayed on the dashboard. A temperature/ humidity sensor is also integrated to the circuit to display the temperature and humidity on the dashboard.
The project involves the implementation of a real-time location tracking system using UWB technology and the Trilateration algorithm. The Two-Way Range (TWR) method is utilized to measure the distance between a Tag and anchor by calculating the time taken for UWB signals to travel between them. The UWB transceivers, specifically the DWM1001C, are used as reference anchors and tags for indoor localization with high precision. The software tools employed include Tera Term for programming the UWB transceivers, DRTLS Manager for creating the UWB transceiver network, and QGroundControl for automating the drone's operation. Additionally, to achieve surveillance capabilities, a lightweight Pixy Cam is installed on the drone, providing a live video feed to a Raspberry Pi onboard. This combination of technologies enables accurate positioning, live streaming, and autonomous flight control, making it suitable for applications in smart campuses, asset tracking, and robotics.
For change the management of buildings, the prediction model mentioned in the research includes data from numerous sensors, including temperature, humidity, energy use, and MMWave readings. It offers accurate forecasting, smart analysis, and AWS optimisation for energy efficiency. The model improves crowd management by giving detailed crowd density heatmaps and preserves comfort and interior air quality by monitoring humidity levels continuously. By analysing energy consumption data in connection to environmental criteria, it identifies potential for energy savings and cost reduction. The use of MMWave sensors allows for precise occupancy analysis, which improves resource management and security. The study does, however, highlight problems such as human error and the difficulty in anticipating interior sensor environments due to the ongoing presence of data.
Comments