Introduction:
Each year, tens of thousands of people are the victims of natural disasters. People living in underprivileged nations are at a much greater risk of dying from these disasters, in part due to a lack of immediately available resources. Rescue systems such as helicopters and search planes, while effective in covering a wide area, are expensive to purchase and operate. This limits which nations can use the systems. Our goal is to promote equity in disaster relief and increase the accessibility of disaster search-and-rescue systems, particularly those that use computer vision (CV) and machine learning.
Additionally, after a natural disaster, damage to civil infrastructure like power lines and roads needs to be quickly assessed to allocate repair crews. However, flood water or other blockages can delay or prevent these inspections due to the danger to crews. Our system will be extended to provide CV capabilities to detect downed power lines so that crews can determine critical damage and safely plan repair operations.
Our solution:
As part of the Gemstone Honors Program at the University of Maryland, we are an undergraduate research group with a focus on designing, prototyping, and developing a module for low-cost, open-source, real-time onboard computer vision. This module will attach onto commercial unmanned aerial vehicles (UAVs) and will be able to detect humans and downed power lines while the UAV surveys an area, which can be transmitted down to a ground crew to coordinate rescue and repair efforts. Additionally, our system will include the capability to send commands to the UAV to autonomously steer it to search an area.
Progress so far:
So far, we have begun training and testing various computer vision (CV) models to improve the accuracy of detecting humans and downed power lines. In the final system, several CV models will be provided so that users can switch between them depending on their needs. We are achieving our real-time onboard computer vision with two core compnents: a Raspberry Pi 4 microcontroller, which handles the camera and image transmission, and a CoralAI Edge TPU Accelerator, which handles the actual computer vision computations. These two easily accessible products enable our system to have roughly four times the processing speed of other standard microcontrollers and transmit the images to a ground crew.
Additionally, we are designing our module to be compatible with a wide range of hobby UAV components, which will increase the accessibility of our system and make it simpler for developers to integrate our system into the parts they already have, rather than needing newer or more expensive components. With the help of the MAVSDK collection of programming libraries, we can transmit autopilot instructions to UAVs in order to search an area.
What comes next?
Our next steps include integrating our module prototype into a drone, performing flight tests to test our onboard real-time computer vision and autopilot navigation, and continuing to make our system more accessible and compatible with hobby UAV components. With your help, we will be able to improve accessibility to natural disaster relief!
Gifts in support of the University of Maryland are accepted and managed by the University of Maryland College Park Foundation, Inc., an affiliated 501(c)(3) organization authorized by the Board of Regents. Contributions to the University of Maryland are tax deductible as allowed by law. Please see your tax advisor for details.
Every aspect of our project requires a variety of electronics, including wires, motors, batteries, and more. Your contribution will go towards purchasing these components so that our system can run!
Our quadcopter drone requires various sensors in order to safely operate it. Your contribution will go towards purchasing these sensors so that we can fly!
Currently, we are working to create our own quadcopter drone in order to test our various computer vision and autopilot algorithms. Your contribution will support the costs of building and maintaining this drone!
We plan to receive images from an onboard camera in order to identify humans and downed power lines, which will be sent to ground crews. Your contribution will go towards purchasing cameras and radio transmission systems in order to accomplish this!
Our computer vision and autopilot algorithms require specialized hardware in order to run onboard the UAV. Your contribution will go towards purchasing microcontrollers, processors, and hardware accelerators in order to facilitate real-time human detection and autopilot!