Technology: Rescue drones need to learn how lost humans think
Frantic parents call 911 to report a child has wandered away from a state forest campsite. An unexpected storm strands hunters deep in a wilderness preserve. The US National Park Service documented almost 3 500 search and rescue missions in 2017 alone. And speed is essential when someone goes missing, so search coordinators tend to throw in every tool at their disposal: volunteers, scent-trained dogs, horses and vehicles of all kinds often pour into the area. Drones may seem like an obvious way to save precious time and resources but the catch is that these unmanned aerial vehicles (UAVs) still need to work far more effectively with humans in order to make a real difference. Now, with the summer outdoor season fast approaching, researchers at Virginia Tech, supported by a $1,5 million grant from the National Science Foundation, are developing algorithms and machine learning tools to better utilise these eyes in the sky.
First, there is the matter of where a drone should start looking. To find ways to narrow this down, Nicole Abaid, an assistant professor in Virginia Tech’s Department of Biomedical Engineering and Mechanics, used algorithms to develop a mathematical model of what humans do in such situations. “Someone with dementia, when they're lost, will behave significantly differently than like a child or a despondent person,” Abaid says.
For data to feed the algorithms she turned to search theory researcher Robert J. Koester, who says he has used information from more than 140 000 search and rescue incident reports for his 2008 book Lost Person Behaviour, which he regularly updates. Before becoming a consultant on the new project, he had already created a set of predictive tools to help coordinators narrow their search parameters. “I've been able to create models that potentially predict what the missing person is going to do,” Koester says. Abaid incorporated his historical data into her own mathematical model. “The model generates a trajectory, like a path that we think the lost person would take,” Abaid explains.
This model could help guide a search and rescue operation, says research team leader Ryan Williams, an assistant professor in the Bradley Department of Electrical and Computer Engineering within Virginia Tech’s College of Engineering. “If we have an idea of where this person started and when they were last seen, we can better predict where they may have gone and deploy drones and people to better cover those areas,” Williams says. “We can generate meaningful maps that human searchers can use for much better decisions at the beginning of the search.”
Drone operators can concentrate on a likely area, in theory scouting much more quickly than a ground-based team. But they can actually collect too much information; in one flight a drone can capture images using visual light, along with thermal imagery and remote-sensing lidar. Analysing it all could slow down a rescue team just when time is most critical. “We don’t want to drown searchers with data,” Williams says. “They want answers.”
So, Williams and his team are also developing machine-learning tools to quickly mine raw data for nuggets of helpful information. “For example, we can train a neural network [a collection of algorithms modelled on the human brain] to analyse thermal data and differentiate between a deer, bear or human profile and suggest places searchers should examine,” Williams says. The system could combine such input, along with any clues that searchers on the ground discover, with Abaid’s human behaviour model. This could determine where a missing person is most likely to be found but if that fails, the aerial vehicle could try another location.
“An automated system could process high-priority data first, then reprocess the data later for less-likely scenarios after the first [scenario] fails,” says John Sohl, a member of the national Mountain Rescue Association and the Weber County Sheriff's search and rescue team in Ogden, Utah, who is not involved in this research.
Of course, such an automated system would supplement humans in the field, not replace them. The Virginia Tech team is also developing backpacks for field crews to carry lightweight computers that can crunch the drone data, as well as equipment for communicating with the airborne UAVs. “As real searches occur over large areas,” Williams says, “the backpack allows in-field data analysis, as opposed to sending everything back to base.”
Data processing is not the only technological limitation preventing drones from fully replacing ground rescuers. Another is the requirement for “line of sight” control: Operators must keep drones in view at all times in order to steer them, which limits the size of the search area. And even today’s most energy-efficient UAVs generally have less than 60 minutes of battery life. Swapping in new ones saps valuable time from a potentially productive search.
To see how problems like these will impact drone use, the Virginia Tech team plans to try out their system during real search and rescue operations. But because their work, including hardware for the backpack, digital models, user interface and computing architecture, is still in development, they will not start field tests until next year.
If the autonomous system proves itself in the field, it could make drones more helpful for coordinators such as Mark Eggeman, who led 113 search and rescue operations at the Virginia Department of Emergency Management in 2017 (he is not involved in the Virginia Tech research). “Information influences how you plan searches. Anything we can do to try to speed access to information can make a difference in how we deploy our resources and assets,” he says.
Sohl agrees. “Drones have the potential of providing an excellent resource that can aid direct visual search as well as improving communications,” he says. “The more resources that are available to any rescue team, the better.”
Source: Scientific American