Done Using Optical Sectioning Algorithm For Search And Rescue In Forests, Making The Search Process Even Faster

An autonomous drone with a new sort of technology has been developed by a team of researchers at Johannes Kepler University in order to improve search-and-rescue efforts. The group outlines its drone modifications in a study published in the journal Science Robotics. In the same journal issue, Andreas Birk of Jacobs University Bremen published a Focus piece detailing the team’s work in Austria.

In 17 field testing across diverse forest types and seasons, a new prototype for a search-and-rescue drone successfully located persons in dense forests around 90% of the time. The design, which was published in Science Robotics on June 23, combines thermal imaging, machine learning, and a new optical method to enable the drone to see missing people through the foliage.

The tree cover makes it difficult to locate individuals who are lost in the forest. People in planes and helicopters have difficulty seeing through the cover to the ground below, where people may be walking or even laying down. The same issue applies to thermal applications warmth sensors are unable to pick up readings properly by the cover. Drones have been tried to be used in search-and-rescue missions, but they face the same challenges because they are controlled remotely by pilots who are searching the ground beneath. The researchers have added new equipment to this new endeavor that enables them to see through the tree cover and highlight those that aren’t.

The new solution is based on an airborne optical sectioning algorithm, which employs the computing power of a computer to defocus occluding objects such as treetops. Thermal imaging is used in the second component of the new device to highlight the heat radiated by a heated body. After that, a machine-learning algorithm assesses whether the heat signals are from humans, animals, or other sources. After that, the new gear was mounted on a regular autonomous drone. To select where to seek, the drone’s computer combines both locational positioning and cues from the AOS and temperature sensors. If a possible match is found, the drone goes closer to the target to acquire a better view.

If a potential match is found, the drone goes closer to the target to get a better look. If its sensors detect a match, it sends a message to the study team, which includes the coordinates. The researchers used three GoPro cameras attached to a headset to train their algorithm while hiking over the Swiss Alps. One camera was focused forward, one to the left, and one to the right of the hiker. The team had taken over 20,000 photographs after spending hours on these paths. The photographs were then used to educate their algorithm on how to draw the borders of a hiking trail.

The result is a deep-learning algorithm that allows a drone with a single forward-facing color camera to travel an unknown track entirely on its own, with no human intervention. The system was even better than humans at determining the exact direction of the trails it walked on. The team warns that these findings are still in the early stages. While there is still a long way to go before autonomous drones can search for missing individuals in forests, the researchers believe their study shows how deep neural networks can help autonomous vehicles negotiate situations with complicated and high-dimensional inputs.

Leave a Reply

Your email address will not be published.