DHS S&T outlines machine learning-based drone detection tech
Visual detection of drones has never been considered as effective as its thermal, radio or acoustic counterparts. Combined with machine learning, however, a camera can tell a different story. Today, this budding technology is helping the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) and Sandia National Laboratories create more precise drone detection capability through visuals alone.
“If you have a video of something, you can kind of identify it based on certain characteristics,” explained Jeff Randorf, an S&T engineering advisor. “You can train neural networks to recognize patterns, and the algorithm can begin to pick up on certain features.”
Until now, videos of drones were limited to raw data analysis, which entailed merely capturing and learning from the video alone. This is unlike the novel temporal frequency analysis (TFA) being tested at Sandia, which dives deeper into the image. Instead of heat signatures, acoustic signatures, or taking a video at face-value, TFA analyzes the frequency of pixel fluctuation in an image over time, eventually obtaining a “temporal frequency signature” for the drones it has observed. Pairing robust imaging systems with machine learning in this way only makes it a matter of time before discrimination is seamless.
“Current systems rely upon exploiting electronic signals emitted from drones,” said Bryana Woo of Sandia National Laboratories, “but they can’t detect drones that do not transmit and receive these signals.”
Previously, drones could be spotted by picking up the radio signal between a remote control and the drone itself, but if drones are soon to be autonomous, that capability may quickly vanish. Alternatively, TFA captures tens of thousands of frames in a video, so a machine can learn about an object and its associations from how it moves through time. If mastered, TFA could be the most precise discrimination method to date.
The Sandia tests consisted of capturing impressions of three different multirotor drones on a streaming video camera. Each drone would travel forward and back, side to side, up and down, and the camera would capture spatial and temporal location. A machine learning algorithm was trained on the frames taken. Ultimately, the analysis renders the full flight path of the target object in all its directions.
In order to challenge the system, testers began with more complex data, providing lots of clutter in the environment—birds, cars and helicopters around the drone. Over time, Sandia noticed considerable difference in the system’s ability to discern whether an object was a drone or a bird.
TFA work with Sandia is part of a larger S&T effort to stay abreast the latest drone technologies. The number of commercial and personal drones in the sky is expected to nearly triple within the current decade, raising concern as to how their traffic will be managed, how nefarious drones can be identified and how to merely tell drones apart from their environment.
There could always be new barriers to detection, which is why S&T has taken on the nefarious drone issue from multiple angles—enabling law enforcement drones for components through demonstrations at Camp Shelby, Mississippi, developing a DHS interface for the future Unarmed Aerial System (UAS) Traffic Management System, and keeping up with state-of-the-art counter-UAS capabilities.
Source: DHS S&T