The next big wave of Unmanned Aerial Systems (UAS) innovations will be in the field of Computer Vision and with it, Command and Control. At the recent Association of Unmanned Vehicles International (AUVSI) Xponential conference in Denver, Colorado, the number of tech-companies integrating Computer Vision (how computers can be programmed to recognize objects and gain high-level understanding from digital images or videos) is on the rise.
In fact, just prior to the conference, AUVSI partnered with companies like Robonation, IBM, Verizon, and Flytbase to host the Xbuild, an unmanned systems hackathon, held at Galvanize in Boulder, Colorado. Almost all of the more than 20 teams participating in the hackathon had some form of machine-learning, Computer Vision solution handling part of the drone’s function. From recognizing people lying on the ground to synchronizing airspace activities, autonomous activities were heavily influenced by this branch of Artificial Intelligence.
This adds another layer of complexity in the run-up to Beyond Line of Sight (BLOS) operations, yet holds tremendous promise to make UASs safer and smarter without necessarily adding additional sensors.
Computer Vision and (equally important) machine learning are finding their way into sensor operations. The need for light-weight, low power sensors often limits performance and utility for industry; however, advanced computing techniques able to “tease out” more information is giving new life and new utility to otherwise one-dimensional sensor-data. For example, though a multispectral sensor may not be able to differentiate between crops and weeds due to limited spectral bands and signature-matching, machine learning finds other data-relationships between the bands that can separate the two (in near real-time).
Data classification from machine learning is still a “dark art” and the concern from many in the data-space is that we really don’t know, nor easily understand, how these classifications work behind the scenes, making verification and adjustment tedious.
For the UAS industry, machine learning has the potential to spot faults and anticipate concerns within complex computing and hardware systems. As we know, faults happen with spectacular results in the UAS world and most can be traced back to smaller, more subtle changes or failures in the run-up to a catastrophic event. “Teaching” the flight control software to recognize these symptoms can potentially cause the system to “self-heal” in time to prevent damage or injury.
The “human in the loop” is going to be the weakest link as advanced data-science builds better models, monitoring systems, and mission planning and operation software, hence the sponsorship of IBM, Flytbase, Facebook, and Verizon at the Xbuild. Safer flight operations and getting more from sensors is key to driving investment dollars into the industry to solve power and airspace integration limitations.