Over the last two to three years, artificial intelligence has been a game changer for the drone industry. AI can be used to autonomously execute safe flight plans, predict drone maintenance needs, and protect drones from cybersecurity attacks.
During flight, AI can also be used to detect and track objects of interest in real-time through computer vision. This powerful technology is opening the door to new drone use cases that were previously unimaginable. It can help improve emergency response, animal conservation, perimeter security, site inspections, and much more.
Our free SkyGrid Flight Control app is equipped with computer vision to detect people, vehicles, animals, and other key objects in real-time as drone operators autonomously surveil a defined area. Get the scoop below and read on for more details.
What is computer vision?
Computer vision is a field of artificial intelligence that trains computers to identify, interpret, and track objects in imagery and video. The technology is driven by pattern recognition. It’s trained by feeding computer models thousands to millions of images with labeled objects. This allows the algorithms to establish a profile (e.g., color, shape) for each object to then identify the objects in unlabeled images.
Thanks to advances in machine learning and neural networks, computer vision has made great leaps in recent years and can often surpass the human eye in detecting and labeling certain objects. One of the driving factors behind this growth is the amount of data we generate that can be used to train computer vision models more accurately.
How does SkyGrid’s computer vision work?
Our computer vision is powered by a well-known neural network called YOLO, short for You Only Look Once. The YOLO object detection model is especially popular for real-time on-device systems because it is both small and very fast, while still maintaining high levels of accuracy. The models have been trained to recognize 80 different categories of common objects, such as people, cars, trucks, animals, electronics, and other objects. As a result, the SkyGrid Flight Control app achieves near real-time object detection (about 10-20 frames per second on an iPad) through a drone’s live video stream. See example below.
SkyGrid Flight Control also enables users to select a detected object and track it through a drone’s live video feed. The algorithm itself is very performant, running at 60+ frames per second on an iPad.
Why kind of use cases can drone computer vision enable?
Our computer vision capabilities can support a wide variety of recreational and commercial drone use cases. It can help identify a missing person during a search and rescue operation or detect potential threats near critical infrastructure, such as an oil pipeline or high-security building. It can be used to count cars in parking lots to predict retail earnings or used to monitor wildlife to detect potential poachers. It can even help monitor social distancing to prevent the spread of COVID-19.
For enterprise customers, SkyGrid can train models to detect and track custom objects based on the mission objectives. For example, models could be trained to detect hurricane debris to help identify the most damaged areas in need of assistance. They could be trained to detect defects in solar panels to help improve the power output from a solar farm. Or they could be trained to detect sharks at the surface of the water to prevent attacks at popular beaches.
How will your computer vision capabilities evolve?
We’re constantly improving our computer vision models to make our object detection and tracking features more performant, robust, and specialized. Today, drone operators will see greater detection accuracy with a head-on view, which often requires flying at a lower altitude. In the coming months, we’re working to optimize this capability to improve accuracy at higher altitudes and maximize the usability to users. Stay tuned for more updates!
Download SkyGrid Flight Control for free in the iPad App Store or learn more about our advanced enterprise features.