Skip to main content

Computer vision brings autonomous flight closer

By
James Wormald
UAVOS’ unmanned helicopter uses computer vision to test its GNSS-free autopilot system

UAVOS’ unmanned helicopter uses computer vision to test its GNSS-free autopilot system

With UAV manufacturers and flight system developers like UAVOS utilising computer vision to navigate GNSS-free environments, and NASA collecting huge data clouds to help them, are airborne transport systems about to take off?

With ground-level autonomous travel gaining trust, traction and most of the headlines, many developers of UAVs (unmanned aerial vehicles) and even manned autonomous flight systems (autopilots) are looking at the open space above, to provide the future of transport and travel.

The majority of UAVs use GNSS (global navigation satellite system) alongside inertial sensors to track their speed, direction and positioning. In either locations where GNSS reception is weak – such as in remote or mountainous areas or near physical obstructions like trees or buildings – or situations where the signals are in danger of being jammed, the use of GNSS can’t be guaranteed.

Computer Vision to provide GNSS-free autonomous flight system

UAVOS is a manufacturer and developer of unmanned drones and autopilot systems, and has recognised the problem. Leveraging computer vision and AI, UAVOS’ Engineering Service tested the developer’s advanced avionics autopilot in an unmanned helicopter.

An onboard navigation module with deep learning algorithms provided the autopilot with geospatial coordinates, bringing accurate navigation as well as safe take-off and landing. The project “will help our clients leap forward in drone technology,” said UAVOS CEO, Aliaksei Stratsilatau. With “truly autonomous flight in complex environments, we are empowering businesses and organisations to leverage drones in new ways.”

NASA research aids computer vision-based flight testing

In unmanned flight technology, UAVOS are not out on their own. There are many other manufacturers and system developers to recognise the advantage computer vision offers. Data, however, is the key stumbling block many face. “Data is the fuel for machine learning,” said Nelson Brown, lead researcher on NASA’s AIRVUE project.

AIRVUE (Airborne Instrumentation for Real-world Video of Urban Environments), is a project that uses a recently developed camera pod installed with sensors, to collect large, diverse and, most importantly, accessible, visual datasets of weather and other airborne obstacles. A data cloud is then created for developers of autonomous drones or air transport. To evaluate their aircrafts’ performance during testing phases.

“Accessible datasets have been essential to advances in driver aids and self-driving cars,” said Brown, “but so far, we haven’t seen open datasets like this in aviation. We hope to inspire innovation by providing the computer vision community with realistic flight scenarios.”

“When a company conducts data collection on their own, it’s unlikely they share it with other manufacturers. NASA’s role facilitates this accessible dataset for all companies in the advanced air mobility industry. Once the design is refined, through evaluation and additional testing, the team hopes to make more pods that ride along on various types of aircraft to collect more visuals and grow the digital repository of data,” said NASA.

Media Partners