Drone engineers have already taken inspiration from bees, but now nocturnal dung beetles are being investigated to improve low-light navigation in drones, robots and autonomous vehicles.
The Australian team demonstrated that by using the Milky Way as an orientation cue, optical instruments are more robust to motion blur caused by rotational vibration and stabilisation artefacts.
AI sensor can accurately measure orientation of the Milky Way in low light
The dung beetle is the first known species to use the Milky Way at night to navigate, focusing on the constellation of stars as a reference point to roll balls of dung in a straight line.
Swedish researchers made this discovery in 2013 and now, engineers from the University of South Australia are modelling the same technique to develop an AI sensor that can accurately measure the orientation of the Milky Way in low light.
The team used computer vision to demonstrate that the large stripe of light that forms the Milky Way is not affected by motion blur, unlike individual stars.
“Nocturnal dung beetles move their head and body extensively when rolling balls of manure across a field, needing a fixed orientation point in the night sky to help them steer in a straight line,” said remote sensing engineer Professor Javaan Chahl.
“Their tiny compound eyes make it difficult to distinguish individual stars, particularly while in motion, whereas the Milky Way is highly visible.”
Exposing the optical hardware to low- and high-frequency vibrations
Using a Raspberry Pi HD camera with a 6 mm wide angle camera lens (CS-Mount) fixed onto to the roof of a vehicle, the researchers carried out a series of experiments, capturing images of the Milky Way while the vehicle was both stationary and moving. Using information from those images they developed a computer vision system that reliably measures the orientation of the Milky Way, which is the first step towards building a navigation system.
Initial imagery was captured while the vehicle was stationary but with the engine idling, exposing the optical hardware to engine vibration. The vehicle was then driven at approximately 40 km/h, subjecting the optical hardware to both engine vibration (higher frequency) and road (lower frequency) vibration.
The route taken by the vehicle contained a series of 90 degree bends and followed mostly coarse, unsealed roads. Additional images were captured with the vehicle stationary and engine switched off, providing reference data free from most significant sources of motion blur.
The Milky Way is a more resilient target for position and orientation
The Milky Way orientation algorithm (MWOA) first takes an RGB image and applies a series of image processing techniques, including noise removal and thresholding.
Once the binary image is generated, the algorithm calculates the angular information for the extracted Milky Way area. The centroid coordinates are calculated as the mean of the pixel coordinates in the x and y directions, respectively.
The researchers found that as the magnitude of blur increases, the individual star intensities drop until they are barely visible on the screen or printed image, highlighting how sensors struggle to detect the signal of individual blurred stars with varying degrees of motion blur.
By contrast, the Milky Way is a more resilient target for position and orientation. “The results show that the motion blur from real images undergoing real motion on a vehicle, as well as synthetic images, has minimal effect on the calculation of angles,” the researchers said in a recent paper published in Biomimetics.
Lead author Yiting Tao said the orientation sensor could be a backup method to stabilise satellites and help drones and robots to navigate in low light, even when there is a lot of blur caused by movement and vibration.
“For the next step I want to put the algorithm on a drone and allow it to control the aircraft in flight during the night,” said Tao.