Skip to main content

Feline-inspired vision system developed to enhance object detection in drones

Cat's Eyes inspired vision system

Image: Deviantart - SilentEmotionn

A new computer vision technology, inspired by the shape of a cat’s pupils, has been found to improve clarity and focus for autonomous systems in variable lighting conditions.

Self-driving vehicles, drones and other autonomous systems have the ability to see far further than the human eye, making it possible to identify and track objects hundreds of metres away, and react accordingly, but when faced with less-than-ideal weather and light conditions such as in rain and fog, cameras and sensors can be affected.

Nature-inspired vision for autonomous systems

After taking inspiration from the natural world including how bees complete seek and locate tasks with little neural hardware and energy use, and how nocturnal dung beetles can help improve low-light navigation in drones, now a new computer vision system inspired by cat’s eyes is being developed to help drones and autonomous vehicles see the world around them more clearly.

How the cat-inspired vision system works

Using advanced lasers and sensors modelled on the structure of a feline’s eye, the new vision system developed by scientists at Gwangju Institute of Science and Technology (GIST), South Korea, could offer enhanced objection detection and recognition across both light and dark environments.

During daylight hours, the shape of a cat’s pupil is a vertical slit that helps to filter out light and reduce glare. When darkness comes, the pupil lets in more light by widening its shape; a reflective layer called the tapetum lucidum reflects visible light back through the retina, thus increasing the light coming through to photoreceptors.

The new system similarly uses a slit-like aperture in bright conditions to filter unnecessary light and help focus on single objects, and a reflective layer to improve visibility in low-light.

“Robotic cameras often struggle to spot objects in busy or camouflaged backgrounds, especially when lighting conditions change,” says Young Min Song, Professor of Electronic Engineering at GIST and lead author of the study. “Our design solves this by letting robots blur out unnecessary details and focus on important objects.”

Machine learning neural network boosts object perception

While the feline-inspired vision system was successfully able to blur background objects to help maintain its focus on target objects, it was the addition of a neural network of machine learning algorithms based on how the human brain processes information that helped the system identify an object’s level of importance.

Practical applications and future potential

For the system to be deployed in practical and commercial settings, the field-of-view’s pixel resolution still needs improvement, but the researchers hope it could be integrated into drones, robots and other autonomous vehicles and surveillance applications in the future, especially in constantly changing environments.
 

Topics

Media Partners