Skip to main content

Development of ‘soft-touch’ colour-detecting robots helps to fill labour shortages

Development of ‘soft-touch’ colour-detecting robots helps to fill labour shortages

MSU's robotic apple picker. Image: MSU

The US National Science Foundation is backing a $366,000 study alongside top US universities to develop advanced robotics using stress-interpreting optical systems for delicate tasks.

As part of a three-year $366,000 study the US National Science Foundation is collaborating with Michigan State, Stony Brook and Purdue Universities to develop the “next generation of robots,” said Civil Engineering Assistant Professor Wei Lin, from Stony Brook.

A force-interpreting optical system is being developed to teach robots to distinguish and manipulate soft and fragile objects through the use of colour to perceive, visualise and interpret interactions.

Stress-interpreting optical system

By integrating photoelastic gel into a stress-interpreting optical system, the vision-based tactile gel-robots will be able to execute the ultra-gentle manipulation of objects, making them more suitable for a variety of tasks including in medical, surgical, marine research and domestic sectors such as for assistive robots to ‘handle’ and serve food.

“Our long-term goal is to fill the fundamental gap that exists between robotic tactile perception and human haptic sensing,” said Lin. “Specifically, this project will leverage the molecular design of fatigue-resistant photoelastic gels, the mechanical design of a stress-interpreting photometry system and the algorithm design of physics-informed machine learning.”

Apple picking robots

But the development of soft-touch robots is not unique. Researchers have been using computer vision to design robots with softer ‘hands’ for years, helping to improve efficiency and reduce the need to rely on manual labour in a range of industrial applications. Michigan State University (MSU), for example, has been collaborating with the US Department of Agriculture (USDA) to developed a robotic apple picker that uses AI to harvest perfectly ripe apples, and engineering to prevent them from being bruised during the process.

The collaboration developed its first robotic apple picker in 2021, but recent technological advances in AI have helped develop a new, improved prototype. “Everything keeps getting better,” said Zhaojian Li, another Associate Professor at MSU’s College of Engineering. “Steady progress has been made over the years and we can pick more high-quality apples faster.”

Vision technology

Using an RGB-D camera to provide images that contain the colour of apples (the RGB part) as well as the depth or location (the D), the apple-picking robot integrates the images with AI to identify perfectly ripe apples. When one of the robotic arms approaches the selected fruit, its soft silicone gripper gently grabs the apple before using suction to pick it from the tree.

Currently, the robot is capable of picking approximately one ton of apples a day, compared to the six tons a skilled human picker is capable of, but “this is just the beginning”, assures Jon Affholter, who is responsible for the MSU Innovation Center’s commercialisation program. The technology “has great potential locally, state-wide and even nationally to make an impact on the long-term labour shortage threatening the speciality crop industry,” said Affholter.

With the support of a $3.5 million USDA project, it’s anticipated that the robotic system will continue to improve in speed and precision over the next four years. Just in time for high-volume farm operations contending with labour shortages.
 

Media Partners