Crop farming is not for everyone. Long hours, weeds, pests and the vagaries of the weather are just some of the challenging and often random conditions that conspire to make reaping a decent financial reward difficult.
But cutting-edge automation has already started to change this picture for tech-savvy farmers. For instance, satellite imagery and unmanned aerial vehicles (UAVs) equipped with image sensors are providing accurate crop height and health data. This data, through machine learning, can be used by farmers to estimate biomass and yield, and make important farm management decisions, such as on crop rotation. Meanwhile, vertical farms and hi-tech greenhouses that use various sensor technologies, automated environmental controls and even robotic pollination and picking are revealing a path to dealing with the combined challenges of labour shortages and increased food demands.
One company working in this agribot space is Arugga, based in Israel. Its Polly robot pollinates tomato plants in high-tech greenhouses. Tomato plants are self-pollinators, meaning each flower contains both male parts and female parts. Therefore, they only need to be shaken to induce pollination. “Greenhouses traditionally use industrial bumblebees or get workers to shake the supporting trellises or apply vibrating wands to the plants for pollination,” says Arugga CTO Ariel Elbaz. “Neither method is ideal.”
Polly, instead, uses four standard webcams attached to gimbals that move on two axes to image tomato plants over multiple frames and provide depth information. The robot uses a deep neural network, trained on thousands of flower examples, to detect flowers ripe for pollination and measure their approximate distance. It then sends pulses of compressed air from about 20-40cm away to pollinate the flower.
Polly has been pollinating tomatoes at Costa Group’s multi-million-dollar greenhouse facility in Australia for two years, and the company has since expanded operations to other greenhouses in Australia, the US and Finland. Results so far have been encouraging, says Arugga, with Polly providing up to 15% higher yield versus manual pollination and up to 7% higher yield than bumblebees.
The company is working to make the robot fully autonomous (it currently has to be moved by an operator from row to row), faster and more accurate. “We want to get the accuracy to 100% pollination of reachable flowers and have a single robot pollinate about a hectare,” says Elbaz. “Even small improvements will increase the farmer’s yields and margins by much more than that.” Moreover, the Arugga team aims to add new capabilities that provide further functions to growers. These include additional modules mounted alongside the pollinator, such as a plant-lowering arm which will be released later this year, and plant monitoring tools for pest and disease detection, yield prediction and crop management.
Bringing farming robots outdoors
The controlled conditions in greenhouses are conducive to vision-based robotic solutions such as Polly and others, including UK-based Fieldwork Robotics’ raspberry-picking robot – recently deployed commercially in two locations in Portugal – and US-based Four Growers’ robots that can harvest tomatoes, peppers and cucumbers.
Traditional outdoor, open-field farming is a different ball game in which unpredictable and uncontrollable factors come into play. However, the alignment of multiple technological advances, labour shortages, and public and private investment has brought robotic outdoor farming to the point where commercial solutions are starting to enter the market. These solutions are generally powered by sophisticated deep neural networks and advanced cameras and sensors, which combine with improved battery technology to provide farmers with hi-tech automated alternatives for traditional labour-intensive tasks.
For instance, Farming Revolution – a German company founded in 2020 by former Bosch roboticists and computer vision experts – focuses on weeding. “Advances in deep learning made it possible to recognise plants accurately and thus made it clear that a sustainable alternative to the use of chemicals for weeding was within reach,” says co-founder and co-CEO Maurice Gohlke. The company’s Farming GT robot uses a multispectral camera with an active light to take high-resolution images of plants, and then applies deep convolutional neural nets to discern between crops and weeds. From this, the robot wields its precise mechanical arms ending in a rotating chopper or blade to cut off weeds at the stem.
Farming Revolution’s Farming GT robot uses a multispectral camera to take high-res images of plants
“The setup is able to discern more than 80 types of plants in different light conditions, and it performs well when plants are overlapping, partially occluded by other plants or earth,” says Gohlke, who is currently looking to reduce component costs and simplify hardware in order to scale up the business. “Today, the robots are rented to farmers that operate them in different crops – we have 12 robots in operation, all in Europe.”
At a similar level of commercialisation is the Small Robot Company (SRC), founded by a group of UK farmers, engineers, scientists and service designers in 2017.
“We’ve got 50 farms signed up for this year and quite a large number of customers waiting in the wings,” says Chief Marketing Officer Sarra Mander. “So the barrier is not farmer appetite, it’s stabilising the technology and seeking additional funding in order to scale up.”
SRC recently launched the latest iteration of its farming robot: Tom V4. Providing what the team calls ‘per-plant intelligence’, Tom V4 can monitor a field in granular detail, down to individual water droplets on leaves. It achieves this with eight 6MP cameras, capturing images at a rate of 3fps mounted 95cm above the ground to give a ground sample distance of 0.28mm per pixel, among the highest resolution of any crop-scanning technology.
These images and their metadata (as well as robot location and diagnostic information) are fed into the company’s AI engine Wilma to build an ultra-high-definition map of the field. “It’s not quite as simple as a computer vision system that just looks at a bounding box around the plant and says what it is,” explains SRC Robotics Engineer Dan Rowe. “You can also look at other trends to do with location and information you already know about the field – so it’s taking an agronomist’s intuition and really catapulting it to the next level.”
Currently, farmers are using data from Tom for weed management. SRC integrates with and optimises farmers’ existing sprayer equipment so that they only treat those areas of the field where there are weeds. “It’s not highly precise at this stage because the area sprayed is greater than a single plant,” says Mander. “But it’s very much more accurate than what a farmer would otherwise be able to do.”
SRC’s Tom V4 uses eight 6MP cameras to monitor a field, down to individual water droplets on leaves
More exciting is SRC’s long-term plan for the technology. Tom V4 is a distributed and modular robot split into two halves, where each unit can function independently. “We can expand it and add an extra chassis, and just double the width of the robot,” says Rowe. “Or we can separate it slightly so that we can add depth and a different tool underneath.” This means the robot can handle different field sizes, different crops and can perform different farming tasks. Testing and trials are under way where the robot performs precision weeding, spraying and planting, and (with the addition of multispectral cameras) blackgrass and slug monitoring and treatment.
One problem facing young start-ups such as Farming Revolution, SRC and many of their competitors is that although their robots are autonomous, they cannot perform their roles unsupervised. This is because obstacle avoidance, particularly human obstacle avoidance, is a concern that is both hard to solve and a health & safety minefield.
“We have dual GPS on all of our robots and we’ve tested out a few different solutions for RTK [real-time kinematic] corrections, and that allows us to get down to a localisation accuracy, purely from GPS, of around 5mm,” Rowe says. “We also use a geofence when the robot is manoeuvring, giving it a low-definition map of the field so that when it’s navigating, it can never go outside of the field boundaries, and odometry information from the IMUs [inertial measurement units] and wheel encoders give the robot its reference location and heading – but for now we still have to have operators monitoring the robots.”
Founded by two robotics engineers after speaking with farmers in Pontonx-sur-l’Adour during the Asparagus Festival in 2011, French company Naïo Technologies has had more time to work on the obstacle avoidance problem. Naïo’s Dino and Ted robots were the first large-scale agricultural robots to be certified to work unsupervised in fields in 2022 (a few years after their appearance on the market), joining the company’s smaller farming assistant Oz, which has been working unsupervised since 2016. Their location and obstacle avoidance system is similar to those employed by Farming Revolution and SRC, based on RTK GPS signals to autoguide the robots, geofencing to ensure the robots stay within the bounds of the field, and sensors and mechanical bumpers; with their tool-carrying Orio robot also using lidar to increase safety levels and work faster.
Naïo Technologies’ tool-carrying Orio robot uses lidar to increase safety levels and work faster
“We stepped beyond testing because we focused on simple tasks where there are labour shortages, and because of solid and deep work on safety and easy-to-use specs,” Marketing Content Manager Flavien Roussel explains. With this strategy, Naïo has established a distribution network in more than 25 countries, and will have 300 robots in service by early 2023.
From field robots to autonomous tractors
Though the autonomous Monarch Tractor and tractors from Yanmar, AgXeed and others were already on the market in 2022, a watershed moment for farming robots came when John Deere revealed its first fully autonomous tractor to fanfare at tech show CES 2022; the first machine ready for series production. As one would expect from the largest agricultural machinery manufacturer in the world, significant R&D went into the autonomous 8RX, which has already started to be shipped to farmers, including a highly sophisticated perception system that, among other things, tackles obstacle avoidance.
“It uses two NVIDIA GPUs, where RGBD images (colour plus distance measurements) from six pairs of cameras are fed into a deep neural network for ranging and obstacle detection around the vehicle,” says Cristian Dima, Lead of Advanced Algorithms at John Deere. “It classifies each pixel about every 100 milliseconds at the moment, and then the decision is made on whether the vehicle can proceed or not.” GPS-based guidance for autonomous steering and a geofence offer an extra level of safety, the result
being farmers can leave the tractor to work unsupervised.
“The first autonomous solution we have introduced for the autonomous 8RX is focused on tillage,” says Dima. “But tractors or combines execute some very complex operations and pull many different kinds of implements. That means that the space of problems is quite large, so it’s important to pick what applications are most important and focus on those first.”
John Deere’s 8RX feeds two NVIDIA GPUs from six pairs of cameras, inset, for obstacle detection
Like many competitors, another key application John Deere has targeted is weed control. The company’s See & Spray Ultimate system is currently available on self-propelled sprayers. Powered by NVIDIA GPUs, the system uses 36 cameras mounted on a 40m carbon-fibre boom.
Trained on millions of images, it uses deep learning to classify images of the field that’s being sprayed, discerning between plants and weeds. “It’s automatically computing where the weed is, how quickly the sprayer is moving and exactly what is the right time at which the sprayer nozzles need to be turned on, so that chemicals end up on the weed,” says Dima. “These sprayers also go fast – the top speed at which this sprayer can work is up to 20 km/h.” Initial tests reveal that See & Spray saves about 66% on herbicides. With all of these technologies from John Deere and many other agtech innovators on or coming to market soon, Dima feels computer vision and AI are on the cusp of revolutionising agriculture: “We’re in a very exciting time where we have the right sensors, we have the right computing devices, and we have quite a bit of knowledge about how to actually make deep learning-based solutions and very complex computer vision systems enable farmers to be more productive and sustainable, while also minimising environmental impacts.”