In outdoor large-scale agriculture, industrial machinery has been used for a long time to tend to very large cultures with little manpower. A single man with the appropriate tractor could plow an entire field in less than a day. However, this kind of machinery is not appropriate for all environments, like greenhouses.
Since finding staff to tend to greenhouses has become increasingly difficult over the years, people have been working to automate some of the greenhouse tasks using robots. Indeed, it is possible for mobile platforms and lightweight robots to take advantage of the reasonably well organised structure of greenhouses to perform many helpful tasks autonomously.
In this article, we will discuss 5 tasks that robots have been shown to be able to perform in a greenhouse to support the existing workforce.
Source: Ali Shafiekhani.
#1 - Ground-Level work
It is hard to argue that any work on the ground-level of the plantations is unpleasant to human workers. It often requires people to crouch for prolonged periods, which in the long term may result in injury. For many greenhouse applications, plantations are made on multiple shelf levels and the ground can even be out of reach.
The advantage of using a robot for this kind of task is that they do not get uncomfortable and can be mounted on hardware that eases the reach without any falling hazard. Equipped with the appropriate tool, be it a fancy seed dispenser or a simple shovel, robots can adapt pick-and-place routines regularly used in industries to sow or to spread fertilizer. The latest developments in AI showed that it was even possible to use images to do plant phenotyping1, 2. Using this technology, it even becomes possible to identify and remove undesirable weeds from the plantation.
Source: Martin Leroux.
#2 - Picking / Harvesting
There is no denying that harvesting products is the most satisfying part of agriculture. It is the fruit (pun intended) of all the efforts put previously in production and ultimately what is going to clients and bringing revenues. Yet, it is also the most sensitive operation in a greenhouse. Even for human workers, the task of picking products can get surprisingly complex: the fruit may be hard to see on the plant, then one has to determine if it is ripe and then there is often special care to be taken when harvesting to avoid damaging either the plant or the product.
Nowadays, image segmentation algorithms and object recognition AI systems can leverage the fact that a greenhouse is a relatively controlled environment, i.e. you know what you should be looking at/for. This means that robots can now be equipped to find the product in the plant. Once that step is accomplished, there are as many strategies as there are products. No matter the criterion used by manual workers to determine if a fruit is ripe, it can be replicated by a robot with the added benefit of quantitative results. Size measurements, color recognition, palpometry, spectral reflectometry. AI and more can be used to make sure that products are only picked at an optimal time3, 4. Once the product is identified as ready to be picked, the robot can execute a standardized picking strategy including a picking point, applied pressure and exact motion in a repeatable way5.
Source: Ruoshi Wen.
#3 - Inspection
Visual inspection has been a staple of industrial production for a very long time. In the recent years, as computer vision capabilities were improved and high definition camera prices reduced, a lot of these tasks are now getting automated. In industrial conditions, products are compared to a gold standard, for example a CAD model. Deviations from the standard are then considered faulty, which raises the challenge of proper tuning to avoid getting too many false positives. In the context of a greenhouse, the challenge is opposite: there is no gold standard to compare to, so there is a risk of leaving false negatives behind. This risk varies with the technique used for inspection, which in turn depends on the object being inspected. Contour detection on leaves to find holes left by insects are less likely to give out false negatives than a neural network trained with bad data. Nevertheless, specialized AI classifiers can now be trained very accurately on a small subset of classes (ex: acceptable and not acceptable strawberries) and even update themselves by requesting occasional human feedback on difficult classification.
Although the robot itself is not the device that is processing the inspection, it is still essential to the process to move the camera around. Researchers have created algorithms to figure out the best way to orient a camera to look at plants6, 7 and multiple-arm devices to delicately manipulate them and inspect them beyond the surface8. These robots can inspect plants to look for broken branches, traces of visible sickness, holes left by insects or mold. Additionally, robots with inspection capacity can also be deployed at the end 6 of the production cycle, to ensure only prime quality products are packaged and sent to clients.
#4 - Trimming
One of the main appeals for integrating robots into any kind of application has historically been the combination of repeatability, accuracy and speed, which are in general mutually exclusive for manual workers. These features can also be of use in agriculture when it comes to trimming plants. Industrial programs already exist to create very precise path planning for robots to cut, weld or polish products with extreme precision. These programs can be adapted to take into consideration the uncertain shape of the plants and compensate for their flexibility when touched, and then implemented on a robot carrying for example a hedge trimmer to shape bushes 9, 10.
This application may feel like a consumer product at first, but trimming robots would find their use in greenhouses to help keep the volume of space occupied in check or in outdoor fields such as orchards and parks. Instead of a hedge trimmer, the robot could use shears to remove branches and keep plants at an optimal density for growth or light exposure.
Source: Trimbot, https://trimbot2020.shorthandstories.com/.
#5 - Pollination
Greenhouses, being a closed space, are in general not suitable for bees which are responsible for flower pollination. Manually pollinating all the flowers of a plantation expected to yield multiple tons of product is immensely tedious, and trying to pollinate everything with wide range devices like sprays tends to be very wasteful. Alternatively, turning to robots for this task proves to be beneficial by being less wasteful (so more cost effective in the long run). It also is a stable and reliable method that doesn’t miss flowers because it tends to each of them individually.
Pollination robots11 were developed by combining:
- Image processing to find the orientation of the flowers,
- Advanced path planning to reach them while avoiding obstacles like the surrounding plants
- Visual servoing techniques to accurately position the robot on the flower
- Specialized single-flower pollinating tool
These robots have the advantage of being able to work autonomously around the clock. Their built-in image processing and path planning algorithm would also make them ideal for fruit picking after a simple swap of the tool once the seasons have passed.
Source: Yu Gu / WVUIRL.
1. Ali Shafiekhani, Suhas Kadam, Felix B Fritschi, and Guilherme N DeSouza.
Vinobot and vinoculer: Two robotic platforms for high-throughput field
phenotyping. Sensors, 17(1):214, 2017.
2. Ali Shafiekhani, Felix B Fritschi, and Guilherme N DeSouza. Vinobot and
vinoculer: from real to simulated platforms. In Autonomous Air and Ground
Sensing Systems for Agricultural Optimization and Phenotyping III, volume
10664, page 106640A. International Society for Optics and Photonics, 2018.
3. Tsunghan Han and Changying Li. Developing a high precision cotton
boll counting system using active sensing. In 2019 ASABE Annual International
Meeting, page 1. American Society of Agricultural and Biological
4. Hanz Cuevas-Velasquez, Antonio-Javier Gallego, Radim Tylecek, Jochen
Hemming, Bart van Tuijl, Angelo Mencarelli, and Robert B Fisher. Realtime
stereo visual servoing for rose pruning with robotic arm.
5. Ruoshi Wen, Kai Yuan, Qiang Wang, Shuai Heng, and Zhibin Li. Forceguided
high-precision grasping control of fragile and deformable objects
using semg-based force prediction. IEEE Robotics and Automation Letters,
6. Pravakar Roy and Volkan Isler. Active view planning for counting apples
in orchards. In 2017 IEEE/RSJ International Conference on Intelligent Robots
and Systems (IROS), pages 6027–6032. IEEE, 2017.
7. Jonathon Sather and Xiaozheng Jane Zhang. Viewpoint optimization for
autonomous strawberry harvesting with deep reinforcement learning.
arXiv preprint arXiv:1903.02074, 2019.
8. Eduardo Navas, Roemi Fernández, Delia Sepúlveda, Manuel Armada,
and Pablo Gonzalez-de Santos. Modular dual-arm robot for precision
harvesting. In Iberian Robotics conference, pages 148–158. Springer, 2019.
9. Dejan Kaljaca, Nikolaus Mayer, Bastiaan A Vroegindeweij, Angelo Mencarelli,
Eldert J van Henten, and Thomas Brox. Automated boxwood
topiary trimming with a robotic arm and integrated stereo vision*. In
IROS, pages 5542–5549, 2019.
10. Dejan Kaljaca, Bastiaan Vroegindeweij, and Eldert van Henten. Coverage
trajectory planning for a bush trimming robot arm. Journal of Field Robotics,
11. Jared Strader, Jennifer Nguyen, Christopher Tatsch, Yixin Du, Kyle Lassak,
Benjamin Buzzo, Ryan Watson, Henry Cerbone, Nicholas Ohi, Chizhao
Yang, et al. Flower interaction subsystem for a precision pollination robot.
arXiv preprint arXiv:1906.09294, 2019.