Smart sensing for digital horticulture
Creating a virtual orchard requires advanced data collection and processing. By Anna Mouton.
Innovation in pome- and stone-fruit production often progresses one orchard or one season at a time. Researchers developing new tree-training systems or growers trying to understand mixed maturities may have to wait months or years to gather data, and their ability to compare alternative interventions is always constrained.
Digital twins — virtual recreations of actual orchards — promise to accelerate innovation by allowing researchers to rapidly and cost-effectively test multiple interventions on the same trees.
“The digital twin platform can allow us to holistically simulate, interrogate, design, prescribe, and control the structure and performance of perennial horticultural food systems,” said Richard Oliver, an engineer at the New Zealand Institute for Plant and Food Research.
Oliver co-leads the Smart Sensing and Imaging Programme within the larger Digital Horticulture Systems project. “We’re building the digital twin for scientific purposes,” he said. “We want to use this as a data capture tool and to define the holes in our knowledge of these systems.”
Digital trees grow from data
Smart Sensing and Imaging is one of seven programmes within the Digital Horticulture Systems project. Three other programmes aim to model apple planting systems, orchard ecosystems, and fruit quality from orchard to market.
All these models must be fed with data. Oliver describes his programme as the real-world sensing arm of the digital twin, but it’s less a limb and more the eyes — and even the skin and nose — gathering inputs for constructing and validating the models.
“This also provides capacity development within our interdisciplinary teams,” he said. “Engineers, data scientists, and a range of biologists are all increasingly learning to work together.”
The physical counterpart of the digital twin is a planar-cordon Royal Gala-orchard on M.9 growing at Plant and Food’s Hawke’s Bay site. Oliver and Karmun Chooi, the other programme co-leader, started by digitally scrutinising one tree.
“Our first research aim is called the instrumented tree and orchard,” said Oliver. “This is where we’re doing much of our orchard imaging.” His team built a single-tree scanning platform that allows a 31.4-megapixel camera to take multiple images of the whole tree canopy — a visual field of roughly 12m2 per side.
“We’re using 12-bit or high dynamic-range imaging,” said Oliver. “Most of our cell phones do it out of the box.” As does the human eye. High dynamic-range imaging compensates for different light levels within a scene so that parts are not obscured by shadow or whited out by light.
The scanning platform captures every detail of the tree in about 850 images. Postdoctoral researcher Dr Oliver Batchelor and colleagues at the University of Canterbury’s Computer Vision Laboratory process these into a three-dimensional reconstruction.
“We hope to use this to get the full organ-level metric,” said Oliver. “That includes each individual leaf’s size, bud fate, and fruit growth and development.” He shared a video of the three-dimensional tree rendering — visit the Hortgro YouTube channel to view it.
Beyond imaging single trees
Oliver’s team is also working on a multiple tree-imaging platform. “We’re using a little robotic platform supplied by a New Zealand startup company,” he said. “We’ve developed a passive stabilisation system for a 4-metre mast to capture both sides of the canopy as the robot weaves up and down the interrows.”
The mast carries four 2.3-megapixel cameras. These are being upgraded to 12-megapixel cameras, similar to many cell-phone cameras.
“These image sets will be used for training machine learning models for pest and disease detection and organ detection and 3D reconstruction in the future,” said Oliver.
Robotic platforms can theoretically be configured to carry any sensor. Many growers are familiar with handheld near-infrared spectroscopes that measure dry matter or °Brix in fruit. Mounting these on robotic platforms is difficult because the instrument must touch the fruit and exclude sunlight while taking a reading.
“We have a technique using modulated laser sources that we think will allow spectroscopy at a distance,” explained Oliver.
His group is also experimenting with environmental sensors that can help transform a digital twin from a group of trees into an entire orchard ecosystem that supports the simulation of weather, climate, nutrition, and irrigation.
“We’re building a Bluetooth mesh network sensor,” he said. “It measures photosynthetically active radiation, temperature and humidity at multiple sites within the canopy and block. The idea is to get a high spatial density measurement.”
Digital supersensing
Data collection is not limited to sensors based on visible or infrared light. Oliver shared two examples of systems that could revolutionise crop protection and harvest management.
The first is LAMP CRISPR-Cas technology for disease detection. “In very simple terms, if the particular DNA that your test has been set up to detect is present, a fluorescent dye becomes available that lights up when you shine a blue light on it,” he said.
A highly sensitive test for bullseye rot has already been validated. Canker, black spot, and fireblight assays are in the pipeline.
The second new technology is volatile sensing. Nondispersive infrared sensors are simple spectrometers that can measure environmental carbon dioxide. Oliver’s team is exploring their value for tracking daily and seasonal variations.
They are also investigating photoionisation sensors that detect very low levels of volatile compounds, such as those released by ripening fruit. “We’re hoping to create another sense for our mobile imaging rig that will pick up plumes of volatiles as it’s weaving its way up and down the orchard,” he said.
Measuring volatiles could lead to a better understanding of their relationship to fruit quality and maturity, which could inform better quality management and potentially reduce postharvest losses.
Digital twins were first conceptualised by NASA in 2010 to simulate spacecraft. They are now widely applied in fields as diverse as urban planning and automotive design. Plant and Food Research’s digital planar-cordon orchard is bringing this technology down to earth for the benefit of fruit growers everywhere.
This article is based on a presentation at the 2024 Hortgro Technical Symposium. Go to the Hortgro YouTube channel to watch Oliver and other speakers at this event.