Date: Mon Aug 1, 2016
Time: 4:20 PM - 6:00 PM
Phenotyping in the field is an essential step in the phenotyping chain. Phenotyping begins in the well-defined, controlled conditions in laboratories and greenhouses and extends to heterogeneous, fluctuating environments in the field. Field measurements represent a significant reference point for the relevance of the laboratory and greenhouse approaches and an important source of information on potential mechanisms and constraints for plant performance tested at controlled conditions. In this paper, we present a range of methods deployed within the German Plant Phenotyping Network (DPPN, www.dppn.de), focusing on plant architecture, photosynthesis, and water relations. Specialized field platforms (a) test innovative phenotyping technologies; (b) provide access to semi-controlled field installations to support breeding approaches for future CO2–concentrations (breed-FACE); and (c) study the translation of phenotypic properties from controlled environments to stands in the field. In this paper, we report that stereo imaging allows the quantification of canopy structure; active thermography estimates leaf water content and provides information on transpiration conditions: sun induced (SIF) and light-induced fluorescence transient (LIFT) techniques allow us to estimate remotely photosynthesis at canopy and leaf-to-plant level, respectively. Regarding photosynthesis, because the Fluorescence Explorer was recently selected, SIF will be measured by the next European Space Agency satellite Earth Explorer mission. All methods will be tested further and incorporated into (semi-)automated systems of sensors positioned in the field, introducing a promising portfolio to measure plant traits in field phenotyping and to enhance our understanding of relevant traits under natural conditions now and in the future.
The use of remote sensing in plant breeding is challenging due to the large number of small parcels which at least actually cannot be measured with conventional techniques like air- or spaceborne sensors. On the one hand crop monitoring needs to be performed frequently, which demands reliable data availability. On the other hand hyperspectral remote sensing offers new methods for the detection of vegetation parameters in crop production, especially since methods for safe and efficient detection of phenotypic differences are essential to develop adapted varieties by breeding.
To address both aspects, a ground-based hyperspectral system called “TriSpek” has been developed to deploy new spectral opportunities and to overcome the problems of data availability and spatial resolution.
The TriSpek is capable to cover a spectral an effective spectral range from 400 to 825 nm with 1 nm bandwidth. Using multiple spectrometers allows for correcting the reflectance measurements for incoming radiation on the fly in the field. This option increases data availability since the effects of illumination situations due to different sun angles and clouds can be compensated directly in the field.
In an extensive calibration process partial least squares regression models for the determination of several vegetation parameters in rye have been developed. The results show a high prediction quality with coefficients of determination (R²) of 0.85 for fresh matter, 0.90 for dry matter, 0.90 for leaf area index and 0.84 for chlorophyll-a, respectively.
Over three growing seasons performance tests with rye were applied at two test-sites in Germany with different candidate strains under drought stress and irrigation. Connecting the spectral/vegetation data to the digital field plans of the experiments allow views of the temporal and spatial dynamics. Applying this concept, heterogenities within plant nurseries caused by elevation or soil differences can be identified indirectly by means of growth variations in the hyperspectral data.
Hybrid plants feature a stronger vigor, an increased yield and a better environmental adaptability than their parents, also known as heterosis effect. Heterosis of winter oilseed rape is not yet fully understood and conclusions on hybrid performance can only be drawn from laborious test crossings. Large scale field phenotyping may alleviate this process in plant breeding.
The aim of this study was to test a low-cost mobile ground-based hyperspectral system for breeding research to easily access important information on crop status and development. Quantitative relationships between vegetation parameters (above ground fresh and dry matter, leaf area index; FM, DM, LAI) and field reflectance measurements were set up using partial least squares regression. At the time, our data set consists of 102 measurements which were acquired during two growing seasons between 2014 and 2016. Models were first set up using the full spectral range as a best case scenario (400-2400nm). Subsequently, performance was evaluated with reduced range (400-800nm) according to the ground-based mobile system. Model validation was performed by means of leave-one-out cross validation (cv).
Rcv² of the PLSR models for FM and DM based on full spectral range was 0.82. For LAI, Rcv² was only 0.52. Confining the spectral range increased prediction errors by 15%, 9%, and 5% respectively. Models were successfully applied to three data sets acquired in April 2015 by our mobile ground-based system.
Rapid methods for plant phenotyping are a growing need in agricultural research to help accelerate improvements in crop performance in order to facilitate more efficient utilization of plant genome sequences and the corresponding advancements in associated methods of genetic improvement. Manual plant phenotyping is time-consuming, laborious, frequently subjective, and often destructive. There is a need for building field-deployable systems with advanced sensors that have both high-speed and high-performance for plant phenotype processing.
The authors are solely responsible for the content of this paper, which is not a refereed publication.. Citation of this work should state that it is from the Proceedings of the 13th International Conference on Precision Agriculture. EXAMPLE: Lastname, A. B. & Coauthor, C. D. (2016). Title of paper. In Proceedings of the 13th International Conference on Precision Agriculture (unpaginated, online). Monticello, IL: International Society of Precision Agriculture.
This study reports on the design and performance of a new 3D computer vision-based plant phenotyping technology that utilizes 3D stereovision from many different viewing angles. The research presents new knowledge used to facilitate the determination of the best viewing angles for 3D reconstruction of plants. A full 3D reconstruction system for plants is introduced that utilizes 16 high-resolution color stereovision cameras mounted on an arc-shaped superstructure designed for in-field use. The system incorporates both unique hardware features (including multiple cameras per arc and structured illumination to enhance the visual texture of plant surfaces) and software algorithms (including 3D feature extraction of plant height, number of leaves, leaf area, and plant biomass). Results demonstrate the ability to reconstruct complete 3D models of the plants growing in the natural outdoor environment of a farm. The system allows photo-realistic plant models to be created from an optimum (i.e. minimum) number of digital color cameras positioned at different viewing angels. Experimental results of comparisons between different sets of viewing angles reveal that the top views are most advantageous for small plants while the side views provide greater information content for larger plants, where top views are detrimental in estimating plant height due to plant main stem occlusion by top leaves.
For digital plant phenotyping huge amounts of 2D images are acquired. This is known as one part of the phenotyping bottleneck. This bottleneck can be addressed by well-educated plant analysts, huge experience and an adapted analysis software. Automated tools that only cover specific parts of this analysis pipeline are provided. During the last years this could be changed by the image processing toolbox of LemnaTec GmbH. An automated and intuitive tool for the automated analysis of huge amounts of 2D data. Various image processing pipelines like edge detectors or background foreground separators are available as well as machine learning routines for more sophisticated problems. Segmentation of single plant parts is possible for plant images on different scales from microtiter plates, petri dishes, and single plants in the greenhouse or on field scale. Single modules can be attached to build an adapted analysis pipeline for a specific dataset and then repeatedly used for datasets of a similar plant. This enables the extraction of parameters like convex hull, height, and diameter or leaf area.
For applications like the geometric parameterization of the complete plant, the classification of the ears from cereal field images using RGB cameras or 3D laser scans, or the segmentation of leaves by using hyperspectral images are possible in high throughput. Once created parameterization pipelines can be easy adapted to different plant species.
Two application scenarios using this software are described in detail within this publication. An automated running analysis pipeline for the parameterization of geometric plant parameters by using RGB photos is shown on greenhouse scale. This is based on an automated acquisition using LemnaTec conveyor systems and an adapted measuring booth. Furthermore we show the localization of plant organs by using radiometric features on images coming from crane based measuring platform on field scale.
The image analysis software LemnaGrid (LemnaTec GmbH, Aachen) provides a professional tool that enables the intuitive connection of different image processing algorithms. It is adaptable for different plant types and on different scales. In this process the data processing can use different sensor data coming from RGB, 3D, hyperspectral or fluorescence imaging.