Date: Mon Jun 25, 2018
Time: 3:30 PM - 5:00 PM
Moderator: Sun-ok Chung
Feeding a global population of 9.1 billion in 2050 will require food production to be increased by approximately 60%. In this context, plant breeders are demanding more effective and efficient field-based phenotyping methods to accelerate the development of more productive cultivars under contrasting environmental constraints. The leaf area index (LAI) is a dimensionless biophysical parameter of great interest to maize breeders since it is directly related to crop productivity. The LAI is defined as the one-sided photosynthetically active leaf area per unit ground area. Direct estimates of the LAI through leaf collection and subsequent leaf area determination in the laboratory are tedious and time-consuming. Hence, indirect methods based on gap fraction theory are frequently used for in situ LAI estimation. The LAI obtained from gap fraction analysis by most optical sensors available on the market is not the true LAI, but a term called the “effective LAI” that does not consider foliage clumping. Hemispherical images of the bottom-up view of crop canopies offer important advantages to maize breeders, such as a low cost compared to other commercial sensors, and it also may provide LAI estimates corrected for foliage clumping (i.e., true LAI). However, taking bottom-up hemispherical images in every single plot of a maize breeding program can take time and patience. The use of small-sized unmanned aerial vehicles (UAVs) in agriculture has allowed for crop information inference at spatial and temporal resolutions that exceed the benefits of other remote sensing technologies (e.g., airborne, satellites). We assessed the efficacy of using UAVs to collect hemispherical images for estimating the LAI. To do this, we investigated the suitability of using nadir-view hemispherical images taken from a UAV flying at a low altitude (15 m) to accurately derive LAI estimates based on gap fraction analysis in a maize breeding trial carried out near Seville, Spain. Six maize cultivars grown in a split-plot design with three blocks and two irrigation treatments (well-watered and water-stressed) were used in the experiment. LAI estimates from top-down hemispherical imaging taken from the UAV were compared with LAI estimates from both bottom-up hemispherical imaging and direct LAI estimates obtained from an allometric relationship derived in the study. The results show that hemispherical images taken from a UAV flying at a low altitude can estimate the LAI of maize breeding plots as accurately as by the classical bottom-up hemispherical imaging approach. CAN-EYE software, which includes automatic image classification and allows the processing of a series of hemispherical photographs, was used in this experiment.
Targeted fertilization of grass clover leys shows high financial and environmental potentials leading to higher yields of increased quality, while reducing nitrate leaching. To realize the gains, an accurate fertilization map is required, which is closely related to the local composition of plant species in the biomass. In our setup, we utilize a top-down canopy view of the grass clover ley to estimate the composition of the vegetation, and predict the composition of the dry matter of the forage. Using a deep learning approach, the canopy image is automatically pixel wise segmented and classified into white clover, red clover, grass and weeds. While robust grass and clover segmentation has proven to be a difficult task to automate, red and white clover discrimination in images is challenging, even for human experts, due to many visual similarities between the two clover species. Using high-resolution color images with a ground sampling distance of 4 to 6 pixels per mm and data simulation of hierarchical labels, a cascaded convolutional neural network was trained for segmentation and classification. Clover, grass and weeds was automatically segmented and classified with a pixel wise accuracy of 87.3 percent, while red clovers and white clovers could be distinguished automatically with 89.6 percent accuracy. Utilizing the image analysis on 179 images of mixed crop plots of ryegrass, white clover and red clover, demonstrated a linear correlation between the detected clover and clover species fractions in the canopy, and the corresponding compositions in harvested dry matter.
From a Precision Agriculture perspective, it is important to detect field areas where variabilities in the soil are significant or where there are different levels of crop yield or biomass. Information describing the behavior of the crop at any specific point in the growing season typically leads to improvements in the manner the local variabilities are addressed. The proper use of dense, in-season sensor data allows farm managers to optimize harvest plans and shipment schedules under variable plant growth dynamics, which may originate from soil spatial variability and management conditions. Sensing of crop architectonics has been used as a diagnostic tool in this context. Moving from the subjective visual estimation of farm workers to automated sensing technologies allows for improved repeatability and savings in cost, time, and labor. The goal of this paper is to report on the evaluation of a prototype sensor system embedded in a portable, low-cost instrument for green vegetable production. The prototype system is currently in its second iteration, featuring improvements for issues found in a previous experiment. The system involves circular scanning of crop canopies to identify crop biomass yield using laser triangulation. The results of these scans are height profiles along an angular position from 0° to 360°, which are the input for the biomass estimation. Two approaches for processing the laser-based height profiles are discussed: regression of profile-representative features and inference of a canopy density function. An experiment was conducted in a spinach field of a commercial farm in Sherrington, Quebec, Canada. The coefficient of determination (R2) for regression between measured and predicted biomass was 0.78 and 0.94. The root mean square error (RMSE) was in turn 4.18 and 2.16 t/ha. The results indicate that the developed sensor system would be a suitable tool for rapid assessment of fresh biomass in the field. Its application would be beneficial in the process of optimizing crop management logistics, comparing the performance of different varieties of crops, and detecting potential stresses in a field environment.
Mapping field environments into point clouds using a 3D LIDAR has the ability to become a new approach for online estimation of crop biomass in the field. The estimation of crop biomass in agriculture is expected to be closely correlated to canopy heights. The work presented in this paper contributes to the mapping and textual analysis of agricultural fields. Crop and environmental state information can be used to tailor treatments to the specific site. This paper presents the current results with our ground vehicle LiDAR mapping systems for broad acre crop fields. The proposed vehicle system and method facilitates LiDAR recordings in an experimental winter wheat field. LiDAR data are combined with data from Global Navigation Satellite System (GNSS) and Inertial Measurement Unit (IMU) sensors to conduct environment mapping for point clouds. The sensory data from the vehicle are recorded, mapped, and analysed using the functionalities of the Robot Operating System (ROS) and the Point Cloud Library (PCL). In this experiment winter wheat (Triticum aestivum L.) in field plots, was mapped using 3D point clouds with a point density on the centimeter level. The purpose of the experiment was to create 3D LiDAR point-clouds of the field plots enabling canopy volume and textural analysis to discriminate different crop treatments. Estimated crop volumes ranging from 3500-6200 (m3) per hectare are correlated to the manually collected samples of cut biomass extracted from the experimental field.
High resolution aerial images captured from unmanned aircraft systems (UASs) are recently being used to measure plant height over small test plots for phenotyping, but airborne images from manned aircraft have the potential for mapping plant height more practically over large fields. The objectives of this study were to evaluate the feasibility to measure cotton plant height from digital surface models (DSMs) derived from overlapped airborne imagery and compare the image-based estimates with the data from a tractor-mounted ultrasonic distance sensor. An airborne imaging system consisting of a red-green-blue (RGB) camera and a modified near-infrared (NIR) camera mounted on a Cessna 206 aircraft was flown along six flight lines over a 27-ha field at peak cotton growth and again with tilled bare soil. Images were captured at 370 m above ground level to achieve a ground pixel size of 0.09 m and side/forward overlaps of about 85%. The ultrasonic distance sensor and a centimeter-grade GPS receiver were mounted on a high-clearance tractor to collect cotton plant height data from every 8th row at 1-s intervals. The images taken on the two dates were processed to create orthomosaics and DSMs. Plant height was estimated from the difference between the two DSMs. Results showed that a significant linear relation existed between image-based and ground-based plant height estimates with a R2 value of 0.657 and a standard error of 0.11 m. The preliminary results from this study indicate that DSMs derived from overlapped airborne imagery have the potential to estimate and map plant height for monitoring crop growth conditions.
Above-ground biomass, along with chlorophyll content and leaf area index (LAI), is a key biophysical parameter for crop monitoring. Being able to estimate biomass variations within a field is critical to the deployment of precision farming approaches such as variable nitrogen applications.
With unprecedented flexibility, Unmanned Aerial Vehicles (UAVs) allow image acquisition at very high spatial resolution and short revisit time. Accordingly, there has been an increasing interest in those platforms for crop monitoring and precision agriculture. Typically, classic remote sensing techniques tend to rely on a vegetation index – such as the popular Normalized Difference Vegetation Index (NDVI) – as a proxy for plant biophysical parameters. However, when applied to UAV imagery, those approaches do not fully exploit the greater details provided by high resolution.
The purpose of this research is to develop a procedure for assessing above-ground biomass based on the analysis of very high resolution RGB imagery acquired with a UAV platform. A small consumer-grade UAV (the DJI Phantom 3 Professional) with a built-in RGB camera was flown over an experimental corn (Zea mays L.) field. A series of images were acquired in summer 2017 at very low altitudes, resulting in milli-resolution imagery (images with less than 1 cm per pixel). Two modes of image acquisition were performed: in a grid pattern at an altitude of 10m AGL (above ground level) for generating orthomosaics, and in a stationary mode at a height of 2.9m AGL. For stability reasons, the latter mode was simulated by a low-altitude platform hung on a zip-line.
Image acquisitions were repeated in time during the early stages of corn growth, covering phenological stages from V2 to V8. Oblique imagery was also acquired in order to evaluate the effect of viewing angle. Field measurement campaigns were carried out in order to provide quantitative measurements of some biophysical parameters, including plant fresh biomass, plant dry biomass, plant height, leaf fresh biomass and leaf dry biomass. The method proposed in this study is based on computer vision, which allowed extracting leaf projected area from the images for estimating biomass and detecting differences in corn growth. Using UAV-derived imagery to extract information on biomass proves to be a cost-effective means for monitoring crop biomass spatially and temporally.