Login

14th ICPA - Session

Session
Title: Applications of UAS 2
Date: Tue Jun 26, 2018
Time: 1:15 PM - 3:00 PM
Moderator: Yubin Lan
Rape Plant NDVI Spatial Distribution Model Based on 3D Reconstruction

Plants’ morphology changes in their growing process. The 3D reconstruction of plant is of great significance for studying the impacts of plant morphology on biomass estimation, illness and insect infestation, genetic expression, etc. At present, the 3D point cloud reconstructed through 3D reconstruction mainly includes the morphology, color and other features of the plant, but cannot reflect the change in spatial 3D distribution of organic matters caused by the nutritional status (e.g. chlorophyll content), as well as illness and insect infestation of the plant. Multispectral photography can reflect the distribution of organic content and other chemical values, and has been extensively applied in such fields as near-ground remote sensing and non-destructive test of quality of agricultural products. In this paper, 31 pieces of multispectral images of 4-leaf rapes are collected for spatial 3D reconstruction through the structure from motion (SFM) method to obtain the 3D point clouds of the rape and filter noise points therein. The maximum length deviation is found to be 0.1023cm and the RMSE is 0.052599 after the model obtained is evaluated with the control point and length, indicating that this method is of good spatial uniformity and accuracy in reconstructing the model obtained. NDVI exponential spatial distribution is then calculated. It is finally proved that the model obtained is of great significance for studying the spatial distribution of plant nutrition, illness and insect infestation in the future.

Yang Chen (speaker)
CN
Yong He
College of Biosystems Engineering and Food Science, Zhejiang University
CN
Length (approx): 15 min
 
Effectiveness of UAV-Based Remote Sensing Techniques in Determining Lettuce Nitrogen and Water Stresses

This paper presents the results of the investigation on the effectiveness of UAV-based remote sensing data in determining lettuce nitrogen and water stresses. Multispectral images of the experimental lettuce plot at Cal Poly Pomona’s Spadra farm were collected from a UAV. Different rows of the lettuce plot were subject to different level of water and nitrogen applications. The UAV data were used in the determination of various vegetation indices. Proximal sensors used for ground-truthing included: handheld spectroradiometer, chlorophyll meter, and water potential meter. Relationship between the aerial and proximal sensor data are shown and discussed. Also shown is the relationship between the sensor data and plant height and leaf numbers.

Subodh Bhandari (speaker)
Professor
Cal Poly Pomona
Pomona, CA, California 91768
US
Dr. Subodh Bhandari is a professor in the Department of Aerospace Engineering at Cal Poly Pomona and the Director of its Unmanned Aerial Vehicles (UAV) Lab. His current research emphasis is on increased autonomy of unmanned aerial vehicles (UAVs), UAV-UGV collaboration, robust and intelligent control, collision and obstacle avoidance system for UASs, and developing capabilities for widespread use of unmanned vehicles including precision agriculture and 3-D mapping.
Amar Raheja
Mohammad Chaichi
Robert Green
Mehdi Ansari
Antonio Espinas
Frank Pham
Length (approx): 15 min
 
Calibrated UAV Image Data for Precision Agriculture

The success of precision agriculture requires data, analytics, and automation.  Rapid growth in all three areas has been rapid over the last few years, and this is particularly true in the realm of data, with many new sensors and sensor platforms now available to provide “big data.”  Fixed-wing UAVs have been viewed as a new platform for data collection that can provide flexible, inexpensive, high-resolution image data over large fields in a reasonable amount of time.  For high-resolution remote-sensing images from fixed-wing UAVs to provide actionable information for precision agriculture, the data derived from them must be readily accessible and reliable.  Commercial systems and most research programs do not currently provide means for (1) fully automated mosaicking and calculation of spectral indices and plant height, or (2) calibrating these data.  However, recent research at Texas A&M University has been conducted to develop a system of combined ground-control points (GCPs) with reflectance targets and height-calibration targets that enable rapid, automated, radiometric calibration for calculation of accurate spectral indices and height calibration for calculation of accurate plant-height data.  Each GCP provides an RTK-GPS position reference, multiple reflectance targets spanning the expected dynamic range in the multispectral image data acquired, and multiple platforms at known heights above ground that span most of the expected dynamic range in the plant-height data.  Software has been developed to radiometrically calibrate an image mosaic automatically as follows: (a) ingest an image mosaic created for a given field, (b) automatically find the GCPs in the image data, (c) extract digital numbers (DNs) from the reflectance targets, (d) create the DN-to-reflectance calibration function, and (e) calibrate the image mosaic for reflectance.  The software can also calibrate plant-height data as follows: (f) ingest a surface model based on structure from motion across the mosaic, (g) extract vertical positions of the known-height platforms on the previously located GCPs, (h) create the estimated-to-actual height calibration function, and (i) calibrate the surface model across the mosaic for height.  Calibrated reflectance data have been shown to be much more accurate than uncalibrated data, having an average error less than 3%.  The calibration of plant-height data has been shown to reduce the error in surface models by 20%.  This combined system of physical GCPs with reflectance and height targets, along with software for automated processing, has the potential to provide accurate reflectance-based spectral indices and plant-height data across large fields rapidly after image acquisition.

J. Alex Thomasson (speaker)
Professor and Department Head
Mississippi State University
Mississippi State, MS 39762
US

Dr. Alex Thomasson is Professor, Department Head, and the Berry Endowed Chair in Agricultural and Biological Engineering at Mississippi State University (MSU).  He also serves as the Founding Director of MSU’s Agricultural Autonomy Institute, formed in 2023 to focus on autonomous machinery systems for agricultural production.  Dr. Thomasson is a registered professional engineer and previously held faculty positions at Texas A&M University (TAMU) and MSU, after starting his career as a research engineer with USDA-ARS.  He currently serves as adjunct professor at both TAMU and University of Southern Queensland (Australia).  He earned his Ph.D. at University of Kentucky, his M.S. at Louisiana State University, and his B.S. at Texas Tech University, all in agricultural engineering.  His expertise includes agricultural ground-based robots and unmanned aerial vehicles, precision agriculture, high-throughput phenotyping, remote and proximal sensing, and optoelectronic sensor development.  He has authored or co-authored over 100 refereed journal articles and four book chapters and holds a patent in sensors for determining crop yield.  Dr. Thomasson is a Fellow of ASABE (American Society of Agricultural and Biological Engineers) and is current President of CAST (Council for Agricultural Science and Technology).  He also serves on a working group of FCC’s Precision-Agriculture Connectivity Task Force.  Dr. Thomasson also belongs to ISPA (International Society for Precision Agriculture), SPIE (Society for Optics and Photonics), AUVSI (Association for Unmanned Vehicle Systems International), ASEE (American Society of Engineering Education), AAAS (American Association for the Advancement of Science), and Sigma Xi (The Scientific Research Honor Society).

Chenghai Yang
Research Agricultural Engineer
USDA-ARS
College Station, TX 77845
US

Dr. Chenghai Yang is a Research Agricultural Engineer with the USDA-ARS Aerial Application Technology Research Unit in College Station, TX. His research has focused on the development and application of remote sensing technologies for precision agriculture and pest management since 1995.

William Rooney
Length (approx): 15 min
 
Snap Bean Flowering Detection from UAS Imaging Spectroscopy

Sclerotinia sclerotiorum (white mold) is a fungus that infects the flowers of snap beans and causes a reduction in the number of pods, and subsequent yields, due to premature pod abscission. Snap bean fields typically are treated with prophylactic fungicide applications to control white mold, once 10% of the plants have at least one flower. The holistic goal of this research is to develop spatially-explicit white mold risk models, based on inputs from remote sensing systems aboard unmanned aerial systems (UAS). The objectives of this study are to i) identify spectral signatures for the onset of flowering towards optimal timing of the fungicide application and ii) investigate spectral characteristics of white mold onset in snap bean, for eventual inclusion in the risk models. This paper concentrates on the first objective. The study area was located at Cornell University, Geneva, NY, USA. A DJI Matrice-600 UAS, boasting a high spatial resolution color (RGB) camera, a Headwall Photonics Nano-imaging spectrometer (272 bands; 400-1000 nm), and a Velodyne VLP-16 light detection and ranging (LiDAR) system, was utilized to collect the data. High frequency flights were flown around days when various portions of the snap bean fields were beginning to flower. The imaging spectroscopy data were first ortho-rectified and then mosaicked using GPS/IMU (inertial measurement unit) information from the UAS. The imagery was calibrated into reflectance data using the empirical line calibration method, based on in-field black/white calibration panels. Samples of flowering and non-flowering snap bean spectra were selected from the hyperspectral imagery, followed by single feature linear discriminant analysis to determine which ratio indices, normalized difference indices, and wavelengths critical for discriminating between flowering and non-flowering plants. Next, the features with the highest c-index trained linear discriminant, logistic regression, and support vector machine models. Results showed that the linear discriminant model had the highest test accuracy of 93%, 95%, and 92% for 20, 10, and 3 features, respectively. These results are promising for eventual implementation in disease risk models.

Ethan Hughes (speaker)
Research Assistant
RIT
Rochester, NY 14623
US
Sarah Pethybridge
Carl Salvaggio
Julie Kikkert
Sr. Extension Associate
Cornell Vegetable Program, Cornell Cooperative Extension, Canandaigua, NY
Canandaigua, NY, NY 14424
US
Currently work as a Vegetable Crops Specialist for the Cornell Vegetable Program, a Regional Agricultural Team that serves commercial vegetable producers in Western, NY. Focus on large acreage vegetable crops for the canning and freezing industry. Ph.D. and M.S. University of Wisconsin-Madison in Horticulture and Botany. B.S. University of Maryland-College Park in Horticulture.
Length (approx): 15 min
 
Pest Detection on UAV Imagery Using a Deep Convolutional Neural Network

Presently, precision agriculture uses remote sensing for the mapping of crop biophysical parameters with vegetation indices in order to detect problematic areas, and then send a human specialist for a targeted field investigation. The same principle is applied for the use of UAVs in precision agriculture, but with finer spatial resolutions. Vegetation mapping with UAVs requires the mosaicking of several images, which results in significant geometric and radiometric problems. Furthermore, even at such resolutions, it is still not possible to precisely identify the nature of the detected stresses. The concept proposed here aims to use UAVs for precise and automated pest detection and identification with images acquired a few meters above the crop canopy, at millimetric resolution.

The image processing is based on artificial intelligence (deep learning) computer vision methods. These methods are trained with images collected for different crops and symptoms. The UAV image acquisitions calendar is optimized using a bioclimatic model that evaluates disease risk. The spatial acquisition plan prioritizes areas with persistent moisture, where the probability of pest presence is higher. These areas could be determined using optical or SAR satellite imagery.

This approach was applied to detect diseases in a vineyard (mildew), potato beetles and weeds (in lettuce, carrot and onion fields). All experimental fields were located in Quebec, Canada. Results show that the application of the deep learning technique to crop canopy UAV images can reach a success rate above 90%, which demonstrates the potential of this approach. Thus, the proposed concept is a major innovation in the application of UAVs in agriculture. It will allow the effective control of pests by optimizing pesticide use while reducing the waste of resources and the harmful effects of chemical products.

Yacine Bouroubi (speaker)
Pr
University of Sherbrooke
Montreal, AL, Quebec H1J2A5
CA
Carl Bélec
Agriculture and Agri-Food Canada
Philippe Vigneault
Research professional
Agriculture and Agri-Food Canada
Saint-Jean-sur-Richelieu, AL, Québec J3B 3E6
CA
Length (approx): 15 min
 
Unmanned Aerial Systems and Remote Sensing for Cranberry Production

Wisconsin is the largest producer of Cranberries in the United States with 5.6 million barrels produced in 2017. To date, Precision Agriculture technologies adapted to cranberry production have been limited. The objective of this research was to assess the feasibility of the use of commercial remote sensing devices and Unmanned Aerial Systems in cranberry production. Two commercially available sensors were assessed for use in cranberry production: 1) MicaSense Red Edge and 2) Zenmuse XT. Initial investigation assessed the cranberry beds during the growing season. Multi-spectral remote sensing and vegetative index images have previously been used to identify regions within the cranberry beds where fertilizer deficiencies exist and the presence of pest damage. Images were collected bi-weekly during the growing season and variations in vegetative indices were successfully detected within the beds. These could be attributed to fertilizer deficiencies or other potential issues within the bed. Further ground truthing of the data is required. Continuation of this research is currently underway to utilize the combination of the above remote sensing technologies to detect regions within the cranberry beds infested by cranberry insect pests. A replicated trial was conducted by introducing sparganothis fruitworm (Sparganothis sulfureana Clemens)and fall armyworm (Spodoptera frugiperda Smith) larvae onto cranberry plants in a greenhouse setting. Multi-spectral and thermal images of the damaged cranberry plants were collected weekly. Results showed Normalized Difference Vegetative Index values decreased as insect damage increased. The vegetative index values were shown to increase again as the plants grew and more biomass was present. Larvae density was not sufficiently high to cause noticeable increases in temperature of the plants. Field scale assessment of these technologies will be conducted during the 2018 growing season.

Brian Luck (speaker)
University of Wisconsin-Madison
Madison, WI, Wisconsin 53706
US
Jessica Drewry
Elissa Chassen
Length (approx): 15 min
 
Yield Assessment of a 270 000 Plant Perennial Ryegrass Field Trial Using a Multispectral Aerial Imaging Platform

Current assessment of non-destructive yield in forage breeding programs relies largely on the visual assessment by experts, who would categorize biomass to a discrete scale. Visual assessment of biomass yield has inherent pitfalls as it can generate bias between experimental repeats and between different experts. Visual assessment is also time-consuming and would be impractical on large-scale field trials. A method has been established to allow for a rapid, non-destructive assessment of biomass yield of forages using aerial based multi-spectral imaging technologies. This method uses aerial surveillance platforms, including a 3DR Solo with Parrot Sequoia sensor and DJI S1000+ with a Tetracam MCA-12 sensor, to take weekly images of a field trial that consists of a global perennial ryegrass reference population of 270 000 individual plants from a 1300 varieties/breeding lines. Multispectral images are processed through Pix4D to create a georeferenced ortho-mosaic image with an average ground sampling distance (GSD) below 2cm. Fifteen ground control points (GCPs) are located across the site and georeferenced with an RTK-GNSS receiver. This allows for accurate and repeatable georeferencing of the ortho-mosaic images. Plant vegetation indices are extracted from the ortho-mosaic image as a point *.*shp file for further processing in QGIS software.  Plant indices are then calculated and processed for single plants, rows or plots based on user-defined georeferenced areas in QGIS software allowing for the quantitative measurements of various vegetation indices. Initial assessment of the correlations between actual biomass yield and NDVI, in spaced-plant and sward ryegrass field trials, have shown significant positive correlations with a correlation coefficient up to 0.94. This screening technique has resulted in a 2000 fold reduction in staff hours for the non-destructive estimation of biomass yield of single plants in this perennial ryegrass field trial. 

Pieter Badenhorst (speaker)
Senior Research Scientist
Department of Economic Development, Jobs, Transport and Reso
Hamilton, AL, Victoria 3300
AU
Length (approx): 15 min