Title: Unmanned Aerial Systems 2
Date: Tue Aug 2, 2016
Time: 1:00 PM - 3:00 PM
Moderator: N/A
Large-scale UAS Data Collection, Processing and Management for Field Crop Management

North Dakota State University research and Extension personnel are collaborating with Elbit Systems of America to compare the usefulness and economics of imagery collected from a large unmanned aircraft systems (UAS), small UAS and satellite imagery. Project personnel are using a large UAS powered with an internal combustion engine to collect high-resolution imagery over 100,000 acres twice each month during the crop growing season. Four-band multispectral Imagery is also being collected twice each month with the large UAS at 4,000’, 6,000’ and 8,000’ altitude over the 4x40 mile corridor. Researchers are using small UAS to collect imagery of selected fields within the flight corridor. Since current US Federal Aviation Administration regulations require UAS line-of-sight operation, project personnel are flying a manned aircraft chase plane within visual line of sight of the UAS with a visual observer onboard. Agricultural research objectives include using the various types and sources of imagery to detect selected crop diseases, nutrient deficiencies in corn, wheat and soybeans, and impacts of excess soil moisture on crop development. Collaborating farmers are sharing detailed soil analyses and field observations, in-field optical sensor data, and crop harvest yield data in selected fields. All project imagery is transferred to, and securely stored, on NDSU Center for Computationally Assisted Science and Technology (CCAST) computers. Image processing and analyses is conducted using desktop computers. All imagery collected on the project is being made available to each land owner and agricultural producers within the image collection corridor. The project is funded jointly by the North Dakota Department of Commerce and Elbit Systems of America.   

John Nowatzki (speaker)
Yuval Chaplin
Length (approx): 20 min
Weather Impacts on UAV Flight Availability for Agricultural Purposes in Oklahoma

This research project analyzed 21 years of historical weather data from the Oklahoma Mesonet system.  The data examined the practicality of flying unmanned aircraft for various agricultural purposes in Oklahoma.  Fixed-wing and rotary wing (quad copter, octocopter) flight parameters were determined and their performance envelope was verified as a function of weather conditions.  The project explored Oklahoma’s Mesonet data in order to find days that are acceptable for flying unmanned aircraft and determining specific time periods which meet criteria for varying degrees of accuracy.  Since Oklahoma has great regional variability in weather and crops, the flight recommendations were assigned to specific Mesonet sites.  The results of this work will help define optimum dates of flight for specific crop issues and determine which Mesonet sites to utilize for those crops.  Weather data studied included wind speed and direction, cloud cover, relative humidity, precipitation, and solar radiation.  Varying levels of UAV/sensor performance based on data requirements (i.e., pretty pictures, relative NDVI, geo-rectified radiometric imagery, research-grade data) were considered when establishing flyable weather criteria.

Cooper Morris (speaker)
Paul Weckler
Key Coop
Length (approx): 20 min
Privacy Issues and the Use of UASs/Drones in Maryland

 According to the Federal Aviation Administration (FAA), the lawful use of Unmanned Aerial Vehicles (UAV), also known as Unmanned Aircraft Systems (UAS), or more commonly as drones, are currently limited to military, research, and recreational applications. Under the FAA’s view, commercial uses of drones are illegal unless approved by the Federal government.  This will change in the future.  Congress authorized the FAA to develop regulations for the use of drones by private parties in the U.S by September 30, 2015 (FAA Modernization Act of 2012).  FAA missed this deadline, but expects comprehensive regulations for drones to be completed by June 2016 (Jansen, 2015).

Paul Goeringer (speaker)
Ashley Ellixson
Length (approx): 20 min
Spatial-temporal Evaluation of Plant Phenotypic Traits Via Imagery Collected by Unmanned Aerial Systems (UAS)

Unmanned aerial systems (UAS) and a stereovision approach were implemented to generate a 3D reconstruction of the top of the canopy. The 3D reconstruction or CSM (crop surface model) was utilized to evaluate biophysical parameters for both spatial- and temporal-scales. The main goal of the project was to evaluate sUAVs technology to assist plant height and biomass estimation. The main outcome of this process was to utilize CSMs to gain insights in the spatial-temporal dynamic of plants within the experiment.  Four experiments were carried out during 2015 corn growing season and utilized for ground-truth validation. The experiments were located at Ashland Bottoms Research Farm, Kansas State University (Manhattan, KS). Flight missions and ground-truthing were accomplished at two critical stages of biomass accumulation. At known locations individual plant height and biomass were measured and correlated to plant height estimation at the same locations in the CSMs. Same samples plants were then compared to a stem volume estimation by mixing estimated plant height from the CSM and ground stalk diameter from the same plants and locations. Plant height correlation was stronger at flowering stage than 2 weeks –prior flowering. Plant biomass estimation became stronger by adding ancillary field data. Per-plant basis results suggested that the CSMs could assist prediction of biophysical variables. Outcomes from this study confirmed that UASs could assist predicting “key” traits (“on-farm rapid phenotyping”).

Sebastian Varela (speaker)
Guillermo Balboa
Vara Prasad
Ignacio Ciampitti
Length (approx): 20 min
In-season Diagnosis of Rice Nitrogen Status Using Crop Circle Active Canopy Sensor and UAV Remote Sensing

Active crop canopy sensors have been used to non-destructively estimate nitrogen (N) nutrition index (NNI) for in-season site-specific N management. However, it is time-consuming and challenging to carry the hand-held active crop sensors and walk across large paddy fields. Unmanned aerial vehicle (UAV)-based remote sensing is a promising approach to overcoming the limitations of proximal sensing. The objective of this study was to combine unmanned aerial vehicle (UAV)-based remote sensing system and Crop Circle ACS-430 to estimate rice (Oryza sativa. L.) N status for guiding topdressing N application in Northeast China. Two N rate experiments involving two different varieties were conducted in 2014 at Jiansanjiang Experiment Station of China Agricultural University, Heilongjiang Province, Northeast China. An active canopy sensor Crop Circle ACS-430 with three spectral bands (red(R), red edge (RE) and near infrared (NIR)) and an Octocopter UAV equipped with a Mini Multi-Camera Array (Mini-MCA) imaging system with five spectral bands (blue (B), green (G), R, RE and NIR) were used to collect reflectance data at the panicle initiation (PI) and stem elongation (SE) stages. The preliminary results indicated that Crop Circle ACS430-based vegetation indices (VIs) explained 79-80% and 86-87% variability of aboveground biomass (AGB) and plant N uptake (PNU), respectively, but had very poor relationship with plant N concentration (PNC) (R2 = 0.16-0.21) across all stages. The N sufficiency index (NSI) calculated with Crop Circle ACS-430 vegetation indices (NNI-VIs) had better correlation with NNI than the original VIs, especially at SE stage and across both stages, with the best R2 of 0.65 and 0.69. UAV-based remote sensing VIs could be used to estimate Crop Circle VIs and NSI-VIs very well at both growth stages. The NSIVIs-NNI approach performed well for diagnosing rice N status. Combining UAV-based remote sensing system and Crop Circle ACS-430 had a good potential for in-season diagnosis of rice N status at PI stage, with the highest accuracy rate (90%) and kappa statistics (0.62), but did not perform well at SE stage and across both stages. More studies are needed to further evaluate these different strategies.

Junjun Lu (speaker)
Length (approx): 20 min
Retrieving Crops' Quantitative Biophysical Parameters Through a Newly Developed Multispectral Sensor for UAV Platforms

Today’s intensive agricultural production needs to increase its efficiency in order to keep its profitability in the current market of decreasing prices on one hand, and to reduce the environmental impact on the other. Crop growers are starting to adopt side dressing nitrogen fertilization as part of their fertilization programs, for which they need accurate information about biomass development and nitrogen condition in the crop. This information is usually acquired through ground sampling, missing the spatial variability, and therefore forcing an average field-base management.

The Robin System has shown high capability for identifying spatial variability throughout a large range of crops and conditions. These results have established the basis to start developing algorithms for the retrieval of quantitative biophysical parameters. Synthetic data was used for establishing empirical relationships between crops’ biophysical parameters and reflectance data. A large Look-Up-Table (LUT) was build, from which the most reliable and sensitive functions were selected for retrieving Chlorophyll content and Leaf Area Index (LAI) from the Robin Eye spectral bands.

The main objective of this study was to validate the crop biophysical parameters retrieved using the selected LUT functions from two Robin Eye images that were acquired over wheat crop in South Africa during the southern winter season of 2015. Between the two acquisition dates, ground sampling was performed for biomass and nitrogen content analysis. As the sampling was performed after the first image was acquired, the definition of the sampling points was performed from the first image so to characterize the spatial variability of the field. A coefficient of determination of 0.96 was obtained for the LAI vs. Biomass relationship, while 0.97 for the Chlorophyll Content vs. Nitrogen concentration relationship. These results confirm that the combination of highly sensitive and accurate data together with robust theoretical models, can generate reliable and valuable information for the crop decision making process.

Agustin Pimstein (speaker)
Length (approx): 20 min