From ground to air to space: Tillage estimates get tech boost

A small airplane of the type used in the study to upscale ground images to the satellite scale
Hyperspectral sensors attached to a small airplane allowed researchers to more accurately scale tillage estimates from the ground to the satellite level. Image from Alan Wilson on Flickr.

URBANA, Ill. – According to national USDA statistics, no-till and conservation tillage are on the rise, with more than three quarters of corn and soybean farmers opting for the practices to reduce soil erosion, maintain soil structure, and save on fuel. However, these estimates are based primarily on farmer self-reporting and are only compiled once every five years, potentially limiting accuracy. 

In a new study, University of Illinois scientists demonstrate a way to accurately map tilled land in real time by integrating ground, airborne, and satellite imagery.

“We’ve shown remote sensing can quantify regional-scale tillage information in a cost-effective manner. This field-level information can be used to support growers in their management practices, as well as to support agroecosystem modeling and provide tools to the USDA to verify their census data,” says the study’s lead author, Sheng Wang, a research assistant professor in U of I’s Department of Natural Resources and Environmental Sciences (NRES) in the College of Agricultural, Consumer and Environmental Sciences (ACES). He is also a research scientist in the Agroecosystem Sustainability Center (ASC) at U of I.

Wang and the research team took photos of the ground at participating field sites throughout Central Illinois, generating 6,719 GPS-tagged images. Then they arranged for an airplane equipped with high-powered hyperspectral sensors to fly over the region. The airborne system scanned 40,000 acres per hour and captured rich spectral signatures of the ground at a scale of about half a meter.

Wang fed the ground photos into a computer that learned to differentiate bare ground from crop residue, a hallmark feature of no-till and conservation tillage. After training on labeled ground images, the computer could interpret and predict hyperspectral images from the airborne sensor with about 82% accuracy. Using this ground-to-air upscaling as a model, the computers then developed an algorithm to scale up again, this time from the air to space, using satellite data.

Compared to upscaling directly from the ground to the satellite, which was only accurate about 22% of the time according to a separate analysis in the study, the airborne layer increased mapping accuracy to 67%.

“In remote sensing, we're always trying to link ground-truth data with spectral signals from satellites, but that represents a big scale mismatch. The intermediate-scale hyperspectral data helps to augment ground-truth data because it can provide both high resolution and accuracy. It’s a major innovation; nobody has done this in the agricultural world. This cross-scale technology significantly advances our capability to create ground-truth information,” says Kaiyu Guan, associate professor in NRES, founding director of the ASC, and senior author on the study.

Although the method was tested in Champaign and surrounding Illinois counties, Guan says the team is working to scale the technology to the broader Midwest and the nation. Now that airborne sensors and computers have been trained to detect evidence of tillage using ground images, it should be possible to forego or minimize ground photos in the next iteration.  

The article, “Cross-scale sensing of field-level crop residue cover: Integrating field photos, airborne hyperspectral imaging, and satellite data,” is published in Remote Sensing of Environment [DOI: 10.1016/j.rse.2022.113366]. The research was supported by the U.S. Department of Energy ARPA-E SMARTFARM projects and the Foundation for Food & Agriculture Research (FFAR) Seeding Solutions Award. Partial funding was also provided by a Foundation for Food and Agriculture Research Seeding Solutions Award, the National Science Foundation, the USDA-NIFA AIFARMS project, and the C3.ai Digital Transformation Institute.

Airplane image from Alan Wilson on Flickr.

Story Source(s)