Sweet potato quality analysis is enhanced with hyperspectral imaging and AI

A man takes an image of a sweet potato with a camera in a laboratory setting
Md Toukir Ahmed, a doctoral student in agricultural and biological engineering at the University of Illinois Urbana-Champaign, takes images of a sweet potato with a hyperspectral imaging camera.

Sweet potatoes are a popular food choice for consumers worldwide because of their delicious taste and nutritious quality. The red, tuberous root vegetable can be processed into chips and fries, and it has a range of industrial applications, including textiles, biodegradable polymers, and biofuels.

Sweet potato quality assessment is crucial for producers and processors because features influence texture and taste, consumer preferences, and viability for different purposes. A new study from the University of Illinois Urbana-Champaign explores the use of hyperspectral imaging and explainable artificial intelligence (AI) to assess sweet potato attributes.

“Traditionally, quality assessment is done using laboratory analytical methods. You need different instruments to measure different attributes in the lab, and you need to wait for the results. With hyperspectral imaging, you can measure several parameters simultaneously. You can assess every potato in a batch, not just a few samples. Spectral imaging is non-invasive, fast, accurate, and cost-effective,” said Mohammed Kamruzzaman, assistant professor in the Department of Agricultural and Biological Engineering (ABE), part of the College of Agricultural, Consumer and Environmental Sciences (ACES) and The Grainger College of Engineering at Illinois.  

The study is part of a multi-state collaboration funded by the U.S. Department of Agriculture that includes researchers from Mississippi, North Carolina, Michigan, Louisiana, and Illinois. Each university addresses different aspects of the project; Kamruzzaman’s team focuses on the assessment of three chemical attributes — dry matter, firmness, and soluble sugar content (degree brix) — which affect the market price and whether a potato is suitable for the consumer or for processing.

The researchers use a visible near-infrared hyperspectral imaging camera to take images of sweet potatoes from two different angles. Analyzing the images produces spectral data, which are used to identify key wavelengths and develop color maps that display the distribution of desired attributes.

Hyperspectral imaging has become an important tool in agricultural and food processing research. However, it generates a vast amount of data that is processed with machine learning. It’s complex and typically acts like a black box, where users don’t know what is happening.

“We combine hyperspectral imaging with explainable AI, allowing us to understand the processes behind the results. It is a way to visualize how the machine learning algorithms work, how input data are processed, and how features are connected to predict the output,” said Md Toukir Ahmed, a doctoral student in ABE and lead author of the paper.

“We believe this is a novel application of this method for sweet potato assessment. This pioneering work has the potential to pave the way for usage in a wide range of other agricultural and biological research fields as well.”

The results can help industry professionals and researchers understand the significance of different features in predicting quality attributes, which leads to more informed decision-making and ensures supplies of higher-quality products to consumers.

Kamruzzaman said one goal of the multi-university project is to develop a tool that processors can use to quickly and easily scan batches of sweet potatoes to determine features and attributes. Eventually, researchers could create a mobile app consumers can use in the grocery store to scan the quality of sweet potatoes at the point of purchase.

The paper, “Advancing sweetpotato quality assessment with hyperspectral imaging and explainable artificial intelligence,” is published in Computers and Electronics in Agriculture [doi.org/10.1016/j.compag.2024.108855].

This work was funded by the U.S. Department of Agriculture Agricultural Marketing Service through the Specialty Crop Multistate Program grant AM21SCMPMS1010. The contents are solely the responsibility of the authors and do not necessarily represent the official views of the USDA.

 

Story Source(s)