What if everyday cameras (the same ones in our phones) could reveal the hidden chemistry of crops? Researchers at the University of Illinois Urbana-Champaign are making that a reality, turning low-cost RGB cameras into powerful tools that can read plant nutrients, assess crop stress, and even predict growth.
Today, farmers and manufacturers rely on expensive multispectral cameras to understand what’s happening inside their crops. These tools can cost upward of $10,000, putting them far out of reach for many operations. By contrast, most people already own an RGB camera — but these only capture visible colors.
That’s where the Illinois team steps in. Using advanced machine learning, they’re “translating’’ simple photos into multispectral and hyperspectral images rich with chemical information. The impact could be transformative: instant crop diagnostics using tools anyone can afford.
“An RGB camera captures only the visible range in three bands, red, green, and blue. The pictures cannot provide any chemical information, which you often need for crop analysis. We reconstructed images from these three bands to include information from the near-infrared range, which you can use to determine chemical composition,” said Mohammed Kamruzzaman, assistant professor in ABE and corresponding author on both studies. “This work has many potential applications in the agricultural industry and can significantly lower costs. While a multispectral camera costs $10,000 or more, you can get an RGB camera for a few hundred dollars,” he added.
The sweet potato breakthrough
In their first paper, the team built Agro-HSR, the largest agricultural hyperspectral reconstruction dataset to date. It features more than 1,300 paired RGB and hyperspectral images from 790 sweet potatoes — a crop important for food, fuel, and fiber.
“Most existing image reconstruction models focus on non-biological objects like tables and chairs, which are very different from biological objects. Our goal was to create an RGB-to-hyperspectral image dataset for a biological sample and make it publicly available,” said lead author Ocean Monjur.
Sweet potato quality traits like brix, moisture, and firmness normally require destructive lab testing. The reconstructed hyperspectral images, however, correlated strongly with real measurements, showing that machine learning can “see” chemical traits without cutting into a single potato.
The team also benchmarked several models, finding that Restormer and MST++ delivered the most accurate reconstructions.
“To our knowledge, this is the largest dataset for hyperspectral image reconstruction, not just for agriculture but overall. We are providing this database so anyone can use it to train or develop their own models, including models for other agricultural products,” Kamruzzaman said.

A handheld field tool for maize
In their second study, the researchers moved from the lab to the field — and from sweet potatoes to maize, one of the world’s most important crops. They developed a new reconstruction model, the Window-Adaptive Spatial-Spectral Attention Transformer, designed to estimate chlorophyll levels, a key marker of plant health.
“Our target measure is chlorophyll content, which is an indicator of plant growth. With this device you can take a picture, get the chlorophyll content, and determine the crop’s growth status,” Kamruzzaman said.
Di Song, the study’s lead author, explained why their model performs so well:
“We combined spectral and spatial attention modes to establish an adaptive window that can discern crops from soil and other elements, capturing the complexity of a field environment. Then we reconstructed 10-band images to predict chlorophyll content, and we found our results performed better than other models,” he said.
The team didn’t stop at algorithm development. They built a handheld device that converts RGB images into multispectral data on the spot; no bulky equipment needed.
“We have developed a handheld device that incorporates the model. You can use it to take an RGB image, which will be converted to a multispectral image that provides much more information,” Song said. “Next, we plan to add a prediction model, so the farmer can simply take a picture and get the chlorophyll content without having to interpret the images.”



