Technology

For Farmers, Are AI Chatbots Worth the Risk?

Photo of David Silverberg

By David Silverberg

Aug 9, 2024

Graphic by Adam Dixon

As ChatGPT and other LLMs enter the mainstream, ag practitioners should consider the potential dangers and rewards.

When Jim Wally has a question to ask about his 10-acre chestnut farm near Alachua, Florida, he has no intention of firing up ChatGPT or any other chatbot to answer his query. “I’m horrified at the thought of using it as a resource for knowledge,” he said. “There is already so much strong research from universities and experts we can draw on, thanks to farming being one of the more data-driven fields.”

Wally is hardly an outlier, as ChatGPT’s shaky reliability has many skeptics among those studying the relationship between large-language model systems (LLMs) and agricultural queries.

Researchers from the U.S., UK, Kenya, Nigeria, and Colombia recently analyzed the accuracy of professional advice provided by ChatGPT (versions 3.5 and 4.0) to farmers in Africa. The researchers noticed inaccuracies that could lead to agricultural missteps and crop losses, mainly due to the chatbot outputting wrong information on various questions. Though their work was geographically targeted, the conclusions could apply to farmers in the U.S. and beyond.

The researchers wrote, among other conclusions, “[T]he chatbot provided inaccurate information related to planting time, seed rate, and fertilizer application rate and timing.”

In their article for Nature Food, the emerging-technologies analysts warned against the unmediated use of generative AI models in agriculture, noting that farmers might implement flawed recommendations that could lead to poorly managed practices, crop losses, and potential food crises. Instead, the researchers suggest a more “optimal development process for AI models in agriculture that includes thorough monitoring and testing before these models are widely implemented.”

In a series of questions posed by cassava root farmers in Nigeria — Africa’s most important cassava producer — the researchers studied recommended methods for cultivating the plant. In one example, ChatGPT suggested the use of herbicides, but erred in the timing of chemical application, which would instigate significant crop damage and food loss if the farmers followed its advice.

“The problem with our findings extends beyond the errors of the algorithm itself,” one of the lead authors, Asaf Tsachor, research affiliate at the University of Cambridge’s Centre for the Study of Existential Risk, said in a release. “Many had forewarned us about potential errors and inaccuracies. The fundamental problem is the absence of any safeguards against the widespread use of Large Language Models, and AI more broadly, in a system as sensitive as agriculture.”

(Ambrook Research made multiple attempts to contact Tsachor without success.)

“The chatbot provided inaccurate information related to planting time, seed rate, and fertilizer application rate and timing.”

This research is presented at a time when some farmers are dipping their boots into the AI chatbot space, albeit tentatively. Jim Ladlee, state program leader for Emerging & Advanced Technology at Penn State, has attended countless presentations on agriculture; he estimates up to 40 per cent of farmers he talked to had experimented with chatbots. “But they are doing it cautiously,” he noted, “and they are realizing how the system is a tool and you still need the human element in there.” He added that LLMs shouldn’t replace the suggestions and advice given by actual farmers with experience in managing crops.

Ladlee recalls how he once asked Google’s chatbot Bard, and then later Meta’s Gemini, for guidance on how to best can tomatoes. “They told me I should place tomatoes in an ice bath, and used an example of showing a person in an ice bath, for some reason, and it was just a silly reply to a serious question,” he said.

LLM chatbots are typically trained to ingest data from a slew of sources, most commonly billions of pages available online, from blog posts to Wikipedia articles to news stories. LLMs are managed by parameters — millions, billions, and even trillions of them. (Consider a parameter as something that assists an LLM decide between different answer choices, an attempt to weed out flawed sources.) OpenAI’s GPT-4 reportedly has 1 trillion parameters.

The training set for a particular chatbot will give it the expertise to deliver answers on a given topic, such as agriculture. But the LLM sector can also be plagued by the garbage in, garbage out theory: If the data an LLM has ingested is biased, incomplete, or undesirable in some way, then the response it gives could be equally unreliable or bizarre.

Farmers have to be particularly wary about using chatbots for advice related to, say, pesticide recommendations, Ladlee cautioned. “If an LLM is searching for suggested pesticides and scrubs all these websites to pick the popular item, I’m not sure that is necessarily effective,” he said. “A lot of people are trying to see LLMs as a magic wand that are going to solve everything and that is not the case.”

“A lot of people are trying to see LLMs as a magic wand that are going to solve everything and that is not the case.”

Research tackling other areas of chatbot reliability contend these systems should complement human work as opposed to replace it. A May 2024 report by Marquette University computer science researchers focusing on chatbot disease classification queries found that “AI chatbots demonstrate potential for disease classification from patient complaints, but their reliability varies significantly.” But the report says, “these scores, even at their best, do not reach a level that ensures absolute reliability. One reason for that is AI chatbots, like OpenAI’s ChatGPT, are trained on a broad dataset from the open internet, which makes them susceptible to reflecting web-based biases and association.”

That conclusion echoes what Ladlee noted about the importance of human intervention for questions related to farming practices.

That said, some agricultural specialists believe LLM systems can be leveraged for less technical tasks. Crop management might not be in ChatGPT’s wheelhouse, but farmers may use it for business-related applications, said Tim Hammerich, host of the podcast The Future of Agriculture. “Look at all the compliance work a farmer has to do,” he said, “and a chatbot can help with paperwork related to say, commodity group compliance or government documentations. That’s where I see the most immediate value.”

Wally echoed Hammerich, saying, “I can see ChatGPT being great for something like writing copy for marketing materials.”

Meanwhile, the Farmer Business Network continues to fine-tune Norm, an AI chatbot that declares itself an experiment and notes in the small copy below the question-field box, “Not intended for real agronomic guidance.”

It can answer questions on pest and disease strategies, livestock and animal health, and general agronomic advice. Questions about soil condition and the right to time plant certain crops are also common.

Norm’s creator, Kit Barron, told Ambrook Research last year, “Over time it’s become more of a useful tool, we’re seeing a lot more seasonally relevant, directed questions. People are treating it as an ag advisor, or another trusted consultant on their farm. Now, they’re asking about post-harvest [tactics], fertilizer regimes, or herbicide applications, and that’s been really great to see.”

“I can see ChatGPT being great for something like writing copy for marketing materials.”

Yet basic questions, such as asking Norm for how often to spray herbicides on corn fields, yielded answers that were vague and cautious, which some may contend is an improvement on blatant unreliability. After all, many LLMs hedge their answers in order to provide information that may require more context. “The frequency of herbicide application on your corn farm can depend on several factors, including the type of herbicide you are using, the specific weed pressures you are facing, and the growth stage of your corn.”

When asked about a more research-driven questions, such as how the Inflation Reduction Act is relevant to farmers, it offered much more concrete advice on, for example, harnessing the Act’s conservation programs: “The IRA allocates significant funding to conservation programs, which can provide financial assistance for practices that improve soil health, water quality, and biodiversity.”

In India, Farmer.CHAT launched in 2023 to deliver “data-driven insights to improve policies that support farmers,” according to its website. Also last year, Helios launched Cersi, a chatbot that generates billions of climate, economic and political signals to forecast potential supply chain risk down to the farm level for American users. The virtual assistant is part of Helios’ larger risk management platform.

Another platform, agri1, works similarly to Norm and invites image submissions. Its website notes, “agri1 utilizes images to better understand and manage pest control, soil conditions, insects, diseases, crops, and other agricultural imagery. A beta version of image detection is currently running.”

This surge of agri-focused chatbots could auger a possible future with tailored solutions that goes beyond the general advice of what is available on ChatGPT. Still, significantly more research has to determine the reliability and accuracy of these models.

Embracing new forms of technology may not come easy to older generations of farmers, but some seem open to relevant innovation they recognize as essential to understand, at the very least.

As Ladlee said, “More experienced farmers have come up to me and admitted that certain technologies might not be right for them but they want to know about this stuff. There is an understanding that technology can help with efficiency, sustainability, and a way to keep the farm alive for the future.”

Author


Photo of David Silverberg

David Silverberg

David Silverberg is a freelance journalist in Toronto who writes for BBC News, Agriculture Dive, The Toronto Star, MIT Technology Review, and many more. He is also a writing coach helping freelance journalists level up their career. Find him at DavidSilverberg.ca.

Illustrative image of a person looking out a window at a field

Subscribe to The Weekly

A weekly round-up of the previous week's stories with a little comedy.

Subscribe