Iowa State University researchers dive into language that humanizes AI systems • | Metaglossia: The Translation World | Scoop.it

"Iowa State University researchers received unexpected answers to the question of how often people humanize artificial intelligence in news writing.


ISU English professor Jo Mackiewicz and Jeanine Aune, an English teaching professor and director of the university’s advanced communication program, recently published a study on how prevalent the use of anthropomorphizing, or humanizing, language is when it comes to artificial intelligence programs.


They were surprised to find that AI was not typically described in human terms. Even the use of humanizing verbs like “learns,” when considered in context, did not always treat the technology like a person.


“It’s the exact opposite of what I was expecting,” Aune said.


Aune said the pair got the idea for this research from a conference they both attended. The discussion included the suggestion that educators emphasize that AI is a tool that cannot replace communication principles and practices and to avoid anthropomorphizing the technology when they talk about it.


Iowa State University alumni were also involved in the study, including current Brigham Young University associate professor of linguistics Matthew Baker and University of Northern Colorado assistant professor of English Jordan Smith.


Heading into their research, Aune and Mackiewicz said they both had their assumptions as to what they’d find — that  humanizing the technology would be widespread. Prior research has shown people anthropomorphize when working with robotics, and Aune said they’ve read articles written about how people use and perceive AI that have suggested humanization is happening.


The research team used News on the Web to study language relating to AI, Mackiewicz said, as it includes the most recent data aggregated from news articles originating from 20 different countries. The database has topped 20 billion words, and it includes some of the earliest news articles about AI to recent articles covering the technology.


“When we started to do an analysis of the individual pairings, that’s where the findings became most interesting, because the overall finding was that, oh, they don’t really pair as frequently as prior research or opinion pieces would have you think,” Mackiewicz said.


Words the team was on the lookout for focused on “verbs that reflect cognition,” Mackiewicz said, also known as “mental verbs” — needs, learns, means, and understands are just a few examples. They also narrowed the search to references to AI or ChatGPT, as it was one of the first AI tools to become publicly available and known.


What they found is that these mental verbs are not often paired with the identified terms, and there was nuance to the instances where they were used together.


“AI means, or AI needs, or ChatGPT knows — you put those together, just on the surface, they seem anthropomorphizing, but they’re not necessarily,” Mackiewicz said.


Anthropomorphization “exists on a spectrum,” Mackiewicz said, where a word like “means” could apply to different senses and meanings depending on how it is being used. Other words have different meanings in their common usage or usage by different disciplines, such as “learns.” Mackiewicz said the terms “AI” and forms of “learn” were paired together frequently in the data, which to some would seem to say that the technology is learning like a human could, but to others would mean something completely different.


It’s this nuance that shows why strict guidelines of what words can and cannot be used in relation to AI are flawed, she said, and what advice the researchers would offer instead is that people need to be careful about knowing who their audience is when writing about AI and thinking about how their words could be interpreted.


Mackiewicz said that in a way, this work showed the value of human beings. It’s easy for a computer to count how many times words are paired together, but it’s harder to analyze it within its context and determine the language’s actual meaning.


While it would be a bigger lift than this study, the researchers said they would be interested in studying what language is used to refer to AI in other areas of writing, like social media. This would require them to create their own database and scrape social media for the necessary information, Mackiewicz said, but it could yield interesting results.


While journalists utilize style guides created by the Associated Press or other organizations when deciding how to write on certain topics and receive editing on their writing, Mackiewicz said, social media generally doesn’t follow such rules.


“We now even had some students refer to ChatGPT with a male pronoun, not talk about it as ‘it,’” Aune said. “So I personally really advocate being intentional with the language, so talking about output text and it uptakes the prompt, you could really kind of emphasize that it is a tool that’s really helpful, but it’s not replacing our brains.”"
https://iowacapitaldispatch.com/2026/01/23/iowa-state-university-researchers-dive-into-language-that-humanizes-ai-systems/
#Metaglossia
#metaglossia_mundus
#métaglossie