Researchers at Temple University received $750,000 from the National Science Foundation (NSF) to develop a tool that warns journalists that they are about to publish polarizing content. The NSF is a federal government agency focused on supporting research and education in non-medical fields of engineering and science.
The initiative is part of NSF’s “Trust & Authenticity in Communication Systems.” It is called the “America’s Fourth Estate at Risk: A System for Mapping the (local) Journalism Life Cycle to Rebuild the Nation’s News Trust.”
The focus of the project, according to a report on Campus Reform, is creating a system that alerts journalists that the content they are about to publish might have “negative unintended outcomes” such as “the triggering of uncivil, polarizing discourse, audience misinterpretation, the production of misinformation, and the perpetuation of false narratives.”
The researchers hope that the system will help journalists measure the long-term impact of their stories, that go beyond existing metrics such as likes, comments, and shares.
One of the researchers involved in the project, Temple University’s professor Eduard Dragut, said that the system will “use natural language processing algorithms along with social networking tools to mine the communities where [misinformation] may happen.”
“You can imagine that each news article is usually, or actually almost all the time, accompanied by user comments and reactions on Twitter. One goal of the project is to retrieve those and then use natural language processing tools or algorithms to mine and recommend to some users [that] this space of talking, this set of tweets, which may lead to a set of people, like a sub-community, where this article is used for wrong reasons,” he added.
Journalists and other players in the news industry will be involved with the project, which already includes researchers from other universities including Boston University and the University of Illinois-Chicago.
“We want journalists to be part of the process, not just the mere users of the product itself,” Dragut said. “So you can imagine sort of an analytics tool that informs the journalists and editors and other people involved in this business how their products or how their creative act is used or misused in social media.”
He added that the project is attempting to “create a collaborative environment with both social media platform[s] and other organizations like Google” because of their expertise.
“We have some preliminary conversation with Bloomberg, for instance, and we will have to define exactly how they are going to help us. Google has an initiative to help local news, and we are working to create a relationship with them, and there are others,” Dragut told Campus Reform. “This product will not work unless we are successful in bringing some of these high tech companies into the game.”
Another researcher involved in the project, professor Lance Holbert, said that, for now, the misinformation the project is focusing on is that of the spread on local media.
“Certainly some topics over time will become more versus less interesting, but also we’re focused here initially on local media as well, so each locality may have different topics or particular points of interest that come up in the news,” he said. “We’re trying to keep this generalizable across topics.”
Holbert noted that misinformation is not “happening in the political spectrum” alone.
“[It’s happening] in sports, it’s happening in economics,” he said. “Like a few years back, I know, an example from Starbucks where there was a sort of a campaign on Twitter [saying] that Starbucks is targeting, in the wrong way, African Americans, which was wrong.”
The NSF is expected to further fund the project when its first phase becomes successful.