Strategic Communications and Marketing News Bureau

Evidence of humans, not ‘bots,’ key to uncovering disinformation campaigns

CHAMPAIGN, Ill. — Political disinformation campaigns on social media threaten to sway political outcomes, from U.S. elections to Hong Kong protests, yet are often hard to detect.

A new study, however, has pulled back the curtain on one type of campaign called “astroturfing,” which fakes the appearance of organic grassroots participation while being secretly orchestrated and funded.

The study suggests that the key to uncovering such accounts lies not in finding automated “bots” but in specific traces of human coordination and human behavior, says JungHwan Yang, a University of Illinois communication professor who is part of a global team of researchers on the project.

The starting point for their research was a court case in South Korea that identified more than 1,000 Twitter accounts used in an astroturf campaign to boost one candidate in the country’s 2012 presidential election. Running the campaign was South Korea’s National Information Service, comparable to the U.S. Central Intelligence Agency.

This was possibly the first time such fake accounts had been firmly identified, and that was a gift to research on the topic, Yang said. “Many people are interested in this kind of research, but they don’t have the ground truth. They can’t tell which accounts are run by paid agents and which are not. In this case we had the data.”

By comparing the known astroturf accounts with accounts of average users, users who actively discussed politics on Twitter and social influencers, the researchers uncovered patterns that helped them identify other astroturf accounts, Yang said.

Their findings were published online by the journal Political Communication and they discuss their work in an article for The Washington Post’s Monkey Cage blog on political science.

In subsequent research, they’re finding similar results in analyzing other political disinformation campaigns in other countries, both past and present, including those of Russia in the 2016 U.S. election and of China in the recent Hong Kong protests.

Automated bots have gotten a lot of media coverage related to political disinformation campaigns, Yang said, but the researchers’ work in South Korea and elsewhere has found that only a small fraction of accounts appeared to be bots. That also matches with past reports from “troll farm” insiders, he said.

Bots are easy and cheap to use, can spread a lot of messages and don’t need sleep, but many people reading them can tell they are automated, similar to spam phone calls, Yang said. “If real people are behind the accounts and manage them manually, however, it’s really hard to detect whether the messages are genuine.”

By having an identified set of astroturf accounts, and analyzing data at the system level, the researchers could spot evidence of central coordination between accounts or messages. “When a person or group manages similar accounts in a very similar way for a certain goal, their behavioral activity leaves a trace,” Yang said.

Human nature leaves other traces too, he said. The organizer or “principal” in an activity usually wants things done a certain way, for instance, but the worker or “agent” seeks the path of least resistance while still meeting the organizer’s goals. Social scientists call it the “principal-agent problem.”

In an astroturf campaign on Twitter, that meant agents often would tweet or retweet the same message on multiple accounts they managed within a very short timeframe, or would frequently retweet messages from related accounts.

“The pattern we found in our data is that people try to reduce the amount of work to achieve their goal,” Yang said. “The fact is that it’s really hard to get a retweet (from an unrelated account), it’s really hard to become a social influencer on a social media platform,” so workers looked for shortcuts, and researchers could see the traces.

The study also found that despite the people and resources committed to the South Korean astroturf campaign, the results were limited. “What we found is that even though they managed more than 1,000 accounts and tried to coordinate among themselves, they failed to get the response from the public. The retweet counts and the number of mentions were more similar to average users than influencers,” Yang said.

The study’s results may be of limited value for individuals on social media trying to judge a single account’s authenticity, Yang said. One strategy is to look at the accounts that are following or retweeting a suspicious account, or examine whether those accounts are tweeting the exact same content at the same time, he said. But it’s still often difficult.

The value of their research is mostly at the system level, Yang said. He and his colleagues hope that social media companies, capable of analyzing the behavior of numerous accounts, can implement algorithms that block suspected astroturf accounts and campaigns.

Editor’s notes:

To reach JungHwan Yang (pronounced JUHNG-hwahn Yang), call 217-300-7139; email junghwan@illinois.edu; Twitter @junghwanyang.

The paper on the South Korean research, “Political astroturfing on Twitter: How to coordinate a disinformation campaign,” is available here or from the News Bureau.

Other co-authors on the paper were Franziska B. Keller, Hong Kong University of Science and Technology; David Schoch, University of Manchester; and Sebastian Stier, GESIS-Leibniz Institute for the Social Sciences in Cologne, Germany.

DOI: 10.1080/10584609.2019.1661888

Read Next

Behind the scenes Photo of the author working with a cockatiel that she holds wrapped in a small towel. Other students, instructors are seen working in the background.

Learning from cockatiels

CHAMPAIGN, Ill. — When the lights go out, the 18 shrieking cockatiels in the room get quiet. I aim my phone’s flashlight into a large cage where Philip Wiley, another of the six veterinary students participating in this advanced avian medicine professional development course, is poised to catch one of the birds. The light helps […]

Behind the scenes Photo of a group of dancers in costume reflected in a dressing room mirror.

Taking flight on a New York City stage

CHAMPAIGN, Ill. — It’s a Saturday afternoon and I’m waiting in a cramped hallway beneath The Joyce Theater stage in Chelsea, Manhattan. My palms are sweaty, and I feel anxious as I attempt to take my mind off the looming event, my New York City debut. The audience is filled with my peers, teachers, family […]

Announcements A collage of four portraits

Four Illinois faculty members elected to American Academy of Arts and Sciences

CHAMPAIGN, Ill. — Four faculty members from the University of Illinois Urbana-Champaign have been newly elected as members of the American Academy of Arts and Sciences, one of the oldest honorary societies in the United States. Materials science professor Paul Braun, history professor Antoinette Burton, physics professor Aida El-Khadra and chemistry professor Jonathan Sweedler are […]

Strategic Communications and Marketing News Bureau

507 E. Green St
MC-426
Champaign, IL 61820

Email: stratcom@illinois.edu

Phone (217) 333-5010