CHAMPAIGN, Ill. — Say your friends on Facebook post a cute photo of their puppy or share their exasperation at missing their flight to a vacation spot. You click “like” but instead an “angry” reaction is posted in response to the puppy or “laughter” at the missed flight.
University of Illinois researcher Ben Grosser has created a web browser extension he calls Go Rando that randomly chooses one of Facebook’s six reactions whenever you click “like.” His intention is not to help you confuse or alienate your friends, but to obfuscate your recorded feelings to Facebook.
“Over time you will appear to have an even, emotionally balanced set of reactions to content on Facebook,” Grosser said. “You’ll be ‘angry’ as much as you’re ‘sad.’ You’ll be ‘haha’ as much as you’re ‘wow.’”
Go Rando users may still select a specific reaction to a post. But by having the extension choose most reactions randomly, it will no longer be clear if a reaction is genuine or not.
“It’s more art than function,” Grosser said. “Letting it select a reaction for me and saying, ‘Wait, that reaction isn’t very appropriate for this situation and might be misread’ … I think that’s a very interesting moment, and it forces us to think about what our reaction means.”
A professor of new media in the School of Art and Design, Grosser works on an initiative in critical technology studies at the National Center for Supercomputing Applications. He focuses on the cultural, social and political effects of software.
In creating Go Rando, he asked, why Facebook would provide a set of reactions for its users, and what it is gaining from learning more about how we feel. Grosser said the information Facebook collects – including our reactions to various posts and our most commonly selected emotion – changes what we see on our news feed, leads to more targeted advertising, and enables continued emotional manipulation.
“By learning a more fine-grained understanding of our emotional reaction to any piece of content, Facebook and its advertisers can adjust what we see to keep us engaged and clicking,” Grosser said.
Other potential uses of the information concern him as well. For example, some law enforcement agencies use software that compiles data about an individual – including social media data – in order to make an instant threat assessment when an officer is heading to a call.
“If it’s a person of color with a Muslim name who expresses anger often on Facebook, it’s not inconceivable that those parameters have some bearing on the threat assessment,” Grosser said. “The difference could be profound. It could be the difference between a police officer approaching and knocking on the door versus approaching with his hand on his holster.”
His criticism is not that information is being used to help police better understand a potential threat, but that we don’t know anything about the algorithms used to develop the threat assessment – part of a larger critique Grosser has of the technology industry.
“Increasingly (social media) is how everyone gets their information. It’s in the hands of a small set of corporations whose staff is not very diverse. They all went to one of the top five computer science schools, including Illinois. They all make about the same amount of money. They all live in the same few ZIP codes. They are mostly male. The homogeneity is the problem,” Grosser said. “There are unintended effects and ideological beliefs embedded in software all over the place.”
Go Rando will be part of an exhibition, “Blinding Pleasures,” opening Feb. 10 in London at arebyte Gallery. The theme of the exhibition is control and the ways in which a “false consensus effect” results when people are presented only with opinions similar to their own. The exhibition organizers use Great Britain’s Brexit vote and the outcome of the U.S. presidential election as examples of outcomes that shocked those who were opposed to them, and whose social media feeds were filled with others who felt the same way. The exhibition will look at ways individuals can be aware of their own biases and resist efforts of marketers or those controlling social media platforms to capitalize on them.
“Obscuring your emotional reactions is a small gesture, but it is a gesture that obfuscates your profile to big data,” Grosser said.
Another web browser extension project Grosser rolled out recently is Textbook, which allows a user to remove all images from Facebook – the photo from a linked article, the selfie a friend posts, a user profile picture and the reaction icons for posts. Blank boxes and white space are left behind in the news feed, showing where images were removed.
Grosser wants to explore the role of images in how we read Facebook.
“Images play a certain role in what we click on and what we look at and how we feel,” he said. “I’m interested in what that role is, and in creating an experiment for myself and anyone else who wants to test it out.
“On the most sinister side of the spectrum, people are making money by creating images that entice us to behave in certain ways. On the other end of the spectrum, there’s the way we post images to get likes and encouragement from our friends,” he said, adding that because Facebook tells us what’s popular through the number of likes we get, we are encouraged to post more images that might get more likes.
Grosser offered the Textbook extension to some of his students to try out. They told him it decreased the attractiveness of Facebook for them.
“For me, I find I’m paying more attention to what’s written,” he said. “There’s a different critical analysis of text versus images. I feel like without the images, I stop and read more.”