In April, Facebook made changes to its privacy policy and settings that opened up more sharing of users' profile information - then announced revisions on May 26 after a storm of criticism that included promotion of a "Quit Facebook Day" on May 31. Christian Sandvig is a professor of communication and of media and cinema studies at the University of Illinois, as well as a fellow at the Berkman Center for Internet & Society at Harvard University. His research focuses on the development of new communication and wireless technologies - including the study of social media. Sandvig was interviewed by News Bureau social sciences editor Craig Chamberlain.
This isn't the first time Facebook has made changes promoting more sharing of information and then faced pushback from users. Why does this keep happening?
Facebook ultimately exists by selling you, the user. It sells you to your friends and lately it's trying to sell you to other businesses around the Web. So you can see where their incentives are. They want to make as much information about you available as possible - so that they can sell it. However, this isn't necessarily consistent with the kinds of information that you or I might want to share with everyone - with potential employers, with potential romantic interests. And so you could say that this isn't really a current scandal for Facebook, it's the core conflict of their existence, in that their job is to sell your data, and you might not like that.
One of the controversial changes in April was to make more of users' profile information public, with no option to hide it. But most users don't make use of the privacy settings that are available. Why?
Most people don't change most settings. It's not just Facebook. It's that we don't have the time or interest. It may be that there are many parts of our lives that we could more thoroughly customize, but there's some tradeoff between the amount of time we have and the amount that we care. Usually it's a much better calculation for us to live with whatever settings happen to be already set up for us. In addition, Facebook's privacy controls have been quite complex. We know that if you're in a situation where you don't know what to choose when someone's offering you a set of options, for privacy or for anything else, then the fact that the manufacturer suggested some settings is going to be really powerful. Using defaults, Facebook usually recommends more sharing, rather than more privacy, and it's time-consuming to change every option.
Individual customization like this is ultimately regressive (that is, favoring those with more resources). In any sort of situation like this, you'll find that when you set up a choice, especially if it's technical and if it's complicated, the people who are more likely to be able to take advantage of choosing are going to be the people rich in money, time or expertise. So on Facebook, in a sense, only the rich have privacy.
Are we fooling ourselves to think that we can have the benefits of sharing information, free of charge, through a for-profit company?
Yes, maybe we are. The problem isn't really that Facebook did the settings wrong in April; the problem is a fundamental tension of their existence - in that their job is to monetize your data. So it's kind of silly to be scandalized that they're doing it - because this is what they're set up to do. If there were some sort of alternative, not-for-profit social networking site, or at least a company with different incentives, then maybe we wouldn't have this fundamental tension.
Assuming that doesn't happen, how do we maintain the advantages of social networks like Facebook, while honoring concerns about privacy?
Individual choice always seems like a good idea, but it could be that the best way to think about privacy is to have some sort of norm that we all agree on by talking about it, agreeing that a certain thing is OK and a certain other thing is not OK, and then all of the social networking sites adhere to the same norm. This could be done through legislation or regulation or industry best practices. Leaving privacy as an individual decision made through privacy settings, I think, is to the detriment of all of us. Asking everyone to ratify a group of privacy settings for every online site or piece of software they use is ultimately ridiculous - it's a way to provide false assent while avoiding a larger discussion about privacy. It is not a good use of our time for every software user to learn the larger implications of every possible privacy checkmark.
This is also important because the more things have been computerized and put online, the more it's possible to reconstruct elements of your identity that you wanted hidden, even though you may have made the most restrictive choices possible on all of the privacy settings that you could. It's the sum of all of your data that matters, not necessarily the pieces that you're releasing from one application or another. Even if Facebook behaves very well, they stick to their policy, they never change it, and even if you pick the most restrictive settings, not having a public policy conversation about privacy is still going to lead to an erosion of privacy.
But isn't the erosion of privacy something we just have to live with in the digital age?
No, I don't think so. I don't see why that should be the case. It might be difficult to have a serious conversation about this complex area, but it doesn't mean we can't have one. In Europe, they've adopted much stricter privacy regulations and they've taken a much more proactive approach to this issue. There's no reason why we can't look to some of what they're doing for inspiration.