I probably spend about 15 minutes each morning on Facebook before I even get out of bed. I scroll through my newsfeed, check up on what my friends are up to, follow a few interesting links, etc. I might check it once more before I head out the door. Once I am at work, I usually try to stay away until lunchtime (after all, who hasn’t logged on to Facebook for a quick check-in and then found themselves still scrolling 30 minutes later?) and then might check once more before I leave work around 5:00. If I am staying in that evening, I will probably have Facebook open on my computer until I go to bed. Even if I am watching a movie or reading, I check what’s going on in social media land every 30-45 minutes. Just a quick scroll, unless I find an interesting link to follow.
All in all, I probably spend 2 hours each day actively scrolling on Facebook per day. It is important to me. Since I recently moved across the country, it is how I find out how my friends and family are doing. I also use it find out what events are going on near me—music shows, films at my local theater, parties, campus events at my institution. Honestly, I am more likely to find out what is going on at my institution through Facebook than the newsletters I get emailed multiple times a week.
More importantly, Facebook is where I get most of my news. I follow The Washington Post, The Guardian, Ars Technica, Jezebel, Library Journal, and The Comic Book Legal Defense Fund, among others. Through the links posted by others, I may also get stories from Gawker, The New York Times, Politico and any number of other sites. And I’m not alone in this. A 2015 study found that 63% of Facebook and Twitter users get the majority of their news from social media.
Which makes it troubling that one of the questions submitted to Mark Zuckerberg at a recent employee Q&A was “What responsibility does Facebook have to help prevent President Trump in 2017?”.
Don’t get me wrong: I don’t like Donald Trump. He’s an easy man to hate (and such an attractive target for every major news corporation!) and would be a terrible president. I don’t think it is far-fetched to say that if Trump is elected president and allowed to take office, America will enter into a severe decline—economically, politically, ethically, take your pick. But it makes me uneasy that Facebook could influence the outcome of any election.
The company has already dabbled in politics. In the 2012 election, it actively encouraged voting by showing users pictures of their friends with the “I Voted!” logo. Furthermore, according to a Mother Jones report, “in the three months prior to Election Day in 2012, Facebook increased the amount of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change—which users were not alerted to—measurably increased civic engagement and voter turnout.”
On one hand, hurrah for increased civic engagement. On the other, Facebook is manipulating the information received by its users—without telling them.
And this is not the first, or even the most infamous, example of Facebook manipulating its newsfeeds. In early 2012, computer engineers at a Facebook joined forces with a group of Cornell scientists to conduct an experiment with Facebook newsfeeds. Using a randomly selected group of users (who were not informed that they would be part of an experiment), they manipulated newsfeeds so that some of the group saw a feed with fewer positive stories and others saw a feed with fewer negative stories. The experiment went on for a week and sought to test whether social media users can experience “emotional contagion” (the act of transferring emotional states between individuals) online. Their results, published in Psychological and Cognitive Sciences, indicated a parallel relationship between the number of positive or negative posts the users generated and the emotional content of the stories on their newsfeed—thus indicating that their emotional state was affected by the posts they saw in their newsfeed.
An “Editorial Expression of Concern and Correction” added to the results post-publication claimed that since Facebook is a private company the experiment was not held to the standards of Cornell’s Institutional Review Board (IRB). Rather, the researcher’s actions were allowable under Facebook’s Data Use Policy, i.e. the policy most users don’t read before hitting “I Agree”.
Facebook is a private company. As such, it is legally allowed to do what it wants with its product. And it is certainly not the only information source to demonstrate a bias regarding certain issues. Bulwarks of the newspaper industry like the New York Times have a tradition of endorsing certain presidential candidates.* However the NY Times still carries stories about the opposition, and its endorsement is clear and out in the open. If Facebook follows the precedent it set with the 2012 election and the emotional contagion experiment, the company will not let users know that it has chosen to combat Trump’s candidacy.
*In the upcoming election, the NY Times has chosen to endorse Hillary Clinton and John Kasich.
When a platform has this much sway over the news stories its users receive (not to mention their emotional state), it needs to be regulated. Given Facebook’s user base, I think its closest comparison shouldn’t be newspapers but rather television. The FCC already regulates political broadcasting on television; why not on social media?
This is something I want to talk more about this future (particularly regarding what will happen when candidates start trying to use money to influence how Facebook presents their news stories—because, of course, that will happen) but for now, just a note about current rules regarding broadcasting and politics. According to “Political Broadcasting: Questions and Answers on the FCC Rules and Policies for Candidate and Issue Advertising” “[television] stations must provide equal amounts of time for candidates for the same office, and otherwise treat candidates for the same office in the same way” (11) and “if you sell time to one candidate for an office, you must be willing to sell an equal amount of time to the opponent” (14).
To put it simply, Donald Trump is the worst, but we shouldn’t allow our information sources to be polluted. We have the right to make up our own minds by accessing all available information.
Agree? Disagree? Think my interpretation of broadcast policies is amateurish? Let me know in the comments!