Two days after the election, Mark Zuckerberg said the following at a meeting in California:
“Personally I think the idea that fake news on Facebook, of which it’s a very small amount of the content, influenced the election in any way is a pretty crazy idea.”
For those of us who’ve spent any time on Facebook in the last 18 months and who’ve tried to engage in conversations with people whose political arguments include conspiracy theories, Zuckerberg’s comment was a record-scratch moment.
To his point, people do gravitate toward information that supports their pre-existing viewpoints, and they find such information more compelling than information that challenges their beliefs. It is likely that even without Facebook, many voters would seek out information that validates them. However, Facebook has made it significantly easier to find that information and share it. It also allows people to use the endorsement of others in their social networks as a proxy for any real fact-checking. That may be useful when it comes to buying a new washing machine, but it’s counterproductive for evaluating news. Facebook’s own data shows that fake news outperformed real news on its network during the election cycle.
Yet it was only a few short years ago that Facebook was touting its ability to influence users’ emotional states by subtly manipulating the words they see in their news feeds (my take on how that research was conducted is here). It was only one of several internal studies they ran to understand how tweaks to the newsfeed could influence user moods and behaviors (a nice summary here). Clearly, at some point, Facebook believed its newsfeed could change user actions. Are they now saying it can’t?
Facebook is also willing to claim it can drive political behavior when it’s attempting to sell ad space. These images are screenshots from their “Success Stories” section for potential ad buyers:
If I were a Facebook ad buyer, I would feel pretty annoyed right now that the founder is saying information on the network can’t influence behavior offline. What am I spending my money for then?
Facebook’s algorithm is a big part of the problem. As I’ve written about before, the site is set up to continually revert users to the Facebook algorithm over the time-based setting. The Facebook algorithm was changed in June to prioritize items shared by friends, which means having people in your network who click on fake news items raises your chances of seeing those types of items in your own feed. And as I noted above, many people (rightfully) use friend endorsements as a proxy for fact-checking. That’s a problem when friends endorse false stories.
I’ll admit I’m personally guilty of crafting my Facebook as an echo chamber. I do it on purpose, because my use of Facebook is to connect me with friends, not learn the news. I want to see baby pictures and vacation photos and hear about your new job. Even with my careful cultivating of my feed to emphasize those types of items, I still continuously see “news” stories promoted by Facebook in my feed. Many of them are fake. And this happens without my behavior encouraging Facebook’s algorithm to think I have an interest in these items. What must the feeds of people who do enjoy these stories look like?
What’s chilling is that Facebook really doesn’t have much of an incentive to change the way their algorithm operates with the realities of the incoming administration. Beyond an ideological commitment to values of truth and reliability, what is there to compel a change? We can only hope that Facebook employees like the rogue group exploring solutions to the fake news problem prevail.
Zuckerberg also said, “there is a certain profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw some fake news.” I completely disagree with this statement. In fact, I think there is a lot of empathy in believing that someone’s vote was influenced by information they believed to be true. I would like to believe that most voters have spent time forming their opinions. As a psychologist, I know emotions drive much of our behavior, and I know that people look for information that validates our gut reactions. That’s human nature. I do it too. The problem is that the information that was available to people when they did this was false, and inflammatory, and so much more divisive than the truth.
Mark Zuckerberg, you can’t have it both ways. Either you’ve created a powerful social tool that can sway behaviors and emotions, or you can’t influence those things and should not be making money off a claim otherwise. Pick one.
Where does fake news come from? For an interesting, and somewhat upsetting, account of where the fake news originates, I recommend this Washington Post interview with one of the authors.
What sites shouldn’t I trust? Here is a list of websites known to peddle fake news stories. Some of them, like The Onion, are deliberate satire, while others are deliberately misleading, and still others are a mixed bag of truth and fiction. If you read a story from one of these sites, it’s a good idea to verify the information from another source before sharing. Snopes is often a good way to check if something is true.