Your Gullible Friends Keep Falling for Facebook Hoaxes. Here's How to Stop Them.

Innovation, the Internet, gadgets, and more.
Dec. 3 2014 6:30 PM

One Weird Trick to Stop Facebook Hoaxes

The social network could be a force for truth—if it wanted to be.

141202_FT_FacebookHoaxes
People will believe just about anything on Facebook. But it doesn't have to be that way.

Photo illustration by Lisa Larson-Walker. Photos by Kimberly White/Getty Images and Thinkstock.

Macauley Culkin is dead. Eating whole lemons will save you from cancer. And this shocking video proves that no planes actually hit the World Trade Center on 9/11.

Will Oremus Will Oremus

Will Oremus is Slate's senior technology writer.

That’s if you believe what you read on Facebook—which an awful lot of people clearly do. A recent Pew survey found that 30 percent of American adults turn to the social network for news, making it perhaps the most influential media platform in the country. Far more people read Facebook every day than watch CNN or Fox News, or read the Huffington Post or the New York Times.

And yet Facebook’s news feed remains a hotbed of hoaxes, lies, and conspiracy theories. (Yes, even more so than Fox News, or whatever shrill liberal publication you’d like to hold up as its lefty counterpart.) Some, like the periodic claims that today is the date shown on the time machine in Back to the Future II, are relatively benign. Others are more insidious, like the posts claiming that for every person who shares this photo of a cancer-stricken baby, Facebook will donate to his family to help cover their medical expenses. And many are surprisingly resilient. This week, Slate debunked a bogus copyright notice that was going viral on Facebook for the third time in the past two years.

Advertisement

From a business perspective, media outlets have little cause to complain: Each of Slate’s short posts debunking that copyright notice has been shared tens of thousands of times, bringing the site some very easy traffic. But for a journalist—not to mention, you know, a human being and a citizen—it’s disheartening to realize that you can shout the truth over and over and over without making a discernible dent in the spread of the falsehoods. And the design of Facebook’s news feed has a lot to do with that.

True, Facebook didn’t invent the viral hoax. It just happens to be the perfect 21st-century venue for the propagation of scams and urban legends that would have propagated by word of mouth, tabloid, or chain letter in earlier eras. But Facebook amplifies misinformation at a speed and on a scale that exceeds what was possible before. Its news feed is the product of cutting-edge software that the company has finely calibrated to maximize user engagement—that is, to prioritize the posts that catch people’s eyes and compel them to click, like, or share. One problem: Hoaxes, scams, and conspiracy theories are specifically optimized to do just that. Truth may be stranger than fiction, but on Facebook, fiction is often more viral.

Here’s the thing: It doesn’t have to be that way. I’ve written before about how machine-learning algorithms could help identify false rumors on social media. Already powered by some of the most advanced machine-learning software on the planet, Facebook’s news feed could easily be a potent force for truth if the company wanted it to be—far more so than rivals like Twitter, which is constrained by its chronological timeline from intervening too much in what its users see at any given time. So why isn’t it?

Facebook will tell you it’s because distinguishing truth from lies is none of its business. The purpose of the news feed, the company explains, is not to sift right from wrong or good from bad according to some objective standard. It’s to sift what’s interesting to each Facebook user from what isn’t—that is, to give its users what they want. And what they want, Facebook has learned, is to see what their friends, family, and acquaintances are talking about. Whether that’s a cute baby photo, a serious current event, a clever lifehack, or a 9/11 conspiracy theory is not Facebook’s concern.

“Our goal is to connect people with the content they're most interested in and not to prioritize one point of view over another,” spokeswoman Jessie Baker told me.

There’s some logic to that. Facebook would be loath to appoint itself the arbiter of the veracity of everything its users post on the site, and I doubt its users would much appreciate that either. It’s a tech company, not PolitiFact.

That said, it’s a false dichotomy to imply that Facebook must either hire legions of censors to police users' posts, or throw up its hands and absolve itself of all responsibility for the nature of the content they share. As Adrian Chen detailed in a terrific Wired story, Facebook already employs teams of traumatized contractors in the Philippines to scrub the site of pornography and criminality. In theory, they could zap certain known scams and hoaxes while they're at it. That, of course, would be a difficult and controversial job, and I wouldn’t necessarily recommend that Facebook take it on.  

Fortunately, it doesn’t have to. There’s a much better and easier way that Facebook could rebalance the scales to give less weight to viral bunkum—if it cared enough to do so. It’s something that Facebook already does in service of other goals, like increasing engagement. As such, it doesn’t require a fundamental shift in Facebook’s philosophy. All it requires is an expansion of how Facebook defines “quality,” and a few corresponding tweaks to the news feed code.