In the past few years, Facebook has ramped up its efforts to combat ‘fake news,’ the ever buzzworthy term for information, typically false or taken out of context, that is meant to deliberately trick or mislead someone into holding a certain belief.
But the task of censoring the trolls and charlatans is neither simple nor easy, as there is a lot of grey area regarding what different groups consider to be the truth. Earlier this month, Mark Zuckerberg went on Kara Swisher’s Recode Decode podcast to discuss his approach to the issue.
“The principles that we have on what we remove from the service are, if it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform,” Zuckerberg said.
But then he dropped a pretty controversial statement. He went on to explain that he wouldn’t necessarily remove Holocaust denial posts from Facebook. “I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong. I don’t think that they’re intentionally getting it wrong,” he said.
Mark Zuckerberg says Holocaust deniers deserve "a voice" https://t.co/qEi0c6CC5m pic.twitter.com/3ecnKHUrTo— New York Daily News (@NYDailyNews) July 18, 2018
Unsurprisingly, some people were offended by Zuckerberg’s comments, so he later clarified, “Our goal with fake news is not to prevent anyone from saying something untrue—but to stop fake news and misinformation spreading across our services. If something is spreading and is rated false by fact checkers, it would lose the vast majority of its distribution in News Feed. And of course, if a post crossed line into advocating for violence or hate against a particular group, it would be removed. These issues are very challenging but I believe that often the best way to fight offensive bad speech is with good speech.”
Given that stance, a lot of people are confused as to why Facebook seems to practice minimal intervention when it comes to censoring posts, groups, and figureheads of the anti-vaccine, or “anti-vaxx,” movement. When you search Facebook for vaccine information, like simply querying the term “vaccines,” the top results show a massive amount of anti-vaxx ideology, featuring groups with names like “United Against Vaccines,” Vaccines Injury Stories,” and “Vaccines Exposed.” The groups often claim that the information doctors receive about vaccines in medical school is negligible and insignificant, and that we don’t know enough about their potential negative effects.
Yet another reminder that Facebook's failure to draw a moral line will cost people lives. In this case, it's a legitimate public health issue. https://t.co/yPSaetNRIh— Melissa Ryan (@MelissaRyan) July 25, 2018
Facebook’s righteous anti-fake news crusade is fine and dandy, but the effects of nonvaccination are well documented and deadly. Despite the hard evidence, Facebook has refused to comment on why they continue to allow anti-vaxx hoaxes to proliferate on their website.
One thing that certainly doesn’t help the issue is popular celebrity endorsers of the anti-vaxx movement, such as entrepreneurial model Kat Von D. In an Instagram post back in June, she featured a baby bump pic with a long and emotional caption about how she knows what’s best for the health of her unborn child, and that she simply won’t listen to any critics.
4. Anti-vaccine pages thrive on Facebook https://t.co/hnNepqyXVL #axiosvitals— Nathan Ludvigson (@RxSponge) July 26, 2018
Facebook groups are particularly dangerous when it comes to spreading misinformation because the members typically just fan each other’s flames of mistrust and conspiracy, blocking out dissenters and refusing to listen to any other viewpoints.
Last year, two Australian researchers published a paper about how anti-vaccination Facebook users assemble in small, clannish networks on the site. “Because anti-vaccination groups are spread across many smaller Facebook groups, rather than clustered around one or two main hubs, that makes it difficult for Facebook to take much action against the groups. They’d just go somewhere else, and set up shop again.” These groups connect people who most likely wouldn’t have found such large crowds of support in the real world, which intensifies their false beliefs.
If huge, powerful companies like Facebook are unable to effectively use their resources to dissuade the anti-vaxxers, what’s the key to changing their minds? What do you think? Tell us in the comments down below!