AARP Hearing Center
Facebook is cracking down on misinformation about the coronavirus and COVID-19 vaccines with sweeping new rules about what can and can't be posted on its social media platforms.
The company now has an exhaustive list of more than 50 specific false claims about the coronavirus it does not allow, ranging from saying the virus is manmade to posting that it's safer to get the disease rather than the vaccine. The rules also extend to Instagram, which Facebook owns.
Facebook has come under scrutiny during the pandemic for allowing conspiracy theories and anti-vaccine rhetoric to spread.
"The original idea was that Facebook was a public square where you can come in and say anything you want,” says Bhaskar Chakravorti, an economist who studies digital technology use and dean of global business at the Fletcher School at Tufts University. “Now they're realizing if they're creating a health hazard, they need to put on some constraints."
Research shows that falsehoods spread significantly faster than the truth on social media, and those age 65 and older are particularly vulnerable to misinformation. A 2019 study published in Science Advances found that found older adults are seven times as likely as younger people to share fake or misleading content on Facebook. The researchers hypothesized that some older adults may not have the digital media literacy and experience to recognize untruths.
Rules tighten during pandemic but are tough to enforce
Facebook has gradually stepped up its efforts to combat harmful content related to COVID-19. Early in the pandemic, it announced a policy to promote posts with accurate coronavirus information, to put warning labels on misinformation and to push it lower in people's feeds. At that point, the platform said it would remove false information “that could lead to imminent physical harm.” (Disclosure: News and information related to the pandemic published by AARP appears in Facebook's Coronavirus (COVID-19) Information Center.)
In October, Facebook banned ads discouraging vaccines. Two months later, it began removing posts with vaccine misinformation that had been debunked by public health experts. Then, in early February, the tech giant took its strongest stance yet, expanding the list of false claims it would not allow, and threatening to ban users, groups or pages that repeatedly spread misinformation.
In an email, Facebook declined to say how many posts, pages and groups it has taken down under its newest rules, but it noted that it removed over a million pieces of content with harmful COVID-19 misinformation from Facebook and Instagram in the fourth quarter of 2020.