Facebook, Twitter reverse changes meant to curb vote misinformation
Twitter had made it harder to retweet others’ posts, encouraging people to add commentary before posting something. The company said it will return to one-click retweets, after seeing a 20% decrease in sharing following the change.
Facebook Inc. and Twitter Inc. reversed changes to their content policies that were implemented to stem the viral spread of misinformation about November’s U.S. presidential election, saying the temporary changes are no longer needed.
Twitter had made it harder to retweet others’ posts, encouraging people to add commentary before posting something. The company said it will return to one-click retweets, after seeing a 20% decrease in sharing following the change. After the election, Facebook boosted news sources it considered authoritative on its social network, to make sure users were getting high-quality information on the outcome, but that problem isn’t as urgent anymore. “This was a temporary change we made,” the company said in a statement.
Although election conspiracies pose less of a threat now, with President Donald Trump’s campaign lawsuits failing and Joe Biden’s victory confirmed by the Electoral College, the companies are going back to their old rules just when they may soon face a public-health information problem around the Covid-19 vaccines, which just started being administered in the U.S. For example, Alabama’s Department of Public Health on Wednesday warned of rumors circulating on social media, specifically about a nurse dying from the shot. “Rumors and misinformation can easily circulate within communities during a crisis,” the department said in a Facebook post, urging people to “look for information from official public health and safety authorities.”
Twitter said that although it’s reverting to the old rules on retweets, “We’ll continue to focus on encouraging more thoughtful amplification,” according to a post Wednesday. “This requires multiple solutions -- some of which may be more effective than others. For example, we know that prompting you to read articles leads to more informed sharing.”
Both companies have policies in place to block or label misinformation about Covid-19 and treatments for the virus. Twitter on Wednesday expanded its rules to include statements that contain harmful or misleading information about Covid-19 vaccines.
Facebook also has said it will remove false information about the vaccines that has been debunked by experts, and will let users know when they have interacted with such information. “We’re still ensuring that people see authoritative and informative news on Facebook, especially during major news cycles and around important global topics like elections, Covid-19, and climate change,” a spokesman said in the statement. Facebook’s shift back was reported earlier by the New York Times.