A small number of users are driving vaccine skepticism on Facebook: Report
Facebook will have to build a mechanism to distinguish between content that expresses concerns and is outright misinformation.
A Facebook research has revealed that a small group of users is responsible for driving conversations that raise doubts over vaccines, according to The Washington Post.
According to the report, Facebook research discovered 10 out of 638 population segments contributed to 50% of content related to vaccine hesitancy. Just 111 users accounted for half of all vaccine-hesitant content in the segment with the highest vaccine hesitancy.
The research also discovered an overlap between communities that have raised doubts over the vaccines and those, who have an affiliation with the conspiracy theory group QAnon.
“Public health experts have made it clear that tackling vaccine hesitancy is a top priority in the COVID response, which is why we've launched a global campaign that has already connected 2 billion people to reliable information from health experts and remove false claims about COVID and vaccines,” Facebook spokeswoman Dani Lever said. “This ongoing work will help to inform our efforts.”
The Facebook research is likely to help the social networking company build new or improve existing policies to address misinformation related to the Covid-19. The company has already taken several measures to prevent misinformation. Just last month, Facebook expanded its efforts to take down false claims related to Covid-19. The expanded efforts also cover misinformation on Covid-19 vaccines and vaccines in general during the pandemic.
Some of the false claims, Facebook said, it will remove are “Covid-19 is man-made or manufactured”, “vaccines are not effective at preventing the disease they are meant to protect against”, and “it's safer to get the disease than to get the disease.” The complete list is available here.
The Washington Post report points out that Facebook will have to consider multiple things if it plans to further police such content. For instance, it will have to distinguish between content that expresses concerns and misinformation.
“Vaccine conversations are nuanced, so content can't always be clearly divided into helpful and harmful,” Kang-Xing Jin, Facebook's head of health, had written in an op-ed last week in the San Francisco Chronicle. “It's hard to draw the line on posts that contain people's personal experiences with vaccines.”