Facebook has identified a group of 111 accounts that it says is responsible for sharing the majority of the anti-vaccine material on its platform, according to an internal report obtained and reported on by The Washington Post. The report doesn’t name any of the accounts in question, but the Post did go into detail on how Facebook identified them. The network grouped US users into different categories and assessed how receptive those groups were to anti-vaccine content.
The 111 accounts were responsible for the majority of the content consumed by the top ten most receptive groups, which the Washington Post says accounts for over 50% of the vaccine-skeptical content on Facebook.
The social network doesn’t allow posts about vaccines that contain provably false claims, but there’s a huge gray area of posts that undermine vaccines without saying anything that can be disproven. The report also shows a significant overlap between pushing anti-vaccination content and supporting the QAnon conspiracy.
Facebook has faced criticism and pressure to take a harsher stance on anti-vaccine content shared across its platform in the wake of the coronavirus vaccine. A repot from Avaaz showed that health misinformation was viewed over 3.8 billion times in the last year, reaching a peak as the coronavirus pandemic struck.
Facebook announced new measures on Monday that would help people get information about the vaccine and how they could obtain it. A spokesperson for the social network said that the company could use the data to change policies surrounding vaccine content but that the company hadn’t made any official decisions yet.