Facebook is failing to keep people safe and informed during the pandemic.
Global health misinformation spreading networks spanning at least five countries generated an estimated 3.8 billion views on Facebook in the last year.
Health misinformation spreading websites at the heart of the networks peaked at an estimated 460 million views on Facebook in April 2020, just as the global pandemic was escalating around the world.
Content from the top 10 websites spreading health misinformation had almost four times as many estimated views on Facebook as equivalent content from the websites of 10 leading health institutions, such as the World Health Organisation (WHO) and the Centers for Disease Control and Prevention (CDC).
Only 16% of all health misinformation analysed had a warning label from Facebook. Despite their content being fact-checked, the other 84% of articles and posts sampled in this report remain online without warnings.
Report reveals top ‘superspreaders’ of health misinformation on Facebook.
From RealFarmacy, one of the biggest health misinformation spreading websites, to GreenMedInfo, a website that presents health misinformation as science.
On Facebook, public pages act as one of the main engines for sharing content from websites spreading health misinformation, accounting for 43% of the total estimated views.
We identified 42 Facebook pages as key drivers of engagement for these top health misinformation spreading websites. They are followed by more than 28 million people and generated an estimated 800 million views.
There is a two-step solution to quarantine this infodemic that could reduce belief in misinformation by almost 50% and cut its reach by up to 80%.
Step 1: Correct the Record by providing all users who have seen misinformation with independently fact-checked corrections. This could decrease belief in misinformation by an average of almost 50%.
Step 2: Detox the Algorithm by downgrading misinformation posts and systematic misinformation actors in users' News Feeds, decreasing their reach by up to 80%.
Facebook has yet to effectively apply these solutions at the scale and sophistication needed to defeat this infodemic, despite repeated calls from doctors and health experts to do so.
Health misinformation is a global public health threat.1 Studies have shown that anti-vaccination communities prosper on Facebook,2 that the social media platform acts as a ‘vector’ for conspiracy beliefs that are hindering people from protecting themselves during the COVID-19 outbreak,3 and that bogus health cures thrive on the social media platform.4
Facebook itself has promised to keep people “safe and informed” about the coronavirus,5 and well before the pandemic, acknowledged that “misleading health content is particularly bad for our community”.6
Until now, however, little has been published about the type of actors pushing health misinformation widely on Facebook and the scope of their reach. This investigation is one of the first to measure the extent to which Facebook’s efforts to combat vaccine and health misinformation on its platform have been successful, both before and during its biggest test yet: the coronavirus pandemic. It finds that even the most ambitious among Facebook’s strategies are falling short of what is needed to effectively protect society.
In this report, Avaaz uncovers global health misinformation spreading networks on Facebook that reached an estimated 3.8 billion views in the last year spanning at least five countries — the United States, the UK, France, Germany, and Italy.7 Many of these networks, made up of both websites and Facebook pages, have spread vaccination and health misinformation on the social media platform for years. However, some did not appear to have had any focus on health until Feb. 2020 when they started covering the COVID-19 pandemic.
In Section 1, we take a closer look at the global health misinformation networks, and show how 82 websites spreading health misinformation racked up views during the COVID-19 pandemic to a peak of 460 million estimated views on Facebook in April 2020. These websites had all been flagged by NewsGuard for repeatedly sharing factually inaccurate information,8 many of them before the pandemic.
We compared this to content from leading health institutions and found that during the month of April, when Facebook was pushing reliable information through the COVID-19 information centre, content from the top 10 websites spreading health misinformation reached four times as many views on Facebook as equivalent content from the websites of 10 leading health institutions,9 such as the WHO and CDC.10
This section also uncovers one of the main engines spreading health misinformation on Facebook: public pages — they account for 43% of all views to the top websites we identified spreading health misinformation on the platform.11 The top 42 Facebook pages alone generated an estimated 800 million views.
The findings in this section bring to the forefront the question of whether or not Facebook’s algorithm amplifies misinformation content and the pages spreading misinformation. The scale at which health misinformation spreading networks appear to have outpaced authoritative health websites, despite the platform’s declared aggressive efforts to moderate and downgrade health misinformation and boost authoritative sources, suggests that Facebook’s moderation tactics are not keeping up with the amplification Facebook’s own algorithm provides to health misinformation content and those spreading it.
In order to assess Facebook’s response to misinformation content spreading on its platform, we analysed a sample set of 174 pieces of health misinformation published by the networks uncovered in this report, and found only 16% of articles and posts analysed contained a warning label from Facebook. And despite their content being fact-checked, the other 84% of articles and posts Avaaz analysed remain online without warnings. Facebook had promised to issue “strong warning labels” for misinformation flagged by fact-checkers and other third party entities.12
In Section 2, we look at the type of content spread by the global health misinformation networks. We also examine how many of these seemingly independent websites and Facebook pages act as networks, republishing each other's content and translating it across languages to make misinformation content reach the largest possible audience. In this way, they are often able to circumvent Facebook’s fact-checking process.
Some of the most egregious health fakes identified in this report include:
An article alleging that a global explosion in polio is predominantly caused by vaccines, and that a Bill Gates-backed polio vaccination programme led to the paralysis of almost half a million children in India. This article had an estimated 3.7 million views on Facebook and was labelled ‘false’ by Facebook. However, once websites in the networks republished the article, either entirely or partially, its clones/republications reached an estimated 4.7 million views. The subsequent articles all appear on the platform without false information labels.
An article that claimed that the American Medical Association was encouraging doctors and US hospitals to overcount COVID-19 deaths had an estimated 160.5 million views — the highest number of views recorded in this investigation.
Articles containing bogus cures, such as one wrongly implying that the past use of colloidal silver to treat syphilis, tuberculosis or ebola supports its use today as a safe alternative to antibiotics. This article reached an estimated 4.5 million views.
In Section 3 we take a deeper look at some of the most high profile serial health misinformers to better understand their tactics and motives. We cover five case studies:
Realfarmacy, which had 581 million views in a year, and is on track to become one of the largest health misinformation spreading networks.
The Truth about Cancer, a family business behind a massive wave of anti-vaccination content and Covid-19 conspiracies.
GreenMedInfo, a site that misrepresents health misinformation content as academic research.
Dr. Mercola, a well-known leading figure in the anti-vaccination movement.
Erin at Health Nut News, a lifestyle influencer and megaphone for the anti-vaccination movement and other conspiracy theories.
Finally, in Section 4 we present a two-step solution to quarantine this infodemic, which research shows could reduce the belief in misinformation by almost 50% and cut its reach by up to 80%:
Correct the Record: providing all users who have seen misinformation with independently fact-checked corrections, decreasing the belief in the misinformation by an average of almost 50%; and
Detox the Algorithm: downgrading misinformation posts and systematic misinformation actors in users' News Feed, decreasing their reach by about 80%.
Early in the crisis the World Health Organisation (WHO) warned of an “infodemic” of misinformation during the COVID-19 crisis.13 Frontline doctors and nurses also sounded the alarm about viral misinformation on social media and the threat it posed to lives around the world.14 This section uncovers global health misinformation spreading networks operating on Facebook, and for the first time, finds that their content generated an estimated 3.8 billion views in one year — peaking as the global pandemic spread around the world.
For this report we have created what is, to the best of our knowledge and given the tools available, a robust and conservative research methodology to identify health misinformation spreading networks:15
Firstly, we collected and reviewed all websites that, according to the independent news vetting organisation NewsGuard, are untrustworthy and have repeatedly shared false content, including on health or COVID-19 related issues, and that have a significant focus on these topics. We then independently reviewed each website from the list above, and only included websites with at least one clear example of health misinformation that:
was directly published by the website;
was independently fact-checked by a credible third party fact-checker;
was shared widely.16
Secondly, we identified the top Facebook pages that are helping to drive content to these websites. We searched through public Facebook pages, identifying those that generated a high amount of interactions (at least 100,000) on posts linking to these websites.17 We then filtered further, only including those pages for which we could document three or more pieces of misinformation that had been independently fact-checked by credible third party fact-checkers.
We ended up with 82 websites and 42 Facebook pages that make up our sample set of global health misinformation spreading networks.
This report looks at the total estimated views on Facebook for content from health misinformation spreading networks. The websites and Facebook pages identified are not solely spreading verifiable health misinformation, but also other types of content, including misinformation relating to other topics, posts that fail to comply with basic journalistic standards and general non-misinformation content.
There are three main reasons that we’re focusing on the total estimated views of these actors:
Misinformation is spread by an ecosystem of actors. Factually inaccurate or misleading content doesn’t spread in isolation: it’s often shared by actors who are spreading other types of content, in a bid to build followers and help make misinformation go viral.18 A recent study published in Nature showed how clusters of anti-vaccination pages managed to become highly entangled with undecided users and share a broader set of narratives to draw users into their information ecosystem.19 Similarly, research by Avaaz conducted ahead of the European Union elections in 2019 observed that many Facebook pages will share generic content to build followers, while sharing misinformation content sporadically to those users.20 Hence, understanding the estimated views garnered by this ecosystem of health misinformation spreaders during a specific period can provide a preview of their general reach across the platform.
One click away from misinformation content. When people interact with such websites and pages, those who might have been drawn in by non-misinformation content can end up being exposed to health misinformation, either with a piece of false content being pushed into their News Feed or through misinformation content that’s highlighted on the websites they’re landing on.21 Our methodology of measuring views is highly dependent on interactions (likes, comments, shares), but many users who visit these pages and websites may see the health misinformation content but not engage with it. Furthermore, many of these websites and pages share false and misleading content at a scale that cannot be quickly detected and fact-checked. Until the fact-check of an article with false information is posted, a lot of the damage is already done and unless the fact-check is retroactive, the correction may go unseen by millions of affected viewers.
All these websites are known for spreading health misinformation. NewsGuard has flagged all 82 websites as untrustworthy because they have repeatedly published false content and specifically flagged them as publishing health misinformation. All of these Facebook pages have been found to be strongly amplifying and spreading the content of these websites, while also repeatedly sharing misinformation.
Consequently, understanding the impact these networks of websites and pages have on the overall information ecosystem for Facebook users is more accurately reflected by the overall amount of views they are capable of generating. Moreover, this measurement allows us to better compare the overall views of health misinformation spreaders to the views of authoritative sources such as the WHO and the CDC.
As detailed above, the heart of the global misinformation networks investigated in this report are 82 known health misinformation spreading websites, whose articles are amplified on Facebook via pages, groups, and individual profiles.
The table below shows the top 10 of these websites, which alone reached almost 1.5 billion estimated views over the last year, accounting for almost 40% of all the networks’ 3.8 billion estimated views.
Table 1.1: Top 10 websites by estimated views on Facebook between May 28, 2019 to May 27, 2020 (See Appendix for details on calculations).
During this investigation it became clear that Facebook pages were one of the main engines of virality for the 82 health misinformation spreading websites.
In fact, the top 1000 public Facebook pages that shared content linked to one or more of these 82 websites generated 43% of their total views on Facebook.22 Facebook’s closed groups and personal profiles generated another 48%.23 The top 1000 public groups generated another 9% of views.24
Within public Facebook pages, we then identified 42 ‘superspreader’ Facebook pages — pages that generated at least 100,000 interactions on posts linking to the 82 health misinformation spreading websites and acted as a megaphone for much of their content by repeatedly sharing and directing users to this content.25 26 These pages generated 800 million estimated views alone and constituted more than three-quarters (76%) of all the websites’ estimated views coming from Facebook pages.27
On average, these Facebook pages were created more than seven years ago, confirming that many of these actors have been active on the social media platform for some time. The pages are not exclusively directed at people looking for health advice — almost half the pages have political or alternative ‘news’ interests. Like the health misinformation websites they drive traffic to, these pages do not exclusively contain misinformation. They also share additional content such as content that fails to comply with basic journalistic standards, viral content from other pages and non-misinformation content such as online memes.
|1||Dr. Joseph Mercola en Español||101,899,218|
|8||Grow Food, Not Lawns||28,312,274|
|9||Wake Up World||23,121,071|
|10||The Truth About Cancer||22,559,261|
Table 1.2: Top 10 Superspreading Facebook pages and their estimated views on the 82 websites spreading health misinformation content, May 28, 2019 to May 27, 2020.
To see all table data, please, scroll to the right
The 82 health misinformation spreading websites, combined with the 42 superspreader Facebook pages, make up the sample set of global health misinformation networks identified in this report.28 It is Important to note that this sample does not represent the full scale of health misinformation on Facebook. Yet despite this only being a sample focused on five countries, the networks identified generated content reaching more than 130 million interactions, equivalent to 3.8 billion estimated views on Facebook, between May 28, 2019 and May 27, 2020.29 As this report only looked at a sample of networks mainly spanning five countries, this is a conservative estimate of the total health misinformation on the social media platform. Civil society, policymakers and researchers have urged Facebook to provide more transparency on the reach of health misinformation on the platform. Although the company has taken small steps to cooperate with researchers associated with certain programs around election integrity, the company has yet to fully and openly comply with requests for more transparency.
During the pandemic, Facebook tried to fix its health misinformation problem by curating a COVID-19 Information Center,30 providing free adverts to the WHO, starting to show general notifications to users who engaged with harmful COVID-19 misinformation and removing content that could lead to imminent physical harm.31 These efforts are commendable initial steps and show that Facebook has started to take this issue more seriously. However, our research suggests that Facebook’s algorithm was simultaneously helping to boost content from global health misinformation spreading networks, severely undermining the platform’s proactive efforts to moderate this problem.
Facebook has so far been unwilling to disclose enough information about its algorithm for researchers to determine exactly how and why misinformation content goes viral on its platform. However, it has shared publicly that it actively interferes in the algorithm to downrank fact-checked misinformation in users’ News Feeds (when it is detected) and has claimed that on average this downgrading leads to an 80% reduction in future impressions — demonstrating what a powerful role the algorithm can play in determining what content its users see.32
This investigation tracked the views of the 82 health misinformation spreading websites on Facebook month by month between May 28, 2019 and May 27, 2020. It found that their total estimated views peaked on the platform in April 2020, just as COVID-19 cases were spreading globally and Facebook executives were promising to act aggressively against misinformation.33
Figure 1.2: This table documents the total estimated views of the 82 health misinformation spreading websites on Facebook over one year.
We then compared this with the views generated on Facebook for the websites of 10 leading health institutes in the UK, the United States, France, Italy, and Germany, as well as with the WHO and the European Centre for Disease Prevention and Control (ECDC), over the same period. We found that on Facebook, the content from the top 10 websites spreading health misinformation generated almost four times as many views as equivalent content from the top 10 websites of leading health institutions (See list in Table 1.3).34
|UK - COVID||https://www.gov.uk/coronavirus|
|UK - NHS||https://www.nhs.uk/|
|US - COVID Response||https://www.coronavirus.gov/|
|US - HHS||https://www.hhs.gov/|
|US - CDC||https://www.cdc.gov/|
Table 1.3: 10 leading health Institutions used as the basis of the comparison with health misinformation spreading websites.
This suggests that just when citizens needed credible health information the most, and while Facebook was trying to proactively raise the profile of authoritative health institutions on the platform, its algorithm was potentially undermining these efforts and helping to boost content from health misinformation spreading websites at a staggering rate.
Figure 1.4: Comparing the total estimated views to content from the top 10 health misinformation sharing websites with equivalent content from 10 leading health institutions by country/region.
Facebook’s algorithm decides the content users see in their News Feeds and where that content is placed based on a broad range of variables and calculations, such as the amount of reactions and comments a post receives, whether the user has shown interest in the content of the group and page posting, and a set of other signals.35
This placement in the News Feed, which Facebook refers to as “ranking”, is determined by the algorithm, and provides significant amplification to a piece of content. However, as Mark Zuckerberg has highlighted:
“One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content ... At scale it can undermine the quality of public discourse and lead to polarisation.”36
As the examples in this report show, health misinformation is often sensationalist and provocative and will therefore receive significant engagement. This engagement will, in turn, be interpreted by the algorithm as a reason to further boost this content in the News Feed, creating a vicious cycle where the algorithm is consistently and artificially giving health misinformation, for example, an upper hand over authoritative health content within the information ecosystem it presents to Facebook users.
Moreover, the staggering number of about 125 million fake accounts37 that Facebook admits are still active on the platform are also ongoingly skewing the algorithm in ways that are not representative of real users.38
Furthermore, the increased engagement with the health misinformation content generated by the algorithm will further increase the visibility and amount of users who follow the pages/websites sharing the misinformation.
Facebook publicly declared in 2018, after the Cambridge Analytica scandal and the disinformation crisis it faced in 2016, that it had re-designed its ranking measurements for what it includes in the News Feed “so people have more opportunities to interact with the people they care about”.39 As a step in that direction, and indirectly admitting to the role of the algorithm in amplifying misinformation, Mark Zuckerberg also announced that the platform would move to moderate the algorithm by ensuring that “posts that are rated as false are demoted and lose on average 80% of their future views”.40
These were welcomed announcements, but the findings of this report indicate Facebook is still failing at preventing the amplification of misinformation and the actors spreading it.
Specifically, the findings in this section of the report strongly suggest that Facebook’s current algorithmic ranking process is either potentially being weaponised by health misinformation actors coordinating at scale to reach millions of users, and/or that the algorithm remains biased towards the amplification of misinformation, as it was in 2018. The findings also suggest that Facebook’s moderation policies to counter this problem are still not being applied effectively enough.
In the recommendation section of this report, we highlight how Facebook can begin detoxing its algorithm to prevent a disproportionate amplification of health misinformation.
In order to assess Facebook’s response to health misinformation content spreading on its platform, we analysed the sample set of 174 pieces of health misinformation our team uncovered as being spread by one or more of these websites and Facebook pages.41 Each of these pieces of content had been found to contain claims identified as false, partly false, misleading, or unsubstantiated by reputable fact-checking organisations — many of them part of Facebook’s own fact-checking initiative. The fact-checks were then verified again for this report by Health Feedback, a reputable third party fact-checker that is a Facebook partner and also a member of the WHO-led project Vaccine Safety Net.
What we uncovered was worrying. Even when taking into consideration recent progress Facebook has made to fight health misinformation, only 16% of the 174 fact-checked health misinformation content we analysed had a warning label from Facebook.42 The other 84% remained on the platform without fact-check labels. 43 Given that the sample set analysed in this report is just a fraction of the health misinformation on the platform, we cannot extrapolate from our findings that 84% of all fact-checked content on the platform remains without a label. Nonetheless, the findings in this report and Avaaz’s previous report on COVID-19 misinformation both strongly suggest that Facebook’s fact-checking and labeling health misinformation content is not being applied effectively across the platform.
One way in which we found that actors were able to circumvent Facebook's false information labelling process was that websites and Facebook pages republished their misinformation content, either in full or partially, and/or translated it across languages. In this way, content that was originally flagged as false by Facebook was able to subsequently avoid warning labels.44 These findings point to a gap in Facebook’s ability to detect clones and variations of fact-checked content — especially across multiple languages — and to apply warning labels to them.
Throughout the timeframe of our research, this content remained on the platform without a warning label, despite the company’s promise to issue “strong warning labels” for misinformation flagged by fact-checkers and other third party entities.45
This section highlights some of the most egregious examples of health misinformation we uncovered on Facebook, and the tactics used by the global health misinformation spreading networks to make it go viral on the platform during the COVID-19 pandemic.46 The examples below targeted Bill Gates, undermined health institutions just when citizens needed them the most, sparked fear and distrust in vaccination programmes and promoted bogus cures for deadly diseases. They were all viewed on Facebook at a time when the social media platform was promising to tackle health misinformation and keep people safe and informed.47
Conspiracy theories involving Bill Gates and COVID-19 were mentioned 1.2 million times on TV and social media from Feb. to April, according to Zignal Labs.48 Below, we look at just one article, “Gates’ Globalist Vaccine Agenda: A Win-Win for Pharma and Mandatory Vaccination,”49 posted on Facebook on April 9, 2020 by the vaccine sceptic organisation Children’s Health Defense.50
This original article, which not only shares verified misinformation about Bill Gates but also spreads falsehoods about the polio vaccine program deployed throughout the world, reached over 3.7 million estimated views (125,284 interactions) on Facebook to date.51 But it was also republished (either in its entirety or partially), quoted from or linked to by 29 of our sample of 82 health misinformation spreading websites in the ten weeks after publication. Those actors who amplified the content by cloning and/or republishing it reached over 4.7 million views in six different languages. In addition, 15 out of the 42 ‘superspreader’ Facebook pages shared, linked or quoted a version of this specific health misinformation example.
The article contains nine false claims about Bill Gates that are debunked in a fact-check by the German organisation Correctiv.52
First Claim: In 2017, the WHO admitted that the global explosion in polio is predominantly caused by vaccines. False: there is no global polio explosion or pandemic due to vaccines. The oral polio vaccine rarely causes illness, and in fact, its implementation has led to polio case numbers falling by more than 99%, according to the WHO, from an estimated 350,000 cases in 1988 to 175 cases in 2019.53
Second Claim: Polio vaccines paralysed 490,000 children in India. False: the fact-check debunks the claim that almost half a million children were paralysed due to polio vaccinations.54
Third Claim: India asked “Gates and his vaccine policies” to leave the country over polio vaccine damage. False: the Indian Ministry of Health has made it clear in its press release that this is not true and media reports are misleading.55
Fourth Claim: Bill Gates vaccinated 23,000 girls in India against HPV. 1,200 had severe side effects, seven died. Unsupported: there is no evidence of these deaths being connected to any vaccinations.56
Fifth Claim: The case regarding HPV vaccinations will be heard by the Indian Supreme Court. Mainly true but missing context: there was a case, but not against the Gates Foundation, and the petition involved claims of inadequate informed consent and that the Indian Council of Medical Research guidelines for biomedical research were not adhered to.57
Sixth Claim: 151 children died in Africa from a malaria vaccine funded by Gates. Out of 5,049 children, 1,048 had severe side effects such as paralysis, seizures and febrile convulsions. Largely False: 105 children died, but not as a result of the vaccine. Thirteen children had side-effects due to the vaccination.58
Seventh Claim: During the “MenAfriVac” campaign between 50 and 500 vaccinated against meningitis suffered from paralysis. Unsubstantiated: the claim is based on a local news report that 40 vaccinated children were paralysed, but an investigation found that this was unconnected to vaccinations and the children’s conditions were not serious.59
Eighth Claim: Bill Gates wants to reduce the world population through vaccinations, therefore a "sterility formula" is integrated into vaccines. False: there is no evidence for any of these claims.60
Ninth Claim: The DTP vaccine kills more children in Africa than the diseases it protects against. False: studies from the WHO have proven that there are reduced mortality rates in vaccinated children, and the studies showed no negative effect of the DTP vaccine.61
While the original article is currently labelled by Facebook as false information, none of the other articles sharing it, either partially or fully, were labelled as such at the time of publication of this report. It appears that other websites that reposted the article or shared parts of it were able to avoid Facebook’s fact-checking process. This demonstrates that Facebook needs to improve its systems, ensuring they are more sophisticated and better able to comprehensively catch falsehoods being published on their platform.
Image 2.1: A sample of health misinformation spreading websites that shared “Gates’ Globalist Vaccine Agenda: A Win-Win for Pharma and Mandatory Vaccination” either fully or partially.
Apart from conspiracy theories, some of the most viral false or misleading posts documented in this report undermined health systems, hospitals, and protective guidelines during the COVID-19 pandemic.
This first example is an article with the headline: “U.S. Hospitals Getting Paid More to Label Cause of Death as ‘Coronavirus’”.62 The article features a video of Fox News host Laura Ingraham’s interview with Minnesota physician and Republican state senator Dr Scott Jensen, and mischaracterised his comments as claiming “the American Medical Association is ‘encouraging’ doctors to overcount coronavirus deaths across the country.” Dr Jensen never mentions the AMA in the interview.63 According to a reporter from FactCheck.org, ”Jensen said that he did not think that hospitals were intentionally misclassifying cases for financial reasons ... But that's how his comments have been widely interpreted and paraded on social media".64
The article was published by GlobalResearch.ca65, a site that NewsGuard says “has published false content, conspiracy theories, and pro-Russia propaganda”66 and posted on the Global Research Facebook page as well as others. The article has cumulatively received more than 160.5 million estimated views on Facebook — the highest number of views of any post in this investigation.
Lead Stories, which is part of Facebook’s third-party fact-checking programme67, fact-checked this article and concluded: “Are U.S. doctors encouraged by hospitals and the American Medical Association to list COVID-19 on a death certificate when the novel coronavirus was not the cause of death? No, that's not true: While hospitals may get additional Medicare or Medicaid reimbursements for COVID-19 patients, there is no evidence this has resulted in doctors being incentivised to lie on a death certificate.”68
It is notable that ‘kernel of truth’ posts like this are often used by misinformers to draw people to misinformation content, interspersing false claims with verified information, to make the post seem more credible.69 This type of false and misleading content represents a trend Avaaz researchers are observing during this pandemic, and requires more sophisticated efforts by fact-checkers and Facebook to ensure well-designed misinformation content is detected and effectively debunked.
The second example is a post published on Collective Evolution’s Facebook page.70 Collective Evolution is a site that promotes conspiracy theories, such as government cover-ups and reported extraterrestrial encounters.71Notably, it has promoted theories that the 9/11 terrorist attacks may have been an inside job,72 widely debunked,73 and that the CDC admitted they have no evidence vaccines don’t cause autism,74 also debunked.75
The post shares an article with the headline “Two Emergency Medicine Doctors On Why Quarantine ‘Just Doesn’t Make Sense,’”76 containing multiple false claims to support the argument that rather than protecting people during a pandemic, quarantine actually harms public health. The article has reached more than 2.4 million (2,468,396) estimated views on Facebook and was also posted by Humans Are Free77 and Vaccine Impact,78 two websites included in the 82 global health misinformation spreading networks on Facebook identified in this investigation.
The article reports that the doctors in the video claimed that sheltering in place does “not make any sense if one wants to protect themselves from COVID” and that it “may play a part in prolonging the disease instead of killing it”. The article also claims the doctors said the COVID-19 infection mortality rate is in “the same ballpark as seasonal influenza”. However, these claims have been debunked according to a fact-check by Health Feedback, citing a study from The Imperial College of London that estimates lockdown policies have averted more than 3 million deaths across 11 European countries.79 The video the article was based on has been removed from YouTube for violating its policies.80
Independent fact-checker Health Feedback, which is a Facebook fact-checking partner, also found that “[s]heltering in place would not limit microbial exposure to the extent of causing a weaker immune system as claimed,”81 and that the “infection fatality rate, which includes both confirmed and undetected cases, for COVID-19 has been estimated to be about 0.66%, based on a research study published in The Lancet Infectious Diseases, which is about 20 times Erickson’s estimate”. The fact-check adds that the doctors’ analysis “greatly underestimates the infection mortality rate of COVID-19”.82
The coronavirus pandemic spread throughout the world at the same time as the global rollout of 5G, amplifying conspiracy theories about the dangers of these wireless networks.83 Many health misinformation spreading networks identified in this report shared articles on Facebook falsely linking 5G to COVID-19, including an article in French that claims that 5G increases the pathogenicity of the virus,84 and an article in German claiming that a study shows a direct connection between 5G and coronavirus.85
One article with the headline “5G – The Global Human Experiment Without Consent”86 was shared by Waking Times, a website that publishes health misinformation, false or unsubstantiated claims related to vaccine safety and COVID-19, and conspiracy theories relating to government surveillance.87 It was viewed an estimated 13.4 million times (13,407,707) and was reposted or linked to from six websites in the health misinformation spreading networks. It includes a number of false claims, including that “thousands of independent studies indicate adverse health impacts from wireless radiation” ranging “from cancer and sterility to DNA damage”.
However, as a fact-check by Health Feedback found, studies have not produced any compelling evidence that cell phone radiation causes cancer.88 Health Feedback also found 5G is non-ionising radiation,89 and therefore, is not able to damage DNA as the original article claims, and that there is no evidence demonstrating a credible threat to human health by 5G.90
Avaaz found that several of the websites and Facebook pages within these global health misinformation spreading networks shared content about bogus cures and unsubstantiated health treatments for deadly diseases.
This post shares an article by RealFarmacy.com (explored in more detail in Section 3 of this report) with the headline: “Colloidal Silver: Erased From Textbooks Because it Treated Illnesses from Syphilis to Tuberculosis to Ebola.”92 The article has reached more than 4.5 million estimated views on Facebook, despite Health Feedback labelling the claim that colloidal silver is effective for any infection as inaccurate, and concluding that "colloidal silver does not fight infection inside the body or boost the immune system".93 According to the U.S. National Institutes of Health, consuming colloidal silver or using it as a nasal spray can also cause health problems and serious side effects.94
Despite Facebook providing a false information overlay on the article, there is no available information on when this measure took place, nor how many viewers that had already interacted with this false content had also seen the fact-check. We also found that RealFarmacy.com received compensation for recommended products and/or generated income through online advertising.95
An article from the German language website unsere-natur.net, which regularly promotes false medical claims,96 alleges that alkaline water can kill cancer cells.97 The article claims that Nobel Prize winner Dr Otto Warburg discovered that up to 95% of all cancers are caused by an acidic environment and that he proved that cancer cannot thrive in an alkaline environment. Subsequently, a recipe for alkaline water is provided.
This article reached more than 121,000 estimated views, despite fact-checkers debunking the claim that alkaline water can be used as a treatment to kill cancer cells. The fact check finds that there is no proof that drinking alkaline water has any health benefits.98 The theory has also been debunked by an associate professor of clinical medicine at the University of Southern California (USC) Norris Comprehensive Cancer Center of Keck Medicine. The article written by the Keck Medicine doctor also explains why scientific evidence does not suggest that drinking alkaline water will treat or prevent cancer.99
In this section, we examine five health misinformation spreaders that are behind some of the most prominent websites and superspreader Facebook pages uncovered in this report.
We detail evidence that suggests de facto connections between many of these actors, including shared content, content collaboration, and content promotion, that likely has increased views of health misinformation across Facebook, especially during the COVID-19 pandemic. In addition, we touch on some of the ways these superspreaders monetise their content and highlight the various “reader beware” disclaimer language posted on their sites.
While 61% of the health misinformation spreading websites show no clear political affiliation, according to NewsGuard,100 more than one quarter are labelled as being from the far-right of the political spectrum, making it the largest political orientation represented amongst the health misinformation spreading websites.
Figure 3.1: Political orientation for the 82 health misinformation spreading websites analysed in our report.
RealFarmacy.com and its three connected superspreader Facebook pages received over half a billion estimated views in one year, making it a candidate to become the largest health misinformation sharing network101. Until recently, a comparable scale was only reached by the well-known misinformation website Natural News,102 which has now been banned from Facebook.103
RealFarmacy.com was registered in 2013 through a Panama-based privacy service.104 The website shares conspiracy theories and false claims regarding the COVID-19 pandemic.105 It also shares content promoting bogus cures and unsubstantiated health treatments, such as the example described in Section 2.5 with the headline: “Colloidal Silver: Erased From Textbooks Because It Treated Illnesses From Syphilis to Tuberculosis to Ebola.”106
While RealFarmacy.com mostly focuses on health and alternative lifestyle content, it’s important to note that its superspreader Facebook page also shares non-health related misinformation, such as a video posted in the midst of the George Floyd protests falsely claiming that “Bins of bricks and broken concrete were discovered strategically placed throughout Brooklyn and removed by police”.107 The claims about bricks being placed for protesters, insinuating the protests and riots were an organised conspiracy, have been debunked by multiple fact-checkers.108 This is not unusual and is part of a wider trend of politicisation within some of the health misinformation networks that we’ve identified in the past.109 NewsGuard says RealFarmacy.com is a website that has "repeatedly shared false health information and unfounded conspiracy theories, including false claims about the 2020 coronavirus outbreak", and that it "often presents false information and unverified claims about the efficacy of vaccines and some cancer treatments”.110
The site reposts or credits articles from some of the other health misinformation spreading websites this investigation has identified, such as Collective Evolution,111 and Children's Health Defense.112
In particular, the website features Dr Mercola, a well-known name within the health misinformation field (and looked at in more detail below). For example, it frequently shares articles previously penned or published on Dr Mercola's website, such as “Bioweapons Expert Speaks Out About Coronavirus”.113 Some articles by Dr Mercola on the RealFarmacy site include an advert which links to a Coronavirus Resource Page on his website.114
Image 1.2: Dr Mercola adverts appearing on articles credited to Dr Mercola on realfarmacy.com
The RealFarmacy website generates income through online advertising. The website also declares that it “may receive small amounts of compensation for specific products that are recommended and linked to in text. Products that have been suggested are those that our team have personally used and approved of”.115
The Legal Disclaimer on the website states: "We can not and do not give you medical advice. The information contained in this online site and emails is presented in summary form only and intended to provide broad consumer understanding and knowledge ... We do not recommend the self-management of health problems ...You should never disregard medical advice or delay in seeking it because of something you have read here”.116
The website of osteopathic physician Dr Joseph Mercola,117 and its three superspreader Facebook pages out of a total of 11 official pages,118 is one of the most prolific health misinformation spreading networks this investigation has identified, reaching more than 147 million estimated views on Facebook in the past year.
Dr Mercola’s website has dedicated sections for food, fitness, pets, health and COVID-19, and "special info sites" for a number of topics, such as cancer,119 fluoride,120 mercury,121 vaccines,122 and vitamin D.123 On articles posted on his site, there's a fact-checked information box noting that "[a]ll Mercola articles are fact-checked, vetted and verified using Associated Press and Society of Professional Journalists journalism standards" (see screenshot below). Yet a number of claims made in articles and posts published by Dr Mercola have been debunked by fact-checkers, including articles with the headlines: “‘Vaxxed’ — How Vaccine Safety Is Undermined and Suppressed,”124 and “Five Alarming Health Problems Linked to 5G”.125
NewsGuard states that "Mercola.com has repeatedly promoted false or unsubstantiated claims on topics that include the Wuhan coronavirus outbreak and the debunked theory that vaccines can cause autism".126
Figure 3.2: Example of a fact-checked information box that appears under articles on Dr Mercola's website.
Mercola has also received four warnings by the U.S. Food and Drug Administration since 2005, according to NewsGuard, "for making unapproved claims about drugs sold in the site’s store".127 Mercola has reportedly contributed more than $2.9 million to one of the United States' most vocal anti-vaccination groups, the National Vaccine Information Center,128 in the last decade — accounting for around 40% of the organisation’s funding.129 NewsGuard states that "(t)he website … has published false claims about standard medical practices such as vaccinations.”130
In July 2019, Mercola stopped updating his English Facebook page due to privacy issues.131 He asked his 1.8 million followers to subscribe to his daily newsletters in order to receive "health news and information" directly from him, "without threats to [user's] security and privacy".132 Mercola's Facebook pages in additional languages continue to be active on the platform at the time of the publication of this report.
Joseph Mercola and Erin Elizabeth, whose site Health Nut News is also examined later in this section, are in a relationship, according to Elizabeth’s Facebook page.133 The couple collaborates at times, as seen in an article by Dr Mercola on Health Nut News featuring his interview with Shiva Ayyadurai, an anti-vaccination activist.134 In Elizabeth’s ‘editor’s note’ to the article, she says, “I turned Joe (Dr Mercola) onto Dr Shiva over a year ago. And I’m so glad he finally listened and interviewed him.”135 Dr Mercola’s articles are also shared on the National Vaccine Information Center's Facebook page, an organisation that he has sponsored to the tune of $2.9 million in the past decade.136
Dietary supplements, essential oils, food and beverage items and other household products are sold online on shop.mercola.com.137 The shop is available in English and Spanish. According to The Washington Post, Mercola "has amassed a fortune selling natural health products, court records show, including vitamin supplements, some of which he claims are alternatives to vaccines".138
The disclaimer at the bottom of his website states: "The information on this website is not intended to replace a one-on-one relationship with a qualified health care professional and is not intended as medical advice. It is intended as a sharing of knowledge and information from the research and experience of Dr Mercola and his community”.139
The Truth About Cancer (TTAC) website,140 and its main Facebook page,141 share content on cancer treatments, vaccination safety and conspiracies surrounding the COVID-19 pandemic. The website is coordinated by the married couple Ty and Charlene Bollinger.142 According to an introduction on the website,143 Ty is an "author, medical researcher, talk radio host, health freedom advocate, former competitive bodybuilder and also a certified public accountant," while Charlene is introduced as a "health freedom advocate, and co-founder and CEO of The Truth About Cancer”.144
Their website and Facebook page reached an estimated 40 million views on Facebook in the last year, and yet the Bollingers claim to be “fighting” censorship.145
In April 2020, when COVID-19 cases continued to rise globally,146 and people around the world were seeking credible health information, The Truth About Cancer launched their nine-part documentary series on Facebook, called “The Truth About Vaccines”. It promises to disclose "the never-before-revealed FACTS about vaccine safety".147 The series includes interviews with several actors identified in this report, and vocal anti-vaccination activists, such as Erin Elizabeth of Health Nut News,148 and Mike Adams149 of Natural News,150 now banned from Facebook.151 A one minute and 40 seconds long trailer titled "Didn't we learn from Nazi Germany?"152 which shows an interview with discredited doctor and "founder of the modern anti-vaccine movement"153 Andrew Wakefield has garnered 97,000 views on Facebook. Another teaser, featuring controversial anti-vaccination activist Dr Judy Mikovits,154 of “Plandemic” fame,155 is titled "How Vaccines are a 'Sacrament' of Big Pharma.”156 The site also posts fear-mongering content, including one post that asks if doctors are “committing crimes against children when they vaccinate”.157
NewsGuard reports that: "The Truth About Cancer has repeatedly promoted ineffective, unproven, and potentially dangerous cancer treatments."158
The Truth About Cancer Facebook page used Sayer Ji, the face behind GreenMedInfo,159 another known health misinformation spreader (see Section 3.4), to promote “The Truth About Vaccines” documentary series.160 Likewise, GreenMedInfo has also used Charlene Bollinger in a post on its Facebook page,161 promoting the 5G Summit it helped co-host.162
The Truth About Cancer website generates income through its online shop,163 where it sells books, videos, and other household and health products. This includes supplements, body cleanse kits, and EMF (Electro Magnetic Fields) protection gadgets for $550.164 “The Truth About Vaccines” series is also available for sale,165 ranging between $199 and $499. The Bollingers also sell monthly and yearly memberships that include access to Facebook live sessions and a Facebook group,166 newsletters and video content, and Q&A sessions with them.
The TTAC website provides a disclaimer at the bottom of its main page which states: “If you purchase anything through this website, you should assume that we have an affiliate relationship with the company providing the product or service that you purchase, and that we will be paid in some way. We recommend that you do your own independent research before purchasing anything.” It adds: "These statements have not been evaluated by the Food and Drug Administration. The information on this website is not intended to diagnose, treat, cure or prevent any disease."167
The GreenMedInfo website,168 and its main Facebook page,169 publish health-related content. The site contains a database with 50,000 studies covering 10,000 distinct topics, including vaccinations, and it has an advocacy goal to ”support informed consent”.170
According to Sayer Ji's website, he is the founder of GreenMedInfo.171 He is also a member of the advisory board of the National Health Federation, a U.S.-based non-profit "health freedom" organisation that promotes alternative medicine.172 Ji was one of two hosts of the online 5G Summit in June that also gathered a number of anti-vaccination activists examined in this investigation, such as Robert F. Kennedy Jr./Childrens' Health Defense, Ty and Charlene Bollingers/The Truth About Cancer, and Joe Martino/Collective Evolution.173 The main purpose of the event was to "show how 5G wireless is an invasive technological platform that can damage our health and privacy".174 Posts promoting the summit were shared on GreenMedInfo’s Facebook page ahead of the event.175
While GreenMedInfo says it is “dedicated to providing evidence-based natural medical information,"176 many of the claims in its articles have been debunked by reputable fact checkers, including: “5G - The Global Human Experiment Without Consent and Most Censored Topic of Our Time,”177 and “Think combined doses of vaccines have been tested? They haven't not once ever”.178
GreenMedInfo appears to have been subject to de-platforming,179 and has been removed from platforms such as Pinterest.180 Sayer Ji has complained about these “censorship” efforts.181 The GreenMedInfo daily newsletter serves 500,000 readers, according to the website.182 GreenMedInfo’s website and main Facebook page reached more than 39 million estimated views on Facebook in the last year.
NewsGuard describes GreenMedInfo as a "website that has promoted unproven ‘natural’ cures for illnesses including cancer and the COVID-19 virus, and has published false claims that vaccines can cause autism".183
The founder of GreenMedInfo, Sayer Ji was one of two co-hosts of the 2020 5G Summit which included speakers such as Robert F. Kennedy Jr and the Bollingers.184 The GreenMedInfo site also shares other types of content from health misinformation spreading websites identified in this report, including articles by Dr Mercola, for instance, “Bioweapons Expert Speaks Out About Novel Coronavirus”185 and a number of articles on vaccine safety published by Robert F. Kennedy Jr. from the Children's Health Defense.186
A package of audio and video files, eBooks and other material from the 5G Summit were available for purchase on the summit's website for $119.187
GreenMedInfo does not run adverts as they affirm that the site is 100% supported through memberships and donations, which makes them "able to serve 500,000 visitors a month with free access to our carefully curated research," as stated on the banner at the bottom of the main page.188 Membership fees start from $8/month up to $89/month.189 Membership grants access to various online courses, articles, videos, special "focus research PDFs" with studies on turmeric and breast cancer, as well as vaccines and autism.190 Visitors can also subscribe to a daily newsletter and "receive Nature's Evidence-Based Pharmacy".191 According to their website, there are over 300,000 subscribers.192
The disclaimer on the bottom of the GreenMedInfo website states that: "This website is for information purposes only. By providing the information contained herein we are not diagnosing, treating, curing, mitigating, or preventing any type of disease or medical condition. Before beginning any type of natural, integrative or conventional treatment regimen, it is advisable to seek the advice of a licensed healthcare professional."193
The website Health Nut News,194 and its main superspreader Facebook page,195 is an alternative health and lifestyle site that shares content on health, food/recipes, environment and politics. The site also acts as a megaphone for the anti-vaccination movement, often sharing or reposting content from Dr Mercola196 and other minsinformers listed in this report, such as GreenMedInfo197 and Children's Health Defense.198
Health Nut News is run by Erin Elizabeth, who according to her Facebook page, is "an author, TV journalist, public speaker, and activist".199 She has appeared in documentaries produced by The Truth About Cancer.200 She is also in a relationship with Dr Mercola, according to her Facebook page ‘About’ section.201
Claims made in content published and shared by Health Nut News have been debunked various times by reputable fact-checkers. The website includes articles such as: “Japan Leads the Way: No Vaccine Mandates and No MMR Vaccine = Healthier Children,”202 and “5G – The Global Human Experiment Without Consent.”203
The Health Nut News website and its main superspreader Facebook page reached more than 95 million (95,919,289) estimated views on Facebook in the past year.204 NewsGuard describes the site as "dedicated to alternative medicine that originated a false conspiracy theory about the deaths of holistic doctors" and “has repeatedly published stories promoting the widely debunked theory that vaccinations can lead to autism".205
As previously mentioned, Elizabeth has participated in the documentary series about vaccination, produced by the Bollingers (The Truth About Cancer).206 She has promoted Charlene Bollinger as well as the documentary in various posts on Facebook.207 She regularly posts or shares content from Dr Mercola208 and other minsinformers listed in this report, including GreenMedInfo209 and Children's Health Defense.210
Elizabeth sells various products on her website,211 from red light therapy devices to electromagnetic frequency "neutralizers". She also has a "personally curated list of products all available on Amazon" that link to her page on the shopping platform,212 which sells products such as colloidal silver for "powerful immune support".213
A common misconception is that there are no effective policy solutions that could significantly limit the spread of misinformation and disinformation. In fact, in this section we present a two-step solution to quarantine this infodemic, which research shows could reduce the belief in misinformation by almost 50% and cut its reach by up to 80%. We hope social networks will implement this solution in order to protect our societies.
Facebook is not a neutral platform. Every time a user logs in, its algorithm decides which content to show them on their News Feed and in which order the content is ranked among the thousands of recent posts from their friends, pages they already like and groups they have joined.214 Facebook’s algorithm also determines which pages and groups are suggested to users.215 This algorithm has a number of goals, most notably: maximising user engagement216 with content and increasing the time users spend on the platform.217
The problem is that this algorithm often gives an advantage to the emotive, divisive content that characterises health misinformation. The Wall Street Journal reported that during a 2018 presentation, Facebook’s own team warned senior executives that the company’s algorithms weren’t bringing people together but instead driving them apart: “Our algorithms exploit the human brain’s attraction to divisiveness,” read a slide. “If left unchecked,” it warned, Facebook would feed users “more and more divisive content in an effort to gain user attention and increase time on the platform”.218
Mark Zuckerberg has also expressed concern about “sensationalism and polarisation,”219 and Facebook has tried to address its misinformation problem through a number of counter false news policies.220 In particular, during the COVID-19 pandemic, Facebook has curated a COVID-19 Information Center,221 given out free advertisements to the WHO, started to alert users who engaged with harmful COVID-19 misinformation,222 and directed them to fact-checked corrections on the WHO site.223 These efforts are commendable and show that Facebook is starting to take this issue more seriously, but as this report shows, Facebook’s efforts are not yet enough to fight this COVID-19 infodemic, and even less so beyond COVID-19 related misinformation.
This report indicates that Facebook’s engagement-based metrics and algorithm are undermining its own efforts. Content from our sample of health misinformation spreading websites and pages reaching an estimated 3.8 billion views in just a year stands in strong contrast to the company’s recent promise to keep its users “safe and informed” on COVID-19.224
It is now time for Facebook to start implementing straightforward systemic solutions, ones that are proven to be effective in minimising the reach and impact of misinformation across social media platforms, namely:
Correct the Record: Providing ALL users who have seen misinformation with independently fact-checked corrections thereby decreasing the belief in the misinformation by an average of almost 50%; and
Detox the Algorithm: Downgrading misinformation posts and systematic misinformation actors in users' News Feeds, decreasing their future views by about 80%.
Here’s how these solutions would work:
Correct the Record would require platforms to retroactively distribute corrections from independent fact-checkers to every single person exposed to false or misleading information. Facebook currently only adds labels on fact-checked content, but does not retroactively go back and provide corrections to the hundreds of millions of people who have seen the initial falsehood in their News Feeds.
An Example Of How The Corrections Tested Would Look Is Presented Below:
Users who happen to see a misinformation post like this:
will see a notification like this in their newsfeed and via their notification centre, when the post is eventually fact-checked by an independent fact-checker:
Correct the Record would cut the belief in the misinformation in almost HALF
A new academic study, commissioned by Avaaz and conducted by Dr Ethan Porter of George Washington University and Dr Tom Wood of Ohio State University, found that providing social media users who have seen false or misleading information with corrections from fact-checkers can decrease belief in disinformation by an average of almost 50% and by as much as 61%.225
In April, coinciding with the release of Avaaz’s previous report on COVID-19 misinformation on Facebook, the platform announced that it would be sharing a general alert telling users that had interacted with harmful COVID-19 misinformation to visit the WHO’s myth busting website.226 But this step is only limited to harmful COVID-19 misinformation, and the alert the platform is providing is not a correction (see image below). Avaaz commended Facebook for engaging with our team and taking this step, but we believe that it is far from what Correcting the Record would entail in terms of providing a clear fact-check and a well-tailored notification.
Image: Facebook’s current notification design for users who see harmful COVID-19 misinformation.
Moreover, academic researchers that Facebook has quoted to justify its position against providing clear corrections have spoken out against the platform for misinterpreting their findings.227 The overall research on the topic is clear that providing well-designed and sophisticated corrections will reduce belief in misinformation.
Centrally, policy makers and Silicon Valley executives must be aware that the vast majority of people want Facebook to Correct the Record for users who are targeted with misinformation.
Polls in Germany, France, Spain, and Italy showed that 87% are supportive of this idea, agreeing that “social media platforms like Facebook and Twitter should work with independent and trusted fact checking organisations to provide all users who have been exposed to wide-spread false or misleading content with verified corrections”.228 Among UK citizens, 84% are in favor of the idea, as are 68% of Americans.229
Detox the Algorithm means to transparently adjust the platforms’ algorithm to ensure that it effectively downgrades known disinformation, misinformation, as well as pages, groups and websites that systematically spread misinformation.
Providing a detailed process for Detox the Algorithm requires significantly more transparency from social media platforms on how their recommendation and amplification systems work. However, with the information currently available, there are clear steps that the platforms can take today to begin the detox process.
For example, Facebook already claims that they downgrade content labelled as false by its fact-checking partners, but as observed in the Bill Gates conspiracy case study (Section 1.1), not all of the reposts of that initial false story were labeled or downgraded. These reposts reached even more views (4.7 million) than the original article (3.7 million) — even though, as this research shows, detecting the majority of the clones of that post is not a difficult process. Facebook needs to drastically escalate its efforts at identifying clones of known misinformation and apply its policies across the platform.
Lastly, it is pivotal for Facebook to act against systematic misinformers, urgently working with fact-checkers and civil society to define clear policies and transparency tools that allow for the quick identification of those responsible for spreading health misinformation. This effort must lead to a robust policy of downgrading the reach of systematic misinformation spreaders, while also giving those who have been flagged for sharing misinformation a clear and quick way to appeal. This step will ensure that systematic misinformation actors do not continue to spread false and misleading content at a scale that cannot be fact-checked, weaponising the platform’s amplification algorithm and breaking through Facebook’s current moderation policies.
Detox the Algorithm (DTA) can be implemented in four steps:
Detect and downgrade known misinformation content: First, social media platforms must ensure that all content that has been identified as misinformation by independent fact-checkers is immediately downgraded to the bottom of the News Feed.
Detect and downgrade systematic misinformation actors: Secondly, platforms must work with independent fact-checkers to detect and downgrade pages, groups and websites that repeatedly and systematically spread misinformation or that are found to have repeatedly violated Facebook’s community standards in an effort to spread misinformation, deceive users, or manipulate the algorithm.231
In emergency situations, such as a pandemic or an election, Facebook should have focused teams working with civil society to monitor, detect and downgrade systematic misinformation actors, with defined thresholds for when a page/group becomes a serial misinformer.
Right to appeal: Downgrading misinformation actors is a serious penalty, and should be imposed only after warnings are issued to these pages, groups and websites. These actors must be given an opportunity to adjust their behavior or to exercise their right of appeal if they disagree with a downgrading decision.
Demonetize systematic misinformers: When an actor has been found to be systematically spreading misinformation, Facebook must ban these actors from advertising and monetising on the platform. Demonetisation is a serious penalty as well and should be imposed only after warnings are issued to these pages, groups and websites.
Inform users and keep them safe: Users viewing or wanting to interact with such pages, groups or websites should be informed through clear labels or other means that these actors were found to be repeatedly and systematically spreading misinformation, and provide them with access to more information.
Image 5.2: Estimated views reduction of systematic misinformation actors through implementing Detox the Algorithm.
Facebook must also facilitate independent audits of the algorithm governing its content delivery and monetisation systems. Comprehensive frameworks for these audits must be developed in consultation with civil society to the extent possible.
Facebook has stated that it is already reducing the reach of pages, websites and groups in the News Feed232 when they repeatedly share content that has been rated false by independent fact-checkers.233
Avaaz calls on Facebook to cooperate in conducting a transparent assessment of whether the misinformation sharing pages identified in this report have hit the threshold of requiring a partial or full downgrade in the News Feed based on a number of indicators such as the scale and severity of misinformation shared by these pages.
What this report highlights is that Facebook’s policy to reduce the reach of actors repeatedly sharing content that has been rated false by independent fact-checkers does not seem to have yet been applied effectively to the health misinformation pages and websites identified in this report —but only Facebook can make this detailed assessment. We urge policy makers to demand the platform is transparent about the process and its findings on this front.
The platform also states that after a story is labeled as false, Facebook ranks the stories significantly lower in the News Feed, which, on average, cuts future views by more than 80%.234 If this would be applied transparently to all misinformation content as well as systematic misinformation actors, it could significantly reduce the negative impact of misinformation on the platform.
As Facebook moves to apply this solution to misinformation content at scale, the platform’s algorithm should be able to learn how to better identify misinformation more quickly through patterns. In the long term, this will ensure that Facebook adopts (given the political will of its executives) a better design of the algorithm that does not, at its core, automatically amplify misinformation content.
Under Detox the Algorithm, pages or groups spreading misinformation would not be deleted, nor would websites be banned from Facebook. Instead, they would not be promoted, nor amplified by Facebook’s algorithm. With time, this policy will help ensure that misinformation content is less prominently promoted by the algorithm because it begins to learn to identify and not massively amplify content that is characterised as misinformation, thus marginalising serial misinformation actors instead of helping them grow their followers list.
Detox the Algorithm preserves freedom of speech rights and could also safeguard against the infringement of freedom of thought and opinion rights. Furthermore, the manner in which the algorithm amplifies misinformation, unknowingly toxifying and polarizing a users’ information ecosystem, may significantly interfere with users' ability to form independent opinions.
In the implementation of Detox the Algorithm, Pages and groups found to be serial misinformers would be given an opportunity to issue corrections to their users or challenge the detox decision if they disagree with it. Facebook should also make sure that pages or groups are notified after the first and second strike, giving them time to change their behaviour or contest the strike.
As for transparency for users, on the next page, we take a closer look at how user alerts could work in practice (this is just a suggestion), but any design that ensures users know that they are interacting with a systematic misinformation actor, if done well, will be effective in helping prevent users from being drawn into groups and pages that can become a rabbit holes for misinformation content.
Facebook labels false content after it has been fact-checked,235 unless it's a post or an ad from a politician or elected official.236
In concurrence with the release of Avaaz's last report, Facebook started to alert users who have engaged with harmful COVID-19 misinformation, sending them to a page with general fact-checks from the WHO.237 This is only applied to harmful COVID-19 misinformation.
Here is an image of Facebook’s current alert to users who had engaged with harmful COVID-19 misinformation:
And yet, Facebook does not take the pivotal step of informing the millions of users who have actually seen or interacted with false content that they’ve been misled. Essentially, Facebook only shows fact-checks to a small sliver of its users who interact with misinformation on its platform.
Facebook must urgently take this further and:
(a) issue specific corrections for all fact-checked misinformation (b) retroactively to all users who have viewed or interacted with the misinformation (c) for all issues, not just harmful COVID-19-misinformation (d) and for all clones and republications of the misinformation content in question.
Here is the correction design Avaaz tested that decreases the number of users who believe disinformation by, on average, almost 50%. We believe with further testing, Facebook could further improve that percentage.
Facebook has stated that they have put misinformation warning labels on about 50 million pieces of content related to COVID-19 during April 2020 based on 7,500 articles by its independent fact-checking partners.238 Previous research conducted by Avaaz shows it can take days and sometimes weeks before a piece of misinformation is fact-checked and registered in Facebook’s system. During this time period, the misinformation content could reach millions of users before any warning labels are applied. Consequently, the vast majority of the tens of millions of users who had actually viewed or interacted with the 50 million pieces of false COVID-19 content Facebook claims to have labeled will not be informed that they’ve been misled, nor be sent a specific correction.
But, as the estimated 3.8 billion views on content from health misinformation spreading networks show Facebook is not:
downgrading this content sufficiently and at scale, with its misinformation policies mainly focused on content and less on the repeat sources of such content or clones of this content on the platform.
Facebook does not do so transparently, with clearly defined thresholds that allow for the transparent identification of systematic misinformers, and:
does not have a robust policy of downgrading their reach, while giving clear and transparent ways to appeal the decision.
On July 15, Facebook said that they have connected over 2 billion people to resources from health authorities through their COVID-19 Information Centre and pop-ups, with over 600 million people clicking through to learn more.241 As laudable as that effort is, its impact may be limited if Facebook continues to allow its own algorithm to spread the content of known health misinformation spreading websites at a scale that outpaces the reach of authoritative health websites by a factor of almost four, reaching billions of estimated views in a year based on the extremely limited sample study this report highlights.
Facebook must be more transparent by not only sharing the amount of people it has directed to its COVID-19 Information Centre, but also allowing an independent audit that measures how many of its users around the world were directed to health misinformation by its algorithm.
Without Facebook Detoxing its Algorithm and Correcting the Record, efforts such as the COVID-19 Bulletin Board will only be a bandage on the gushing injuries misinformation is afflicting on our communities.
The necessity of having fact-based discussions on health issues while also defending freedom of expression
For years, Avaaz has campaigned to ensure that the international community acts effectively to provide equal access to healthcare and science-based solutions to people across the world.
Through this work, Avaaz is cognisant of the constantly evolving science around certain medical practices, and the research models used by the scientific community to assess the level of confidence associated with different cures and medical procedures.
It is without question that a central pillar in the improvement of medicine and healthcare policies is the freedom to question established expert beliefs and opinions, especially those not founded on hard science and evidence.
Science is an ever evolving process, and established doctors and medical experts have sometimes made dangerous mistakes. As the New York Times reported in 2019, “nearly 400 routine practices were flatly contradicted by studies published in leading journals”.242
Consequently, creating the space for questioning, deliberation and debates that may challenge expert opinion or status quo practice is necessary, but such discourse should be based on facts, experience, rigorous scientific research and evidence.
Avaaz draws a clear distinction between health misinformation, which is false and misleading content that can cause public harm, and evidence-based medical opinions and content, including those outside the allopathic model that seek to improve medical practice and health care by challenging current expert opinions.
Hence, we believe that policies to fight health misinformation and downgrade health misinformation content must be designed to create a healthy and robust space to foster this dialogue, while preventing malicious actors from saturating normal citizens’ information ecosystems through the amplification of unsubstantiated hoaxes and conspiracy theories.
Currently, it is not only the voices of authoritative health institutions like the CDC and WHO, and local nurses and doctors, that are being squashed by the tsunami of misinformation. Rather, credible alternative health practitioners and well-meaning vanguard scientists are being eclipsed by malicious actors abusing social media platforms to spread false information.
This is why we believe that when there is an ongoing science-based debate on certain health procedures or medicines, pages sharing different sides of that view should not be labeled as false, potentially shutting down debate, but rather have a more nuanced label such as “Unsupported” or “Lacks Scientific Evidence”.
One of the key objectives of this report is to allow for fact-based deliberation, discussion and debate to flourish in an information ecosystem that is healthy and fair and that allows both citizens and policymakers to make decisions based on the best available data. This is also why our solutions do not call for the removal of content, but rather for corrections that provide users with facts, and for measures to downrank content and actors that have been found to systematically spread misinformation.
We see a clear boundary between freedom of speech and freedom of reach. The curation and recommendation model currently adopted by most social media platforms is designed to maximise human attention, not the fair and equal debate which is essential for humanity to rise to the great challenges of our time.
Finally, if we made any errors in this report — please let us know immediately and share your feedback. We are committed to the highest standards of accuracy in our work. For more details, read the Commitment to Accuracy section on our website here.
In this section we describe the three step methodology we used to uncover the global health misinformation spreading networks on Facebook and estimate the 3.8 billion views that they generated. In particular:
Step 1: We identified 82 health misinformation spreading websites.
Step 2: We identified 42 superspreader Facebook pages that most contributed to the networks’ views.
Step 3: We estimated the total number of views for the global health misinformation networks (ie: the 82 websites + 42 Facebook superspreader Facebook pages).
Avaaz identified a sample size of 82 websites spreading health misinformation from a database of 5,080 website credibility reviews,243 which according to NewsGuard,244 accounts for 95% of online engagement with news in the United States, the United Kingdom, Italy, Germany and France. These 82 websites were selected by Avaaz (from the 5,080 website database curated by NewsGuard) based on the following criteria:
Rated as ‘Red’ by NewsGuard, meaning that the site “fails to meet basic standards of credibility and transparency.”
Failed to meet NewsGuard’s two main criteria for evaluating journalistic credibility:
“Does not repeatedly publish false content” and
“Gathers and presents information responsibly”.
Included the following keywords in the NewsGuard database ‘topic’ category: ‘Health’ and/or ‘COVID’ and/or ‘Medical’.245
Its NewsGuard label mentioned specific health misinformation claims or articles shared by the website as per our definition of “health misinformation” as defined in box 1 below.
Published at least one example of fact-checked health misinformation content, which reached at least 30,000 estimated views on Facebook since Jan. 1, 2019.246
Using CrowdTangle, a public insights tool owned and operated by Facebook, Avaaz identified the Facebook pages and groups that had gathered at least 100,000 interactions on posts linking to the 82 health misinformation spreading websites between May 28, 2019 and May 27, 2020.
We removed all pages and groups for which we could not document three or more posts containing or linking to health misinformation as defined in Box 1 below.
This resulted in 42 Facebook pages, which we are calling ‘superspreaders’ because of the significant scale at which they are sharing and amplifying content from the health misinformation spreading websites sampled in this report.
We calculated the estimated views on Facebook of all the content published on both the 82 health misinformation spreading websites and the ‘superspreader’ Facebook pages identified in Steps 1 and 2.247
We did this by:
1: Recording interactions with the 82 websites:
Using the social media monitoring tool Buzzsumo, we recorded all interactions on Facebook for all articles published by the 82 health misinformation spreading websites between May 28, 2019 and May 27, 2020.
Cumulatively, this amounted to 91,019,790 interactions.
2: Recording additional interactions for the ‘superspreader’ Facebook pages:
Using CrowdTangle, we were able to compute all interactions on all content by the 42 ‘superspreader’ Facebook pages between May 28, 2019 and May 27, 2020 for a total of 65,822,067 interactions.
From this total, to avoid double counting interactions, we removed all interactions these pages had gathered on links going to our ‘health misinformation-sharing websites,248 which equalled 26,807,896 interactions.
After subtracting these, the total additional interactions (which includes both content uploaded directly on these Facebook pages e.g. text, pictures, videos, and links going to websites not included in our 82-site sample) from ‘superspreader’ Facebook pages came to 39,014,171 interactions.
3: Calculating the views / interactions ratio:
Facebook discloses the number of views for videos,249 but for posts containing an external domain weblink, the platform displays only the number of interactions (which are shares, reactions and comments).
Therefore, in order to estimate the number of views of the 82 health misinformation spreading websites, we designed a metric based on the publicly available statistics of the top 1000 Facebook pages sharing those 82 sites. We took into account the total number of views (11,302,720,000) for all videos uploaded or shared directly by the top 1000 Facebook pages spreading health misinformation links between May 28, 2019 and May 27, 2020 and then divided it by the total number of interactions (380,600,000) for the same set of videos, which gives us a views/interaction ratio of 29.70 (11,302,720,000/380,600,000).250
4: Total views calculation:251
All the links to the 82 websites shared on Facebook pages, profiles and groups between May 28, 2019 and May 27, 2020 received (a total) 91,019,790 interactions on Facebook. All the ‘superspreader” Facebook pages’ generated an additional 39,014,171 interactions. Multiplying those interactions by our views/interaction ratio of 29.70 we obtain the final estimate of 3,861,632,821 views.
5. Total views calculation for 10 leading health institutions:
A similar 3 steps approach was followed to estimate viewership for 10 leading health institutions in Section 1 - Table 1.3:
Using the social media monitoring tool Buzzsumo, we recorded all interactions on Facebook for all articles published by the health institutions websites between May 28, 2019 and May 27, 2020.
We took into account the total number of views (2,619,840,000) for all videos uploaded or shared directly by the top 1000 Facebook pages spreading health institution links between May 28, 2019 and May 27, 2020 and then divided it by the total number of interactions (83,750,000) for the same set of videos, which gives us a views/interaction ratio of 31.28 (2,619,840,000/83,750,000).
Total views calculation: All the links to the 10 health institution websites shared on Facebook pages, profiles and groups between May 28, 2019 and May 27, 2020 received (a total) of 12,868,742 interactions on Facebook. Multiplying those interactions by our views/interaction ratio of 31.28 we obtain the final estimate of 402,555,762 views.
‘Health misinformation’ definition - For the analysis in this report we consider only verifiably false or misleading information that has the potential to cause public harm, such as undermining democracy or public health. For this report, our investigative team documented 174 verifiably false or misleading posts, articles and videos shared on the health misinformation spreading pages sampled in this report, and that met the following criteria:
Were fact-checked by Facebook’s third-party fact-checking partners or other reputable fact-checking organisations, such as Politifact, HealthFeedback, Snopes and Reuters. To confirm the accuracy of all fact checks, Avaaz hired Health Feedback, a member of the WHO-led project Vaccine Safety Net (VSN), to review all samples chosen for this study.252
Could cause public harm by undermining public health in the areas of:
Preventing disease: e.g. false information on diseases, epidemics and pandemics, and anti-vaccination misinformation.
Prolonging life and promoting health: e.g. bogus cures and/or encouragement to discontinue recognised medical treatments.
Creating distrust in health institutions, health organisations, medical practices, and their recommendations: e.g. false information implying that clinicians or governments are creating or hiding health risks.
Fear mongering on health-related issues: health-related misinformation which can induce fear and panic, e.g. misinformation stating that the coronavirus is lab-created or a man-made bio-weapon.
Health misinformation with the potential of inducing discrimination against minorities: e.g. misinformation that migrants are spreading the virus.
NewsGuard - NewsGuard is an organisation that employs journalists and editors to rate news and information websites based on nine journalistic criteria.253 NewsGuard ratings are used by the tech industry,254 over 700 public libraries,255 and more recently hospitals256 to warn users about untrustworthy information and news sources. It has also already been used in several academic studies from top U.S. universities.257