As millions of people throughout the world hit the streets to protest police brutality and racism, disinformation seeking to polarize the debate, or to demonise and undermine the protests, continues to spread on social media platforms. A new analysis from Avaaz’s investigative team estimates that a small sample of posts connected to a dozen viral disinformation narratives about the anti-racism protests have been viewed millions of times on Facebook during the last two weeks.
Furthermore, the team was able to identify a network of nine Facebook pages amplifying some of these narratives in a coordinated manner to potentially monetise the virality of outrage.
To understand the scale of disinformation targeting protesters on the platform, Avaaz analysed a dozen of the most egregious, false and misleading narratives about the protests and associated posts between May 27, 2020 - the start of the U.S. demonstrations - and June 9, 2020. We found that:
These 12 narratives attained at least 26 million estimated views combined;
The disinformation promoted by these narratives includes:
Black Lives Matter activists are threatening to assassinate white families;
The anti-racism protests are using paid protesters who have been hired through the website ProtestJobs.com1;
“Antifa” groups or government officials placed pallets of bricks at protest sites in US cities to stoke violence;
George Floyd is alive.
A coordinated network of 9 Facebook pages with an audience of over 1.5 million followers promoted three of the disinformation narratives we analysed: that Soros orchestrated the protests; that “Antifa” placed pallets of bricks at protest sites to stoke violence; and that “Black Lives Matter” had announced its intention to “assassinate white families.” This network consistently included links marketing a dietary supplement in its posts, in a possible attempt to monetise the spread of anti-protest and other disinformation narratives.
At the time of our investigation, many of the posts we reviewed did not carry a fact-checking label, even though the false or misleading narratives the posts were based on had previously been debunked by independent fact-checkers. Examples are this post2 about bricks placed in Dallas for rioters to throw, this post3 about Black Lives Matter activists attacking white families, and this post4 about Soros financing the anti-racism protests.
Below are screenshots of the most egregious narratives examined in this study:
7,584,905 estimated views
539,661 estimated views
78,131 estimated views
A full list of the twelve disinformation narratives we detected and analysed, and their estimated views, can be found in the table below.
These disinformation narratives are spread through a mixture of memes, videos, doctored images, fake-news website links, and text posts, a mish-mash of tactics that, in some cases, appears to be designed to go around Facebook’s counter-disinformation tools.
Avaaz conducted this snapshot analysis as a case study on Facebook’s preparedness to stem the spread of false and misleading content ahead of the US 2020 elections. Facebook’s failure to keep the disinformation narratives above from going viral and being viewed millions of times, as well as the company’s failure to provide retroactive fact-checks to users who have engaged with false or misleading information, offers a strong warning about how disinformation narratives, amplified on the platform, may be used to influence divisions and voters’ perceptions ahead of the elections.
Although Facebook adopted more robust policies after Russian and other election interference efforts during the 2016 vote, the platform’s policies are not keeping up with the scale and sophistication of this problem. For example, Facebook still does not alert the users who it detects have engaged with disinformation that the content they interacted with has been flagged and/or removed from the platform, nor does it act to ensure they see fact-checks, despite ample scientific evidence showing that doing so is effective in fighting belief in disinformation.
Centrally, it is important to note that the narratives identified do not cover the more broad range of disinformation narratives that have not yet been fact-checked by independent fact-checkers, such as extremely local disinformation content focused on certain small communities or protests. Our findings are only the tip of the iceberg.
Last Friday, Mark Zuckerberg stated that: “I believe our platforms can play a positive role in helping to heal the divisions in our society, and I'm committed to making sure our work pulls in this direction.” With increasing demands from Facebook’s employees for better policy solutions to these difficult questions, and with divisions becoming more severe in some cases specifically due to the information ecosystem on Facebook, we urge Mr. Zuckerberg to act fast to ensure that the company’s anti-disinformation efforts are quickly scaled to deal with this problem.
Avaaz has shared these findings with Facebook and recommended that the company share specific data on the full reach and views of the flagged false or misleading content, and whether the platform’s fact-checks have been effective in reaching the users who engaged with this troubling activity.
To see all table data, please, scroll to the right
|Narratives||Fact Check(s)||Estimated Views5|
|Anti-racism protests are relying on paid protesters from website ProtestJobs.com
|“Antifa” groups or government officials placed pallets of bricks at protest sites to stoke violence
|Derek Chauvin committed suicide in his prison cell
|George Floyd is still alive
|A man threw a gas bomb into a horse trailer during a black lives matter protest
Truth or Fiction
|George Soros is paying people to protest/ orchestrating protests
|“Antifa” threatened to “take what’s ours” from white residential areas
|George Floyd’s death was staged
|Protestors broke into the White House
|Picture shows Derek Chauvin wearing a “Make Whites Great Again” hat
|Black Lives Matter activists threatened in social media posts to assassinate white families
|Protesters defaced the Vietnam Veterans Memorial in Washington D.C.
As we analysed the viewership of the 12 protest-related disinformation narratives, we identified a coordinated network of 9 Facebook pages with an audience of over 1.5 million followers. We are calling this the “I Am A Texan” Network, because the ‘I Am A Texan’ page is the oldest page with the most page likes in the network. In addition to possibly sharing administrators and coordination on content posting, this network also appears to have tried to monetise its referral traffic by driving viewers to Resurge.com, which is a website promoting a dietary supplement.
Between May 26, 2020 and June 8, 2020, this network promoted three of the disinformation narratives we analysed: that George Soros orchestrated the protests; that “Antifa” placed pallets of bricks at protest sites to stoke violence; and that “Black Lives Matter” had announced its intention to “assassinate white families.” The network regularly posts identical content to its pages within seconds of each other.
The 9 pages in the “I Am A Texan” Network that we identified are shown below.
|Don't Tread on Me||https://www.facebook.com/letfreedomringyall/||September 30, 2012||United States (2), Germany (1)||518,654||http://archive.ph/ClPwy|
|I Am A Texan||https://www.facebook.com/BeautifulTexas/||June 6, 2012||United States (3), Germany (1)||520,929||http://archive.ph/4bchP|
|Cool Conservative||https://www.facebook.com/coolconservatives||December 6, 2018||United States (3), Germany (1)||17,788||http://archive.ph/G8Exz|
|The Second Amendment||https://www.facebook.com/The-2nd-Amendment-452832981470891/||May 3, 2013||United States (3), Germany (1)||156,066||http://archive.ph/mRN41|
|Start Draining America||https://www.facebook.com/startdrainingnow/||May 26, 2013||United States (3), Germany (1)||170,528||http://archive.ph/CltYH|
|Washington News||https://www.facebook.com/Washington-News-1583110238580443/||April 4, 2015||United States (2), Germany (1)||31,161||http://archive.ph/OPEDV|
|I Am Texan||https://www.facebook.com/iamtexan1/||July 16, 2015||United States (2), Germany (1)||69,332||http://archive.ph/Jr6jJ|
|I Love America||https://www.facebook.com/reclaimamericaforliberty||September 19, 2013||United States (7), Germany 1||44,941||http://archive.ph/8Csn0|
|Texas is Amazing||https://www.facebook.com/amazingtx/||November 16, 2014||United States (3), Germany (1)||23,806||http://archive.ph/DiQz6|
A number of factors indicate that these pages are maintained by the same administrators. As you can see in the table above, the pages appear to share a set of administrators based in the US and Germany. Note that all of the pages have 1 admin in Germany, plus 2 to 3 in the US. The one exception is the page, “I Love America,” which has 7 US-based admins and 1 in Germany.
Further, the same items are frequently posted to several of the network’s pages within very tight time bounds, with identical post text and an identical Buff.ly shortlink.
For example, on 8 of the pages in the “I Am Texan” Network, Avaaz observed that the story of Black Lives Matter announcing its intention to “assassinate white families” was posted on each page within 11 seconds of each other, and they all included identical text and links. It appears that, in the last 24 hours, all but one of these posts have now been taken down. The rest of the network appears to still be live.
This table shows the 8 posts on the Network that have the false content claiming “Black Lives Matter activists intend to assassinate white families”
|Page Name||Created||Original URL (Some posts have now been removed by Facebook)||Message|
|I Love America||2020-05-31 21:18:32 EDT||https://www.facebook.com/reclaimamericaforliberty/posts/2939627482782088||This was just posted in a Houston area Facebook Group. Watch this video right away or you'll hate yourself later https://buff.ly/2AoLTjO:=:https://resurge.com/welcome/?hop=iamatexan&s=dQxEaf40yI9SijRm6VAY%26&atid=|
|I Am Texan||2020-05-31 21:18:31 EDT||https://www.facebook.com/iamtexan1/posts/2753113428250296||This was just posted in a Houston area Facebook Group. Watch this video right away or you'll hate yourself later https://buff.ly/2AoLTjO:=:https://resurge.com/welcome/?hop=iamatexan&s=dQxEaf40yI9SijRm6VAY%26&atid=|
|Texas Is Amazing||2020-05-31 21:18:31 EDT||https://www.facebook.com/amazingtx/posts/2746325695599335||This was just posted in a Houston area Facebook Group. Watch this video right away or you'll hate yourself later https://buff.ly/2AoLTjO:=:https://resurge.com/welcome/?hop=iamatexan&s=dQxEaf40yI9SijRm6VAY%26&atid=|
|The 2nd Amendment||2020-05-31 21:18:29 EDT||https://www.facebook.com/452832981470891/posts/3097927866961376|
|Don't Tread On Me||2020-05-31 21:18:27 EDT||https://www.facebook.com/letfreedomringyall/posts/2780724178717656|
|Washington News||2020-05-31 21:18:27 EDT||https://www.facebook.com/1583110238580443/posts/2721936764697779|
|Start Draining America||2020-05-31 21:18:25 EDT||https://www.facebook.com/startdrainingnow/posts/3050045618382036|
|Cool Conservatives||2020-05-31 21:18:21 EDT||https://www.facebook.com/coolconservatives/posts/570320157250365|
We also discovered interesting monetisation behaviour by this network which we observed in all the posts we reviewed, including the Black Lives Matter examples set out above. The network’s posts consistently include shortlinks that direct to Resurge.com, a site that sells a dietary supplement via an affiliate marketing service provided to sellers by a company called ClickBank.
Each of the posts we examined includes the text, “Watch this video right away or you'll hate yourself later,” followed by a short link that contains an identical tracking code - “hop=iamatexan”. This tracking code tells ClickBank who should be credited the commission if a sale of a Resurge product is made through the referral link. So-called “affiliate” marketers of Resurge’s products earn a commission for every product sold. In every post we reviewed that's the, "I Am A Texan," page, the network's oldest and most popular page. It appears that someone(s) in the “I Am a Texan” network has become a registered “affiliate” of ClickBank for the purpose of marketing Resurge.com products. According to the Resurge.com website; “CLICKBANK® is a registered trademark of Click Sales Inc.” The fact that each node of the network markets Resurge using an identical Clickbank shortlink points strongly to the nodes’ connection to each other.
Avaaz found no evidence to suggest that Resurge or Click Sales Inc. are aware that Resurge’s products are being marketed in connection with the spread of disinformation.
Facebook must notify and issue fact-checked corrections to every person who saw content that independent fact-checkers have determined to be false or misleading information. Particularly at this tense moment, 5 months ahead of the elections, it is central that American users are notified when they have viewed false and misleading content, and must be provided with corrections when they are available. Research commissioned by Avaaz and conducted by leading experts proves that providing corrections to social media users who have seen false or misleading information can decrease belief in disinformation by almost 50%. Multiple other peer-reviewed studies have demonstrated that effective corrections can reduce and even eliminate the effects of disinformation.
Facebook must detox its algorithms that decide what people see. This means known disinformation is downgraded in user feeds, instead of being amplified. Disinformation, and pages and channels that belong to repeat offenders who spread it, should also be taken out of the algorithms that recommend content. Social media algorithms can often prioritise keeping users on their platform over keeping them safe and well-informed.
For further information on Correct the Record and Detox the Algorithm, please see our legislative principles for tackling disinformation.
Avaaz’s team of researchers used a combination of Facebook’s public data, analytics tools like CrowdTangle7, independent US fact-checking organizations, and statistical modelling to arrive at what we believe provides a credible, though conservative, picture of the total estimated views and impact of some of the most shared and fact-checked disinformation narratives in the US as pertains to the anti-racism protests.
Avaaz defines disinformation as “verifiably false or misleading information, as assessed by reputable independent fact-checking organisations, with the potential to cause public harm for example by undermining democracy or public health, or encouraging discrimination or hate speech".
To better understand the scale of protest-related disinformation on Facebook, Avaaz began collecting and analysing posts about the protests from May 26, 2020 to June 9, 2020.
For the purpose of this analysis, our investigative team then specifically reviewed a broad list of independently fact-checked disinformation content, targeting users spanning the political and ideological spectrum. The team sorted posts that shared the same theme into a specific narrative “box”. Our team then chose the set of 12 disinformation narratives to focus on for this quick snapshot based on the following criteria:
Were the relevant narratives fact-checked by reputable, independent U.S. fact-checking organisations?8
Did the posts promote verifiably false or misleading content related to the protests that could cause public harm?
Had the posts detected, which pertain to a certain narrative, attain an aggregate of at least 50,000 estimated views9?
For each of these narratives, our team then used CrowdTangle software and direct observation to:
Find text, images, videos, or links to external web pages promoting these narratives that were shared by pages, groups, and profiles on Facebook;
Assess the engagement with each item and record the total number of interactions (likes, shares, comments) each received;
Arrive at the final estimate of the number of times each item was likely viewed on the platform.
Facebook discloses the number of views for videos, but for posts containing only text and image content the platform displays only the number of shares, likes and comments. Therefore, in order to estimate viewership for text and image content, Avaaz designed a metric based on the publicly available statistics of the Facebook pages creating or sharing the false or misleading pieces in our report. For each page, we took into account the total number of owned and shared video views between May 26, 2020 to June 9, 2020, and then divided it by the total number of owned and shared video interactions.
Facebook reports a “video view” after only three seconds, while an image or text can be considered as “viewed” and having an actual impact in under three seconds. This approach relies on the assumption of uniform distribution of viewership. Therefore, the estimation of total views in this study is probably lower than the content’s actual total viewership, but more data from Facebook would be useful to ensure the accuracy of this estimation.
Step 1: Views/interaction ratio calculation: We computed a global view/interaction ratio of 20.19 for all the pages that shared the selected disinformation narratives. The ratio was computed by dividing the total video views collected by these pages in one year by total video interactions in the same timeframe.
Step 2: Total views calculation: For each piece of disinformation, we multiplied our views/interaction ratio of 20.19 per the number of interactions for that posts or web links provided by CrowdTangle. The only exception was for videos, where we used the actual views provided by Facebook. Adding all those real and estimated views together we obtain the final estimate of 26,543,800 views.
Social media platforms have different means of measuring user engagement:
Views/impressions: Impressions are the amount of times a particular piece of content was served to a user whereas views is the amount of times a user sees actually a specific piece of content. An individual who even scrolls through a picture in their Facebook newsfeed regardless of whether they actually saw the content would be counted as a view on Facebook. Impressions and Views on organic content are not known publicly.
Video views: The only measure that Facebook shares is a number of “video views” where a video view is when a user has watched three seconds of a video. As described in our methodology above, we’ve used that higher-bar definition of a “video views” as the baseline for our estimation of total views of all of the content, even though content can already have an actual impact in less than three seconds.
Reach: The amount of specific individual users who have engaged with specific content.
Interactions: The amount of likes, shares, comments...etc. on a specific post.
Clicks: The number of users who actually clicked on the content - for example to visit a website it links to, or to expand a picture.
Facebook does not make the metrics on how many individuals have viewed certain disinformation content fully available to researchers outside specific research programs at the company, apart from for videos, and in some cases, for advertising content that a user has purchased. Therefore, Facebook alone has a full picture of the views, reach, interactions, and clicks for the 12 disinformation narratives we explored in this investigation.
As discussed above, in this report Avaaz seeks to approximate a “view” based on only the publicly available metrics. For each page, we took into account the total number of owned and shared video views between May 26, 2020 to June 9, 2020, and then divided it by the total number of owned and shared video interactions. Facebook reports a “video view” after only three seconds, while an image or text can be considered as “viewed” and having an impression and actual impact on the user in under three seconds. Therefore, the estimation of total views in this study is probably lower than the content’s actual total viewership.
We urge Facebook to transparently share the number of views and reach of disinformation content on its platform, as well as the reach of relevant fact-checked content, and to work to ensure that the prevalence of fact-checked content supersedes that of disinformation content.