On July 31, Facebook announced the removal of around 32 pages and accounts on its platform for coordinated and inauthentic behavior.
Facebook shared eight pages with @DFRLab 24 hours before the takedown, and our initial findings were published within that timeframe.
The pattern of behavior by the accounts and on the pages in question make one thing abundantly clear: they sought to promote divisions and set Americans against one another. Their approach, tactics, language, and content were, in some instances, very similar to accounts run by the Russian “troll farm” or Internet Research Agency between 2014 and 2017.
The malign influence operation showed increasing sophistication. Three follow-up aspects to our initial findings include converting online engagement to real world action, shifting tactics to cover tracks, and crossover posting of content from bad actors on different platforms or accounts.
@DFRLab intends to make every aspect of our research broadly available. The effort is part of our #ElectionWatch work and a broader initiative to provide independent and credible research about the role of social media in elections, as well as democracy more generally.
This post investigates how the set of pages Facebook took down on July 31 had a tailored focus on building a mostly static online audience then translating it kinetic political activity in the United States.
The Facebook accounts which promoted divisive issues in America and were shuttered by the platform on July 31, 2018, were most probably run by a successor to the Russian influence operation which targeted the United States from 2014 to 2017, @DFRLab has concluded.
One key question is whether the accounts were run by a successor to the Russian-based “Internet Research Agency” (IRA) which targeted the U.S. in 2014–17. Facebook itself did not draw a firm conclusion, saying that “we can’t say for sure whether this is the IRA with improved capabilities or a separate group.”