200
Facebook (FB.O) is taking a more aggressive approach to shut down coordinated groups of real-user accounts engaging in certain harmful activities on its platform, using the same strategy its security teams take against campaigns using fake accounts, the company told Reuters.
The new approach, reported here for the first time, uses the tactics usually taken by Facebookโs security teams for wholesale shutdowns of networks engaged in influence operations that use false accounts to manipulate public debate, such as Russian troll farms.
It could have major implications for how the social media giant handles political and other coordinated movements breaking its rules, at a time when Facebookโs approach to abuses on its platforms is under heavy scrutiny from global lawmakers and civil society groups.
Facebook said it now plans to take this same network-level approach with groups of coordinated real accounts that systemically break its rules, through mass reporting, where many users falsely report a targetโs content or account to get it shut down, or brigading, a type of online harassment where users might coordinate to target an individual through mass posts or comments.
In a related change, Facebook said on Thursday that would be taking the same type of approach to campaigns of real users that cause โcoordinated social harmโ on and off its platforms, as it announced a takedown of the German anti-COVID restrictions Querdenken movement.
These expansions, which a spokeswoman said were in their early stages, means Facebookโs security teams could identify core movements driving such behavior and take more sweeping actions than the company removing posts or individual accounts as it otherwise might.
In April, BuzzFeed News published a leaked Facebook internal report about the companyโs role in the Jan. 6 riot on the US Capitol and its challenges in curbing the fast-growing โStop the Stealโ movement, where one of the findings was Facebook had โlittle policy around coordinated authentic harm.โ
Facebookโs security experts, who are separate from the companyโs content moderators and handle threats from adversaries trying to evade its rules, started cracking down on influence operations using fake accounts in 2017, following the 2016 US election in which US intelligence officials concluded Russia had used social media platforms as part of a cyber-influence campaign โ a claim Moscow has denied.
Facebook dubbed this banned activity by the groups of fake accounts โcoordinated inauthentic behaviorโ (CIB), and its security teams started announcing sweeping takedowns in monthly reports. The security teams also handle some specific threats that may not use fake accounts, such as fraud or cyber-espionage networks or overt influence operations like some state media campaigns.
Sources said teams at the company had long debated how it should intervene at a network level for large movements of real user accounts systemically breaking its rules.
In July, Reuters reported on the Vietnam armyโs online information warfare unit, who engaged in actions including mass reporting of accounts to Facebook but also often used their real names. Facebook removed some accounts over these mass reporting attempts.
Facebook is under increasing pressure from global regulators, lawmakers and employees to combat wide-ranging abuses on its services. Others have criticized the company over allegations of censorship, anti-conservative bias or inconsistent enforcement.
An expansion of Facebookโs network disruption models to affect authentic accounts raises further questions about how changes might impact types of public debate, online movements and campaign tactics across the political spectrum.
โA lot of the time problematic behavior will look very close to social movements,โ said Evelyn Douek, a Harvard Law lecturer who studies platform governance. โItโs going to hinge on this definition of harm โฆ but obviously peopleโs definitions of harm can be quite subjective and nebulous.โ
High-profile instances of coordinated activity around last yearโs US election, from teens and K-pop fans claiming they used TikTok to sabotage a rally for former President Donald Trump in Tulsa, Oklahoma, to political campaigns paying online meme-makers, have also sparked debates on how platforms should define and approach coordinated campaigns.
REUTERS