Spam groups being operated out of developing Asian countries such as Bangladesh, Cambodia and Myanmar are using Facebook to profit from events such as the Black Lives Matter protests, the upcoming US presidential election and ethnic tensions in Myanmar, among others.
The bad actors have been using clickbait related to social issues and politics throughout 2020 to drive Facebook users to fraudulent websites that generate profit from ads or merchandise.
Facebook revealed the activity in a new report on "inauthentic behaviour" on its platforms. In the report, published Wednesday (October 21), Facebook said "inauthentic behaviour" (IB) is often financially motivated, as opposed to "coordinated inauthentic behaviour" (CIB) which is usually designed to manipulate public debate for a strategic goal.
While Facebook has focused a lot of its efforts on CIB as "the most egregious form of IB", it wants to publicise its efforts against other forms of IB to "advance the public's understanding" of "gray areas where harm and deception aren’t as clear cut".
This behaviour is often less sophisticated than operations run by fraudulent networks which have more nefarious intentions. Where CIB networks leverage a collection of fake accounts to spoof identity, IB activity primarily uses legitimate accounts to amplify and increase the distribution of content, Facebook said. The activity can sometimes involve the use of fake accounts or other inauthentic assets, "but we typically see little attempt to obfuscate their identity from Facebook and only the most superficial attempts to construct a false identity", the social network wrote.
Much of the spam activity originates in countries with "cottage industries specialised in propagating deceptive schemes to exploit internet platforms", Facebook said. That includes Asian countries such as Bangladesh, Cambodia, Myanmar, the Philippines and Vietnam, as well as European markets such as Albania and Macedonia.
Bad actors behind IB are primarily driven by financial motivation, Facebook said, with much of the activity focused on driving people to off-platform websites filled with ads or merchandise. These websites may pretend to support a cause or be part of the same community as their target audience in order to convince users to separate with cash.
To facilitate this, the actors mislead people or Facebook about the popularity of content, the purpose of a community (i.e. Groups, Pages, Events), or the identity of the people behind it. One such method is called "abusive audience building", in which a Facebook Page switches identity and repeatedly changes its name to the latest trending topics and shares viral clickbait in order to build an audience.
Clickbait has historically been focused on celebrity gossip or animal-related memes, but in recent months spam actors have turned their attention to politics and social issues.
In May and June, Facebook took down 4 pages and 13 groups attempting to build audiences by posting viral content around the Black Lives Matter protests. In these particular cases, the spam actors leveraged topics including racial and social injustice and police brutality in the US to trick people into joining their groups and following their pages, and then directed them to ad farms or merchandise stores.
The pages and groups were created by several unconnected foreign spam groups from Botswana, Bangladesh, Cambodia and Vietnam, Facebook said.
Facebook said these activities can be mistaken for politically-motivated influence operations at first glance but were in fact clickbait operations targeting people in the US to generate profit. Unfortunately, worldwide crisis is a lucrative opportunity for fraud and spam.
"We continue to see deceptive actors try to exploit moments of crisis, tragedy and tensions around the world," Facebook said in its report. "IB actors often seek to leverage current events and hot-button issues like the Black Lives Matter movement or COVID-19, as well as celebrities and new TV shows to drive clicks to ad farms or merchandise sites."
"These actors are persistent and demonstrate adversarial intent which is why we keep evolving our policies and detection systems to take action against them," Facebook added.
Spam actors aren't just targeting US politics. In August and September, Facebook removed 655 pages and 12 groups tied to a number of separate spam networks in Myanmar. These networks misled people about the purpose of their pages and used fake accounts to evade Facebook's limits on the frequency of posting. They posted content ranging from celebrity gossip to local news, and some posts focused on politics in Myanmar, including support for the military and references to ethnic tensions.
Only in the most "egregious, harmful or adversarial scenarios" does Facebook remove accounts, pages and groups for failing to mitigate IB violations. Usually a warning or a limitation on platform usage suffices, it said.