How Facebook Groups Are Being Exploited To Spread Misinformation, Plan Harassment, And Radicalize People

Mark Zuckerberg wants to get a billion people in “meaningful” Facebook groups. But to get there he’ll have to battle the spammers, hackers, and trolls who exploit and hijack groups to make money or sow chaos.

One week after the mass shooting in Parkland, Florida, those searching on Facebook for information about the upcoming March for Our Lives were likely to be shown an active group with more than 50,000 members.

Called “March for Our Lives 2018 Official,” it appeared to be one of the best places to get details about the event and connect with others interested in gun control. But those who joined the group soon found themselves puzzled. The admins often posted pro-gun information and unrelated memes and mocked those who posted about gun control.

“I'm a retired federal law enforcement special agent. There is and never has been any reason for a civilian to have a high-capacity high velocity weapon,” posted one member on Feb. 20.

“Shutup fed and stop trying to spread your NWO BS,” was the top reply, which came from one of the group’s admins. (NWO is a reference to the “new world order” conspiracy theory.)

A few days later the group’s name was changed to “Kim Jong Un Fan Club,” and members continued to wonder what was going on.

The simple answer is they were being trolled. The more complicated one is that while Facebook groups may offer a positive experience for millions of people around the world, they have also become a global honeypot of spam, fake news, conspiracies, health misinformation, harassment, hacking, trolling, scams, and other threats to users, according to reporting by BuzzFeed News, findings from researchers, and the recent indictment of 13 Russians for their alleged efforts to interfere in the US election.

And it could get worse. Facebook recently announced that group content will receive more prominent placement in the News Feed and that groups overall will be a core product focus to help reverse an unprecedented decline in active users in the US and Canada. The unintended consequence is that the more than a billion users active in groups are being placed on a collision course with the hackers, trolls, and other bad actors who will follow Facebook’s lead and make groups even more of a focus for their activities.

“This vision could backfire terribly: an increase in the weight of ‘groups’ means reinforcement of Facebook’s worst features  —  cognitive bubbles  —  where users are kept in silos fueled by a torrent of fake news and extremism,” wrote Frederic Filloux, coauthor of the influential weekly media commentary newsletter Monday Note.

Renee DiResta, a security researcher who studied anti-vaccine groups on Facebook, told BuzzFeed News that groups are already a preferred target for bad actors.

“Propagandists and spammers need to amass an audience, and groups serve that up on a platter. There’s no need to run an ad campaign to find people receptive to your message if you can simply join relevant groups and start posting,” she said.

To her point, the recent indictment of the 13 Russians repeatedly cited Facebook groups as a focus of workers at the Internet Research Agency.

“By 2016, the size of many organization-controlled groups had grown to hundreds of thousands of online followers,” the indictment said.

One part of the Russian effort on Facebook left unmentioned in the Mueller indictment — and that has not previously been reported — is that Facebook’s group recommendation system encouraged fans of troll-run pages such as Blactivist and Heart of Texas to organize themselves into groups based on their shared affinity for a page. To this day there remain many groups with automatically generated names such as “Friends Who Like Blacktivist” or “Friends Who Like Heart of Texas.” These groups appear to be small and inactive, but the fact remains that Facebook prompted Americans to organize into fan groups of Russian troll pages.

There’s no question that groups offer value to many Facebook users. They keep friends and former classmates close, and played a key role in helping organize the West Virginia teachers’ strike. They are the basis for a thriving global community of buy, sell, and barter networks. Groups were used to raise money for victims of the Las Vegas shooting, and help provide support to, and prevent suicide among, US military personnel. Facebook groups can restrict membership by being designated “secret” or “closed,” and as a result people often share deeply personal information and experiences because they feel protected from the prying eyes of search engines and the public at large.

That’s the case for Female IN, a secret Facebook group that today boasts a membership of more than 1.5 million women around the world. It was created by Lola Omolola, a Nigerian-born journalist now living in Chicago. She started it after more than 250 schoolgirls were kidnapped by a terrorist group in Nigeria. Many of the women in the group live in, or are originally from, African countries. But Omolola said it has grown to include women from other parts of the world.

Secret groups like FIN are completely invisible to anyone who is not a member, and Facebook users can only join if they are invited by a current member. This is different from closed groups, which can be found via search, show up on user profiles, and enable anyone to request to be added to the group.

“We discuss everything from everyday day stuff like your children, your healthcare, and your individual interactions with people. We go from there to talking about domestic violence. People are sharing in real time what happens at their homes,” Omolola told BuzzFeed News in an interview facilitated by Facebook’s PR team. “Generally we are very actively and thoughtfully healing a population of women who are used to being silenced, and whose voices have never really historically mean that much.”

But even a group that Facebook itself points to as an example of the product’s value has to wage a battle against bad actors. Omolola and other FIN admins work constantly to keep men and spammers out of the group. They have to warn members about imposter FIN groups on Facebook that try to leverage her group’s success and brand among women. A post to the public FIN Facebook page from last fall warned women of a “FAKE FIN run by A MAN!!!”

“Some people try to use the [FIN] name to get a lot of people to their groups,” Omolola said. “We have contacted Facebook and Facebook knows about it, and are working on ways to protect us.”

Omolola says she working on trademarking the group’s name so she can ask Facebook can enforce her trademark and take imposter groups down faster. Right now, it’s mostly up to her and her members to go into fake FIN groups and warn people that it’s a scam.

Jennifer Dulski, the head of groups and community at Facebook, told BuzzFeed News that there is now a dedicated “integrity” team for groups, which includes people who specialize in security, product, machine learning, and content moderation. She pointed to new tools launched last year that help admins screen and easily remove members, and that lets admins clearly state the rules of a given group on its About page. (Omolola said these new features are all big time savers.)

“There is some negative content and behavior happening on Facebook, including in groups, and we take that really seriously,” Dulski said.

She emphasized that the the experience for most users is positive.

“It’s amazing to see what is actually happening in Facebook groups, and the vast majority of of activity and content in these groups is not only positive but also meaningful for the people involved in them.”

That messaging — bad content and actors make up a tiny fraction of overall groups activity — echoes what Mark Zuckerberg said after questions were raised about the spread of misinformation on its platform and its impact on the 2016 election.

DiResta sees a parallel, suggesting groups are at the same stage pages and the News Feed were prior to Facebook’s 2016 wake-up call about fake news: rife with manipulation and lacking proper oversight.

“We need to avoid a repeat of the kind of persuasive manipulation we saw on pages, so Facebook needs to be pay attention to these issues as they continue to increase their emphasis on groups,” DiResta said.

That emphasis on groups is now a cornerstone of Zuckerberg’s vision for the company and product. Last summer he announced an ambitious goal to drastically increase the number of Facebook users in “meaningful” groups from 100 to 1 billion.

“If we can do this, it will not only turn around the whole decline in community membership we've seen for decades, it will start to strengthen our social fabric and bring the world closer together,” he said in a speech at Facebook’s first Community Summit last summer.

But to get to a billion, Facebook will have to acknowledge and battle the spammers, hackers, and trolls who constantly exploit, take over, and buy and sell groups in order to make money or sow chaos. It will also have to confront the fact that its own recommendation engine at times pushes users into conspiracy theorist groups, or into those geared toward trolling, harassment, or illicit online activity.

The global exploitation of groups

As the 2016 election moved into its final stretch, members of a Facebook group called Republicans Suck suddenly experienced an influx of new members who spammed the group with links to questionable stories on websites they’d never heard of.

Bobby Ison, a Kentucky man who was member of the group, told BuzzFeed News that he noticed the administrators and moderators of the group began to change and “some of them were obvious foreign accounts. Eastern European countries. Serbia, Bosnia, Croatia.”

Ison said he and others learned that the Facebook account of one of the group’s original admins had been hacked. With control of the admin’s Facebook profile, the hackers were then able to add whoever they wanted as admins, and remove the group’s original leaders.

“When they got his [account], they booted the other admins, and moved more of theirs in,” Ison said. “It was rather sad to witness.” (The admin whose account Ison said was hacked did not respond to inquiries from BuzzFeed News.)

Eventually, former admins and members fled to start a new, closed group called

Republicans Suck....again!!!!

“As some of you know, the old REPUBLICANS SUCK has been hacked and taken over by hackers and trolls, which is something that is happening in several groups,” reads the pinned message for the group from one of its administrators.

Ison said the old group’s new overseas admins stepped up their spamming efforts in order to drive traffic from the group to their sites, presumably so they could earn ad revenue.

“It turned into a horrible page after they finally took over. Posting some of the most outlandish stories I've seen,” he said.

BuzzFeed News spoke via Facebook Messenger with one of the profiles currently listed as an admin of the group. That account has previously posted online about a Ford van for sale in Macedonia and written posts in Macedonian. Many of the account’s earliest posts in groups were for online moneymaking schemes. The person running the account said they were only added as an admin of the group two months earlier and didn't know anything about the group takeover last year.

“For the first time I hear from you that this group is stolen,” they wrote. They said they are not profiting in any way from the group. (BuzzFeed News could not confirm whether the profile’s name is linked to a real person.)

“We have no interest, we have no profit, we do not do with advertisements,” they said, and offered to make a BuzzFeed News reporter an admin of the group as a sign of goodwill.

“It would not be unusual to request to join 30 to 40 groups at a time, wait a few minutes to get approvals, and then spam links throughout those groups."

BuzzFeed News previously documented how political Facebook groups were exploited to spread political fake news to Americans during the 2016 election and beyond. An article published on Election Day revealed that Macedonian spammers used fake profiles to spam Trump and Bernie Sanders groups to generate engagement for their often false pro-Trump stories.

The tactic of spamming groups is by no means restricted to US politics or those targeting US audiences.

Rappler, an independent news website in the Philippines, last month published a story about a seemingly fake Facebook profile that spammed groups with links to websites that carried positive news about President Rodrigo Duterte. At one point the profile took over a fan group for a South Korean actor and changed its name to “President Duterte Supporters.”

Rappler found that the account posted hundreds of links to pro-Duterte news in Facebook groups since last summer. “She had also posted 307 times in various groups linking to a website called ‘Philippine Republic News’ which carries fake stories such as Oprah's support for President Duterte,” the story said.

A report on troll farms and fake news in the Philippines by two researchers included a case study showing that paid trolls tasked with helping win an election “came up with fake profiles and populated a Facebook group dedicated to the city they were trying to win.”

Facebook groups are also an engine of misinformation in Myanmar. Thura, a worker with a Myanmar social media research organisation, told BuzzFeed News by email that “Facebook groups are being used widely to spread hatred.” (BuzzFeed News agreed to conceal his identity due to security and safety concerns related to the work he does in Myanmar.)

Thura said people are using groups to spread hate and misinformation because they’re aware that less of the content posted to pages will reach the News Feed. So they focus on groups as a cheaper and more reliable way to reach people.

“Since Facebook pages do not normally go into News Feed unless they pay ads, groups are a good way to raise visibility of some of the fake news,” Thura wrote. “Groups are also a good strategy for [generating] engagement for a post from a page, meaning once a page posts a content, their network share them in several different groups.”

Spamming fake stories into groups is also the preferred tactic of American fake news publishers. Jestin Coler is a California man who ran more than a dozen early fake news sites such as National Report starting in 2013. He exited the fake news game last year and now says he’s focused on trying to use his experience to help battle online misinformation. Coler told BuzzFeed News that spamming groups was, and continues to be, a key traffic strategy for fake stories — and it’s still not being dealt with by Facebook.

“Joining Facebook groups that are related to the content being promoted, then spamming links throughout those groups using various aliases, is quite effective,” he said. “Members of the group then essentially become ‘bots’ and often share content to their network (outside of the group) creating a more organic-looking campaign.”

Coler said it was standard practice to identify specific Facebook groups related to the topic of a fake story and then join them all at once to enable link spamming.

“It would not be unusual to request to join 30 to 40 groups at a time, wait a few minutes to get approvals, and then spam links throughout those groups,” he said.

Facebook did not respond to questions about spam in groups.

Spamming groups is a global tactic on Facebook, and it often involves clickbait and misinformation. For example, in the summer of 2016, a network of sites sprung up that published false stories about bombings and terrorist attacks taking place in different cities. The stories were almost identical except that the location changed in each version. While investigating the origins of the hoaxes, BuzzFeed News documented how a young man in the republic of Georgia used his Facebook account to share a link to a false story about a bombing in Philadelphia in a series of Facebook groups focused on that city.

His lack of knowledge of American pop culture also caused him to share the link in a group dedicated to the TV show It’s Always Sunny in Philadelphia. He apparently thought it was a group focused on the city. (He did not reply to requests for comment at the time.) One town in Kosovo is also home to a network of groups that spread fake news to Americans, according to a recent report by Media Matters.

Facebook’s intention to give even more visibility to group content in the News Feed will now make this tactic even more appealing. It could also result in more takeovers of groups. There’s evidence this is already happening.

Two weeks after Facebook announced its latest News Feed algorithm changes, former admins and members of “Mystic Fire Native Heart,” a group with tens of thousands of members, began posting to warn that it had been taken over by hackers from Eastern Europe.

“Our whole group ... with 62000+ members has been hacked today morning (Indiana time) by a person from Albania,” wrote an Indiana radio host named Bruce Lee Robinson. The group was dedicated to Native American content and discussions, which is a topic that’s rife with abuse and exploitation on Facebook.

Keyword squatting and targeted harassment

Another tactic that could become more frequent with the elevation of groups by Facebook is what researcher Joan Donovan calls “keyword squatting.” The group that changed its name to “March for Our Lives 2018 Official” in the wake of the school shooting in Parkland, Florida, is a perfect example. That group has over time gained tens of thousands of members thanks to the admins changing its title to different topical names that are likely to attract new members.

When Stephen Hawking died last week the group changed its name to “RIP STEPHEN HAWKING 2018” and as a result showed up as a top three result in a groups search for “stephen hawking.”

“This is a feature that they have learned to leverage and you can imagine if they have learned to do it there are hundreds if not thousands of other groups changing the keyword weekly in order to get attention,” said Donovan, who leads the media manipulation project for the research institute Data & Society. (Facebook declined to comment on that group’s tactics.)

Thura said keyword squatting is a common tactic in Myanmar, too. People will start a Facebook group with a topic that has broad appeal and is not connected to news. Then, once it’s gained enough members, the admins switch its name and start sharing false stories.

“The strategy they used often is that they start with a different name and then once they have a large amount of users, they change it to a different name such as ‘Let’s Share News’ group,” Thura said.

Dulski, who oversees groups at Facebook, said the company has invested in new tools to enable group members to report content that may violate the terms of service, or a group’s specific rules.

“One is tool for members to report posts that might be violating the rules, and they can report this to the admins of the group and to Facebook,” she said. “This allows admins to quickly and easily review and keep bad content from appearing.”

In cases of hate speech or offensive content, Dilski said Facebook uses machine learning to automatically flag posts that may be violating the terms of service, regardless of whether people report it or not.

Thura said he and other have found it’s often pointless to flag content or complain to group admins about fake news since they’re often the ones spreading it.

There are in fact groups that exist solely to engage in activities that violate Facebook’s terms of service. One group, Republic of Kekistan, amassed more than 30,000 members before being removed by Facebook earlier this year. BuzzFeed News joined the group last year and watched as members coordinated online harassment campaigns, most of which were targeted at trans women and left-leaning Facebook pages. The group was also home to hate speech, such as a post where one member posted a photo of a handgun and asked, "What kind of Muslim repellant do y'all carry?"

The process for planning raids was simple: One group member shared a link to a piece of content they didn’t agree with and added the word “raid” to signal like-minded members to flood the comments of the content in question with nasty, often hate-filled messages and other types of trolling. Some calls for raids went unnoticed by the rest of the group, but others funneled dozens of hateful comments to unsuspecting victims.

At one point, a post in the group asked members to “raid the comments” of a fundraising post for the family of Heather Heyer, the woman who was killed during the Charlottesville demonstrations last year. Another post encouraged members to post laughing reactions to a Facebook post about a trans teen who committed suicide (The group’s description page at one point warned members, “No raid posts allowed. The safety of the nation must be guarded against retaliatory attacks.” But it appeared that policy was enforced loosely, if at all.)

Planning coordinated online attacks is against Facebook policy, but members of Republic of Kekistan seemingly stayed under Facebook’s radar and went unpunished for months. Sophie Labelle, a cartoonist from Montreal and a trans woman, was targeted last May. Labelle writes a web comic called Assigned Male and is the author of several comic books about being trans. Labelle told BuzzFeed News she’s used to the online harassment that comes with being a trans public figure, but this was an extreme case.

Trolls posted her home address online and she had to move. Labelle said her roommate, who’s also trans, felt threatened too. The trolls also targeted a bookstore hosting her launch and as a result it was canceled. (That raid worked out in Labelle's favor: the added attention helped her book sell out its first printing.)

“I feel pretty powerless. It’s kind of a given that when you show support to me it makes you a target,” she said.

Labelle said there are conspiracy theories about her online and even her readers are sometimes targeted if they leave a comment on one of her social media posts.

Labelle suspects Republic of Kekistan was behind the May harassment, but she does not know for sure. As payback, her supporters decided to band together to coordinate mass-reporting of the Facebook group for violating the platform’s terms of service, and Facebook eventually shut it down temporarily that same month. The group reemerged at some point after the ban and was active again as recently as January, according to an online archive. Facebook did not respond to a request for comment about when and why Republic of Kekistan was taken offline.


A marketplace of black hat groups services

Groups have become attractive enough for both legitimate and bad actors that there’s now a flourishing online marketplace of apps and services related to exploiting them.

There are multiple software apps that can automate posting links in groups and also automate the process of joining groups and inviting other profiles into them. For a small fee, you can buy new members to make your group seem more popular than it is, pay to spam a link of your choosing in groups with large numbers of members to try to drive traffic, or hire someone to go into a group and capture the email addresses of members in order to enable further spamming and targeting.

All of the aforementioned services violate Facebook’s terms of service, but they are easily found with a simple online search, or by going to a freelancing site such as Fiverr. Many of these services are offered by people based in countries such as Bangladesh, Pakistan, and India.

“Looking for a Facebook promotion expert to help you post and promote your products, services to millions of real and active Facebook group members ???” read one recent Fiverr ad. “Search no more.I am here to give you full expert service that will promote your ads or links to about 40 million real and active Facebook group members.”

To learn how these providers add hundreds or thousands of members to groups, BuzzFeed News purchased a total of 6,000 members for three groups it set up. One person on Fiverr charged $6 to provide 5,000 members for a group. BuzzFeed News then purchased two more sets of 500 members from different providers that were found by simply searching Google for “buy facebook group members.”

People all over Facebook are being involuntarily added to groups because at some point they unknowingly friended a dubious online marketer.

One company charged $25 to add 500 members that were located in the US or UK. Another charged just $6 for 500 members that were not accounts specifically based in those two countries. In each case the process was the same: After the order was placed BuzzFeed News was contacted by email or Facebook and told to make a specific Facebook profile an administrator of the group. Once done, the administrator began quickly adding other Facebook accounts to the group. Suddenly these newly created groups appeared more popular than they were, and were likely to show up higher in search results on Facebook.

The profiles added to the groups appeared to belong to real people, though a sampling of some of them found they were inactive for a year or two. The orders that did not specify US and UK profiles saw accounts added primarily from Bangladesh and India.

A man in Pakistan who helped fulfil one of the orders told BuzzFeed News in a Facebook Messenger chat that he could easily add large numbers of profiles to any group in a short amount of time.

“I do even 500,000 members in single group easily,” he said.

He said the accounts he adds to a group belong to real people. The key is that he controls many Facebook accounts that he uses to friend real people. Once two accounts are friends, Facebook easily allows one profile to add the other into a public group without requiring their permission. (In the case of closed and secret groups, the invited profile must accept the request.)

The result is that people all over Facebook are being involuntarily added to groups because at some point they unknowingly friended a dubious online marketer.

The man told BuzzFeed News he uses a computer programming script to automatically add members in batches of 500 to 700. If he does more than that at a time he risks being banned by Facebook. But that limit appears to do little to deter those selling group members. (Once he saw that the BuzzFeed News reporter had a verified Facebook profile he offered to pay to get profiles he controls verified.)

Facebook groups themselves are also places to purchase black-hat online marketing services. BuzzFeed News joined a group dedicated to Google’s AdSense advertising network and saw a post from an admin that listed his services. One item for sale? Facebook groups.

“I have facebook trade and sell groups for sale 20k to 40k members,” he wrote.

Facebook declined to comment on black hat groups services offered on Fiverr and within Facebook groups themselves.


Radicalization by the recommendation engine

DiResta, the security researcher, first saw the risks of groups in 2015 while researching health conspiracy content and communities on social networks. She joined anti-vaccine parenting groups on Facebook and watched as well-meaning parents became increasingly radicalized in their views of Western medicine. She also saw false and misleading links spread quickly within groups.

DiResta also documented how group members coordinated the sharing of specific links on Twitter and other platforms to create the impression of an outpouring of support for a specific point of view. “They didn’t have bots but they were effectively replicating the function of bots,” DiResta said. “They were using these groups to coordinate and spread these messages.”

Facebook’s recommendation system also began its own process of algorithmic radicalization. DiResta noticed that as her account became more involved in anti-vax groups Facebook shifted the kind of groups it recommended to her. She was soon being shown groups about chemtrails, fluoride conspiracies, and flat Earth. As the 2016 election moved into its final stretch, Facebook suggested she join groups dedicated to the Pizzagate conspiracy theory. DiResta calls it “radicalization via the recommendation engine.”

She provided two screenshots to BuzzFeed News, both taken in December 2016, that show Pizzagate group recommendations for her profile. The largest group being pitched had more than 12,000 members at the time.

“The groups recommendation engine is a conspiracy correlation index,” DiResta said.

Similarly, after BuzzFeed News joined the group dedicated to gaming and cheating AdSense, Facebook’s recommendation engine surfaced a list of groups dedicated to the same topics, as if to say, “We see you enjoy ad and click fraud. Try these groups to learn more.”

When Republic of Kekistan was active, one of the groups Facebook suggested people interested in it join was called “Stop Geoengineering, Chemtrails & Weathercontrol.”

Facebook also uses data about groups membership to recommend new friends. At the same time that the groups recommendation engine can push people further to the fringe, Facebook will suggest new friends who reinforce these perspectives.

Another concern for DiResta is how secret groups are used to further the radicalization process. When she was researching the anti-vax community she found it was easy to join open and closed groups. But she soon discovered that people used secret groups to exchange the most sensitive information and even engage in illegal activity.

“People would go into a group and say, ‘Hey, I need to get a medical exemption [for vaccines] for my kid’ and someone would say, ‘Hang on, I’ll DM you.’ Then gradually they recruit you into a secret group,” she said. “There was a whole underground network of information sharing around which doctors to go to in order to get fraudulent medical exemptions.”

DiResta said Facebook must find a way to strike a balance between letting people search for and talk about the topics they want without actively radicalizing or polarizing them.

“People have a right to search for Pizzagate and join Pizzagate groups, yes,” she said. “But for Facebook to be continuously, proactively surfacing these types of groups to people, that I think is where the ethics of the recommender system is insufficiently developed right now.”

DiResta said Facebook needs to recognize groups are “a potential radicalization pathway.”

“At a minimum, it needs to be studying how often people respond to these suggestions, and how groups may be impacting polarization,” she said. “Only Facebook has this data.”

Tellingly, even those who are happily engaging in conspiracy theory groups on Facebook also express frustration with how the product is being exploited. Last month a member of the PizzagateUncompromised group (7,500 members) posted a warning.

“The groups Exposing the Rothschilds and Pizzagate Exposed were hijacked,” he wrote.

It happened the same way that “Republicans Suck” and “Mystic Fire Native Heart” were taken over.

“It happens in a lot of groups,” replied one person. “Time to look for some good alternatives to FB imo.” ●

Topics in this article

Skip to footer