After the 2016 election, Facebook knew it had a problem. The fake pages and accounts created by the Kremlin-backed Internet Research Agency had spread across the social network and attracted massive participation from real users. Facebook knew it had to control things.
But years later, Facebook’s own internal investigation teams revealed that the troll farms were still reaching mass audiences, even if they didn’t have many direct followers. The company’s own algorithms pushed the content of the trolls to users who had not expressed interest in the pages, exponentially expanding the reach of the trolls. A report detailing the investigation was leaked to MIT Technology Review by a former employee.
When the report was released in 2019, troll farms reached 100 million Americans and 360 million people worldwide each week. In any given month, Facebook showed posts about troll farms to 140 million Americans. Most of the users never followed any of the pages. Rather, Facebook’s content recommendation algorithms had imposed content on more than 100 million Americans on a weekly basis. “A vast majority of its ability to reach our users comes from the structure of our platform and our ranking algorithms rather than user choice,” the report says.
Troll farms seemed to stand out for users in the U.S. While globally, more people viewed the content in gross numbers (360 million each week according to Facebook’s own accounting), troll farms were reaching more than 40 percent of all Americans.
The report, written by Jeff Allen, a former Facebook data scientist, reveals that prioritizing company engagement led to the problem. Facebook, he said, knows very little about content producers. Who posted something was not being considered in the News Feed algorithm.
“There are many extremely sophisticated collaborative filtering algorithms, but they are all based on compromise,” Allen wrote. “When content producers who earn that system are exploiting communities on our platform rather than building and supporting them, it is clear that the rating system does not reflect our company values. So much so that it is actually working against us. “
Many of the pages come from countries on the Balkan Peninsula and are aimed at foreign audiences with a primary focus on Americans, the report says.
The popularity and scope of the troll farms led Allen to believe that agents from the Russian Internet Investigation Agency could probably exploit the same techniques or use the same pages to reach American users. “If Troll Farms is reaching 30 million US users with content targeting African Americans, it shouldn’t surprise us at all if we find that the IRA currently has a large audience there as well,” Allen wrote.
Additionally, troll farms were able to slide their content into Instant Articles and Ad Breaks, two Facebook programs that give partners a cut of ad sales that run alongside page content. “At Instant Articles, there was a period where maybe up to 60% of instant article reads were on extracted content, which is the preferred Troll Farms article writing method,” Allen said. Facebook had been inadvertently paying the troll farms.
Facebook “had already been investigating these issues” when the report was released internally, a Facebook spokesperson told MIT Technology Review. “Since then, we have formed teams, developed new policies, and collaborated with industry peers to address these networks. We have taken aggressive enforcement measures against such inauthentic domestic and foreign groups and have shared the results publicly on a quarterly basis. “
Ars has submitted additional questions to Facebook and we will update this story if we get a response.
Users who viewed content from troll farms tended to split into two groups, Allen wrote. “A camp doesn’t realize that the pages are run by inauthentic actors who are exploiting their communities. They tend to love these pages. They like how entertaining the posts are and how they reaffirm the beliefs they already had, ”she wrote. “The other field realizes that the pages are run by actors who are not authentic. They hate the shit of all the lovers of these pages. They hate these pages with a passion that even I find impossible to match. “
The latter group was actively telling Facebook about the problem. “Our users are literally trying to tell us that they feel exploited by these pages,” Allen said.
As an example, Allen cited a user who discovered a troll farm page targeting Native Americans. The group of trolls stole art and sold it reprinted on t-shirts that were often never sent to customers, the user said. “This whole group is a fraud network,” wrote the user.
The troll farms highlighted in the report primarily targeted four different groups: Native Americans, Black Americans, Christian Americans, and American women. At the time it was written in October 2019, the report says that for many of those groups, the majority of the top pages were run by troll farms, including the top 15 pages targeting Christian Americans, 10 out of the top 15. targeting black Americans. and four of the top 15 targeting Native Americans. When MIT Technology Review published its story, five of the troll groups were still active. Three black Americans targeted Christian Americans, one targeted Christian Americans, and one targeted Native Americans.
Much of the content these groups posted, although frequently stolen, apparently did not violate Facebook’s content rules. Still, that doesn’t mean it’s harmless, Allen said. “The bottom line is that regardless of whether or not an intellectual property violation is occurring, posting strictly non-original content violates our policies and exposes our communities to exploitation,” Allen explained.
Allen believed that the problem could be solved relatively easily by incorporating “Graph Authority,” a way of ranking users and pages similar to Google’s PageRank, into the News Feed algorithm. “Adding even a few simple functions like Graph Authority and removing the dial from purely commitment-based functions would probably pay off in both the integrity space and … probably the commitment as well,” he wrote.
Allen left Facebook shortly after writing the paper, reports MIT Technology Review, in part because the company “effectively ignored” his investigation, a source said.