Skip to Content

Facebook has known it has a human trafficking problem for years. It still hasn’t fully fixed it

<i>Jakub Porzycki/NurPhoto/Getty Images</i><br/>Facebook has for years struggled to crack down on content related to what it calls domestic servitude.
NurPhoto via Getty Images
Jakub Porzycki/NurPhoto/Getty Images
Facebook has for years struggled to crack down on content related to what it calls domestic servitude.

By Clare Duffy, CNN Business

Facebook has for years struggled to crack down on content related to what it calls domestic servitude: “a form of trafficking of people for the purpose of working inside private homes through the use of force, fraud, coercion or deception,” according to internal Facebook documents reviewed by CNN.

The company has known about human traffickers using its platforms in this way since at least 2018, the documents show. It got so bad that in 2019, Apple threatened to pull Facebook and Instagram’s access to the App Store, a platform the social media giant relies on to reach hundreds of millions of users each year. Internally, Facebook employees rushed to take down problematic content and make emergency policy changes avoid what they described as a “potentially severe” consequence for the business.

But while Facebook managed to assuage Apple’s concerns at the time and avoid removal from the app store, issues persist. The stakes are significant: Facebook documents describe women trafficked in this way being subjected to physical and sexual abuse, being deprived of food and pay, and having their travel documents confiscated so they can’t escape. Earlier this year, an internal Facebook report noted that “gaps still exist in our detection of on-platform entities engaged in domestic servitude” and detailed how the company’s platforms are used to recruit, buy and sell what Facebook’s documents call “domestic servants.”

Last week, using search terms listed in Facebook’s internal research on the subject, CNN located active Instagram accounts purporting to offer domestic workers for sale, similar to accounts that Facebook researchers had flagged and removed. Facebook removed the accounts and posts after CNN asked about them, and spokesperson Andy Stone confirmed that they violated its policies.

“We prohibit human exploitation in no uncertain terms,” Stone said. “We’ve been combatting human trafficking on our platform for many years and our goal remains to prevent anyone who seeks to exploit others from having a home on our platform.”

CNN has reviewed internal Facebook documents included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen‘s legal counsel. The redacted versions were obtained by a consortium of 17 US news organizations, including CNN. In addition to information about human trafficking content on Facebook’s apps, the documents provide deep insights into the company’s approach to misinformation and hate speech moderation, internal research on its newsfeed algorithm, communications related to the Capitol Riot and more.

The Apple threat, first reported by The Wall Street Journal last month, represents the potentially dire consequences of Facebook’s continued challenges with moderating problematic content on its platforms, especially in non-English-speaking countries. In one SEC complaint related to the issue, representatives for Haugen wrote: “Investors would have been very interested to learn the truth about Facebook almost losing access to the Apple App Store because of its failure to stop human trafficking on its products.” The revelation also comes as tensions between Facebook and Apple have risen in recent months over user privacy issues.

Stone directed CNN to a letter Facebook sent last summer to several United Nations representatives about its efforts to combat human trafficking on its platform. In the letter, the company notes that domestic servitude content is “rarely reported to us by users.”

“To counter these challenges … we have also developed technology that can proactively find and take action on content related to domestic servitude,” Facebook said in the letter. “By using it, we have been able to detect and remove over 4,000 pieces of violating organic content in Arabic and English from January 2020 to date.”

Facebook has tried to discredit some earlier reporting by the Wall Street Journal and Haugen’s testimony to a Senate subcommittee earlier this month. In a tweet last week ahead of “The Facebook Papers” publication, Facebook Vice President of Communications John Pinette said: “A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us.”

A ‘severe’ risk to Facebook’s business

In the fall of 2019, the BBC approached Facebook about an investigation it was soon to publish about an illegal online marketplace for domestic workers — which operated in part using Instagram — and shared the hashtags it had used to locate the content, according to an internal Facebook report. In response, Facebook removed 703 Instagram profiles promoting domestic servitude, but “due to the underreporting of this behavior and absence of proactive detection,” other domestic servitude remained on the platform, the report states.

Following the publication of the BBC investigation, Apple contacted Facebook on October 23, 2019, threatening to remove its apps from the App Store for hosting content that facilitated human trafficking. In a November 2019 internal document titled “Apple Escalation on Domestic Servitude — how we made it through this [Site Event]” a Facebook employee detailed the actions the company took over the course of a week to mitigate the threat, including taking action against more than 130,000 pieces of domestic servitude-related content in Arabic on Facebook and Instagram, expanding the scope of its policy against domestic servitude content and launching proactive detection tools in Arabic and English.

“Removing our applications from Apple platforms would have had potentially severe consequences to the business, including depriving millions of users of access to IG & FB,” the document states. “To mitigate against this risk, we formed part of a large working group operating around the clock to develop and implement our response strategy.”

Despite the scramble during that week, Facebook had been well aware of such content before the BBC reached out. “Was this issue known to Facbeook before the BBC enquiry and Apple escalation?” the internal report states, “Yes.”

In March 2018, Facebook workers assigned to the Middle East and North Africa market flagged reports of Instagram profiles dedicated to selling domestic laborers, internal documents show. At the time, these reports “were not actioned as our policies did not acknowledge the violation,” a September 2019 internal report on domestic servitude content states.

Stone, the Facebook spokesperson, said the company did have a policy prohibiting human exploitation abuses at the time. “We have had such a policy for a long time. It was strengthened after that point,” he added.

Internal Facebook documents show that Facebook launched an expanded “Human Exploitation Policy” on May 29, 2019 that included a prohibition on domestic servitude content related to recruitment, facilitation and exploitation.

In September 2019, a Facebook employee posted to the company’s internal site a summary of an investigation into a trans-national human trafficking network that used Facebook apps to facilitate the sale and sexual exploitation of at least 20 potential victims. The criminal network used more than 100 fake Facebook and Instagram accounts to recruit female victims from various countries, and used Messenger and WhatsApp to coordinate transportation of the women to Dubai, where they were forced to work in facilities disguised as “massage parlors,” the summary said.

The investigation identified $152,000 spent to buy advertisements on its platforms related to the scheme, including ads targeting men in Dubai. The company removed all pages and accounts related to the trafficking ring, according to the report. Among the recommended “action items” listed for response to the investigation is a request that Facebook clarify policies for how it handles ad revenue associated with human trafficking to “prevent reputational risk for the company (not to profit from ads spent for HT).”

About a week later, a subsequent report outlined more broadly the issue of domestic servitude abuse on Facebook’s platforms. The document includes samples of advertisements for workers posted to Instagram; one describes a 38-year-old Indian woman for sale for the equivalent of around $350 (the company says it removed the related accounts).

Ongoing challenges

More recent documents show that despite efforts Facebook took to remove such content immediately and in the weeks and months following the Apple threat, it has still struggled to regulate domestic servitude content.

A report distributed internally in January 2020 found that “our platform enables all three stages of the human exploitation lifecycle (recruitment, facilitation, exploitation) via complex real-world networks,” and identified some commonly-used naming conventions for domestic servitude accounts to help with detection. Traffickers from labor “recruitment agencies” used “FB profiles, IG Profiles, Pages, Messenger and WhatsApp to exchange victims’ documentation … promote the victims for sale, and arrange buying, selling and other fees,” the document said of one trafficking network the company identified.

In a February 2021 report, researchers found that often labor recruitment agencies communicated with victims via direct messages but rarely posted post public content violations, making them difficult to detect. The report also said Facebook lacks “robust proactive detection methods … of Domestic Servitude in English and Tagalog to prevent recruitment,” although the Philippines is a top source country for victims, and that it didn’t have detection capabilities turned on for Facebook stories. The report laid out plans for a preventative educational campaign for workers, and said researchers identified at least 1.7 million users who could benefit from information about workers’ rights.

“While our previous efforts are a start to addressing the off-platform harm that results from domestic servitude, opportunities remain to improve prevention, detection, and enforcement,” the February report stated. The company has implemented on-platform interventions to remind people seeking employment of their rights, and has information on its Help Center for users who encounter human trafficking content, Stone said.

And although Facebook researchers have heavily investigated the issue, domestic servitude content appears to still be active and easily found on Instagram. Using several common account naming trends highlighted in one domestic servitude internal research document, CNN last week identified multiple Instagram accounts purporting to offer domestic workers for sale, including one whose account name translates to “Offering domestic workers” that features photos and descriptions of women, including their age, height, weight, length of available contract and other personal information. Facebook confirmed these posts violated its policies and removed them after CNN asked about them.

In early 2021, Facebook launched “search interventions” in English, Spanish and Arabic that create “friction” in search when users “type in certain keywords related to certain topics (that we have vetted with academic experts),” according to Stone. He added that the company launched these search interventions for sex trafficking, sexual solicitation and prostitution in English, and for domestic servitude and labor exploitation in Arabic

“Our goal is to help deter people from searching for this type of content,” Stone said. “We’re continuing to refine this experience to include links to helpful resources and expert organizations.”

This article is part of a CNN series published on “The Facebook Papers,” a trove of over ten thousand pages of leaked internal Facebook documents that give deep insight into the company’s internal culture, its approach to misinformation and hate speech moderation, internal research on its newsfeed algorithm, communication related to Jan. 6, and more. You can read the entire series here.

The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.

Article Topic Follows: CNN - Social Media/Technology

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KTVZ NewsChannel 21 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content