Facebook content moderation is an ugly business. Here’s who does it

Some of the workers saw video of a man being stabbed to death. Others viewed acts of bestiality or animals being tortured. Suicides and beheadings popped up too.

The reason for watching the gruesome content: to determine whether it should be pulled from Facebook before more members of the world’s largest social network could see it.

Content moderators protect Facebook’s 2.3 billion users from exposure to humanity’s darkest impulses. Scouring posts that’ve been flagged by other members of the social network or by the Silicon Valley giant’s artificial intelligence tools, they quickly decide what stays up and what comes down. But reviewing the posts comes with a cost. Constant exposure to violence, hatred and sordid acts can wreak havoc on a person’s mental health. Former content moderators have already filed a lawsuit against Facebook in which they say repeated exposure to violent images caused psychological trauma. There’s a reason being a content moderator has been called “the worst job in technology.”

It’s also an important job, and one that isn’t handled by Facebook employees. Instead, it’s outsourced to contractors, some of whom turn to drugs and sex in the workplace to distract themselves from the abhorrent images they see every day, according to a February story in The Verge, which reported that some of the workers make as little as $28,800 per year. That’s just over the federal poverty level for a family of four. Facebook said in May that it plans to raise the minimum hourly wage for contract workers, which is currently $15 per hour. 

Details of the working conditions of content moderators are still coming out. On Wednesday, The Verge reported that a content moderator who worked at a site in Florida operated by Cognizant died after having a heart attack at his desk. The site in Tampa was reportedly a stressful, dirty and unhealthy environment. 

Cognizant says that it works to “ensure a safe, clean, and supportive work environment.”

Contracting in the tech industry has reached a flashpoint, escalating tensions in Silicon Valley’s world of haves and have-nots. Contractors and temps don’t get the health care or retirement benefits that full-time employees do, a difference that hasn’t gone unnoticed. Last year, contract workers at Google protested, demanding higher wages and benefits.

Facebook said Wednesday it works with its contractors “to provide a level of support and compensation that leads the industry.” The social media giant also said its thoughts go out to family, friends and co-workers of the deceased moderator.

“There will inevitably be employee challenges or dissatisfaction that call our commitment to this work and our partners’ employees into question,” a Facebook spokesperson said in a statement. “When the circumstances warrant action on the part of management, we make sure it happens.”

Here’s a look at five of the companies that have worked with Facebook to police content.

Cognizant

A multinational provider of services to technology, finance, health care, retail and other companies, Cognizant offers services including app development, consulting, information technology and digital strategy.

Based in Teaneck, New Jersey, Cognizant has roughly 281,600 employees around the world, according to its annual report. Nearly 70 percent of its workforce is in India.

The company’s role in supporting Facebook’s content moderation activities was the subject of recent stories in The Verge, which reported that roughly 1,000 Cognizant employees at its Phoenix office evaluate posts for potentially violating Facebook rules against hate speech, violence and terrorism.

Indian Economy Business ImagesIndian Economy Business Images

Cognizant Technology Solutions office in Chennai, India. The company works with Facebook on content moderation. 


Madhu Kapparath/Getty Images

The workers get two 15-minute breaks, a 30-minute lunch and nine minutes of “wellness time” per day. They also have access to counselors and a hotline, according to the report.

Still, some workers said that constant exposure to depravity has taken its toll. One former content moderator said he started to believe conspiracy theories, such as 9/11 being a hoax, after reviewing videos promoting the idea that the terrorist attack was faked. The former employee said he had brought a gun to work because he feared that fired employees would return to the office to harm those who still had jobs.

Cognizant said in February it looked into “specific workplace issues raised in a recent report,” that it had “previously taken action where necessary” and that it has “steps in place to continue to address these concerns and any others raised by our employees.”

The company outlined the resources it offers employees, including wellness classes, counselors and a 24-hour hotline.

Cognizant also runs a site in Tampa, Florida that employs about 800 workers, according to The Verge. Workers at that facility have filed two sexual harassment complaints against coworkers since April. 

“Like any large employer, Cognizant routinely and professionally responds to and addresses general workplace and personnel issues in its facilities,” Cognizant said in a statement on Wednesday. “Our Tampa facility is no different.  Cognizant works hard to ensure a safe, clean, and supportive work environment for all of our associates.”

PRO Unlimited

Based in Boca Raton, Florida, PRO Unlimited provides services and software used by clients in more than 90 countries. 

Last year, Selena Scola, a former PRO Unlimited employee, who worked as a Facebook content moderator, filed a lawsuit alleging that she suffered from psychological trauma and post-traumatic stress disorder caused by viewing thousands of disturbing images of violence. Scola’s PTSD symptoms can pop up when she hears loud noises or touches a computer mouse, according to the lawsuit.

The lawsuit was amended to include two more former content moderators who worked at Facebook through staffing companies. 

“Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator,” the lawsuit states, referring to Scola.

Filed in superior court in Northern California’s San Mateo County, the lawsuit alleges Facebook violated California law by creating dangerous working conditions. Facebook content moderators are asked to review more than 10 million posts per week that may violate the social network’s rules, according to the lawsuit, which seeks class-action status.

At the time the original lawsuit was filed, Facebook acknowledged the work can be stressful and said it requires the company it works with for content moderation to provide support such as counseling and relaxation areas.

Facebook in a court filing denied Scola’s allegations and called for the case to be dismissed. 

A Facebook spokeswoman said the social media giant no longer uses PRO Unlimited for content moderation. PRO Unlimited didn’t respond to a request for comment.

Accenture

One of the most prestigious consultancies in the world, Dublin-based Accenture has more than 459,000 people serving clients across 40 industries and in more than 120 countries, according to its website.

People enter an Accenture office in downPeople enter an Accenture office in down

People enter an Accenture office in downtown Helsinki.


Jussi Nukari/Getty Images

In February, Facebook content reviewers at an Accenture facility in Austin, Texas, complained about a “Big Brother” environment, alleging they weren’t allowed to use their phones at their desk or take “wellness” breaks during the first and last hour of their shift, according to a memo obtained by Business Insider.

“Despite our pride in our work, Content Moderators have a secondary status in [the] hierarchy of the workplace, both within the Facebook and the Accenture structure,” the memo read.

Accenture didn’t respond to a request for comment. At the time, Facebook said there had been a “misunderstanding” and that content moderators are encouraged to take wellness breaks at any time throughout the day.

Some of Accenture’s clients have included other tech giants such as Google, Microsoft and Amazon. More than three-quarters of Fortune Global 500 companies work with Accenture.  

Arvato

One of Facebook’s largest content moderation centers is in Germany, a country that started enforcing a strict hate speech law last year that would fine social media companies up to 50 million euros ($58 million) if they didn’t pull down hate speech and other offensive content quickly enough.

Arvato, owned by the German media company Bertelsmann, runs a content moderation center in Berlin. The company has faced complaints about working conditions and the toll the job takes on workers’ mental health.

In 2017, Arvato said in a statement that it takes the well-being of its employees seriously and provides health care and access to company doctors, psychologists and social services.


Now playing:
Watch this:

Facebook is putting women on the front line of its war…

4:06

The company, based in Gütersloh, Germany, has 70,000 employees in more than 40 countries. It’s been providing Facebook with content moderation services since 2015.

Arvato, which was rebranded last week as Majorel, said it offers content moderators a salary that’s 20 percent above minimum wage and support such as wellness classes and counselors. Workers can also take “resiliency breaks” at any time of the day.

“We are proud to be a partner of Facebook and work in alignment with them to offer a competitive compensation package that includes a comprehensive benefits package,” a company spokesperson said in a statement. “We will continue to work together to improve our offerings and support of our employees.”

Genpact

New York-based professional services firm Genpact won a contract with Facebook last year to provide content moderation, according to The Economic Times.

Concerns about the mental health of Facebook content moderators weren’t enough to scare off applicants in India, who flocked to jobs that paid between 225,000 and 400,000 rupees a year (about $3,150-$5,600). Genpact was searching for content moderators fluent in Tamil, Punjabi and other Indian languages.

Some Genpact workers have complained about low pay and a stressful work environment, according to a report this week by Reuters. One former Genpact employee told the news outlet that at least three times he’s “seen women employees breaking down on the floor, reliving the trauma of watching suicides real-time.”

Facebook pushed back against allegations of low pay but outlined the work it was doing to improve working conditions for content moderators.

In an email, a Genpact spokesperson confirmed that it partners with Facebook but said it doesn’t comment on work with clients.

“As a company we bring our extensive experience in the field of content review and operations to our partners by providing industry-leading support for our team of content reviewers and a best-in-class working environment,” the Genpact spokesperson said in a statement. “We take very seriously this work and the services that we provide to our clients.”

First published on March 1 at 4:00 a.m. PTUpdate, 4:03 p.m. PT: Includes new material from Facebook about PRO Unlimited. Update, 5:24 p.m. PT: Includes material about an amended lawsuit against Facebook.Update, June 19: Includes new reported details of a Cognizant facility in Tampa, Florida. 

Check Also

8 New Google Products We Expect to See This Year

Google’s device line could end up having a particularly important moment in 2023. The company usually announces new Pixel products throughout the year. Google is expected to release its first foldable phone this year, however, which would directly compete with Samsung’s proven line of Galaxy Z Fold devices. Google also introduced its own ChatGPT rival, …

Leave a Reply