Connect with us

Tech

Moderator sues Facebook for mental trauma

Published

on

Moderator sues Facebook for mental trauma

A woman employed as a moderator for Facebook has sued the company for mental trauma.

The woman, Selena Scola, who monitored content on Facebook, alleged that the company does not properly protect those who face mental trauma as a result of viewing distressing images.

Moderators are “bombarded” with thousands of images depicting child sexual abuse, torture, bestiality and beheadings, the legal action alleges, adding that the social network was “failing to provide a safe workplace”.

However, Facebook said its moderators had full access to mental health resources.

Read also: Facebook moves to address online bullying, harassment

Scola, a contract staff, who worked at Facebook’s offices in Menlo Park and Mountain View for nine months from June 2017, under a contract through Pro Unlimited, a Florida-based staffing company, which is also mentioned in the legal action, took up the legal action in California.

According to her lawyers, she developed post-traumatic stress disorder as a result of constant exposure to “highly toxic and extremely disturbing images” at her workplace, adding that there is potential for a class action from “thousands” of current and former moderators in California.

Reacting, Facebook said in a statement: “We recognise that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.

“Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counselling – available at the location where the plaintiff worked – and other wellness resources like relaxation areas at many of our larger facilities.”

The social network has come under fire in recent months over how it handles fake news and hate speech on its platform and has committed to employing more content moderators.

Facebook currently has 7,500 reviewers, which include full-time employees and contractors.

It also uses artificial intelligence and has stated that one of its main priorities is to improve the technology so that the unpleasant job of monitoring disturbing images and videos can be done wholly by machines.

 

RipplesNigeria… without borders, without fears

Click here to join the Ripples Nigeria WhatsApp group for latest updates.

Join the conversation

Opinions

Support Ripples Nigeria, hold up solutions journalism

Balanced, fearless journalism driven by data comes at huge financial costs.

As a media platform, we hold leadership accountable and will not trade the right to press freedom and free speech for a piece of cake.

If you like what we do, and are ready to uphold solutions journalism, kindly donate to the Ripples Nigeria cause.

Your support would help to ensure that citizens and institutions continue to have free access to credible and reliable information for societal development.

Donate Now