New lawsuit may force YouTube to own up to moderators mental health consequences of content moderation
For big tech platforms, one of the questions to arise during the pandemic’s early months was how the forced closure of offices would change their approach to content moderation.
Facebook, YouTube, and Twitter all rely on huge numbers of third-party contract workers to police their networks, and traditionally those workers have worked side by side in big offices.
When tech companies shuttered their offices, they closed down most of their content moderation facilities as well.
Happily, they continued to pay their moderators — even those who could no longer work, because their jobs required them to use secure facilities. But with usage of social networks surging and an election on the horizon, the need for moderation had never been greater.
And so Silicon Valley largely shifted moderation duties to automated systems.
Around 11 million videos were removed from YouTube between April and June, says the FT, or about double the usual rate.
Around 320,000 of these take-downs were appealed, and half of the appealed videos were re-instated.
Again, the FT says that’s roughly double the usual figure: a sign that the AI systems were over-zealous in their attempts to spot harmful content.
As YouTube’s chief product officer, Neal Mohan, told the FT: “One of the decisions we made [at the beginning of the pandemic] when it came to machines who couldn’t be as precise as humans, we were going to err on the side of making sure that our users were protected, even though that might have resulted in [a] slightly higher number of videos coming down.”
A former content moderator is suing Google-owned YouTube after she allegedly developed depression and symptoms associated with post-traumatic stress disorder from repeatedly watching videos of beheadings, child abuse and other disturbing content.
The law firm involved in the suit was also part of a similar suit against Facebook, Wong reported.
That’s a significant detail in large part of what Facebook did in that case: agree to settle it, for $52 million.
That settlement, which still requires final approval from a judge, applies only to Facebook’s US moderators. And with similar suits pending around the world, the final cost to Facebook will likely be much higher.
When asked what YouTube made of the new lawsuit, a spokesman had this to say.
“We cannot comment on pending litigation, but we rely on a combination of humans and technology to remove content that violates our Community Guidelines, and we are committed to supporting the people who do this vital and necessary work,” a spokesman said.
“We choose the companies we partner with carefully and work with them to provide comprehensive resources to support moderators’ well-being and mental health, including by limiting the time spent each day reviewing content.”