The Verge's Casey Newton has a great story about the pitiful conditions.
One of the reports on the horrific nature of this job came from Wired's Adrian Chen back in 2014, who tracked contractors working on behalf of US-based companies like Facebook to make a content moderation firm in the Philippines. "Pornography, Gore, Minors, Sexual Solicitation, Sexual Body Parts / Images, And Racism," so it could be taken down swiftly .
That story, therefore, is a lot of content moderation work in the US, and that's the case with The Verge's story from this week. Cognizant in Phoenix, Arizona.
From the sound of things, the job has not Gotten any better. Salaries are above minimum, but not by a whole lot; people develop post-traumatic stress disorder, and some start to believe they are hearing about it more often than the average user.
Even more troubling is the nature of the relationship between moderators and their superiors, who have made their decisions on a video page. Newton noted that these cases were sometimes subjective, and disagreements could lead to presenters' 'accuracy scores' going down ̵
It is disappointing to learn that using artificial intelligence systems and thousands of humans for this task is not enough to stem the flow of content that violates the policies of social networks and media platforms. Back in 2016, 100 million hours of video per day. Furthermore, it is not easy to problem-to-solve – but it looks like companies have not to make it much easier, to make it much easier to deal with it help people cope with the endless stream of disturbing posts.
Newton's post is well worth a read, and includes its experience of a visit to the Phoenix content moderation facility; Find it on this page.
TNW Conference 2019 is coming and its Future Generations track explores how emerging technology will help us achieve the 17 sustainable development goals, outlined by the UN. Find out more by clicking here.