قالب وردپرس درنا توس
Home / Innovative / Moderating content does not have to be so traumatic

Moderating content does not have to be so traumatic



Enduring trauma may not necessarily be included in the job description for workers who watch graphic or violent videos as part of their jobs. Content moderators, journalists and activists often have to work through horrible pictures, videos and texts to get their work done, a task that jeopardizes their mental health and can lead to post-traumatic stress disorder. Fortunately, experts say there are ways to minimize the damage caused by so-called vicarious or secondary trauma.

Post-Traumatic Stress Disorder (PTSD) is caused by experiencing or observing a frightening event. Symptoms include acute anxiety, flashbacks, and intrusive thoughts. Although most people think of PTSD in the context of war or are physically involved in a crisis, in recent years acceptance has become accepted that repeated traumatic events could also cause the disease. Pam Ramsden, a psychologist at the University of Bradford in the United Kingdom, presented research at a 201

5 conference of the British Psychological Association that found that a quarter of people who observed incriminating images of violent events experienced symptoms of PTSD developed.

Meanwhile, research by Roxane Cohen Silver, a psychologist at the University of California Irvine, has shown that It's more than six In the four weeks following the attack, the exposure for hours to the Boston Marathon bombings (where media exposure can occur) was more stressful than actually being there. Even the latest version of the Diagnostic and Statistics Manual, the Bible for Psychiatric Diagnosis in the US, acknowledges that PTSD can occur when graphic images need to be displayed for work. For example, Facebook presenters at a company in Arizona have serious mental health issues related to constantly viewing graphics, as revealed by The Verge this week.

This does not mean that everyone who sees these pictures is traumatized, and some people search for traumatic content without being affected. "Not even one hundred percent of people going to war get a PTSD, so there are different risk factors," says Cohen. "There are certainly ways to reduce stress and take breaks and not watch anything for eight hours a day without interruption."

Although there is little research in this area, the Dart Center, which supports journalists who engage in violence, has created two best practice tips for working with traumatic imagery. Although some of the tips can be made by a moderator, eg. For example, to minimize the size of the image window, take notes to minimize the repetitive shuffling of footage, and to have "distraction files" of cute puppies, there are many that can only be implemented by the manager.

"There must be plenty of opportunities to take breaks and swap tasks, and offices where people can focus on something beautiful," says Elana Newman, director of research at the Dart Center and psychologist at the University of Tulsa. "You need to routinely screen your employees for mental health issues and provide these services." Newman and other experts agree that ultimately, the company itself must make these changes to protect workers.

Sam Dubberley is director of the Digital Verification Corps of Amnesty International, a group of volunteers who need to check if digital images are real, and often look for traumatic imagery. (Dubberley also worked with the Dart Center.) "I firmly believe the change must be from top to bottom," he says. Dubberly did his own research on what he calls "drip and perpetual misery" when watching online graphical images, and by interviewing people on these "digital frontlines" created a report with more suggestions for people at every level organization ,

A healthier environment, according to his report, could mean that presenters use mindfulness tools and often pay attention to their own workload and learn to focus on images that they feel secure. Perhaps more importantly, there is a trauma education training program for everyone. This means that all employees must be informed that disruptive graphics are part of the job, and that the traumatic triggers vary from person to person. It also means developing a culture where mental health is as important as physical health, and you have to advise managers on how to get along, both in one-to-one interviews and in groups.

However, there are increasing efforts to prevent secondary trauma In journalism and human rights organizations, the picture is more complicated when it comes to content moderation. Awareness of this damage has increased dramatically over the past two years, and social media companies are definitely aware of the problem, says Sarah Roberts, Professor of Information Studies at UCLA. For example, YouTube CEO Susan Wojcicki at SXSW last year said that the company would limit moderators to four hours a day.

At the same time, these companies are responding to all forms of controversy – be it "fake news" or controversial advertising materials, promising to hire more moderators and improve gatekeeping. There are detailed studies on how they can be protected. The resources that the Dart Center and Dubberley have created are general guidelines drawn from interviews rather than the result of rigorous longitudinal studies of content moderation and secondary trauma. We do not have these studies because they require access that they are unlikely to allow.

"My immediate answer to [the YouTube news] was" Okay, that means you need to double your workforce "and we have no evidence that four hours a day is the magic number," says Roberts. "How do we know that four hours a day are feasible and 4.5 hours bring you into a mental crisis?"

Finally, the business model of moderation of content makes it difficult to even implement the suggestions of common sense. Because the ecosystem for moderation of commercial content is global and often contract-based, companies are often afraid of losing a contract to a company that proves to be more efficient. "There are these opposing forces," explains Roberts. "On the one hand, we want the well-being and resilience of workers on the one hand, and on the other, this is a bottom line productivity measure. If a person is in a "wellness break", they are not on the ground during these activities. "

Ultimately, change must happen on multiple levels – not just distraction files or trust in artificial intelligence, but changing corporate culture and exploring business models. Facebook is committed to improving the oversight of its contractors, but protecting content moderators requires industry-wide efforts.


Source link