Infinit Care Shares Why Mental Health Support is Essential for Content Moderators

How is information sifted and checked in the online worlds we love to immerse ourselves in? Websites and applications, big and small, have community guidelines that protect their users from being exposed to harmful information, but who exactly are the people working behind the scenes and doing the heavy lifting of screening this information? This article talks about the sentinels of the internet, the plight that comes with their profession, and how mental health-tech companies like Infinit Care are supporting them.

Protecting the Protectors: Why Mental Health Support is Essential for Content Moderators Being on the internet has become a regular part of our everyday life. According to the latest statistics, 3.96 billion people use social media globally, with each person spending an average of 147 minutes or two hours and seven minutes on digital platforms every day.

How is information sifted and checked in the online worlds we love to immerse ourselves in? Websites and applications, big and small, have community guidelines that protect their users from being exposed to harmful information, but who exactly are the people working behind the scenes and doing the heavy lifting of screening this information? In this article, we will talk about the sentinels of the internet and the plight that comes with their profession.

Meet the Content Moderators.

Content Moderation in a Nutshell

Content moderation, at its simplest, is the process of screening and monitoring user-generated content posted on online platforms. Whenever a user submits or uploads something to a website, moderators go through the content to make sure that the material follows the community regulations and is not criminal or illegal in nature. Some examples of banned content that content moderators screen are those that contain sexual themes, drugs, bigotry, homophobia, harassment, and racism.

HR Technology News: Adp National Employment Report: Private Sector Employment Increased by 455,000 Jobs in March

There are two different types of content moderation that websites use: AI-automated and human moderation. In the first type, a machine learning system is designed to moderate posts based on previous data gathered from the internet. AI moderation is significantly faster–sometimes only taking seconds to review posts, but it might not always be 100 percent accurate because it relies on machine learning which may not always pick up the right cues.

Human moderation, on the other hand, is a manual type of process that involves an actual person who reviews the posts. Under this category, the screener follows specific platform rules and guidelines to check the user-generated content submitted to the website. While this type of moderation is more foolproof than its counterpart, it also takes more time due to its manual nature. Moreover, it also presents a serious problem within its workforce that unfortunately, is not often well addressed: mental distress.

The Dark Side of Content Moderation

Moderators sift through hundreds of submissions that contain triggering content not limited to depictions of death, torture, mutilation, and violence for hours, sometimes with only limited time for breaks. The nature of the work can lead to the development of mental distress and psychological issues such as post-traumatic stress disorder (PTSD), anxiety, and even depression. This is something that is also supported by data from other studies in journalism, law enforcement, and child protection which claim that repeated trauma exposure can lead to psychological distress. On top of that, workers in the said areas have also been stated to suffer more from burnout, relationship challenges, and even suicide.

HR Technology News: Spring Hills Senior Communities Selects OnShift’s Workforce Management and Employee Engagement Software

Protecting the Protectors

While overarching guidelines to protect and support the mental health of content moderators are already being developed on a global scale, it cannot be debated that a huge chunk of the responsibility should fall on the shoulders of the employers who are in a better position to observe and improve the best practices in this area. At Infinit Care, for example, they follow a tried and tested framework, the Mental Health Continuum, to make sure that every employee working in high-risk professions gets the mental health support that they need, wherever they are on the scale – whether they are excelling, surviving or in crises. (Click here to know more about the Mental Health Continuum.)

Infinit Care’s Head of Clinical Care Shyne Mangulabnan suggests several ways on how employers can put this to work. “Having a counseling professional who can help these employees is essential as well as having a solid support and assessment system for them. For example, surveys given to agents which can be used as a reference for the design of a wellness strategy is a good place to start. Constant monitoring of employees should also be done to make sure that their needs are met.”

On top of that, Mangulabnan also suggests creating proper escalation procedures for concerns relating to the mental health challenges of content moderators. Proper education of important stakeholders within the company (human resource team, upper management) about mental health risks of the job is also necessary since they are the decision-makers who create systems that take care of employees.

“It would be best to have an end-to-end solution: an onboarding process that gives candidates the training and education they need to understand the risks and concepts of well-being, round-the-clock onsite and virtual counseling services, community support groups, yoga and meditation activities, and workshops are just some of the many things that employers can initiate to make sure that they give the support that their workforce needs.”

True enough, it is the responsibility of employers to make sure that they ‘protect the protectors’ of the internet. However, it’s not only the content moderators who should be given this kind of support, especially with 43 percent of the global workforce expressing that the COVID-19 pandemic has increased the stress that they suffer from work. This story is just the first chapter of a series that will shed light on all the professions who need mental health support most in these trying times.

Infinit Care is a mental health-tech company that helps companies and organization provide comprehensive mental health care support to their employees through the use of science-backed methodologies and technology. Reach out to Infinit Care here to know more.

HR Technology News: SoFi at Work Study Reveals Three in Four Workers Are Stressed About Financial Issues, Spending 9+ Working Hours per Week Dealing With Personal Finances

[To share your insights with us, please write to sghosh@martechseries.com]

anxietyClinical CareDepressionMental Healthonline platformsPsychological
Comments (0)
Add Comment