Infinit Care Spotlight Series: A closer look at workers that need mental health support


Why Mental Health Support is Essential for Content Moderators

Being on the internet has become a regular part of our everyday life. According to the latest statistics, 3.96 billion people use social media globally, with each person spending an average of 147 minutes or two hours and seven minutes on digital platforms every day.

These are significant figures if you look at them from the context of the level of exposure we get from the digital platforms we access. Over the last few years, the internet has played a pivotal role in society –building businesses and new industries, creating new needs, and of course, shaping the mindset of the public as a whole. Without a doubt, the internet has become so powerful that it can shape generations and the way they think and act as a whole.

But have you ever wondered how information is sifted and checked in the online worlds we love to immerse ourselves in? Websites and applications, big and small, have community guidelines that protect their users from being exposed to harmful information, but who exactly are the people working behind the scenes and doing the heavy lifting of screening this information? In this article, we will talk about the sentinels of the internet and the plight that comes with their profession. Meet the Content Moderators.

Content moderation in a nutshell

Content moderation, at its simplest, is the process of screening and monitoring user-generated content posted on online platforms. Whenever a user submits or uploads something to a website, moderators go through the content to make sure that the material follows the community regulations and is not criminal or illegal in nature. Some examples of banned content that content moderators screen are those that contain sexual themes, drugs, bigotry, homophobia, harassment, and racism.

While content moderation is applied to a majority of online platforms, they are even more so practiced in websites with a heavy lean toward user-generated uploads. This includes social media platforms, online marketplaces, communities and forums, the sharing economy, and even dating sites.

There are two different types of content moderation that websites use: AI-automated and human moderation. In the first type, a machine learning system is designed to moderate posts based on previous data gathered from the internet. AI moderation is significantly faster–sometimes only taking seconds to review posts, but it might not always be 100 percent accurate because it relies on machine learning which may not always pick up the right cues.

Human moderation, on the other hand, is a manual type of process that involves an actual person who reviews the posts. Under this category, the screener follows specific platform rules and guidelines to check the user-generated content submitted to the website. While this type of moderation is more foolproof than its counterpart, it also takes more time due to its manual nature. Moreover, it also presents a serious problem within its workforce that unfortunately, is not often well addressed: mental distress.

The dark side of content moderation

While content moderation remains to be a discreet profession, at least in the Philippines, more and more people who have worked in the field have stepped up over the recent years to speak up about the challenges and dangers that are prevalent in the industry. A riveting 2018 internationally produced documentary titled ‘The Cleaners’ gave an exhaustive look at the plight of moderators in the country who worked for online giants like Facebook, Twitter, and Google, and tackled the subject of their mental health struggles from their job. Facebook itself has acknowledged the difficulties that come with the profession while Microsoft has faced lawsuits from former employees who claim that they were not given proper support despite the psychological dangers of their job.

Moderators sift through hundreds of submissions that contain triggering content not limited to depictions of death, torture, mutilation, and violence for hours, sometimes with only limited time for breaks. The nature of the work can lead to the development of mental distress and psychological issues such as post-traumatic stress disorder (PTSD), anxiety, and even depression. This is something that is also supported by data from other studies in journalism, law enforcement, and child protection which claim that repeated trauma exposure can lead to psychological distress. On top of that, workers in the said areas have also been stated to suffer more from burnout, relationship challenges, and even suicide.

The following are other mental health problems that can arise from exposure to toxic content:

  • Panic attacks - some moderators have expressed feeling attacks when being around animals and children–fearing something will happen to them–after repeated exposure to violent videos.
  • Normalization/Desensitization to disturbing humor and language - repetitive exposure to disturbing content can change the mindsets and perspectives of its audience, leading to inappropriate humor and language.
  • Self-destructive habits - alcoholism, use of drugs, and display of indiscriminate sexual habits have supposedly also been reported in the workplaces of moderators who presumedly engage in them as a way of emotional escape to their job.
  • Skewed beliefs - in some cases, some content moderators can also develop fringe views (e.g. believing conspiracy theories) that are not supported by hard facts because of constant exposure to their materials.

The cost of internet safety

Without a doubt, content moderators serve as the first layer of protection of the general public from disturbing and harmful materials. Unfortunately, they are not always properly protected from the rigors that come with their profession. Unlike different workplaces (for example, those in the health sector, law and policing, and journalism) which have more solid guidelines when it comes to taking care of the mental needs of their workforce, there is an obvious lack of the same system for those working in the content moderation industry. In an article by Harvard, it is even said that companies are even very restrictive about letting others investigate their existing procedures and treatment of these workers. Not only are there no third parties monitoring the welfare of employees, but people working in the industry are also commonly asked to refrain from talking about their work through non-disclosure contracts.

Fortunately, some companies have also taken the initiative to develop workplace guidelines that can improve the treatment of those in the industry. Facebook, for example, helped create the Technology Coalition which then designed the Employee Resilience Guidebook, a guide that outlines rules protecting the occupational health and safety of workers reviewing distressing content. While the guidelines were made for those who are focused on employees dealing with child pornography, it also has terms that can be used for others in professions that expose workers to distressing imagery and content.

Specifically, the guide includes rules such as the provision of mandatory individual and group counseling sessions with a certified trauma specialist, limiting exposure to disturbing content for four hours, giving employees the choice to opt out of viewing specific disturbing content, encouraging them to switch to other projects as a form of relief, and giving them enough time to take a break and recover from their work.

Protecting the protectors

While overarching guidelines are already being developed on a global scale, it cannot be debated that a huge chunk of the responsibility should fall on the shoulders of the employers who are in a better position to observe and improve the best practices in this area. Here at Infinit Care, for example, we follow a tried and tested framework, the Mental Health Continuum, to make sure that every employee working in high-risk professions gets the mental health support that they need, wherever they are on the scale - whether they are excelling, surviving or in crises. (Click here to know more about the Mental Health Continuum.)

Our Head of Clinical Care Shyne Mangulabnan suggests several ways on how employers can put this to work. “Having a counseling professional who can help these employees is essential as well as having a solid support and assessment system for them. For example, surveys given to agents which can be used as a reference for the design of a wellness strategy is a good place to start. Constant monitoring of employees should also be done to make sure that their needs are met.”

On top of that, Mangulabnan also suggests creating proper escalation procedures for concerns relating to the mental health challenges of content moderators. Proper education of important stakeholders within the company (human resource team, upper management) about mental health risks of the job is also necessary since they are the decision-makers who create systems that take care of employees.

“It would be best to have an end-to-end solution: an onboarding process that gives candidates the training and education they need to understand the risks and concepts of well-being,  round-the-clock onsite and virtual counseling services, community support groups, yoga and meditation activities, and workshops are just some of the many things that employers can initiate to make sure that they give the support that their workforce needs.”

True enough, it is the responsibility of employers to make sure that they ‘protect the protectors’ of the internet. However, it’s not only the content moderators who should be given this kind of support, especially with 43 percent of the global workforce expressing that the COVID-19 pandemic has increased the stress that they suffer from work. This story is just the first chapter of a series that will shed light on all the professions who need mental health support most in these trying times.

Do you need help on how you can start caring for your employees in this aspect? We’d be more than happy to guide you here at Infinit Care. We are a company that helps other companies provide comprehensive mental health care support to their employees through the use of science-backed methodologies. You can reach out to us here to know more about how we can help.