ADVERTISEMENT

The worst job in tech: Keeping Facebook clean

Published Jan 19, 2018 12:00 am
By Lauren Weber and Deepa Seetharaman (The Wall Street Journal) By her second day on the job, Sarah Katz knew how jarring it can be to work as a content moderator for Facebook, Inc. She says she saw anti-Semitic speech, bestiality photos and video of what seemed to be a girl and boy told by an adult off-screen to have sexual contact with each other. Ms. Katz, 27 years old, says she reviewed as many as 8,000 posts a day, with little training on how to handle the distress, though she had to sign a waiver warning her about what she would encounter. Coping mechanisms among content moderators included a dark sense of humor and swiveling around in their chairs to commiserate after a particularly disturbing post. She worked at Facebook's headquarters campus in Menlo Park, Calif., and ate for free in company cafeterias. But she wasn't a Facebook employee. Ms. Katz was hired by a staffing company that works for another company that in turn provides thousands of outside workers to the social network. Facebook employees managed the contractors, held meetings and set policies. The outsiders did the "dirty, busy work," says Ms. Katz, who earned $24 an hour. She left in October, 2016 and now is employed as an information-security analyst by business-software firm ServiceNow, Inc. Deciding what does and doesn't belong online is one of the fastest-growing jobs in the technology world – and perhaps the most grueling. The equivalent of 65 years of video are uploaded to YouTube each day. Facebook receives more than a million user reports of potentially objectionable content a day. Humans, still, are the first line of defense. Facebook, YouTube and other companies are racing to develop algorithms and artificial-intelligence tools, but much of that technology is years away from replacing people, says Eric Gilbert, a computer scientist at the University of Michigan. Facebook will have 7,500 content reviewers by the end of December, up from 4,500, and it plans to double the number of employees and contractors who handle safety and security issues to 20,000 by the end of 2018. "I am dead serious about this," Chief Executive Mark Zuckerberg said last month. Facebook is under pressure to improve its defenses after failing during last year's presidential campaign to detect that Russian operatives tried to use its platform to influence the outcome. Russia has denied any interference in the election. Susan Wojcicki, YouTube's chief executive, said this month that parent Google, part of Alphabet, Inc., will expand its content-review team to more than 10,000 in a response to anger about videos on the site, including some that seemed to endanger children. No one knows how many people work as content moderators, but the number likely totals "tens of thousands of people, easily," says Sarah Roberts, a professor of information studies at the University of California, Los Angeles, who studies content moderation. Outsourcing firms such as Accenture PLC, PRO Unlimited, Inc. and SquadRun Inc. supply or manage many of the workers, who are scattered among the corporate headquarters of clients, work from home or are based at cubicle farms and call centers in India and the Philippines. The arrangement helps technology giants keep their full-time, in-house staffing lean and flexible enough to adapt to new ideas or changes in demand. Outsourcing firms also are considered highly adept at providing large numbers of contractors on short notice. Facebook decided years ago to rely on contract workers to enforce its policies. Executives considered the work to be relatively low-skilled compared with, say, the work performed by Facebook engineers, who typically hold computer-science degrees and earn six-figure salaries, plus stock options and benefits. Reports of offensive posts from users who see them online go into a queue for review by moderators. The most serious categories, including terrorism, are handled first. Several former content moderators at Facebook say they often had just a few seconds to decide if something violated the company's terms of service. A company spokeswoman says reviewers don't face specific time limits. Pay rates for content moderators in the Bay Area range from $13 to $28 an hour, say people who have held jobs there recently. Benefits vary widely from firm to firm. Turnover is high, with most content moderators working anywhere from a few months to about a year before they quit or their assignments ended, according to interviews with nearly two dozen people who have worked such jobs. Facebook requires that its content moderators be offered counseling through PRO Unlimited, which actually employs many of those workers. They can have as many as three face-to-face counseling sessions a year arranged by an employee-assistance program, according to an internal document reviewed by The Wall Street Journal. The well-being of content moderators "is something we talk about with our team members and with our outsourcing vendors to make clear it's not just contractual. It's really important to us," says Mark Handel, a user research manager at Facebook who helps oversee content moderation. "Is it enough? I don't know. But it's only getting more important and more critical." Former content moderators recall having to view images of war victims who had been gutted or drowned and child soldiers engaged in killings. One former Facebook moderator reviewed a video of a cat being thrown into a microwave. Workers sometimes quit on their first or second day. Some leave for lunch and never come back. Others remain unsettled by the work – and what they saw as a lack of emotional support or appreciation – long after they quit. Shaka Tafari worked as a contractor at messaging app Whisper in 2016 soon after it began testing a messaging feature designed for high-school students. Mr. Tafari, 30, was alarmed by the number of rape references in text messages he reviewed, he says, and sometimes saw graphic photos of bestiality or people killing dogs. "I was watching the content of deranged psychos in the woods somewhere who don't have a conscience for the texture or feel of human connection, " he says. He rarely had time to process what he was seeing because managers remotely monitored the productivity of moderators. If the managers noticed a few minutes of inactivity, they would ping him on workplace messaging tool Slack to ask why he wasn't working, says Mr. Tafari. A Whisper spokesman says moderators were expected to communicate with managers about breaks. Whisper no longer employs US-based moderators. It uses a team in the Philippines along with machine-learning technology.
ADVERTISEMENT
.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1561_widget.title }}

.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1562_widget.title }}

.most-popular .layout-ratio{ padding-bottom: 79.13%; } @media (min-width: 768px) and (max-width: 1024px) { .widget-title { font-size: 15px !important; } }

{{ articles_filter_1563_widget.title }}

{{ articles_filter_1564_widget.title }}

.mb-article-details { position: relative; } .mb-article-details .article-body-preview, .mb-article-details .article-body-summary{ font-size: 17px; line-height: 30px; font-family: "Libre Caslon Text", serif; color: #000; } .mb-article-details .article-body-preview iframe , .mb-article-details .article-body-summary iframe{ width: 100%; margin: auto; } .read-more-background { background: linear-gradient(180deg, color(display-p3 1.000 1.000 1.000 / 0) 13.75%, color(display-p3 1.000 1.000 1.000 / 0.8) 30.79%, color(display-p3 1.000 1.000 1.000) 72.5%); position: absolute; height: 200px; width: 100%; bottom: 0; display: flex; justify-content: center; align-items: center; padding: 0; } .read-more-background a{ color: #000; } .read-more-btn { padding: 17px 45px; font-family: Inter; font-weight: 700; font-size: 18px; line-height: 16px; text-align: center; vertical-align: middle; border: 1px solid black; background-color: white; } .hidden { display: none; }
function initializeAllSwipers() { // Get all hidden inputs with cms_article_id document.querySelectorAll('[id^="cms_article_id_"]').forEach(function (input) { const cmsArticleId = input.value; const articleSelector = '#article-' + cmsArticleId + ' .body_images'; const swiperElement = document.querySelector(articleSelector); if (swiperElement && !swiperElement.classList.contains('swiper-initialized')) { new Swiper(articleSelector, { loop: true, pagination: false, navigation: { nextEl: '#article-' + cmsArticleId + ' .swiper-button-next', prevEl: '#article-' + cmsArticleId + ' .swiper-button-prev', }, }); } }); } setTimeout(initializeAllSwipers, 3000); const intersectionObserver = new IntersectionObserver( (entries) => { entries.forEach((entry) => { if (entry.isIntersecting) { const newUrl = entry.target.getAttribute("data-url"); if (newUrl) { history.pushState(null, null, newUrl); let article = entry.target; // Extract metadata const author = article.querySelector('.author-section').textContent.replace('By', '').trim(); const section = article.querySelector('.section-info ').textContent.replace(' ', ' '); const title = article.querySelector('.article-title h1').textContent; // Parse URL for Chartbeat path format const parsedUrl = new URL(newUrl, window.location.origin); const cleanUrl = parsedUrl.host + parsedUrl.pathname; // Update Chartbeat configuration if (typeof window._sf_async_config !== 'undefined') { window._sf_async_config.path = cleanUrl; window._sf_async_config.sections = section; window._sf_async_config.authors = author; } // Track virtual page view with Chartbeat if (typeof pSUPERFLY !== 'undefined' && typeof pSUPERFLY.virtualPage === 'function') { try { pSUPERFLY.virtualPage({ path: cleanUrl, title: title, sections: section, authors: author }); } catch (error) { console.error('ping error', error); } } // Optional: Update document title if (title && title !== document.title) { document.title = title; } } } }); }, { threshold: 0.1 } ); function showArticleBody(button) { const article = button.closest("article"); const summary = article.querySelector(".article-body-summary"); const body = article.querySelector(".article-body-preview"); const readMoreSection = article.querySelector(".read-more-background"); // Hide summary and read-more section summary.style.display = "none"; readMoreSection.style.display = "none"; // Show the full article body body.classList.remove("hidden"); } document.addEventListener("DOMContentLoaded", () => { let loadCount = 0; // Track how many times articles are loaded const offset = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]; // Offset values const currentUrl = window.location.pathname.substring(1); let isLoading = false; // Prevent multiple calls if (!currentUrl) { console.log("Current URL is invalid."); return; } const sentinel = document.getElementById("load-more-sentinel"); if (!sentinel) { console.log("Sentinel element not found."); return; } function isSentinelVisible() { const rect = sentinel.getBoundingClientRect(); return ( rect.top < window.innerHeight && rect.bottom >= 0 ); } function onScroll() { if (isLoading) return; if (isSentinelVisible()) { if (loadCount >= offset.length) { console.log("Maximum load attempts reached."); window.removeEventListener("scroll", onScroll); return; } isLoading = true; const currentOffset = offset[loadCount]; window.loadMoreItems().then(() => { let article = document.querySelector('#widget_1690 > div:nth-last-of-type(2) article'); intersectionObserver.observe(article) loadCount++; }).catch(error => { console.error("Error loading more items:", error); }).finally(() => { isLoading = false; }); } } window.addEventListener("scroll", onScroll); });

Sign up by email to receive news.