Microsoft sued by Commercial Content Moderators (#CCM) in first-of-its-kind case

header1

In a first-of-its kind case, a Washington state attorney has filed suit on behalf of two Microsoft employees who were exposed to disturbing and abhorrent content, all a part of the conditions of their work at Microsoft as commercial content moderators (CCM). According to the article reported in the McClatchy syndicate of newspapers, the two workers now suffer PTSD symptoms after reviewing material, such as child pornography, and removing it on behalf of the software giant.

While this may be the first instance of CCM workers suing their employers, I suspect it is not likely to be the last. Indeed, this must be worrisome not only for Microsoft and other technology and social media companies who engage CCM for their platforms and services (e.g., Google; Facebook and their properties), but also those firms with whom they frequently contract in order to outsource such work – even if the workers themselves remain onsite to undertake their CCM tasks.Countless other web sites, social media platforms and apps also use CCM to deal with their need for user-generated content.  What is particularly interesting about this case is the fact that the workers filing suit appear to be direct employees of Microsoft and not a third party, which will make it more difficult for Microsoft to appeal to the kinds of plausible deniability that such administrative and bureaucratic distance typically provides.

Although there are no definitive estimates of the number of CCM workers who toil on web and social media sites worldwide, the numbers are certainly well into the thousands, given the sheer number of platforms requiring adjudication in order to function. My own research undertaken with both Silicon Valley-based and Philippines-based workers confirmed that CCM work absolutely exposes workers to shocking and repulsive material, as a condition of their work, on a daily basis. The longterm results of such exposure are not yet known, but this new lawsuit will certainly contribute to the larger landscape of academic research and public concern over the issues it touches on, including worker safety, employer liability, social media policy and practice, legal responsibility and a host of others.

Now that the airing of employee harm has come to light, first through research, journalistic takes on the issue, and artistic interventions, legal and policy battles are the next frontier. Stay tuned.

Forced to watch child porn for their job, Microsoft employees developed PTSD, they say

 Contact me for comment: