Way back in the mid-1990s, when the web was young and the online world was buzzing with blogs, a worrying problem loomed. If you were an ISP that hosted blogs, and one of them contained material that was illegal or defamatory, you could be held legally responsible and sued into bankruptcy. Fearing that this would dramatically slow the expansion of a vital technology, two US lawmakers, Chris Cox and Ron Wyden, inserted 26 words into the Communications Decency Act of 1996, which eventually became section 230 of the Telecommunications Act of the same year. The words in question were: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The implications were profound: from now on you bore no liability for content published on your platform.
Moderation, however, has two problems. One is that it’s very expensive because of the sheer scale of the problem: 2,500 new videos uploaded every minute to YouTube, for example; 1.3bn photos are shared on Instagram every day. Another is the way the dirty work of moderation is often outsourced to people in poor countries, who are traumatised by having to watch videos of unspeakable cruelty – for pittances. The costs of keeping western social media feeds relatively clean are thus borne by the poor of the global south.
Moderation, however, has two problems. One is that it’s very expensive because of the sheer scale of the problem: 2,500 new videos uploaded every minute to YouTube, for example; 1.3bn photos are shared on Instagram every day. Another is the way the dirty work of moderation is often outsourced to people in poor countries, who are traumatised by having to watch videos of unspeakable cruelty – for pittances. The costs of keeping western social media feeds relatively clean are thus borne by the poor of the global south.