Controlling the Conversation: The Ethics of Social Platforms and Content Moderation
With social platforms’ prevailing dominance, there are numerous debates around who owns information, content, and the audience itself: the publisher, or the platform where the content is discovered—or not discovered, as the case may be. Platforms rely heavily on algorithms to decide what to surface to their users across the globe, and they also rely on algorithms to decide what content is taken down. Meanwhile, publishers are making similar decisions on a significantly smaller scale, and not necessarily algorithmically or quite as generically. But how are any of these decisions made? And what are the various factors taken into account to ensure that the decision-making is fair and ethical?
On February 23, 2018, the Tow Center for Digital Journalism at Columbia University and the Annenberg Innovation Lab at USC Annenberg School for Communication and Journalism hosted a Policy Exchange Forum followed by a conference on the topic of “Controlling the Conversation: The Ethics of Social Platforms and Content.”
The Policy Exchange Forum was a closed-group discussion that followed the Chatham House Rule. The discussion broadly focused on three topics: “Ethics of Moderation”, “Moderation Tools”, and “Technological Challenges.”
- PEF III write-up (Ethics of Content Moderation).pdf application/pdf 272 KB Download File