Skip to content

Major social media platforms rush to manage Charlie Kirk assassination clips following Utah shooting incident

Social media companies struggle with content moderation due to users' ability to directly upload content, according to Jessica Melugin, director of the Center for Technology and Innovation at the Competitive Enterprise Institute, as stated to FOX Business. The scale of moderation involved is...

Social networking giants rush to restrict access to videos depicting the assassination of Charlie...
Social networking giants rush to restrict access to videos depicting the assassination of Charlie Kirk following a shooting incident in Utah

Major social media platforms rush to manage Charlie Kirk assassination clips following Utah shooting incident

Jessica Melugin, Director of the Competitive Enterprise Institute's Center for Technology and Innovation, has spoken out on the challenges faced by social media companies in moderating content. In an interview with Fox Business, Melugin highlighted the unique nature of these platforms, where users can directly upload content.

Melugin made it clear that she does not endorse click-bait or sensational content, distinguishing it from the current situation. She pointed out that social media companies are investing significant resources into removing, labelling, and warning about inappropriate content, particularly in the aftermath of significant events.

However, Melugin acknowledged that content moderation at scale is a daunting task, especially for large platforms. She compared the situation to watching a live event unfold, hoping and praying that the majority of America does not have an appetite for such content.

Melugin also contrasted the efforts of social media platforms to moderate content with government calls for the forced removal of certain content. She viewed these government calls as a different issue, both legally and in terms of free speech.

Melugin differentiated between social media platforms curating content to keep users happy and a government prohibiting citizens from seeing certain content. She expressed concern over government calls for the forced removal of content, stating it is a different matter than social media platforms moderating content.

Carsta Maria Müller, a former Meta employee, has been actively involved in content moderation discussions and has emphasised the question of freedom of expression on social media platforms like Facebook and Instagram. Müller's involvement underscores the ongoing debate about the role and responsibility of social media companies in regulating content.

In conclusion, Jessica Melugin's insights provide a thoughtful perspective on the challenges and complexities of content moderation on social media platforms. Her comments underscore the need for a balanced approach that respects both the rights of users and the responsibilities of social media companies.

Read also:

Latest