Tik Tok Content Moderation Concern

“Children are the world’s most valuable resource and its best hope for the future”. -John F. Kennedy

I am deeply concerned about the ongoing challenges that TikTok faces regarding its content moderation practices, particularly in safeguarding our most vulnerable population: children.

A lawsuit against social media site, TikTok, brought by thirteen states and Washington D.C., alleges the platform does not do enough to protect young users from the dangers of social media.

The platform’s reliance on automated moderation technology and human moderators, while essential, has proven to be insufficient in protecting young users from harmful content. According to NBC News, TikTok only removes about 46% of violent/graphic content and 39% of harassment and bullying.  TikTok failed to remove about 35% of “Normalization of Pedophilia” posts, 33% of “Minor Sexual Solicitation”, and 100% of “Fetishizing Minors”. The gaps in TikTok’s content moderation raise significant concerns about the platform’s ability to provide a safe environment for children.

The lawsuit emphasized the urgent need for effective content moderation measures. It is alarming to see that harmful content continues to slip through the cracks. This situation not only puts children at risk but also undermines the trust that parents and guardians place in social media platforms. The Mental Health Center of Passaic believes that protecting children should be a top priority, and we urge TikTok to invest more resources and efforts into improving their content moderation systems.

In addition to technical solutions, it is crucial for TikTok to engage with mental health experts, such as licensed psychotherapists and psychiatrists, to develop comprehensive strategies that address the unique challenges of content moderation. By collaborating with professionals in the field, TikTok can better understand the psychological impact of harmful content on young users and implement measures that go beyond mere technical fixes. This collaborative approach will ensure that TikTok’s content moderation practices are informed by the latest research and best practices in child safety and mental health.

As a non-profit behavioral health organization, the Mental Health Center of Passaic remains committed to advocating for the safety and well-being of children in the digital age. We call on social media platforms to take immediate and decisive action to protect young users from harmful content. Our children deserve a safe online environment where they can explore, learn, and connect without fear of exploitation or harm. Together, we can create a digital world that prioritizes the mental health and safety of our children.

Christopher Alcazar is the Executive Director of the Mental Health Center of Passaic. He is a Licensed Clinical Social Worker.