Serving the UMN community since 1900

The Minnesota Daily

Serving the UMN community since 1900

The Minnesota Daily

Serving the UMN community since 1900

The Minnesota Daily

Daily Email Edition

Get MN Daily NEWS delivered to your inbox Monday through Friday!

SUBSCRIBE NOW

Editorial: Student calls for Minnesota kids code to hold Big Tech accountable

Mental health is an important consideration in legislating social media.
Editorial%3A+Student+calls+for+Minnesota+kids+code+to+hold+Big+Tech+accountable
Image by Sarah Mai

In January, I made the trip from Macalester College, where I am a first-year, to advocate in Washington, D.C. as five social media platform CEOs appeared before Congress in what will hopefully be looked back on as a historic moment in efforts to address Big Tech’s transgressions against society.

While Meta and TikTok received most of the attention for their blatant failures to stop kids from experiencing harm, many of the youth advocates I sat within the audience struggled to recognize the executive of Discord. 

I first signed up for Discord, a live chat platform, as a middle-schooler searching for communities of young people who played video games. Initially, I was awestruck by the variety of seemingly welcoming communities available to me, offering connections I couldn’t find offline. However, this sense of wonder was short-lived. 

Over time, I was exposed to a slew of graphic, violent and sexually explicit content, none of which was removed due to Discord’s decentralized design and lax approach to moderation. These incidents led me to feel helpless and depressed, and I felt powerless to take action against the purveyors of harm due to the platform’s cumbersome reporting system. 

Despite being tied to one of the largest intelligence leak in U.S. history and the facilitation of child sexual exploitation, Discord does not receive appropriate public scrutiny for its harm done to children and young people like me, especially young men — harms that continue to worsen with each passing day.

As a child of the digital age and an advocate for building healthy relationships with social media and online platforms, I have witnessed too many members of my generation suffer at the hands of these predatory design techniques. Yet, despite internal research proving otherwise, Big Tech has denied that their products are harming children while lobbying against legislation like the Minnesota Kids Code (SF 2810/HF 2257) to protect our most vulnerable online. 

This bill currently being considered by our state legislature lays out common sense provisions to make technology safer for kids. It requires tech companies whose products are “reasonably likely” to be accessed by kids to assess risks to young users and take steps to prevent physical and psychological harm. 

This would involve defaulting accounts for younger teens to be private and could have implications for addictive features like infinite scroll and autoplay. Companies would also have to consider the needs of children throughout the entire product development process. 

Teenagers are now spending an average of nine hours a day on screens. Exposing young people almost continuously to harmful, hateful content via addictive algorithms powered by stolen data has consequences. The Kids Code aims to put an end to this crisis by targeting its source–design, ensuring user privacy for children while also forcing companies to make structural changes to make day-to-day use of social media safer. 

America’s youth are experiencing their worst mental health crisis in decades, a crisis my colleagues and I believe social media platforms are largely responsible for. Rates of teen anxiety, depression and self-harm have skyrocketed over the past decade following the introduction of apps like Instagram and Snapchat on mobile phones. Nearly one in three high school-aged girls seriously considered suicide in 2021, a 60% increase from 2011. 

This has been the year our institutions woke up to the frightening reality of social media’s role in causing harm to children. Just as we have laws preventing the marketing of harmful substances like cigarettes and alcohol to kids, we need safeguards for the digital world. 

As someone who has grown up on an unregulated internet that wasn’t designed for me, I can only hope that future generations will get to experience a different paradigm of online safety — one that prioritizes the well-being of children over Big Tech’s bottom line. 

Matthew Allaire is a freshman at Macalester College, a Design It For Us coalition advocate and a member of the LOG OFF movement.

Leave a Comment
More to Discover

Accessibility Toolbar

Comments (0)

All The Minnesota Daily Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *