We all want to protect children online. And hopefully, we understand that with supervised use, the Internet is a powerful tool to expand a child’s educational horizons, serve as a medium to connect with friends and discover the beauty of the world around them. Where disagreement resides is the balance between parental responsibility, corporate responsibility, and government responsibility to ensure that digital services used by minors are as safe as possible.
Sadly, in an apparent admission of its failure to articulate or legislate meaningful steps to protect children, Senate lawmakers are poised to vote on the Kids Online Safety Act (S.1409). This act will eliminate innovative, child-centric products and services altogether.
Policymakers are undoubtedly feeling pressure to do more to protect children online. But while, the name of this bill suggests it shields children from online predators, its practical implication is to completely upend the landscape for how businesses produce digital content for children. The Kids Online Safety Act does this by forcing social media companies to mandate default settings that prohibit algorithms – or targeted content – from appearing in a youth’s activity feed.
Supporters claim this is a good thing because they believe targeted ads are “deliberately exploiting children.” This assertion seemingly ignores the long-settled debate over advertising age-appropriate content to America’s youth. Think of all the commercials you see on Disney Channel, Cartoon Network, and Nickelodeon – are they exploiting children too? As a society, we have already come to accept this targeted advertising. The only thing that has changed is the medium in which kids receive the ads.
Lawmakers have not thoughtfully considered the impact of “eliminating commercials” that are age-appropriate for children. In addition to these advertisements being necessary for the aforementioned television programming to exist, in the digital space, my generation grew up with gaming websites such as Neopets, RuneScape, and AddictingGames. Critically, without the ads that supported these platforms, these kid-centric gaming sites would not have long existed. The ads paid the bills. Flash forward to the present, by taking away ads, why would social media companies invest in making their platforms more child-focused or child-safe?
This would be a shame, as this advertisement-supported investment allowed the previously mentioned gaming websites or social media companies to become better at it, both in delivering age-appropriate content and in providing more robust tools for security. Take a look at Instagram, which has added significant parental supervision tools, including empowering parents to set daily limits for the time teens can spend on the platform, defaulting accounts to private for those under 16 (which prevents children from being able to receive DMs from anyone they aren’t connecting to), robustly reducing the visibility of sensitive content, and availing supervised accounts so parents can see who their child has been in contact with.
One may still say Meta can do more. While I’m not acquainted with Mark Zuckerberg, I can’t imagine he’d disagree. But it’s through trial and error that companies like Meta can improve its technology and serve children and families better. Meta, like the other big tech companies, must constantly evolve to stay on top.
This evolution of safety practices isn’t just limited to social media companies; we see it in the kid’s entertainment market across the board. Children’s toys of the past included potential hazards like easy-bake ovens that led to accidental burns, hoverboards that caught fire due to poor-quality batteries, and action figures made with lead paint. Over time, easy-bake ovens implemented new technology to reduce heat, the battery quality of hoverboards was improved, and lead was eliminated as a paint source for toys.
Disney World offers another example. Over the years they’ve made critical improvements to their theme park experience to keep kids safe. This includes training cast members to help identify lost or distressed children and inventing “magic bands” to help locate a child within the park should they go missing.
Bringing a new product or service to market – whether for kids or adults – inherently comes with risks. But it’s through this information-gathering process that safety standards improve for all. For this reason, a very real case can be made the Kids Online Safety Act endangers children by prohibiting such innovation from taking place in the digital space.
But I’m sure the Senators who vote for this legislation will claim they solved the problem nonetheless.