Teen Trust in Tech is Low, But Solutions Already Exist

Common Sense Media Study Highlights Teen Trust Gap While Proven Frameworks Wait to Be Implemented

Derek E. Baird, M.Ed.
2 min readJan 29, 2025
Photo by Solen Feyissa on Unsplash

A new Common Sense Media report reveals a stark reality: today's teens doubt Big Tech's commitment to their wellbeing. The study found that around 60% of teens don't trust technology companies to protect their safety or care about their mental health. When it comes to AI, half of the surveyed teens in the study expressed skepticism about tech companies making responsible decisions.

But here's the encouraging news: we already have well-developed frameworks that address these teen concerns directly. The Age-Appropriate Design Code (AADC), Child Rights by Design, Wellness by Design, and the RITEC Design Toolbox provide clear guidelines for creating digital spaces prioritizing young users' privacy, safety, and wellbeing.

These frameworks aren't just theoretical, they offer practical solutions to the exact issues teens are worried about. For instance, while 74% of teens in the Common Sense study want privacy safeguards and transparency in AI systems, the AADC already includes detailed requirements for privacy protection and transparent data usage.

Similarly, when teens express concern about AI-generated content, the Child Rights by Design framework provides specific guidelines for ensuring content integrity and age-appropriate interactions.

The disconnect isn't in knowing what to do—it's in implementing it. If tech companies embraced existing frameworks and policymakers aligned regulations with these established principles, we could significantly improve teens' trust in digital spaces.

The roadmap exists; we need the will to follow it.

The Common Sense report on teens and AI highlights problems and validates the importance of our existing frameworks. As AI becomes increasingly embedded in the infrastructure of teens' daily lives, implementing these guidelines is good practice and essential for rebuilding trust between young users and technology companies.

Related Links

🔸What Teens Say Adults Should Know About Their Use of AI (HopeLab) 🔸Most US Teens Use AI. Most Of Their Parents Don't Know. (WIRED)
🔸Students Are Using AI. Here's What Adults Should Know (Harvard GSE)

Derek E. Baird, M.Ed., is the former Chief Youth Privacy & Policy Officer at BeMe Health and the award-winning producer of BeingMe: A Teen Mental Health Podcast.

A Disney Inventor Award recipient and co-author of The Gen Z Frequency, Derek brings 25+ years of expertise in children's media, kids' technology, privacy, policy, and educational technology to his work.

His research, patents, and innovations in youth digital culture, online learning, and digital trust and safety have shaped industry practices and appeared in numerous peer-reviewed publications.

--

--

Derek E. Baird, M.Ed.
Derek E. Baird, M.Ed.

Written by Derek E. Baird, M.Ed.

Minor Safety Policy | Trust & Safety | Digital Child Rights + Wellbeing | Youth Cultural Strategy | Author | 2x Signal Award winning podcast writer & Producer

No responses yet