California’s AB 2877: A Step Towards Child-Centric AI Governance

Learn how this legislation could revolutionize AI development, safeguard children’s digital rights in California, and serve as a national model for protecting children’s data.

Derek E. Baird, M.Ed.
2 min readJun 24, 2024

As artificial intelligence (AI) continues to reshape our digital landscape, California’s AB 2877 marks a significant step forward in protecting our youngest digital citizens.

This pending legislation proposes crucial amendments to the California Consumer Privacy Act (CCPA) of 2018, focusing on AI training practices involving minors’ data.

Key Points of AB 2877:

  • Prohibits using personal information of consumers under 16 for AI training without explicit consent.
  • Mandates affirmative authorization from the minor or their parent/guardian for such data use.

Why This Matters:

Looking Ahead:

While AB 2877 is a commendable start, it’s important to consider the following:

  • Implementation challenges: How will age verification and consent be practically managed?
  • Broader implications: How might this affect AI development and innovation?
  • National Implications: Will this legislation lead to similar federal legislation? Should we add comparable AI protections for children to COPPA 2.0 or the Kids Online Safety Act (KOSA)?

As trust and safety professionals, we must anticipate these technological shifts. AB 2877 goes beyond compliance — it’s a step towards an AI ecosystem that learns from past missteps and prioritizes children’s digital rights.

--

--

Derek E. Baird, M.Ed.

Minor Safety Policy | Trust & Safety | Digital Child Rights + Wellbeing | Youth Cultural Strategy | Author | 2x Signal Award winning podcast writer & Producer