On Thursday, California Governor Gavin Newsom signed a bill intended to safeguard children’s internet safety, privacy, and data into law.

Online platforms must comply with the California Age-Appropriate Design Code Act (Opens in a new window) (AB 2273), which mandates that they take into account “the best interest of child users” and set privacy and security defaults “that protect children’s mental and physical health and well-being.”

The bill, which was proposed by Assemblymembers Buffy Wicks (D-Oakland) and Jordan Cunningham (R-San Luis Obispo), forbids online service providers that kids are likely to access (such as the majority of social media sites, game developers, streaming platforms, etc.) from using young people’s personal information, collecting, selling, or keeping track of geolocations, creating default profiles of them, or encouraging them to do so.

“We’re moving quickly in California to safeguard the health and welfare of our children,” Newsom said in a statement (Opens in a new window) said. As a father of four, I am aware of the actual problems our kids are having online. I appreciate Assemblymen Wicks and Cunningham, as well as the tech sector, for working for these measures and prioritizing the welfare of our children.

The measure, which was passed by both parties, also mandates that internet businesses give responsive tools to assist youngsters in “exercising their privacy rights,” as well as freely accessible privacy information, terms of service, policies, and community standards. By January 2024, the state legislature will receive a report on best practices from the Children’s Data Protection Working Group.

As the mother of two young girls, Wicks remarked, “I am personally driven to ensuring that Silicon Valley’s most influential corporations rebuild their products in children’s best interest.”

Cunningham continues, “We still have work to do to address the youth mental health problem.

He cites eating disorders, despair, and suicidal thoughts as examples. “In particular, we know that certain Big Tech social media corporations design their products to addict youngsters, and a significant proportion of those kids suffer serious harm as a result,” he says. It is not just good sense to protect children online; doing so will also save lives.

Changes have been made amid public backlash over children’s excessive usage of social media apps.


” alt=””>

” alt=””> Last year, Instagram said that users under the age of 16 who join up for the app will automatically have private profiles, while users of existing accounts will receive alerts encouraging them to do so. More recently, it said that as part of its ongoing efforts to make its platform safer for younger users, it will urge teenage users to review their privacy settings.

Additionally, TikTok limited push notifications for users between the ages of 13 and 15 after 9 p.m. and for those between the ages of 16 and 17 after 10 p.m. Additionally, it by default disabled direct texting for those 16 and older.

The law, according to a representative for Meta, is a “important development,” but some of the sections raise concerns for the company, who prefers industry standards.

Meanwhile, Cunningham and Wicks earlier sponsored a law intended to restrict social networks from utilizing or selling young people’s personal data to cause addiction. But last month, after intense lobbying pressure from social firms, it was cancelled.

Get Our Best Stories!

PCMag Logo

Alternative Social Networks to Twitter, Facebook, and More! For daily delivery of our best stories to your inbox, sign up for What’s New Now.

Advertisements, discounts, and affiliate links could be found in this newsletter. You agree to our Terms of Use and Privacy Policy by subscribing to a newsletter. You are always free to unsubscribe from the newsletters.


You may also like