California lawmakers have passed a bill called the California Age Appropriate Design Code (AB 2273) that seeks to firm up federal standards and hold online platforms liable unless they take steps to make themselves safer for kids. Privacy advocates, however, are divided over the efficacy of the upcoming legislation. While some welcomed the bill claiming it finally places the privacy, safety, and well-being of children over commercial interests, others aren’t impressed. “The bill is heavy on ensuring the “health and well-being” of children,” Tom Garrubba, Director of Third-Party Risk Management Services at Echelon Risk + Cyber, told Lifewire over email, “[which is] something that many parental organizations and privacy activists for years have been criticizing big tech for turning a blind eye to.”
A Good Start
The California Senate has already passed the bipartisan AB 2273 bill, which is now with Governor Gavin Newsom, waiting for him to sign it into law. The California proposal resembles new rules passed in 2021 in the UK that govern how tech firms can target kids with things like push notifications in order to “put the best interests of the child first.” In essence, the bill seeks to establish stringent default privacy settings for users under the age of 18 years while giving them the option to easily access, comprehend, and report any concerns with a platform’s privacy policies. “I like the tone of this bill mostly because it applies an additional “privacy by design” concept focusing on children called “age appropriate by design,” said Garrubba. Platforms, he added, will have to demonstrate adherence or face serious penalties, ranging from $2,500 per child if the violation is due to negligence to $7,500 per child if the violation is found to be intentional. Interestingly, the new privacy rules proposed in the bill wouldn’t just apply to social apps like TikTok, YouTube, and Instagram but also very broadly to other online platforms that are likely to be accessed by children. “I am very interested in seeing if Instagram, TikTok, and YouTube will take this bill as seriously as the UK law, as they reportedly beefed up their systems with child protections to be in alignment,” added Garrubba.
But Too Little Too Late?
However, some privacy advocates, like Melissa Bischoping, Director of Endpoint Security Research Specialist at Tanium, and parent to a teenager, believe AB 2273 isn’t properly designed to protect young people from what they claim are manipulative practices employed by online platforms. “No one can argue that websites and applications should be designed and maintained in such a way that children—really everyone—are safer when using their technology,” Bischoping told Lifewire over email. “However, AB 2273 may be a poorly designed solution to the current gaps in protection and enforcement of privacy and online safety.” For instance, Bischoping points out that to comply with the proposed law, a site must establish the age of the visitor with a “reasonable level of certainty.” She thinks this will be abused by the platforms for more invasive tracking of every visitor. Her other fear is that if the age determination is implemented as a lazy checkbox solution that asks users their age, it’ll only end up riling users already annoyed by the cookie-permissions popups and could probably be easily passed by typing in a false birth year. “While the spirit of the bill is in the right place, the administration and implementation of the law is bound to cause more headaches for all consumers with very little benefit to the digital well-being of our kids, in addition to being costly to enforce loosely defined criteria,” said Bischoping. Instead of pinning their hopes on a bill, Bischoping encourages parents to become more involved in determining how their data, and their children’s, is handled by the sites and services they use “so they can make informed choices on subscriptions, relevant ad or cookie blockers, and train their kids how to keep online data secure as they come of age in a digital world.”