California Legislature Pushes All-out Children’s Online Privacy Protection Act


A new children’s online privacy protection bill is rapidly passing the California state legislature and could lead to a scrutiny of national safety standards for websites likely to be accessed by children.

The California Age-appropriate Design Code calls for turning off children’s location information on social media platforms, stopping “nudge” techniques that trick children into giving up information, reducing exposure to harmful content and limiting the potential for risky connections with adults.

The bill, jointly created by Oakland Democrat Buffy Wicks and Templeton Republican Jordan Cunningham, UK law recently passed. The bill, where teenagers spend an average of 8.5 hours online every day, will force social media companies to implement the strongest safety settings by default for users under the age of 18.

“If you don’t have government regulations to make this a priority, you think about it later,” Wicks told CBS News. “It’s what regulation can, should, and should do, and it’s about forcing dialogue at a much higher level within the company.”

More than a dozen bills in Congress

Efforts to pass a children’s online safety law in California come as federal efforts are not gaining momentum. More than 15 bills are currently circulating in Congress, several with bipartisan support, for goals such as modernizing internet safety standards, making it easier for people to sue large companies and creating data privacy agencies. The American Innovation and Choice Act, which focuses on enforcing antitrust in the industry, is the only legislation that has gone through a committee vote.

Despite multiple congressional hearings and explosive testimony involving executives from Meta, Twitter, Snapchat, TikTok and YouTube Whistleblower Frances HaugenProgress on this measure has been hampered by other legislative priorities and now looks unlikely with midterm elections imminent.


Are the kids okay?: Internet | CBS Report

23:12

safety over profit

California law states that, in the event of a conflict, social media platforms and all websites “potentially” accessible to children must put the best interests of children ahead of their “commercial interests.”

The phrase reminds me of Facebook’s controversial Senate hearing last yearThe company denied the allegations when lawmakers accused the company of prioritizing profits over safety.

“Own data makes the strongest argument for this type of legislation, why this type of protection is important,” Wicks said.

Paused meta Instagram for kids After backlash from advocates and lawmakers last year, the project told CBS News that the company wants to create age-appropriate features, give teens control over their privacy and online experiences, and include parents in the process. said. recently released family center Parents have more access to supervision tools.

Teen accounts on Instagram are set to private mode by default. In addition to notifications about Instagram’s “Take Break” feature, Meta said if teens stay on one topic for a long time, they’ll soon start nudge to another.


Facebook whistleblower shares fears about metaverse

07:29

What teen activists say

For Emily Kim, this is a welcome change. Kim downloaded her Instagram as soon as she got her first phone call at the age of 13.

Rep. Kim said at a hearing at the National Assembly last month, “I found myself staring at me as I scrolled down the profiles of my colleagues. Her “online pain” continued even after an autoimmune disease led to severe hair loss, she told lawmakers.

Kim said, “Female students used to post pictures of themselves participating in a number of trends that I didn’t participate in.”

Now 18, Kim is collaborating with LOG OFF, a teen-led digital wellness advocacy group, to educate her colleagues about “the dangers of social media and how to use it safely.” She said in favor of California legislation that legislation was needed “to protect young people from increasing mental and physical risks.”

Privacy by default

Wicks and Cunningham’s bill was unanimously passed by the Privacy and Consumer Protection Commission in April. You can reach the Capitol this month.

Wicks said the UK’s new children’s laws are working and could have “significant impact” if California can successfully follow the same model.

According to the 5Rights Foundation, a London-based non-profit that supports UK law and backs California legislation, “a wide range of services has changed privacy settings hundreds of times” to comply with UK law.

August, Google has set SafeSearch as the default option. Turn off location history for users under the age of 18 worldwide. YouTube has disabled autoplay. And for those under 18, I turned on bedtime reminders by default. TikTok also announced improved safety features.This includes disabling direct messaging between children and adults and turning off push notifications after 10pm for underage users by default.

Eric Null, director of the privacy and data project at the Center for Democracy and Technology, told CBS News, “Businesses can accept whatever the strongest state law on a particular subject and make it a national guiding principle. There is history,” he said.

Websites or services that currently target children under the age of 13 must comply with the Children’s Online Privacy Protection Rules of 1998 (COPPA).

Null notes that while COPPA focuses on “parents taking action to get their children to use websites or collect data from companies,” the California legislation focuses much more on “what companies are and what aren’t allowed” and explained that there is.

Unintended consequences?

While there is “a significant amount” of positive progress in California legislation, Null warns that it could have unintended consequences.

“One of the biggest privacy implications of this kind of billing is that essentially all websites are required to enforce age restrictions and collect information about the age of all users,” Null said. “It takes a lot of data collection for every single user on almost every single website,” he added.

Meta and Google have not weighed in on the move, but some industry trade groups are raising concerns.

Two opposing groups, TechNet and the California Chamber of Commerce, say it’s overreaching to include all sites “probably accessible to children,” not just those targeted at children.

The group also argued that the bill’s “new standard for age verification” would force businesses to collect more information about users, such as “birthdates, addresses, and government IDs.”

The Electronic Frontier Foundation (EFF) told Wicks that it could not support the bill unless it was amended to include only users under the age of 13 under federal law. The EFF also said many terms in the bill were “ambiguous” and the enforcement mechanisms were unclear.

“We’re working on the executive component right now, and we’re trying to figure out the best way to do it,” said Wicks. She said the bill is not intended to “damage the big tech” and she wants social media executives to support it.

“They are parents too,” Wicks said.

.

Leave a Comment