Getty Images

Senators push for more online child privacy protections

U.S. senators expressed frustration with social media giants for not supporting specific legislation enhancing child privacy protections online.

Congressional leaders attempting to increase privacy protections for children online are growing frustrated with big tech's lack of support for new regulation. But social media platform providers told lawmakers at a U.S. Senate hearing Tuesday that they are already taking on the task of safeguarding the security and privacy of children.

Representatives from TikTok, Snap Inc. and YouTube testified before the U.S. Senate Committee on Commerce, Science and Transportation Tuesday at a hearing on privacy protections for kids online. Sen. Richard Blumenthal, D-Conn., raised concerns about the harms social media platforms cause children and teens, such as promoting content related to self-harm and eating disorders.

Concerns about privacy protections for children online stem from recent internal research leaked by whistleblower Frances Haugen showing that social media giant Facebook's platforms such as Instagram threaten teens' mental health. Blumenthal, as well as other U.S. senators, pressed the social media company representatives on what legislative steps the companies would support to make sure children's data, privacy and livelihoods are protected online.

While there was general support for implementation of federal data privacy legislation, something social media giants like Twitter have called for as well, the TikTok, Snap and YouTube representatives did not endorse particular pieces of legislation introduced to protect children online. Such measures include the bipartisan Children and Teens' Online Privacy Protection Act and the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT Act).

"I join in the frustration felt by many of my colleagues that good intentions, support for goals, endorsement of purposes, are no substitute for actual endorsement," Blumenthal said. "I would ask that each and every one of you support … specific measures that will provide for legal responsibility."

TikTok, Youtube, Snap talk privacy protections

Jennifer Stout, vice president of global public policy at Snap, said the company has already undertaken privacy and security efforts outside of regulation.

Snap, operator of popular video and photo platform Snapchat, has deployed tools to prevent children from viewing age-regulated content and ads. Snap also does not allow children under the age of 13 to create accounts, something many platforms claim to prohibit.

Given the different speeds at which technology develops and the rate at which regulation can be implemented, regulation alone can't get the job done.
Jennifer StoutVice president of global public policy, Snap Inc.

"Given the different speeds at which technology develops and the rate at which regulation can be implemented, regulation alone can't get the job done," she said. "Technology companies must take responsibility to protect the communities they serve. If they don't, government must act to hold them accountable."

Michael Beckerman, vice president and head of public policy for the Americas at TikTok, said the company has implemented privacy controls such as keeping accounts of users under 16 private and preventing young teens from livestreaming and direct messaging. TikTok has also built parental controls that allow parents to link their accounts from their devices to their teens' accounts to "enable a range of privacy and safety controls."

Beckerman said he believes the Children's Online Privacy Protection Act (COPPA) should be updated, which is what the Children and Teens' Online Privacy Protection Act introduced by Sen. Ed Markey, D-Mass., and Bill Cassidy, R-La., aims to do. However, before supporting such legislation, Beckerman said TikTok would like to see additions to the legislation, such as outlining a better way for social media platforms to verify user age.

Leslie Miller, YouTube's vice president of government affairs and public policy, said the platform removed millions of videos that violate YouTube's child safety policies and launched YouTube Kids in 2015 that provides parents with tools to control and customize the user's experience.

Miller said her team is participating in various proposals to update Section 230 of the Communications Decency Act -- such as the EARN IT Act -- that shields social media platforms from liability for third-party content posted on their platforms.

While YouTube does "support the goals" of the EARN IT Act, Miller said there are some portions of the legislative proposal that need additional work.

"We see 230 as the backbone of the Internet, and it is what allows us to moderate content," she said. "So we want to make sure we continue to have protections in place so we can moderate our platforms so they are safe and healthy for users."

Blumenthal expressed dissatisfaction with responses from all three representatives and said he hopes for continued discussions with the platforms to determine the best path forward when it comes to regulation.

"I know with certainty that we need more than the kind of answers that you have given to persuade me that we can rely on voluntary action by any of the big tech companies," Blumenthal said. "I think you're going to hear a continuing drumbeat and I hope you'll be part of the effort for reform."

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.

Next Steps

California privacy law might push online age verification

Dig Deeper on Risk management and governance