Facebook, Google CEOs blasted in Congress over apps for kids
House Energy and Commerce subcommittees during a virtual hearing also examined how social media companies police falsehoods on Covid-19, vaccines and the election.
Facebook Inc. and Google came under fire at a congressional hearing for the impact their social media services have on children, with lawmakers zeroing in on Facebook's plan for a new app for kids and YouTube's feature that serves up a continuous stream of videos.
At a hearing focused on disinformation and extremism, lawmakers pressed Facebook Chief Executive Officer Mark Zuckerberg and Sundar Pichai, CEO of Alphabet Inc. and Google, which owns YouTube, to answer questions about whether their products are designed to keep kids addicted and pose a threat to their well-being.
“Your platforms are my biggest fear as a parent,” said Representative Cathy McMorris Rodgers, a Washington Republican and the mother of three school-aged kids. “My husband and I are fighting the big tech battles in our household every day. It's a battle for their development, a battle for their mental health, and ultimately, a battle for their safety.”
The tech executives appeared alongside Twitter Inc. CEO Jack Dorsey on Thursday before members of two U.S. House Energy and Commerce subcommittees during a virtual hearing that also examined how social media companies police falsehoods on Covid-19, vaccines and the election. Questions at the more than five-hour-long hearing addressed points ranging from the U.S. Capitol riots to corporate diversity reports, but one topic that surfaced over and over was whether the internet giants do enough to protect children from the harmful effects of their services on their mental health and privacy.
The bipartisan attack from lawmakers about the effect of social media on kids marked an escalation in one front of broader effort to rein in the tech giants. While the lawmakers pressed the executives aggressively on the issue, they have thus far offered little detail on how they would regulate the companies' services for young people.
Several lawmakers cited the news earlier this month that Facebook is building a version of its photo-sharing app Instagram specifically for children younger than 13 --- an age group that is currently prohibited from using most of the social media giant's services. Previously, Facebook launched Messenger Kids, which is a version of the company's messaging app for pre-teens that gives parents the power to keep tabs on their children's actions on the service. Google has also created a separate YouTube Kids app to provide safer, youth-oriented video content.
Representative Bob Latta, an Ohio Republican, asked Zuckerberg whether Facebook shoulders part of the blame for an underage girl's suicide after a man showed a compromising photo of her to her peers on the social network.
Zuckerberg said it was “an incredibly sad story,” and said his company bears responsibility to build systems to remove that kind of content. In another exchange, he sought to highlight the good social media can do when it enables meaningful interactions.
“Using social apps to connect with other people can have positive mental health benefits,” Zuckerberg said.
Representative Bill Johnson, an Ohio Republican, likened Facebook's and Google's products aimed at kids to tobacco companies selling cigarettes to youth. He argued that long-term risks to children are one reason why Congress should consider curtailing a legal shield that protects internet platforms from lawsuits over third-party content known as Section 230 of the Communications Decency Act of 1996.“By allowing big tech to operate under Section 230 as is, we'll be allowing these companies to get our children hooked on their destructive products for their own profit,” he said. “Big tech is essentially handing our children a lit cigarette and hoping they stay addicted for life.”
McMorris Rodgers criticized the power of tech companies' algorithms to determine what children see online, and linked that responsibility with the liability shield provided by Section 230.
“Over 20 years ago, before we knew what Big Tech would become, Congress gave you liability protections. I want to know, why do you think you still deserve those protections today?” said McMorris Rodgers, the committee's top Republican. “What will it take for your business model to stop harming children?”
Massachusetts Democrat Lori Trahan questioned the CEOs over what she called the companies' “manipulative design features intended to keep them hooked,” such as the auto-play function on YouTube, which rolls a viewer directly into a new video when one ends. She questioned Zuckerberg over whether Facebook would enable “endless” scrolling and the ability to add filter effects on photos on the new Instagram app for kids. Zuckerberg said that Instagram Kids is still in early development, and the company is looking into safety measures as part of that process.
“This committee is ready to legislate to protect our children from your ambition,” Trahan said. “What we're having a hard time reconciling is while you're publicly calling for regulation -- which comes off as incredibly completely decent and noble -- you're plotting your next frontier of growth which deviously targets our young children.”
Social media services targeted at kids and teens have also caught the attention of the Federal Trade Commission, which has fined companies for violating children's privacy laws. In 2019, YouTube agreed to pay a record $170 million fine for failing to obtain parental consent in collecting data on kids under the age of 13. The FTC has also spoken with Facebook after it was revealed a flaw allowed some kids to chat with people their parents hadn't approved.
Representative Kathy Castor, a Florida Democrat, asked Pichai and Zuckerberg how much money their companies make off advertising revenue shown to kids.
Pichai said kids aren't allowed to use most of Google's products. When Zuckerberg answered Castor with a similar refrain, the lawmaker interrupted the CEO.
“Every parent knows there are kids under the age of 13 on Instagram,” she said. “The problem is you know it, and you know that the brain and social development of our kids is still evolving at a young age.”
Castor vowed to strengthen laws protecting children online. Last year, she introduced a bill that would bolster the Children's Online Privacy Protection Act to force companies to gain consent from people under age 18 before collecting or sharing their personal information.
“Because these platforms have ignored it, they have profited off it, we're going to strengthen the law,” she said.