Twitter CEO takes some responsibility for Stop the Steal spread
Dorsey, Facebook CEO Mark Zuckerberg and the head of Google and its parent Alphabet, Sundar Pichai were asked if their platforms bear any responsibility for disseminating “Stop the Steal” disinformation alleging the 2020 presidential election was stolen from Donald Trump.
Twitter Inc. Chief Executive Officer Jack Dorsey said he takes some responsibility for online organizing that led to the Jan. 6 riot at Capitol Hill, while the leaders of Facebook Inc. and Alphabet Inc. deflected blame during a Congressional hearing that focused on social media disinformation.
Representative Mike Doyle, a Pennsylvania Democrat, asked Dorsey, Facebook CEO Mark Zuckerberg and the head of Google and its parent Alphabet, Sundar Pichai, if their platforms bear any responsibility for disseminating “Stop the Steal” disinformation alleging the 2020 presidential election was stolen from Donald Trump. Doyle demanded a yes or no answer.
“Yes, but you also have to take into consideration a broader ecosystem,” Dorsey said. “It’s not just about the technology platforms we use.”
Doyle cut off Zuckerberg when he responded that Facebook’s responsibility is to “build effective systems,” and said individuals who organized the events and those who questioned the election’s outcome, including Trump, deserved blame. When Doyle asked Pichai if his statement that “we always feel a deep sense of responsibility” amounted to a yes, the CEO said it was a “complex question.”
This exchange set the tone for a tense back and forth between the leaders of the world’s most powerful social media networks and lawmakers eager to hold them accountable over how they police falsehoods on Covid-19, vaccines and the election on their internet services. Many committee members also pressed the executives on the negative impact of their products on children and teenagers.
The executives appeared on Thursday before members of two U.S. House Energy and Commerce subcommittees during a virtual hearing examining social media’s role in promoting extremism and disinformation.
While some lawmakers have been seeking tighter regulations of online content for years, pressure is increasing on tech companies to more aggressively curtail violent and misleading material on their platforms following the Jan. 6 riot at the U.S. Capitol, which left five people dead and many more injured.
“People died that day, and hundreds were seriously injured,” Doyle said on Thursday. “That attack and the movement that motivated it started and was nourished on your platforms. Your platforms suggested groups people should join, videos they should view, and posts they should like.”
Trump’s supporters used social media sites -- particularly alternative platforms such as Parler and Gab, but also larger services -- to organize the riot, which was held in protest of Trump’s loss to President Joseph Biden in the November election.
In recent months, Democrats have been pushing the tech giants to do more to rid conspiracy theories about Covid-19 and the vaccine that prevents its symptoms from their websites.
“The witnesses here today have demonstrated time and again that promises to self-regulate don’t work,” said Jan Schakowsky, chair of the Consumer Protection and Commerce Subcommittee, in an opening statement. “They must be held accountable for allowing disinformation and misinformation to spread across their platforms, infect our public discourse, and threaten our democracy.”
Thursday’s hearing is sparking renewed debate in Washington over whether Congress should weaken or even revoke a decades-old legal shield that protects social media companies from liability for user-generated content posted on their sites, known as Section 230 of the Communications Decency Act of 1996.
While both parties have proposed bills to reform the law, they have sparred over how tech companies should change their content moderation practices. Republicans have threatened to weaken the legal protection for tech companies over unfounded accusations that social media firms are systematically censoring conservative viewpoints. Democrats want internet companies to do more to curb the spread of misinformation, hate speech and offensive content.
“Given your promises in the fall, the events that transpired on January 6 and your true incentive that you yourself admit, I find it really difficult to take some of these assurances you are trying to give us today seriously,” Debbie Dingell, a Michigan Democrat, said.
Dingell followed up by asking Zuckerberg if he would be opposed to a law to enable regulators’ access to tech companies’ algorithms that promote disinformation and extremism.
“I don’t agree with your characterization,” Zuckerberg said. “I do think giving more transparency into the system is an important thing.” He added it might be hard to separate the algorithms and people’s data, arguing such a proposal might risk users’ privacy.
Representative Robin Kelly, an Illinois Democrat, said the companies’ business models to promote engagement on their platforms come at the cost of spreading disinformation.
“To build that engagement, social media platforms amplify content that gets attention -- that can be cat videos or vacation pictures -- but too often it means content that’s incendiary, contains conspiracy theories or violence,” she said. “This is a fundamental flaw in your business model.”
The tech executives also differed in their support for making changes to the legal shield. Before the hearing, Zuckerberg told the committee he supports making the liability protection conditional on having systems in place for identifying and removing unlawful material. Under Zuckerberg’s proposal, a third party would determine whether a company’s systems are adequate.
Google’s Pichai, whose company owns the most popular video website, YouTube, signaled that he was more skeptical of making changes to the law. Reforming it or repealing it altogether “would have unintended consequences -- harming both free expression and the ability of platforms to take responsible action to protect users in the face of constantly evolving challenges,” he said in prepared testimony.
Later, under questioning, Pichai said he was open to Zuckerberg’s approach.
“There are definitely good proposals around transparency and accountability, which I’ve seen in various legislative proposals as well, which I think are important principles,” Pichai said. “We would certainly welcome legislative approaches in that area.”
Dorsey said he supported the idea of encouraging tech companies to be more transparent about their practices.
Several bills being considered by Congress seek to weaken the legal shield in an effort to encourage the platforms to bolster their content moderation practices. Democratic senators, led by Mark Warner of Virginia, introduced the SAFE TECH Act, which would hold companies liable for content violating laws pertaining to civil rights, international human rights, stalking, harassment or intimidation.
And a bipartisan bill -- the PACT Act -- from Democratic Senator Brian Schatz of Hawaii and Republican Senator John Thune of South Dakota would require large tech companies to remove content within four days if notified by a court order that the content is illegal.
Several members of the subcommittees used the hearing to ask pointed questions about how the companies’ products affect children, amid Facebook’s push to make a version of Instagram for kids under age 13.
Representative Cathy McMorris Rodgers, a Washington Republican, criticized the power of tech companies’ algorithms to determine what children see online.
“Over 20 years ago, before we knew what Big Tech would become, Congress gave you liability protections. I want to know, why do you think you still deserve those protections today?” said McMorris Rodgers, the committee’s top Republican. “What will it take for your business model to stop harming children?”
Representative Bob Latta, an Ohio Republican, asked Zuckerberg whether Facebook shoulders part of the blame for an underage girl’s suicide after a man showed a compromising photo of her to her peers on the social network.
Zuckerberg said it was “an incredibly sad story,” and said his company bears responsibility to build systems to remove that kind of content.