Lawmakers press Big Tech CEOs on speech responsibility
Washington – The CEOs of social media giants Facebook, Twitter and Google faced a grilling Thursday as lawmakers tried to draw them into admitting ’ responsibility for helping fuel the January insurrection at the U.S. Capitol and rising COVID-19 vaccine misinformation.
In a hearing by the House Energy and Commerce Committee, lawmakers pounded Facebook CEO Mark Zuckerberg; Sundar Pichai, the CEO of Google, which owns YouTube; and Twitter chief Jack Dorsey over their content policies, use of consumers’ data and media use by young children.
Republicans raised long-running conservative grievances, unproven, that the platforms are biased against conservative viewpoints and censor material based on political or religious viewpoints.
There is increasing support in Congress for legislation to rein in Big Tech companies. “The time for self-regulation is over. It’s time we legislate to hold you accountable,” said Rep. Frank Pallone, D-N.J., the committee’s chairman.
That legislative momentum, plus the social environment of political polarization, hate speech and violence against minorities, was reflected in the impatience of panel members as they questioned the three executives. Several lawmakers demanded yes or no answers and repeatedly cut the executives off.
“We always feel some sense of responsibility,” said Pichai. Zuckerberg used the word “nuanced” several times to insist that the issues can’t be boiled down. “Any system can make mistakes” in moderating harmful material, he said.
The three staunchly defended their companies’ efforts to weed out the increasingly toxic content posted and circulated on services used by billions of people while striving to balance freedom of speech.
“I don’t think we should be the arbiters of truth and I don’t think the government should be either,” Dorsey insisted.
Democrats are laying responsibility on the social media platforms for disseminating false information on the November election and the “Stop the Steal” voting fraud claims fueled by former President Donald Trump, which led to the deadly attack on the Capitol. Rep. Mike Doyle, a Pennsylvania Democrat, told the CEOs that the riot “started and was nourished on your platforms.”
Support is building for Congress to impose new curbs on legal protections regarding speech posted on their platforms. Both Republicans and Democrats – including President Joe Biden as a candidate – have called for stripping away some of the protections under so-called Section 230 of a 25-year-old telecommunications law that shields internet companies from liability for what users post.
The tech CEOs defended the legal shield under Section 230, saying it has helped make the internet the forum of free expression that it is today. Zuckerberg, however, again urged the lawmakers to update that law to ensure it’s working as intended. He added a specific suggestion: Congress could require internet platforms to gain legal protection only by proving that their systems for identifying illegal content are up to snuff.
Trump enjoyed special treatment on Facebook and Twitter until January, despite spreading misinformation, pushing false claims of voting fraud, and promulgating hate. Facebook banned Trump indefinitely a day after rioters egged on by Trump swarmed the Capitol. Twitter soon followed, permanently disabling Trump’s favored bullhorn.
Facebook hasn’t yet decided whether it will banish the former president permanently. The company punted that decision to its quasi-independent Oversight Board – sort of a Supreme Court of Facebook enforcement – which is expected to rule on the matter next month.
Researchers say there’s no evidence that the social media giants are biased against conservative news, posts or other material, or that they favor one side of political debate over another.
Democrats, meanwhile, are largely focused on hate speech and incitement that can spawn real-world violence. An outside report issued this week found that Facebook has allowed groups – many tied to QAnon, boogaloo and militia movements – to extol violence during the 2020 election and in the weeks leading up to the deadly riots on the Capitol.
The report from Avaaz, a nonprofit advocacy group that says it seeks to protect democracies from misinformation, identified several hundred pages and groups on Facebook that it says spread violence-glorifying material to a combined following of 32 million users. Facebook acknowledged that its policy enforcement “isn’t perfect,” but said the report distorts its work against violent extremism and misinformation.
Ortutay reported from Oakland, California. AP Technology Writer Michael Liedtke in San Ramon, California, contributed to this report.