In Congress, remote work policy debate hides facts Supreme Court cases could usher in changes to Section 230

Legal experts back Section 230 reform

Section 230 reform and addressing the liability protections afforded large tech companies is a divisive debate that has yet to result in bipartisan bills.

Congress is attempting to rein in big tech companies through multiple avenues of regulation, including Section 230 reform.

This week, the House Committee on Energy and Commerce held a hearing discussing four bills aimed at limiting Section 230's protection of big tech companies. Section 230 of the Communications Decency Act protects companies from liability for speech or information posted on their platforms. Testifying was Frances Haugen, the former Facebook employee who leaked internal documents revealing that Facebook was aware of the harms its platforms could cause teenage users.

Facebook has come under fire for allowing the sharing of violent, extreme rhetoric across its platforms. Haugen advocated for new online rules and revisions to Section 230 that make systems like Facebook's safer -- a task she said is feasible.

"Facebook has hidden from you countless ways to make the platform itself safer that don't require anyone to pick and choose what ideas are good," Haugen said during the hearing. "Facebook hid these options from you because the status quo made them more money."

Section 230 reform is a divisive regulatory measure. Of the four bills discussed during the hearing, none had Republican support. Despite that, a bipartisan agreement on antitrust and Section 230 reform remains possible.

"There is a bipartisan desire to reform the court's interpretation of Section 230, and the American public wants to see us get things done," committee chairman Rep. Mike Doyle, D-Pa., said during the hearing. "I urge all my colleagues, Republican and Democratic, to bring their ideas forward now and let's work together on bipartisan legislation because we can't continue to wait."

Legal experts weigh in on Section 230 reform

The four bills discussed during the hearing included the:

Matthew Wood, vice president of policy and general counsel at independent media advocacy group Free Press Action, said while the four bills feature promising concepts, he also has some concerns particularly about language in some of the bills such as the Justice Against Malicious Algorithms Act of 2021 and Protecting Americans from Dangerous Algorithms Act. Creating legislation for the algorithms could lead to hard questions about definitions and exemptions rather than focusing on a platform provider's knowledge or liability, he said.

"We don't want to prevent accountability when platforms' actions cause harm, even in the absence of personalized recommendations," he said.

Additionally, Wood said Section 230 is a "foundational and necessary" law that benefits not just tech companies, but also people who share ideas online. Wood said it's important for Congress to preserve Section 230's benefits while considering revisions to "better align court outcomes with the statute's plain text."

Section 230 should allow injured third parties to hold platforms liable, he said. Though courts have let some lawsuits go forward, most courts have opted not to hold platforms liable. Court cases like Zeran v. America Online Inc. (AOL) that ruled Section 230 provides Internet service providers with immunity from online libel suits, prevent plaintiffs from testing liability for platform conduct, Wood said.

Mary Franks, professor at the University of Miami School of Law, also supported Section 230 reform.

The possibility of liability forces people and industries to take care, internalize risk and prevent foreseeable harm.
Mary FranksProfessor, University of Miami School of Law

Franks said big tech lacks the incentive to behave responsibly. Liability protection provided by Section 230 means the "drive to create safer or healthier online products and services simply cannot compete with the drive for profit."

As long as tech companies can continue to operate without liability, they will continue to cause harm, she said.

"The possibility of liability forces people and industries to take care, internalize risk and prevent foreseeable harm," she said.

Also this week

  • The Federal Trade Commission has sued to block U.S. chip supplier Nvidia Corp.'s controversial $40 billion acquisition of U.K.-based chip designer Arm Ltd. The merger has already been under investigation by the U.K. Competition and Markets Authority (CMA) for months. According to a news release from the FTC, the "proposed vertical deal would give one of the largest chip companies control over the computing technology and designs that rival firms rely on to develop their own computing chips." The FTC alleges the merger could potentially stifle innovation.
  • The U.K. Competition and Markets Authority has told Facebook, now Meta, to sell GIF-sharing platform Giphy, which the company acquired last year. According to the CMA, before the merger, Giphy's innovative advertising services had the potential to compete with Facebook's advertising services. After the merger, Facebook ended Giphy's advertising services and the potential competition. "By requiring Facebook to sell Giphy, we are protecting millions of social media users and promoting competition and innovation in digital advertising," said Stuart McIntosh, who is leading the Facebook-Giphy investigation, in a press release.
  • Ride-hailing firm DiDi announced plans to delist from the New York Stock Exchange after facing mounting pressure from Chinese regulatory authorities. The Chinese government launched an investigation into Didi after it went public on the New York Stock Exchange five months ago.

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.

Dig Deeper on CIO strategy