Advertisement
Singapore markets close in 4 hours 57 minutes
  • Straits Times Index

    3,280.37
    -12.76 (-0.39%)
     
  • Nikkei

    37,739.41
    -720.67 (-1.87%)
     
  • Hang Seng

    17,269.03
    +67.76 (+0.39%)
     
  • FTSE 100

    8,040.38
    -4.43 (-0.06%)
     
  • Bitcoin USD

    64,279.46
    -2,337.47 (-3.51%)
     
  • CMC Crypto 200

    1,391.46
    -32.64 (-2.29%)
     
  • S&P 500

    5,071.63
    +1.08 (+0.02%)
     
  • Dow

    38,460.92
    -42.77 (-0.11%)
     
  • Nasdaq

    15,712.75
    +16.11 (+0.10%)
     
  • Gold

    2,325.80
    -12.60 (-0.54%)
     
  • Crude Oil

    82.88
    +0.07 (+0.08%)
     
  • 10-Yr Bond

    4.6520
    +0.0540 (+1.17%)
     
  • FTSE Bursa Malaysia

    1,570.41
    -1.07 (-0.07%)
     
  • Jakarta Composite Index

    7,144.21
    -30.32 (-0.42%)
     
  • PSE Index

    6,580.98
    +8.23 (+0.13%)
     

Section 230 likely to take center stage at Big Tech hearing: 'Everyone's looking for something to blame'

The CEOs of Facebook (FB), Google (GOOG, GOOGL), and Twitter (TWTR) will appear before Congress again Thursday to answer lawmakers’ questions about the spread of disinformation and misinformation on their platforms.

And just like with previous hearings involving Big Tech, Section 230 of the Communications Decency Act is expected to take center stage. Considered a foundational law for the modern internet, Section 230 protects websites from legal liability for third-party content posted to their sites, as well as from liability for moderating that content.

The law has come under an increasingly harsh microscope as members of Congress on both sides of the aisle have sought to air their grievances about whether social media companies regulate speech on their platforms too much or too little — especially in light of the role of social media in the deadly assault on the Capitol on Jan. 6.

The debate over 230 could have long-lasting implications not only for how some of the biggest sites on the internet operate, but also the smallest, including whether they continue to host user-generated content.

ADVERTISEMENT

Thursday’s hearing before the House Energy and Commerce Committee will mark the third time in the past year that the three CEOs — Google’s Sundar Pichai, Facebook’s Mark Zuckerberg, and Twitter’s Jack Dorsey — have testified about how they moderate their platforms and Section 230. In October they were called before the Senate Judiciary Committee and in November they appeared before the Senate Commerce Committee. Topics during those hearings have run the gamut from online censorship to whether 230 has outlived its usefulness.

“Everyone's looking for something to blame and Section 230 is convenient,” David Ardia, associate professor of law and co-director of UNC’s Center for Media Law and Policy, told Yahoo Finance in advance of the November hearing. “But unless we are really clear about what it is we're hoping to achieve, it’s hard to know whether these efforts to change Section 230 are going to make any difference.”

Split views on moderation and Section 230

Republicans and Democrats generally have different views on how social media companies should moderate speech on their platforms and if Section 230 provides them with too much leeway to do so.

Some Republicans say tech companies show an anti-conservative bias by using Section 230’s broad immunity to remove or add warnings to conservative content that violates their platform rules. They further charge that social media companies are hiding right leaning content. Still, right-wing pages and posts have tended to perform well on sites like Facebook.

CEO of Twitter Jack Dorsey gives his opening statement remotely during the Senate Commerce, Science, and Transportation Committee hearing 'Does Section 230's Sweeping Immunity Enable Big Tech Bad Behavior?', on Capitol Hill in Washington, DC, U.S., October 28, 2020. Greg Nash/Pool via REUTERS     TPX IMAGES OF THE DAY
CEO of Twitter Jack Dorsey gives his opening statement remotely during the Senate Commerce, Science, and Transportation Committee hearing 'Does Section 230's Sweeping Immunity Enable Big Tech Bad Behavior?', on Capitol Hill in Washington, DC, U.S., October 28, 2020. Greg Nash/Pool via REUTERS TPX IMAGES OF THE DAY (POOL New / reuters)

Many Democratic lawmakers, meanwhile, say the companies aren’t doing enough to stop the spread of misinformation and disinformation on their platforms. They point to the ability for the “Stop the Steal” campaign to flourish on social networks ahead of the storming of the Capitol on Jan. 6, as well as COVID-19 disinformation as proof that the companies do too little to keep such content off of their sites.

“Congress doesn't really care about the facts. They don't really care about what their constituents want. They just want to do something,” Eric Goldman, Santa Clara University School of Law professor and associate dean for research, told Yahoo Finance.

“Congress is really complaining about speech that is lawful...and they just don’t like it,” Goldman said. “We call that censorship.”

The platforms already have a right under the First Amendment, regardless of Section 230, to decide what content to provide on their respective sites. Based on that, the government can’t dictate what social media companies should and shouldn’t allow on their sites.

But according to Goldman, 230 still provides useful protections. Effectively, he said, the law supplements the First Amendment by immunizing online platforms from claims where First Amendment defenses are not typically successful, and by opening a procedural “fast lane” that reduces the volume and cost of litigation.

Mark Zuckerberg, Chief Executive Officer of Facebook, testifies remotely during the Senate Judiciary Committee hearing on 'Breaking the News: Censorship, Suppression, and the 2020 Election' on Capitol Hill on November 17, 2020 in Washington, DC. (Photo by Bill Clark / POOL / AFP) (Photo by BILL CLARK/POOL/AFP via Getty Images)
Mark Zuckerberg, Chief Executive Officer of Facebook, testifies remotely during the Senate Judiciary Committee hearing on 'Breaking the News: Censorship, Suppression, and the 2020 Election' on Capitol Hill on November 17, 2020 in Washington, DC. (Photo by Bill Clark / POOL / AFP) (Photo by BILL CLARK/POOL/AFP via Getty Images) (BILL CLARK via Getty Images)

Without Section 230’s protections, companies and scholars have predicted that online platforms will take an all or nothing approach to content. Either they’ll allow all content to avoid accusations of bias, or they won’t allow any to avoid accusations they don’t moderate content enough. And that could ruin the internet as we know it, some argue.

Still, some scholars say now may be the best time to amend Section 230, and that the social media sites brought the pressure on themselves.

“I think the time is ripe for revisiting Section 230, given recent examples of social media platforms having to exercise editorial control over user-generated content,” S. Shyam Sundar, professor of media ethics at Pennsylvania State University told Yahoo Finance. “When Twitter put warning labels on Trump’s tweets last November, Twitter, Inc. ended up looking more like a publisher than a newsstand.”

And that distinction, some argue, is important because publishers like, say, The New York Times or The Washington Post, are not entitled to Section 230 liability protections for content that they, and sometimes others, create. If a website alters third-party content to a certain extent, they say, the platform should be stripped of its immunity shield.

Yet Goldman says Congress intentionally made no legal distinction between publishers and platforms to ensure its core value proposition: That websites would not be discouraged from doing the work that might otherwise consider them a publisher.

“The whole point of Section 230 was to say, ‘We don’t care about where that distinction is. We want internet services to do this socially valuable content moderation work, and we'll give them legal protection if they choose to do that.’ Anyone bringing back the publisher-platform distinction, doesn't really understand the basic architecture of Section 230.”

For his part, Sundar says lawmakers will need to thread the needle when addressing any issues with 230.

“I am hopeful of finding a meaningful middle ground that will assign responsibility to platforms for monitoring dangerous content without regulating speech, holding them accountable for their decisions by demanding more transparency in their content-moderation operations,” he said.

Alexis Keenan is a legal reporter for Yahoo Finance and former litigation attorney. Follow Alexis Keenan on Twitter @alexiskweed.

Got a tip? Email Daniel Howley at dhowley@yahoofinance.com over via encrypted mail at danielphowley@protonmail.com, and follow him on Twitter at @DanielHowley.

Read more:

Should Facebook let Trump back on? A new slim majority want to lift the ban

Square's Cash App vulnerable to hackers, customers claim

Twitter's Jack Dorsey sued over his dual role as Square CEO