Placeholder Image

Subtitles section Play video

  • Section 2 30 has become the single most important law shaping online platforms.

  • In simple terms, it's the law that protects services if a user posts something illegal.

  • But as these services have become bigger and more controversial, people are more seriously calling for the law to change.

  • If you have any kind of problem with Facebook or YouTube, somebody's probably proposed fixing it by changing Section 2 30.

  • And as pressure in Congress builds, those fixes could have huge consequences for the rest of the Internet.

  • When 2 30 is created, it's created by people newly creating this nascent Internet.

  • Our laws shouldn't remain stagnant as they were 25 years ago.

  • It doesn't mean we get rid of them.

  • It means we change them.

  • So let's talk about how we got here.

  • In the mid nineties, a lot of the Internet was message boards, some tiny and independent others on bigger services like Compuserve, Prodigy and AOL.

  • And when a user posted something libellous or otherwise illegal, a couple of court rulings effectively punished companies for moderating content.

  • They basically said that if a site took down some posts, it could become legally responsible for other posts while just letting a service run wild could keep companies out of trouble.

  • Lawmakers thought this was a bad idea.

  • So when they passed the Communications Decency Act in 1996 they included Section 2 30 to settle the question.

  • The first part says that interactive computer services, apps, websites, newspaper comment sections, AOL and so on can't be legally treated as the publisher of their users posts.

  • The second part says, you can't sue a platform for good faith moderation efforts.

  • Overall, the point is simple.

  • When a site or APP accepts user generated content, it typically doesn't accept liability for that content, no matter how it moderates.

  • A lot of conservative politicians have gotten this wrong, claiming there are separately regulated categories of platforms and publishers, and that moderation makes you a publisher with more legal liability.

  • In reality, Section 2 30 was literally written, so this wouldn't happen.

  • So some of these same politicians have tried to roll back or abolish Section 2 30 to change this, trying to make it tougher for sites to suspend users like former President Donald Trump.

  • As you know, I have long been concerned about Twitter's pattern of censoring and silencing individual Americans.

  • Why do you block anything?

  • Well, we have policies that are focused on making sure that more voices on the platform are possible.

  • We see a lot of abuse and harassment, which ends up silencing people and something on leave from the.

  • But now the First Amendment also says companies probably don't have to publish speech they don't like, including Trump posts.

  • But Section 2 30 makes it easier to fight a potentially expensive lawsuit.

  • When Democrats like current President Joe Biden, talk about abolishing Section 2 30 though, they're usually referring to a more complicated problem.

  • If somebody slanders you in a YouTube video or criminally harasses you on Facebook, Section 2 30 says you should sue or prosecute the original offender, not the platform that hosts them, but on an Internet with billions of users, Patrol mobs and automated systems can spread this content really fast and going after each user is almost impossible.

  • Lots of Democrats and some Republicans, one sites to take a more active role in policing illegal content, and they think that right now, Section 2 30 lets companies escape responsibility.

  • I just think that social media has to be more socially conscious of what is important in terms of our democracy.

  • I, for one, think we should be considering taking away the exemption that they cannot be sued for, knowingly engaged on promoting something that's not true, trusting that they will be regulated into actually caring about the problems of haste.

  • Feast is, in my opinion, a very silly wicked if you're going to do it By 2 30 the Internet issues are way bigger than Section 2 30.

  • To explore them, the Verge held a panel discussion with the general counsels of the Wicked Media Foundation and Video, as well as writer and researchers.

  • Today, Harry, they leave out the complex problems that any legal reform would have to deal with and the potential impact of getting it wrong.

  • No amount of Section 2 30 performs going to fix the fact that the First Amendment also protects a lot of speech online.

  • Even if you took away Section 2 30 today, you're still not going to be able to sue platforms for misinformation about the election.

  • Misinformation about vaccines, you know, things that are hateful to minority groups.

  • But if you look at the people who actually do sue for things like defamation.

  • It's generally the people in power.

  • It's not the people out of power.

  • And so, by removing Section 2 30 protections, do you privilege those people over people who can't afford a lawyer and we'll never see you anyway?

  • I think you probably do.

  • It just becomes a billion dollar tech companies throwing money around the silence, whoever is going to be a problem and no matter what, if it goes to money in law, the people will lose, and that is the worst possible outcome.

  • These issues go way beyond quote unquote big tech.

  • There are smaller sites that basically exists just to promote harassment, nonconsensual pornography and other criminal behavior.

  • And right now, Section 2 30 protects them along with every other side on the Internet.

  • But a lot of companies, nonprofits and ordinary people are just trying to run a site or app in good faith and changing Section 2 30 could catch them in the crossfire.

  • Are you going to create a situation where you need a so called army of moderators to deal with the problem?

  • Because, if so, you risk entrenching the large players like Facebook and others because they already do have virtually unlimited resources and can deal with the issue.

  • So far, Congress has introduced some potential changes, and they basically fall into two categories.

  • The first is making sites earn Section 2 30 protections.

  • The Pact Act, for instance, says that sites need to publish a clear moderation policy.

  • They need to remove illegal content within 24 hours of hearing about it, and they need to staff a phone helpline with actual human beings.

  • The pack pack also includes a small site exemption so it could theoretically have a smaller impact on the Internet.

  • The second category is limiting what kind of content section 2 30 covers.

  • That's the goal of the new safe Tech Act, which as liability for a bunch of different offending content, including harassment and wrongful death.

  • But when Congress has limited protections before, it's often accidentally created more problems.

  • I really worry about the kind of abuse.

  • If there was overly broad restrictions on people's ability to host what is actually knowledge, I would want to make sure that any intervention was narrowly and specifically tailored without creating some of the kind of unintended consequences for real people that have made real people less safe.

  • Section 2 30 doesn't cover copyright infringement, for instance, so sites have created takedown systems and automated filters to catch Pirated content.

  • But trolls abuse these systems to censor users, and the flagging systems make a lot of mistakes.

  • And that is the crux of this.

  • Is that the same things that protects us?

  • The same things that enable harm against us are intertwined, and that is what we're teasing out.

  • And I am not comfortable with any discussion that pretends we will solve the problems of people or solve the problems of democracy.

  • With a large opening one size fits all discussion, Section 2 30 often gets looked at as the only tool in the toolbox.

  • If privacy is our concern, we should focus on privacy laws.

  • If it's data security, we should focus on data security laws.

  • I worry about locking in sort of the technology of 2021.

  • Frankly, don't think we're we've got it right.

  • The Internet has real problems, and since section 2 30 helped create the Internet as we know it, you can blame it for some of them.

  • But Section 2 30 also helped create a lot of what we love about the Internet.

  • So as lawmakers try to change it, the real question is whether good intentions will translate into good results.

  • I ordered some potatoes.

  • This is my fault, not the online company.

  • This is what I received three potatoes of this size.

  • These kinds of things have happened to me.

  • Often I take the blame.

  • We know that the virtues of technology has helped us very much get through this pandemic.

  • But at the same time, there's so much we need to fix.

Section 2 30 has become the single most important law shaping online platforms.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it