Placeholder Image

Subtitles section Play video

  • Peter Kafka: I'm not going to do a long wind-up here, because I have a lot of questions

  • for my next guest.

  • I'm delighted she's here.

  • Please welcome Susan Wojcicki, CEO of YouTube.

  • They gave you a good hip-hop theme for your way in.

  • Susan Wojcicki: Thank you.

  • Thank you for coming.

  • Sure.

  • Thank you for having me.

  • I'm really glad we get to have this conversation.

  • I'm glad we get to do it in public, on a stage, on the record.

  • That's great.

  • Let's start here.

  • There was a bunch of news last week.

  • Some of it involved you.

  • Some of it involved vox.com, where I work.

  • There was a policy change.

  • I think they all sort of happened at the same time.

  • Can we just walk through what happened, and if they're parallel tracks, or if they were

  • connected?

  • Sure.

  • So, first of all, thank you.

  • A lot of things happened last week, and it's great to be here and talk about what happened.

  • But I do want to start, because I know that the decision that we made was very hurtful

  • to the LGBTQ community, and that was not our intention at all.

  • Should we just set context, for anyone who was not following this?

  • What decision this was?

  • Yeah.

  • So, let me ... I'll go into that.

  • But I thought it was really important to be upfront about that, and to say that was not

  • our intention, and we were really sorry about that.

  • But, I do want to explain why we made the decision that we did, as well as give information

  • about the other launch that we had going on.

  • Really, there were two different things that happened at the same time.

  • The first one I'll talk with is, we made a really significant change involving hate

  • speech.

  • This is something we had been working on for months, and we launched it on Wednesday of

  • last week.

  • And this is a series of policy changes you've been rolling out for years now.

  • So, just to be clear ... Yeah.

  • So, we've been making lots of different policy changes on YouTube.

  • We have made about 30 changes in the last 12 months, and this past week, we made a change

  • in how we handle hate speech.

  • That took months and months of work, and hundreds of people we had working on that.

  • That was a very significant launch, and a really important one.

  • What we did with that launch is we made a couple big changes.

  • One of them was to make it so that if there's a video that alleges that some race or religion

  • or gender or group, protected group, is superior in some way, and uses that to justify discrimination

  • or exclusion, that would now no longer be allowed on our platform.

  • Similarly, if you had a religion or race, and they alleged that inferiority, that another

  • group was inferior, and they used that to justify discrimination in one way.

  • Those were changes that we made.

  • So, examples would be like, “Race X is superior to Y, and therefore Y should be segregated.”

  • Is it weird to you that you had to make a rule that said, “This shouldn't be allowed”?

  • That this wasn't covered either by an existing rule?

  • That you had to tell your community, “Look.

  • This is not acceptable”?

  • Well, actually, a lot of this ... We're a global company, of course.

  • And so, if you look at European law, there are a number of countries that have a really

  • strong hate speech law.

  • And so, a lot of this content had never been allowed in those countries, but had actually

  • been allowed in the US and many other countries.

  • And so what we had actually done with it a few years ago is we had actually had limited

  • features, meaning that it wasn't in the recommendations.

  • It wasn't monetized.

  • It had an interstitial in front of it to say that this was content that we found offensive.

  • And when we did that, we actually reduced the views to it by 80 percent.

  • So, we found that it was effective, but we really wanted to take this additional step,

  • and we made this step on Wednesday.

  • We also added, which is really important, a few other definitions to protected groups.

  • So, we added caste, because YouTube has become so significant in India.

  • Then, we also added victims of verified violent events.

  • So, like saying the Holocaust didn't happen, or Sandy Hook didn't happen, also became

  • violations of our policies.

  • And so, this was happening on Wednesday, and we launched it on Wednesday.

  • There were thousands of sites that were affected.

  • And again, this is something that we had been working on ...

  • This was coming already.

  • It was coming already.

  • We had started briefing reporters about it in Europe over the weekend, because they're

  • ahead.

  • You know, the train had left the station.

  • And then at the sameon Friday, there was a video.

  • We heard the allegations from Mr. Carlos Maza, who uploaded a video on Twitter with a compilation

  • Works at vox.com.

  • Who works at vox.com, yes.

  • With a compilation of different video pieces from Steven Crowder's channel, putting them

  • together, right?

  • And asked us to take action.

  • Each of these videos had harassment

  • Saying, “He's directing slurs at me, and the people who follow him are attacking me

  • outside of YouTube, as well.”

  • Yes.

  • So, he alleged that there was harassment associated with this, and we took a look at this.

  • You know, we tweeted back and we said, “We are looking at it.”

  • You know, Steven Crowder has a lot of videos, so it took some time for us to look at that

  • and to really understand what happened, and where these different snippets had come from

  • and see them in the context of the video.

  • Actually, one of the things I've learned, whenever people say, “There's this video

  • and it's violative.

  • Take it down or keep it up,” you have to actually see the video, because context really,

  • really matters.

  • And so, we looked through a large number of these videos, and in the end we decided that

  • it was not violative of our policies for harassment.

  • So, were you looking at this yourself, personally?

  • Vox is a relatively big site.

  • It's a big creator.

  • Were you involved in this directly?

  • I mean, I am involved whenever we make a really important decision, because I want to be looking

  • at it.

  • So, you were looking at the videos.

  • Well, so we have many, many different reviewers.

  • Mm-hmm.

  • They will do a review.

  • Again, there are lots of different videos produced by Steven Crowder.

  • He's been a longtime YouTuber.

  • But in this case, did you weigh in personally?

  • Did you look at the stuff?

  • I mean, yes.

  • I do look at the videos, and I do look at the reports and the analysis.

  • Again, I want to say there were many videos, and I looked certainly at the compilation

  • video.

  • So, when the team said, “We believe this is non violative.

  • This doesn't violate our rules,” you agreed with that?

  • Well, let me explain to you why.

  • Mm-hmm.

  • Why we said that.

  • But you agreed?

  • I agreed that that was the right decision, and let me explain to you why I agreed that

  • was the right decision.

  • Okay?

  • So, you know, when we gotfirst of all, when we look at harassment and we think about

  • harassment, there are a number of things that we look at.

  • First of all, we look at the context.

  • Of, you know, “Was this video dedicated to harassment, or was it a one-hour political

  • video that had, say, a racial slur in it?”

  • Those are very different kinds of videos.

  • One that's dedicated to harassment, and one that's an hour-longso, we certainly

  • looked at the context, and that's really important.

  • We also look and see, is this a public figure?

  • And then the third thing that we look at is, you know, is it malicious?

  • Right?

  • So, is it malicious with the intent to harass?

  • And for right or for wrong right now, malicious is a high bar for us.

  • So the challenge is, like when we get an allegation like this, and we take it incredibly seriously,

  • and I can tell you lots of people looked at it and weighed in.

  • We need to enforce those policies consistently.

  • Because if we were not to enforce it consistently, what would happen is there would be literally

  • millions of other people saying, “Well, what about this video?

  • What about this video?

  • What about this video?

  • And why aren't all of these videos coming down?”

  • And if you look at the content on the internet, and you look at rap songs, you look at late-night

  • talk shows, you look at a lot of humor, you can find a lot of racial slurs that are in

  • there, or sexist comments.

  • And if we were to take down every single one, that would be a very significant

  • So, to stipulate that you take it seriously.

  • I want to come back to the idea that there's a ton of this stuff here.

  • Well, so what we did commit toand really, this is I think really importantis we

  • committed, like, “We will take a look at this, and we will work to change the policies

  • here.”

  • We want to be able towhen we change a policy, we don't want to be knee jerk.

  • We don't want it to be like, “Hey, I don't like this video,” or, “This video is offensive.

  • Take it down.”

  • We need to have consistent policies.

  • They need to be enforced in a consistent way.

  • We have thousands of reviewers across the globe.

  • We need to make sure that we're providing consistency.

  • So, your team spends a bunch of time working on it.

  • They come to you at some point and they say, “We don't think this is violative.”

  • You say, “We agree.”

  • You announce that.

  • And then a day later you say, “Actually, we do have problems with this.”

  • Well, so what ... Okay.

  • So, we did announce it, and when we announced it, if you look carefully at the tweet, what

  • we actually said at the end is, “We're looking at other avenues.”

  • Mm-hmm.

  • That's because we actually have two separate processes.

  • One of which is like, “Is this content violative,” from just the purely community guidelines.

  • But then we also have monetization guidelines, and that's because we have a higher standard

  • for monetization.

  • We're doing business with this partner.

  • Our advertisers also have a certain expectation of what type of content they are running on.

  • And so, we had the first review.

  • We said, “It doesn't violate the community guidelines on harassment, but we'll take

  • a look at our harassment guidelines and commit to updating that.”

  • Which actually had been on our plan anyway.

  • I had actually put that in my creator letter that I had just done a few weeks ago, saying

  • we were going to take a hard look at it.

  • But we had been working so hard on the hate speech, and so our teams were caught up on

  • that.

  • But that really had been next on our list.

  • So, we have a higher standard for monetization, so then we did announce the monetization change.

  • That Steven Crowder was, his monetization was suspended.

  • So, was that in reaction to people reacting to you not reacting?

  • No.

  • Or was that something that you were already planning to do and just hadn't gotten around

  • to announcing?

  • No.

  • We were in the process of looking at that, and there werewhen we look at these accounts,

  • there are many different components that we look at, and that's actually why we put

  • the line, “There are other avenues that we're still looking at.”

  • And that might have been too subtle.

  • If I were to do it again, I would put it all into one

  • Do it in one go.

  • Yeah, I would do it all in one go.

  • But we were also

  • So you said, “We're not kicking you off, but we're not going to help you make money

  • on YouTube.

  • It'll be directly through ads.”

  • We're suspending monetization.

  • Meaning, “We're not going to run ads against your stuff.

  • If you still want to sell racist coffee mugs or whatever you're selling, that's your

  • business, but we're not going to help you.

  • We're not going to put an ad in front of your stuff.”

  • Well, we said we're not going to put an ad in front of it, but the conditions by which

  • we will turn it on can be broader than just that.

  • So, for example, if they're selling merchandise and linking off of YouTube, and that is seen

  • as racist or causing other problems, that's something that we will discuss with the creator.

  • So, one more question specific to this.

  • Because again, we're putting advertising there, so we need to make sure that the advertisers

  • are going to be okay with it, and we have a higher standard.

  • And so, we can sort of look at all different parts of that creator and what they're doing,

  • and basically apply that higher standard there.

  • So, people I work with at Vox and other people are saying the one problem we've got with

  • all this, in addition to what seems like a back and forth, is that we don't understand

  • why you made the decision you made.

  • There's not enough transparency.

  • We can't figure out what rules he did or didn't break.

  • And also, by the way, it seems clear that he did break these rules.

  • But they're asking for transparency, they're asking for more understanding of what went

  • on here in this specific case.

  • Is that something that's reasonable for someone to expect out of you and out of YouTube?

  • To say, “Here's exactly what happened.

  • Here's exactly what broke the rule for us.

  • Here's exactly why we're demonetizing it”?

  • Which case are you talking about?

  • Well, in the case of the Crowder/Maza stuff.

  • But for anything, right?

  • So, we tried to be really transparent.

  • We communicated numerous times, including publishing a blog explaining some of the rationale

  • for our decision.

  • We try to be really transparent with our community, with our guidelines.

  • We get that request actually a lot from our creators, because they want to know what's

  • allowed on our platform, what's not allowed, from a monetization standpoint.

  • And so, we do get the request for transparency, and we are working to continue to be more

  • transparent and explain why something is a violation of our policies or not.

  • So, you were talking earlier.

  • You said, “We have to take this very seriously, because if we make a ruling here, someone

  • else is going to say, 'Look at this rap video where they use this slang term.'”

  • Yes.

  • This, to me, seems like the actual issue you've got across YouTube, which is you're doing

  • this at scale, 2 billion users.

  • What, it's 500 hours of content uploaded every minute?

  • It seems like no matter what decision you make at any particular case, someone is always

  • going to come up and say, “What about this?”

  • Or they're going to say, “If this is the new rule, I'm going to figure out a way

  • to skirt around it.”

  • Or someone's going to say, “By the way, you're going to see content that you have

  • never contemplated showing up.”

  • It seems like at the scale you're working, on an open platform where anyone can put anything

  • up, that you guys are always going to be on this treadmill, and no matter how many humans

  • you throw at it and how much AI you train up, you can't actually solve this problem.

  • Do you have confidence that this is something you can actually get a handle on?

  • We can definitely do, and continue to improve, how we manage the platform.

  • I see how much improvement we've already made.

  • For example, if you just look a few years ago, two years ago, there were a lot of articles,

  • a lot of concerns about how we handle violent extremism.

  • If you talk to people today who are experts in this field, you can see that we've made

  • tremendous progress.

  • At the end of the day we're an information company.

  • We have the access to Google, some of the algorithms there.

  • We have the resources to deploy.

  • We've committed to, last year, we committed to having over 10,000 people who are working

  • on controversial content, so I see how much progress that we have already made.

  • Like I mentioned, we've made all these different changes to our policy.

  • We actually have just made changes to our recommendation algorithms as well for not

  • violative content, but borderline content.

  • We announced that we've seen a 50 percent reduction in the views coming from recommendations

  • from that.

  • If you combine the combination of much better policies, tighter policies, and we consult

  • with many third parties who try to make sure that we get them right, and we're hearing

  • from all parties, you combine that with technology to be able to do that at scale, I think you

  • can be in a much better place.

  • I'm not

  • So you can get better, but can you get it to the point where we're not seeing a story

  • about something awful happening within YouTube on a sort of weekly basis?

  • Can you get it where this thing is an exception to the rule?

  • I mean, I think there's always going to beat the scale that we're at, there

  • are always going to be people who want to write stories, but

  • Well, there's also people who want to put terrible things on your website.

  • Right?

  • You guys talk about the fact that you took down 8 million terrible pieces of content

  • in the last quarter.

  • Mm-hmm.

  • Right?

  • And that you're proud of that because you were able towhat was 75 percent of it,

  • no human ever saw.

  • If I ran a business where people dumping that much sludge onto my property on a quarterly

  • basis, I would really rethink what I'm doing.

  • It seems like, I mean, I just can't fathom why there's 8 million pieces of terrible

  • things coming onto your site on a quarterly basis, but that would really upset me and

  • worry me.

  • Well, it matters what's the denominator.

  • You gave the numerator.

  • Right?

  • We have a large denominator, meaning we have lots of content who's uploaded, and lots

  • of users, and lots of really good content.

  • When we look at it, what all the news and the concerns and the stories have been about

  • this fractional 1 percent.

  • If you talk about the other 99-point-whatever that number is, that's all really valuable

  • content of people who are sharing valuable points of view that we haven't heard about,

  • educational content, addressing really important issues.

  • I think it's important to remember that and put that in perspective.

  • I say that not because we are not committed to solving the fractional 1 percent.

  • We are very committed, and I've been really clear that that responsibility is my No. 1

  • priority.

  • There is a lot of work for us to do.

  • I acknowledge that, but I also know that we have tremendous tools at our fingertips that

  • we can continue to invest in to do a better job.

  • So, yes, while there may be something that slips through or some issue, we're really

  • working hard to address this, and I think we have some good tools to do so.

  • What if YouTube wasn't open?

  • What if I couldn't upload a video that I wanted to without asking you for permission?

  • What would that do to YouTube?

  • What if you had some sort of barrier to entry that required me to get some kind of permission

  • to upload something before it went up there?

  • I think we would lose a lot of voices.

  • I don't think that's the right answer because we would lose a lot of voices and

  • a lot of people who share content.

  • Sometimes we hear from creators that they started sharing content like they started

  • doing

  • I don't know, you look at Sal Kahn.

  • Right?

  • He got started almost by accident, right, who now creates very valuable educational

  • content.

  • But what I think is the right answer that you're alluding to but not quite — I'm

  • going to

  • I haven't got there yet.

  • I'm going to improve a little bit your suggestion here, which is having more trusted tiers.

  • In a sense, we've already started doing that with monetization, saying, “Look, you

  • can't just come onto the platform and have monetization on day one.”

  • You have to have a certain number of views.

  • You have to have a certain number of hours.

  • You have to earn your way into getting ads.

  • You have to be in good standing.

  • That's an example of where we have more of a trusted relationship with them.

  • We did a similar thing with livestreams in terms of certain number of views that you

  • have to havesorry, subscribers that you need to have.

  • I think this idea of not everything is automatically given to you on day one, that it's more

  • of a — we have trusted tiers, and

  • But there's still, at bottom, you can put stuff on YouTube without asking for it.

  • Yeah.

  • Everyone can start.

  • Yep.

  • Everyone can start.

  • Everyone can be their own media provider, but in order to get some of the broader distribution

  • or to have monetization, it's something that you work your way up to.

  • Is this insistence that the platform has to be open, I can think of a business reason

  • for that.

  • I can think of legal reasons for that.

  • Right?

  • Section 230 gives you legal protections, but you have to not actually prevent someone from

  • uploading something.

  • I can think of ideological reasons why you want the platform to be open.

  • Which one is most important to you?

  • Sorry, I'm trying to remember them all.

  • You said business reasons?

  • Business, legal, ideology, but you can add as ... I mean ...

  • I mean, they all ...

  • I get that more voices is better, but if having unlimited voices has a problem that is always

  • going to trouble you

  • Look, it's a core issue.

  • If you want to limit it and say, “Hey, we're only going to have a select set of people,”

  • but what are the factors that you're determining that?

  • How are you deciding who is getting to be on the platform and have speech, and who's

  • not?

  • But you are deciding, fundamentally, right?

  • You have teams, and you have software, and you weigh in.

  • You are fundamentally making the decision at some point.

  • We do, but that's after they've been on the platform, and we have an understanding

  • of what they're doing.

  • It's based on the content that people have uploaded.

  • Look, we see all these benefits of openness, but we also see that that needs to be married

  • with responsibility.

  • You do need to have more of this responsibility in place, and you need the different ways

  • to understand what is the content that you should be putting in recommendations, what

  • is the content that should be promoted for different use cases.

  • YouTube deals with a lot of different cases.

  • We deal with entertainment, music.

  • Why should we say some people can, new musicians or old musicians, why would we want to close

  • something like that?

  • But there is a world, right, where people make those decisions all the time.

  • Traditional media does this.

  • The New York Times or NBC or Vox knows what's on their website because they said, “Yes,

  • let's publish this.

  • Let's distribute this.”

  • You guys don't know.

  • That would, again, make me just fundamentally nervous to be running a platform with 2 billion

  • people on it and I don't know what's happening on it.

  • Well, we have a lot of tools.

  • We work hard to understand what is happening on it and really work hard to enforce the

  • work that we're doing.

  • I think that if you look across the work, I think you can see that we've made tremendous

  • progress in a number of these areas.

  • Again, I'm not saying we're done, but I think you can see that we've made progress.

  • If you were to fast-forward a couple years and say, “Well, what would that look like

  • in 12 months and then another 12 months?

  • What are all the different tools that have been built in place?”

  • I think you'll see that there'll be a lot of progress from the perspective.

  • Your boss, Sundar, described the problem you guys are dealing with in an interview this

  • week on Axios as a search problem.

  • Is that the correct analogy to think about the problems you guys are facing?

  • Well, I think what he — I don't want to interpret, necessarily, what he said, but

  • if you look at Google, Google has doneGoogle was first founded on delivering information.

  • Right?

  • That's been the mission of the company, to organize the world's information.

  • You look at our first patent, it was about page rank.

  • Page rank is actually about figuring out who is authoritative on the web.

  • Who are the people that you should bring up for any different query?

  • Being part of Google, for informational queries, is a really, really important part of how

  • we'll build some of those systems going forward.

  • Again, when you talk about music or you talk about cooking or you talk about DIY, these

  • other categories, you want to discover new people.

  • You want to discover those fresh ideas, and you want to see a variety of content, but

  • when you're dealing with really sensitive areas like news or medical, you want to get

  • those from authoritative sources.

  • I think there's an opportunitythat's where we work with Google, and we learn a

  • lot from Google, and we think we can do a better job that way.

  • There's a lot of discussion about regulation and what kind of regulation makes sense to

  • use with tech companies, including yours.

  • What regulation would be helpful to you, and what regulation would you not want to see

  • at all?

  • Well, I agree there's definitely more regulation in store for us.

  • One of the groups I've been working, that we've been spending a lot of time, has been

  • Article 13, which is the copyright directive in Europe, now named Article 17.

  • I think what we have seen is that a lot of times regulation is really well-intended.

  • People want to change society for the better, but it has all these unintended consequences.

  • What I would say is really important is for us to be able to work closely with these different

  • providers, the different governments, and be able to explain how can we really implement

  • it in a reasonable way, how to make sure there aren't unintended consequences that they

  • didn't think about that actually might make the situation worse that they just, because

  • they're not running our businesses, don't necessarily have that visibility.

  • There's a push to break up some of these companies.

  • What would happen if you were split off from Google?

  • I don't know.

  • I've been really busy this week working with all these other concerns.

  • I mean, I don't know.

  • I'll worry about it — I don't know.

  • I mean, we would figure it out.

  • That's a good answer.

  • One more specific video question.

  • BuzzFeed wrote about a creator with some popularity who'd made a series of problematic videos

  • — I hate that I just used the word problematicoffensive videos.

  • One of them was a death threat aimed at you.

  • You took down that video and another video.

  • Her channel is still up, which brings me back to this sort of whack-a-mole where you guys

  • are sort of sifting through videos as they come to your attention and saying, “This

  • one's a problem.

  • This one is okay.

  • This one threatened to kill me, but she can still stay on the site.”

  • How do you feel about havingso her name is Sophhow do you feel about having her

  • on YouTube?

  • Well, like I mentioned beforehand, every single time there is a controversial video, you really

  • need to see the video, and you need to understand what's happening.

  • She has a large amount of political satire and political speech that's in her channel.

  • The video where she made the threat to me was struck, and we removed monetization from

  • her channel because, like I said, we have a higher bar for monetization.

  • That was an issue.

  • Look, I think there's certainly different levels of threats in the videos and different

  • ways that we would interpret it depending upon the context of who is the person and

  • what they're saying.

  • In many cases, we would terminate the channel if we thought it was a channel that was dedicated

  • to a number of, I don't know, hateful, harassment, etc.

  • But in this case, it was a large amount of political speech.

  • Seeing it in context, we struck the video, removed monetization, but left the rest of

  • the channel up.

  • There's been a lot of threats directed at you and other YouTube executives in the last

  • few days since you announced your policy.

  • I assume some of that, again, is ideologic, but some of it is that you've got these

  • creators who are trying to make money on your platform.

  • I think that's a difference between what you do and what Google and Twitter do.

  • Generally, most people aren't making a living on Google or Twitter, and people can legitimately

  • do that.

  • Then you had a YouTuber who was angry about being demonetized, I think, and came onto

  • your campus and killed someone a couple years ago.

  • What's that like to manage that teeming group of people that you don't really actually

  • manage?

  • Well, first of all it's difficult, but I will say there was something in your question

  • that I don't totally agree with.

  • I think it's really important to clarify.

  • You talked about, you made this association of creators who could be upset, or angry,

  • or difficult and then you alleged, right, that it ties to monetization.

  • Again, I want to

  • They'll often make that argument.

  • Well, I want to just be clear that, again, it's this 99 fractional 1 percent problem,

  • but I also want to point out that because we have a higher bar for monetization that

  • we're reallythis isn't a business problem from that perspective.

  • Right?

  • We're focused on having high-quality content available as part of our ecosystem, but we

  • also want to have a broad range because we wanted to keep it open and enable lots of

  • different points of view to be on the platform.

  • It's certainly, I think like any time that you have a bunch of creators

  • or people are upset, it's difficult.

  • Certainly, this year, this week it was unfortunate.

  • We managed to upset everybody and we're working really hard to try to do the right

  • thing.

  • It was really in our effort to be consistent that we wound up upsetting people.

  • It's not an easy job.

  • It's a tough job, but I am encouraged by the fact that I hear so many good stories

  • of people who have been able to pursue their passion, start a career, have a business on

  • YouTube, talk about how they learned an instrument or learned something new they never thought

  • they could do beforehand.

  • And so it's really all the good, it's the 99 percent of good, valuable content that

  • I'm really encouraged, that keeps me motivated and passionate about what I do.

  • I have more questions but I want to open it up to the audience because I think they're

  • going to have questions, too.

  • I assume there's a microphone, maybe two microphones, lights?

  • Can you introduce yourself?

  • Steve Kopack: Yeah, I've Steve Kopack from CNBC.

  • I was wondering what goals and metrics Sundar and your other bosses give you that you're

  • evaluated on and how or if those have changed over the years.

  • Yeah, well, so you know, I think like any company we use multiple metrics to evaluate

  • our success.

  • We have sort of at the top of our pyramid quality and responsibility right now.

  • When we talk about quality we're talking about recommendation quality, search quality,

  • comment quality, etc.

  • We talk about responsibility.

  • We talk about ruling out changes to our recommendation

  • services to make sure that we're recommending

  • the right content to our users.

  • Those are the top-line metrics that we have and so we put that at the top of the pyramid

  • because it's really important to

  • Steve Kopack: Have those changed over the years?

  • Excuse me, what?

  • Steve Kopack: Has that pyramid changed over the years?

  • It definitely has.

  • It's certainly evolved over time.

  • There are lots of other metrics, to be fair, that we look at, like how many creators do

  • we have on our platform, revenue, etc., how are users engaging with our videos, right,

  • number, daily active users.

  • So yes, there are many other metrics that we use.

  • But we have been really focused on putting these at the top to be clear.

  • Steve Kopack: For a long time you guys were pushing for watch time, right?

  • You wanted to get to a billion hours.

  • In retrospect, was that emphasis on watch time a mistake?

  • We definitelyyeah, we've always had lots of different metrics, to be clear, not

  • just one.

  • When we actually introduced the watch time metric it was the goal of — a quality one,

  • at the time, because what we saw is that views a lot of times cause click spam, or people

  • clicking on content that they didn't really want.

  • And we thought watch time was the best way to show our people really engaged.

  • But we have evolved in many different ways.

  • We also started adding satisfaction into our recommendations, so after people watch a video,

  • were they satisfied with those videos?

  • That's an example of how our metrics have evolved.

  • Okay, we've got a lot of folks.

  • We'll try to go quick.

  • Okay.

  • Mark Mehaney: Susan, Mark Mahaney at RBC.

  • Two questions, you pick which one you want, the near term and the long term.

  • Near term, in the March quarter there's a real sharp deceleration in growth, ad revenue

  • growth at Google, some talk about product changes maybe that occurred at YouTube.

  • Do you have anything you can share, any light you can shed on that?

  • Then the other question is bigger and it's what Peter was asking you earlier, you've

  • been at Google since the beginning of the beginning.

  • Do you think the company has become too big that it really should be more, much more closely

  • regulated than it ever was in the past?

  • Google grew up against in a way one tech monopoly.

  • Do you think it's become another tech monopoly?

  • I'll take the second question, given that you've given the two.

  • I did just celebrate my 20-year anniversary at Google and it's pretty amazing to see

  • the way Google has evolved over time.

  • I can tell you from my vantage on YouTube is if you look at all the people who are getting

  • into video and big, well-resourced companies all getting into online video, this seems

  • like an incredibly competitive time.

  • I'm also often asked the question like, “What about X, Y, Z, all these other providers?”

  • I see that competitiveness.

  • I also see some of the advantages.

  • I mentioned beforehand, like if we are going to focus on some really hard computer science

  • information problems.

  • I think that's what Sundar referred, like being part of Google really helps us because

  • we're able to use different signals and technology that they have to be able to figure

  • out what the right way to do our recommendations and rank it.

  • Peter Kafka: You need to be this big to solve the scale of the problem that you have because

  • you're so big?

  • Having enough scale does help us build systems that are world class to be able to address

  • this.

  • I do think there are some advantages of that and there are some technologies that we're

  • able to learn from Google and those techniques that really benefit us.

  • I do see that there are some real advantages that we're able to deliver a better product

  • because we have some ofwe run on shared infrastructure and there's some technology

  • that we can borrow or just even techniques that we can learn from on how to deliver the

  • right information.

  • We'll take two more questions real quick.

  • Nilay Patel: Hi, Nilay Patel from the Verge.

  • You're CEO of YouTube.

  • Do you think you've set a fair goal for your policy team to create a single, coherent

  • content moderation policy that covers beauty influencers, tech YouTubers, and how-to channels,

  • and political comedians?

  • It feels like you've set them up to fail.

  • When we think aboutfirst of all, we have many policies, right?

  • One of the things I've learned from spending a lot of time on these different policies

  • is that there could be certain areas that need a different set of policies, like I'll

  • just say news and information is a whole category in and of itself.

  • This is actually why it hasn't been so simple to solve because we need to be able to solve

  • each of these different areas with a different set of experts, deep understanding, different

  • technology.

  • For example, in news we have misinformation cards.

  • We have authoritative ranking, breaking news shows, right, so we have a bunch of different

  • product solutions.

  • Then I think from a policy standpoint we also are going to have different policies as they

  • appear in different verticals.

  • Nilay Patel: Do your creators know that they fall under different policy regimes?

  • Well, people know that if they createlook, there's some standards like how we handle

  • harassment or how we handle hate that applies to everyone, but what I'm trying to say

  • is that if you're in the news category and you're producing misinformation we're

  • going tothere could be a set of cases that really only relate to news.

  • I'm just saying that we're trying to understand how to cover everything from a policy standpoint.

  • But when we identify, “Look, there's a problem over here,” it's specific to this

  • area, specific to this industry, we're also going to look and say, “Well, what policy

  • should we change, make that will really affect this effectively?”

  • I'm going to change my policy.

  • I'm going to let two more because I want to have Ina go and I'm going to let Kevin

  • from the Times ask a question because he wrote a really good story this week.

  • Sure.

  • So, super quick, Ina?

  • Ina Fried: Yep, Ina Fried with Axios.

  • You started off with an apology to the LGBTQ community but then you also said that you

  • were involved and that you think YouTube made the right call.

  • A lot of people don't really feel like that's an apology and are concerned that YouTube

  • flags LGBT-positive content just for being LGBT as sometimes sensitive and yet slurs

  • are allowed.

  • I'm curious, are you really sorry for anything to the LBGTQ community or are you just sorry

  • that they were offended?

  • Yeah, so first of all, I'm really personally very sorry and it was not our intent.

  • Our goal wasYouTube has always been a home of so many LGBTQ creators and that's

  • why it was so emotional and that's why I think this reallythat's why even though

  • it was a hard decision, it was made harder that it came from us because it has been such

  • an important home.

  • Even though we made this hard, this decision which I'm going to get into a little bit

  • more with your questions, I'm saying we've really, like YouTube, we have so many people

  • on YouTube from the LGBTQ community and we've always wanted to support the community openly

  • in spite of this hard issue that we've had right now.

  • I'm just saying people have gotten a lot of criticism like, “Why are you stillwhy

  • did you change your logo to rainbows even though you made this hard decision?”

  • It's because as a company we really want to support this community.

  • It's just that from a policy standpoint we need to be consistent because if welook,

  • if we took down that content there would be so many other, so much other content that

  • we would need to take down.

  • We don't want to just be knee jerk.

  • We need to think about it in a very thoughtful way be able to speak with everyone.

  • We'll speak to people from the LGBTQ community, make sure that we're incorporating that

  • going forward in terms of how we think about harassment and then make sure that we are

  • implementing that in a fair and consistent way going forward.

  • I think that it was a hard week across the board and I am truly, truly sorry for the

  • hurt that we caused that community.

  • It was not our intention at all.

  • I do want tonot that I want to take away in any way from the hurt or the challenge

  • of what happened this week, but I do want to let you know that many changes that we

  • made in the hate policy which, again, is a different set of policies we think will be

  • really beneficial for the LGBTQ community and there are a lot of videos there and a

  • lot of ways that that community is attacked where we will be taking down those videos

  • going forward and we will be very consistent.

  • If we see that we will take it down.

  • So again, thank you for your question and I really do apologize for the hurt that we

  • caused.

  • Kevin really does get the last question.

  • Okay.

  • Kevin.

  • Kevin Roose: Hi, the story that Peter mentioned was about radicalization and a 21-year-old

  • man who says that YouTube helped to radicalize him into the far right through his recommendations.

  • As part of that story I got to ask a lot of people about YouTube and radicalization but

  • I never got to ask you, so I wanted to ask you, do you think YouTube is having a radicalizing

  • effect on our politics?

  • Um, I mean, so, umWe offer, as you know and you've researched, a broad range of

  • opinions.

  • We have looked in all of these different areas across the board and we see that we are offering

  • a diversity of opinions, right?

  • So when people go and they look up one topicwhether it's politics or religion or

  • knittingwe're going to offer a variety of other content associated with it, but we

  • have taken these radicalization concerns very seriously and that's why at the beginning

  • of January we introduced some new changes in terms of how we handle recommendations.

  • I know you're familiar with them.

  • You've referenced them, but what we do just for background for the other people is that

  • we basically have an understanding of what's content that's borderline and when that

  • content is borderline we determine that based on a series of different raters that we use

  • that are representative of different people across the US.

  • And when we determine that we're able to build a set of understanding of what's considered

  • borderline content and then reduce our recommendations.

  • We have reduced by 50 percent the recommendations that we're making of borderline content

  • and we're planning to roll that out to all countries, to 20 more countries this rest

  • of the year.

  • I think the combination of the changes that we're making of our policies as well as

  • the changes that we've made to our recommendations are going to make a really big difference.

  • Just to restate Kevin's question, do you think YouTube helps radicalize people, particularly

  • on the right?

  • I, I mean, I, it's, like, you know, you've done lots and lots of research on this.

  • Our view has been that we are offering a diverse set of content to our users and that users

  • we're providing that diverse set and users will choose different types of content

  • for them to see, but we see that we're offering a diverse set over time.

  • All the same, we really want to be able to address it and that's why we've made the

  • changes that we have made.

  • We could do this double-length, but we don't have time.

  • Susan, thank you for coming.

  • Sure, thank you.

  • Thanks for taking our questions.

  • Thank you.

Peter Kafka: I'm not going to do a long wind-up here, because I have a lot of questions

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it