Placeholder Image

Subtitles section Play video

  • ♪ (music) ♪

  • Hi, everybody, and welcome to TensorFlow Meets.

  • It's my honor to be chatting with Karmel Allison, Engineering Manager

  • on TensorFlow today.

  • Now, Karmel, you do all of this outreach stuff,

  • as well as engineering management.

  • You had these video series on YouTube,

  • you've done talks at conferences,

  • and I know you did a great talk at the TensorFlow Developer Summit.

  • Can you tell us a little bit about you, what you do,

  • as well as all this great stuff.

  • Yes, so as you mentioned, I'm an Engineering Manager for TensorFlow.

  • My team works on high-level APIs, so that's Estimator and Keras.

  • And my talk at the Dev Summit this year was about what we're bringing in 2.0

  • for high-level APIs, and, specifically, about Keras,

  • and how that's the primary high-level API

  • that we're consolidating a lot of the things we have under,

  • and bringing the scale of Estimator into Keras,

  • and how we're going to be doing that in 2.0.

  • That's really interesting-- bringing the scale of Estimator into Keras.

  • I know there's going to be tons of questions about that.

  • One of the things I thought was really interesting,

  • was that you had this slide

  • where there was this "spot the difference."

  • You had like training a DNN, I think it was in Fashion-MNIST in 1.13,

  • and then, in 2.0, and there was no difference between them.

  • So, what's the real message behind that?

  • Yes, so François Chollet, the creator of Keras,

  • is really, really one of the champions of the user experience,

  • and he's done a great job of that with the Keras API thus far,

  • and we wanted to be able to keep that-- minimize, first of all, the overhead

  • in having to convert for people who're already using Keras,

  • but also just to retain that simple API as we move into 2.0.

  • At the same time,

  • we were able to bring everything we're bringing to the table in 2.0,

  • into that same model in Keras.

  • So, that same model works in Eager Mode and in Graph Mode,

  • so the same one in 1.13 would be Graph Mode,

  • but in 2.0, it's going to be in Eager Mode,

  • which allows you to more easily debug, prototype, and all of that.

  • It also works with Distribution Strategies,

  • with Feature Columns.

  • There's all these different tools and pieces that we're bringing in 2.0,

  • we wanted to make sure that same Keras API worked,

  • as it does now.

  • I see, okay, cool.

  • Now, you've mentioned Distribution Strategy.

  • So one of the things, it's not just making it easier

  • for the coding part of the cycle,

  • but the training part of the cycle and being able to go big.

  • So could you tell us a little bit about what Distribution Strategy

  • is all about and how it works?

  • Yes, so Distribution Strategies is a set of strategies,

  • a set of ways to distribute your model.

  • There are a number of them, including MirroredStrategy,

  • which is distribution across multiple GPUs on the same device or on the same machine.

  • There is also MultiWorkerMirroredStrategy

  • where you're mirroring across multiple machines,

  • all with their own devices.

  • And we've also got coming in the future ParameterServerStrategy

  • which is going to be distributing asynchronously

  • across hundreds of nodes,

  • which is the kind of training we also do at Google.

  • It's really exciting to build that in as a simple API, a flexible API

  • that works for DeepMind, for Google researchers,

  • but also for the people who are outside of the TensorFlow repository right now.

  • To make it easy to use, but also really performant.

  • There's a lot of work that's gone in under the hood

  • to make sure that the scaling efficiency is really high

  • even though the code stays simple.

  • Nice, nice. And now this becomes available

  • - to almost anybody who wants to use it. - Yes, we hope.

  • You don't have to be a Googler.

  • So now, I'm going to put you on the spot for a minute,

  • because this all great new stuff that we've been talking about

  • in TensorFlow 2.0-- do you have a favorite?

  • I think some of the things I'm most excited about are tf.functions.

  • So, this is some of the magic we're bringing in in Eager Execution,

  • where you can actually continue to use graph style code

  • and get the performance of graph style code,

  • even though you are in Eager Execution.

  • That's one thing.

  • Another is what I just mentioned which is ParameterServerStrategy.

  • That's the way that we're going to be able to distribute

  • some of the largest workloads we have at Google, using Keras models.

  • And I know that there are a lot of researchers, internal to Google

  • and externally, at some of the largest companies we work with.

  • We're excited to be able to take the same model they prototype

  • and take it all the way to production, training, and serving,

  • using Distribution Strategies.

  • That's something I'm really excited about.

  • Nice. It's hard to pick just one favorite, right?

  • - Thank you so much, Karmel! - Thank you!

  • This has been, as always, very informative,

  • and if you've got any questions for me, or if you've any questions for Karmel,

  • please, just leave them in the Comments below.

  • And whatever you do, don't forget to hit that Subscribe button,

  • so you'll be able to see the rest of Karmel's videos

  • right here, on YouTube.

  • Thank you.

  • ♪ (music) ♪

♪ (music) ♪

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it