Subtitles section Play video
♪ (music) ♪
Hi, everybody, and welcome to TensorFlow Meets.
It's my great honor today to be chatting with Megan Kacholia,
Engineering Director, right? on TensorFlow.
So can you tell us a little bit more about what it is that you do?
Sure. I'm an Engineering Director on the TensorFlow team.
I've been with the team for a little over two-and-a-half years now.
I've gotten to participate in all of the Dev Summits
- which has been really exciting. - Nice.
Just to get to come, be part of the keynote, be part of the event.
In terms of what I actually do,
I'm one of the leads for the TensorFlow team.
I just help with, obviously, the team itself,
make sure we're going in the right direction,
and make sure that we're doing the right things by our community.
One of the things that you spoke about
that I saw that a lot of people are excited about
and I know I'm personally excited about
is just really how we're doubling down on making it very developer-friendly
- in TensorFlow. - Yes.
How do that come about, can you tell us a bit more about it?
Obviously we've taken a lot of feedback from the community,
just to see what are the pain points,
what works well, what doesn't, I think that's a big part of it.
And just in general TensorFlow has been around--
We just had our third birthday in November, right?
So, things change, or the industry is really moving quickly,
machine learning is advancing in lots of different places
than we might have anticipated when TensorFlow was first developed.
So we need to make sure that we're making it very, very easy
for folks to just come in, get started, and be able to take advantage
of all of the cool things happening
- in the machine learning space. - Right.
I really think it comes from kind of both of those angles,
just how things have moved so fast,
and just the feedback we've gotten from the community
as we've had TensorFlow out there for a few years.
I've been working on it for about a year and a half
and I came in with a developer background
and when I started trying to kick the tires on TensorFlow,
there were some things where it was like...
it just wasn't quite intuitive to me.
But a lot of that has been changing, right?
There's stuff like Keras
- and other things, - That's correct.
and the eager execution by default that you've been adding
just hopefully will make it a lot easier for other developers.
(Megan) That's correct. We want to make sure, obviously,
that the flexibility and the power of TensorFlow is there,
but also make sure it's approachable and easy for people to just come in
and be like, it's fine, start here on this surface,
and if you need to dive down to this part
and really get into the guts of it, you can, it's fantastic,
but you don't have to if you don't need to.
(Laurence) Right. And all of this doesn't come
at the cost of performance, right?
Because you had that great slide with performance and...
Can you tell us a little bit about the performance improvements
that you've been seeing with TensorFlow?
(Megan) So a lot of those improvements as well
while we're looking at just kind of the core TensorFlow, right,
kind of you think of like the heart of it,
and that's where the majority of the improvements come,
meaning that those improvements will be applicable
whether we're talking about TensorFlow 2.0
or whether we're talking about using Keras or something else.
It's that big meaty part kind of under the hood.
And a lot of it comes
from just better making use of the different types of hardware,
making sure that we're using the different types of accelerators appropriately,
understanding the limitations and restrictions for things like mobile.
We talked about a lot of improvements for performance on TF Lite as well,
and again, some of it is just understanding
what do the [workflows] really look like,
what does it look like on some of these different devices
and Edge TPUs and things that we're trying in TensorFlow out more with now,
and then kind of always going back and closing the loop and being like,
okay, this is how we can make it better and this is how we can make sure
that the users get that performance by default,
and don't have to necessarily know
what magic had to happen under the covers for it to happen,
they should just get it.
(Laurence) Right. Now, you mentioned a mobile with TensorFlow Lite.
We lovingly call it TF Lite.
One of the things that I do get a lot of questions about
is that there's all of these different almost parts of the family of TensorFlow,
and can you give us a quick summary of the different runtimes
that are available for developers,
between TensorFlow Lite and, what are those other?
(Megan) So there's TensorFlow Lite,
there's more higher level types of things
that we talked about with TensorFlow Extended,
so that's not neccessarily a different runtime
but it's more kind of just how you put things together end to end,
especially if you're looking more towards a production environment.
We talked a bit about JavaScript as well, so TensorFlow.js,
and I think that one, it's a language-type thing,
because you want to make sure that the JavaScript community--
it's such a large community-- that they have access
to machine learning as well,
but there's also just the whole in-browser experience
that it plugs in well with because of Node.js.
So I think a lot of it depends on what kind of applications
people are interested in
and kind of where they're coming at machine learning from.
There are so many different ways of applying machine learning,
whether you're thinking of it like, "Oh, I work in a large company,
so I need it for these enterprise use cases,"
or "I'm just an app developer, I'm trying things out myself,"
or "I'm someone who's really familiar with this language,
I'm really familiar with JavaScript,
so that means I can use that as kind of my entryway
to start doing other things in machine learning."
So some of it depends on where you're at,
where you're coming from,
and then we try and make sure we have the right way for you
to get into the machine learning community.
The JavaScript stuff is amazing in the fact
that you can actually train models in the browser.
(Megan) Yes.
When I first heard about that, I was like "Nah, come on, really?"
- But you actually can do that, so... - (Megan) Yes, it's really cool.
And as you mentioned that with Node you're not just limited to in the browser,
you can also run it on backend servers and that kind of stuff.
One of my personal favorites actually, is with things like Cloud Functions
or Cloud Functions for Firebase because they run Node.js,
you can actually start putting models in there.
Now, TensorFlow 2.0 is currently an alpha release, right?
Now, lots of people are asking what comes next,
when are we going to see a release candidate or something like that?
Yeah, so like we talked about at the Dev Summit,
we're going to have our release candidate kind of coming up sometime in Q2.
We want to make sure that we're giving ourselves
enough time to make sure we have good performance,
make sure we have the right fit, finish, and polish for everything,
but there is, like you said, an alpha available.
We want feedback from the community.
We want to understand what are people enjoying,
what's working well, what are they concerned about.
And you can also follow along on GitHub
and kind of see what features are coming out,
what status are things at.
That way the community knows what's going on
and we can make sure that we're engaged appropriately
to be building the right things
as we finalize and finish up with the 2.0 release.
Sound perfect. Thank you!
So people can stay tuned to GitHub and keep track of everything
- that's going on. - (Megan) Yeah, that's correct.
Thank you so much, Megan.
And thanks, everybody, for watching this episode of TensorFlow Meets.
If you got any questions for me or any questions for Megan,
just please leave them in the comments below,
and check out the GitHub page for TensorFlow
if you want to learn more about the TensorFlow 2.0 release.
Thank you so much.
♪ (music) ♪