Placeholder Image

Subtitles section Play video

  • [UPBEAT MUSIC PLAYING]

  • [MUSIC PLAYING]

  • TODD KERPELMAN: Hey, there I/O Live.

  • Todd Kerpelman here, and I am here

  • with Jen Tong, who is apparently using the power of the cloud

  • to find new planets.

  • No big deal.

  • JEN TONG: Yeah, why not?

  • TODD KERPELMAN: Sure.

  • So, Jen, what is going on here at this booth?

  • JEN TONG: So behind us, we see Project PANOPTES,

  • which is an open source project that includes open source

  • hardware that is composed entirely

  • of commercial, off-the-shelf components.

  • And it is intended to be low-cost enough

  • for educational institutions and hobby

  • astronomers to build these robotic telescopes

  • and contribute to the project.

  • TODD KERPELMAN: And so what happens once they

  • build these robotic telescopes?

  • How are they discovering stars--

  • or planets, I should say?

  • JEN TONG: So the way they discover

  • planets around other stars is by combining their efforts using

  • the cloud.

  • So what happens is, each night, the telescopes

  • will all wake up and look at the sky.

  • And they'll get a bunch of great images of the sky.

  • And then, when the day comes, they'll

  • go to sleep but not really.

  • They're going to start uploading all their data to Cloud

  • Platform.

  • And from there, we can combine all the data together,

  • aggregate it, and then, from that,

  • we can infer the existence of planets around other stars.

  • TODD KERPELMAN: And how do you infer the existence of planets?

  • JEN TONG: So planets are very hard to see directly,

  • because their stars are very bright,

  • and there's lots of glare, and they're very far away.

  • In fact, a star only looks like one pixel on the camera.

  • So we have to use some trickery to do that.

  • So instead of looking for the planet directly,

  • we look for a dimming of the star

  • when the planet moves between us and the star,

  • kind of like an eclipse.

  • Or we call it a transit in the more general case.

  • TODD KERPELMAN: And so the general idea

  • is you've got hundreds of telescopes, all

  • around the world, taking pictures of the night sky.

  • They combine all those images up to Google Cloud, which

  • analyzes them all and looks for a star that

  • dims on a regular enough basis that you

  • think it must be because there's a planet passing

  • in front of it.

  • JEN TONG: That is exactly correct.

  • So we're able to infer that just by having

  • a whole bunch of samples from having a very large fleet

  • of telescopes.

  • TODD KERPELMAN: Wow, that's very interesting.

  • Now, if I remember from being three years old,

  • stars do twinkle.

  • How can you tell that a star is dimming because

  • of a planet passing in front of it versus normal star

  • twinkling?

  • Which is totally a technical term.

  • JEN TONG: Totally technical.

  • And that's a great question, because that's

  • part of the stuff that makes PANOPTES special.

  • Because we're using commercial, off-the-shelf cameras instead

  • of specialized astronomy sensors,

  • we have to compensate for the fact

  • that those cameras are designed to take color photos.

  • Because when a camera takes color photos,

  • it filters some of the light out using a thing

  • called a Bayer filter.

  • And when the star twinkles, it moves around that Bayer filter.

  • And it makes it much harder to count the number of photons.

  • Because we don't know how much are getting

  • filtered out by the Bayer filter.

  • So the way we compensate for that is we

  • look for another star, in the same picture,

  • that has the exact same amount of twinkle.

  • And from that, we can do a relative brightness

  • measurement, because we know how bright those stars should be,

  • because we can identify the star.

  • And that's how we kind of cancel out the twinkle.

  • TODD KERPELMAN: And I'm contractually obligated to ask,

  • what awesome Google Cloud Platform features are you using

  • to power Project PANOPTES?

  • JEN TONG: PANOPTES definitely kind of illustrates

  • that boring uses of the cloud can enable really cool stuff.

  • So we are using some simple stuff.

  • We're using some of the simple security controls.

  • We're putting a service account on each one of the devices,

  • so we can control access to a specific telescope,

  • can access a specific part of our storage buckets.

  • And then we're using Google Cloud Storage

  • to store all of that data.

  • And then we're aggregating it on Google Compute Engine.

  • TODD KERPELMAN: And so the general idea

  • is anybody can get involved.

  • They can build their own telescope.

  • And they can be part of this project.

  • And then, if they find a star, they

  • can name it after themselves.

  • Is that basically it?

  • JEN TONG: Well, naming stars is a more complicated issue.

  • And individual PANOPTES telescopes

  • don't actually discover a star themselves.

  • It's kind of like a team that collaboratively accomplishes

  • the goal.

  • But yeah, anybody can get involved.

  • We especially like to work with hobbyist astronomers

  • and educational institutions, because we

  • want to kind of inspire a love of astronomy in the youth

  • around the world.

  • TODD KERPELMAN: That's good.

  • And how would I get started if I wanted to?

  • JEN TONG: So I encourage you to go check

  • out projectpanoptes.org.

  • TODD KERPELMAN: All right, so you heard it here.

  • Go to projectpanoptes.org, and you too

  • can name a star after you.

  • Jen totally promised that you can do that.

  • I'm here with Sara Robinson.

  • Sara, what is this that we're looking at here?

  • SARA ROBINSON: We're looking at a demo of our Cloud Machine

  • Learning APIs, specifically highlighting

  • our Video API, Speech API, and Translation API.

  • So what these let you do is they let

  • you access a pre-trained machine learning model,

  • with a single REST API request, so you don't

  • need to know anything about how machine learning works

  • to use them.

  • TODD KERPELMAN: Awesome.

  • And what is this game that we have set up?

  • SARA ROBINSON: So we're going to see how

  • we compare to our Video API.

  • Shall we take a look?

  • TODD KERPELMAN: You mean like me against the computer?

  • SARA ROBINSON: You against our pre-trained model.

  • TODD KERPELMAN: All right, well, let's see how it goes.

  • SARA ROBINSON: Are you up for the challenge?

  • TODD KERPELMAN: I think so.

  • What are we going to do in this game?

  • SARA ROBINSON: So what we're going to do

  • is we're going to play a video.

  • And I'm going to have you try to annotate the video

  • as it's going.

  • And then, when it's done, we'll compare

  • the annotations you came up with, with our Video API.

  • TODD KERPELMAN: All right, I'm feeling confident.

  • Let's see if I can beat this computer.

  • SARA ROBINSON: All right.

  • TODD KERPELMAN: All right, here we go.

  • SARA ROBINSON: So let's take a look at this video.

  • And we're going to play against the APIs.

  • We're going to play against the Video API.

  • So when I hit Play, type what you see.

  • [BACKGROUND CROWD SOUNDS]

  • TODD KERPELMAN: Keys.

  • Book.

  • Keys.

  • A door, books, plane-- my typing is terrible.

  • Woman, room, lost, suitcase, goat.

  • She's still lost.

  • Wait, no, now she has Google Trips.

  • SARA ROBINSON: So we can skip to the end of the video.

  • TODD KERPELMAN: But wait-- aw, OK.

  • SARA ROBINSON: We won't play the whole thing.

  • But we saw about half of the video.

  • So we can see that found 13 items.

  • Video API found 89 items.

  • But it was a valiant effort.

  • TODD KERPELMAN: Once again, beaten by a machine.

  • SARA ROBINSON: Do you want to take a look at the API response

  • in a little bit more detail?

  • TODD KERPELMAN: Yeah, let's take a look at what we got here.

  • SARA ROBINSON: So this is everything

  • that the Video API found.

  • So we can actually skip to the points in the video

  • where it found these things.

  • And then, if we want to look at the JSON response from the API,

  • we can do that here.

  • So we can see all of the different entities it picked up

  • in the video.

  • And we can also take a look at what the Speech API transcribed

  • from this video.

  • So this is an entire transcription

  • of the audio from that video using our Speech-to-Text API.

  • And we can even use this to skip to various points in the video.

  • And then finally, I'll show our translation API.

  • So let's say you're a user somewhere else in the world,

  • and you want to translate a video into your own language.

  • You can go over here.

  • And we can try it out in French, for example.

  • You can also translate the Video API entities there.

  • So that's an overview of our Machine Learning API demo.

  • TODD KERPELMAN: Hey, did you like this video?

  • Want to see more like it?

  • Head on over to g.co/io/guide to see all of our I/O guide

  • videos.

  • Come on, let's go.

  • [UPBEAT MUSIC PLAYING]

[UPBEAT MUSIC PLAYING]

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it