Subtitles section Play video
>> SUNDAR PICHAI: Good morning everyone.
Thank you for joining us.
As we were preparing for this event, we were all devastated by
the news coming out of Las Vegas as I'm sure all of you were.
And that is coming off of a challenging past few weeks with
hurricanes Harvey, Irma, and Maria and other events around
the world.
It's been hard to see the suffering but I have been moved
and inspired by everyday heroism, people opening up their
homes and to first responders literally risking their lives to
save other people.
Our hearts and prayers are with the victims and families
impacted by these terrible events.
We are working closely with many relief agencies in affected
areas and we are committed to doing our part.
It is a true privilege to be at the SFJAZZ Center.
It's a great American institution for jazz performance
and education and it is really good to see familiar faces in
the audience.
As always, I want to give a shout out to people joining us
on the livestream globally from around the world.
Since last year and since Google I/O, we've been working hard,
continuing our shift from a mobile first to an AI
first world.
We are rethinking all of our core products and working hard
to solve user problems by applying machine learning
and AI.
Let me give you an example.
Recently, I visited Lagos in Nigeria.
It is a city of twenty-one million people.
It is an incredibly dynamic, vibrant, and ever growing city.
Many people are coming online for the first time.
So it's very exciting unless you happen to be in the Google Maps
team and you have to map this city.
And it is so - it is changing so fast and normally we map a place
by using Street View and doing a lot of stuff automatically but
it's difficult to do that in a place like Lagos because the
city is changing.
You can't always see the signage clearly and there are variable
address conventions.
Things aren't sequential.
So for example, take that house there.
If you squint hard, you can see the street number there.
It is number three to the left of the gate.
That was relatively easy.
Onto a harder problem now.
That house, that is what we see from Street View.
I think as humans, it's probably pretty hard.
Maybe one or two of you can spot it out.
But our computer vision systems, thanks to machine learning, can
pick it out, identify the street number, and start
mapping, mapping the house.
So we approach Lagos completely differently.
We deployed machine learning from the ground up and just in
five months, the team was able to map five thousand kilometers
of new roads, fifty thousand new addresses, and a hundred
thousand businesses.
It's something which makes a real difference for millions of
users there as Google Maps is popular.
And we think this approach is broadly applicable.
Let's come closer to home in parking in San Francisco.
I don't even try it anymore but for those of you who try it, we
again use machine learning.
We understand location data.
We try to understand patterns.
Are cars circling around?
And the color shows the density of parking and we can analyze it
throughout the day and predict parking difficulty and in Google
Maps, give you options.
A simple example but it's the kind of everyday use case for
which we are using machine learning to make a difference.
The best example I can think of, what we have talked before, is
Google Translation.
I literally remember many years ago adding translation in Chrome
and making it automatic so that if you land on a page different
from your language, we do that for you.
Fast forward to today.
With the power of machine learning and our neural machine
translation, we serve over two billion translations in many,
many languages every single day.
To me, it shows the power of staying at a problem, constantly
using computer science to make it better, and seeing users
respond to it at scale.
This is why we are excited about the shift from a mobile first to
an AI first world.
It is not just about applying machine learning in our products
but it's radically rethinking how computing should work.
At a higher level in an AI first world, I believe computers
should adapt to how people live their lives rather than people
having to adapt to computers.
And so we think about four core attributes as part of
this experience.
First, people should be able to interact with computing in a
natural and seamless way.
Mobile took us a step in this direction with multi-touch but
increasingly, it needs to be conversational, sensory.
We need to be able to use our voice, gestures, and vision to
make the experience much more seamless.
Second, it is going to be ambient.
Computing is going to evolve beyond the phone, be there in
many screens around you when you need it, working for you.
Third, we think it needs to be thoughtfully contextual.
Mobile gave us limited context.
You know, with identity, your location, we were able to
improve the experience significantly.
In an AI first world, we can have a lot more context and
apply it thoughtfully.
For example, if you're into fitness and you land in a new
city, we can suggest running routes, maybe gyms nearby, and
healthy eating options.
In my case being a vegetarian and having a weakness for
desserts, maybe suggest the right restaurants for me.
Finally and probably the most important of all, computing
needs to learn and adapt constantly over time.
It just doesn't work that way today.
In mobile, you know, developers write software and constantly
ship updates but you know, let me give a small example.
I use Google Calendar all the time.
On Sundays, I try to get a weekly view of how my week looks
like for once the work week starts, say on a Monday or a
Tuesday, I'm trying to get a view into what the next few
hours looks like.
I have to constantly toggle back and forth.
Google Calendar should automatically understand my
context and show me the right view.
It's a very simple example but software needs to fundamentally
change how it works.
It needs to learn and adapt and that applies to important things
like security and privacy as well.
Today, a lot of us deal with security and privacy by putting
the onus back on users.
We give them many settings and toggles to improve those.
But in an AI first world, we can learn and adapt and do it
thoughtfully for our users.
For example, if it is a notification for your doctor's
appointment, we need to treat it sensitively and differently than
just telling you when you need to start driving to work.
So we are really excited by the shift and that is why we are
here today.
We have been working on software and hardware together because
that is the best way to drive the shifts in computing forward.
But we think we are at a unique moment in time where we can
bring a combination of AI and software and hardware to bring a
different perspective to solving problems for users.
We are very confident about our approach here because we are at
the forefront of driving the shifts with AI.
Three months ago at Google I/O, our Google AI teams announced a
new approach called AutoML.
AutoML is just our machines automatically generating machine
learning models.
Today, these are handcrafted by machine learning scientists and
literally only a few thousands of scientists around the world
can do this, design the number of layers, weight and connect
the neurons appropriately.
It's very hard to do.
We want to democratize this.
We want to bring this to more people.
We want to enable hundreds of thousands of developers to be
able to do it.
So we have been working on this technology called AutoML and
just in the past month for a standard task like image
classification, understanding images, our AutoML models are
now not only more accurate than the best human generated models,
but they are more resource efficient.
So it is pretty amazing to see.
We are now taking it a step further.
Let me talk about another use case, object detection.
When we say object detection, it's just a fancy name for
computers trying to delineate and understand images, being
able to draw bounding boxes and distinguish between all of the
vehicles there, scooters, mopeds, motorcycles, and even
pick out the bike in front.
It has a lot of practical use cases.
The Street View example for Lagos works based on
object detection.
Google Lens, which you will hear about later, as well as our
photography in Pixel, uses object detection.
This is really hard to do.
The best human generated models we have only have a 39% accuracy
but our AutoML models as of the past couple of weeks have
reached around 43% accuracy and they are constantly
getting better.
So the rate at which we are seeing progress with AI is
amazing which is why we are really excited about combining
it with our software and hardware to bring it together
for our users.
Let me give you a concrete example.
I was recently very inspired by this tweet from a journalist who
was attending the Little League World Series.
As you can see, there are two Little Leaguers here, one from
Dominican Republic and the other from South Dakota, and they are
talking to each other using Google Translate.
It's a great example but I looked at it and I feel like we
could do a whole lot better.
Computing needs to evolve where this happens in a more natural,
seamless, and a conversational way.
Later today you will get an early glimpse of how we can push
this experience further by thinking about AI, software, and
hardware together.
I couldn't be more excited by the direction our hardware team
is taking in approaching their work here.
So to give you a lot more insight let me welcome Rick onto
the stage. Thank you.
>> RICK OSTERLOH: Hi everyone.
I want to welcome everyone here today, Google fans, the media.
In the front row we have some of our top sales reps from Verizon
and other partners.
It's great to have you all here.
Yes. Welcome. Welcome.
We've got a lot to show you today, including a
few surprises.
So I thought I would quickly set the scene and explain what we've
been working toward and just how it ties into the vision that
Sundar outlined.
A year ago, we stood on a slightly smaller stage and
introduced the world to Google Pixel, Home, and Wi-Fi,
Chromecast Ultra, and Daydream View.
This was the first generation of hardware made by Google.
So much has happened since then and we have made tremendous
progress in our first year.
For starters, our team is getting a lot bigger.
We're bringing on two thousand very talented engineers for HTC
along with some important IP.
It's an amazing team that has created a series of industry
first innovations and by working more closely together, we'll be
able to better integrate Google hardware and software.
And our products have built up a lot of momentum going into our
second year.
Let's take a look.
>> SPEAKER: Okay Google.
>> SPEAKER: Okay Google.
>> SPEAKER: Okay Google.
>> TY BURRELL: Okay Google.
Play the Soundtrack to Spaghetti for Pepe.
>> SPEAKER: Google set to hold a product event in San Francisco.
There are a ton of new announcements.
>> SPEAKER: Google is showing off new hardware.
>> SPEAKER: Speaking German
>> SPEAKER: It's called Google Home, the new way to make your
home a little bit smarter.
>> JIMMY KIMMEL: Google debuted a new virtual reality headset.
>> SPEAKER: Get the Daydream View.
>> SPEAKER: I actually feel like I'm there.
>> SPEAKER: It's sick as hell; I can tell you that much.
The Google Pixel.
>> SPEAKER: Pixel is my new favorite camera.
>> SPEAKER: Okay Google. Take a selfie.
>> SPEAKER: Oh, that camera.
The pictures that come out of this thing are incredible.
>> SPEAKER: Google's new Chromecast Ultra.
>> SPEAKER: Four stars and editor's choice.
>> TY BURRELL: Okay Google.
Show me a video of a kangaroo. What?
>> SPEAKER: The camera is phenomenal.
>> SPEAKER: The best ever in a smart phone.
>> SPEAKER: Beautiful.
>> SPEAKER: Tops in my book.
There, I said it.
>> SPEAKER: My mom now can save every picture, unlimited space.
>> SPEAKER: Mama got a new toy.
>> SPEAKER: It can recognize each voice.
>> SPEAKER: Enter Google Assistant.
>> SPEAKER: She's so smart.
>> SPEAKER: She's just amazing.
>> SPEAKER: Let me see what else my Google Assistant can do.
>> SPEAKER: Hey Google. Call Carrie.
>> SPEAKER: Speaking French
>> SPEAKER: What is a dog?
>> JIMMY KIMMEL: That's your question?
What is a dog?
>> CONAN O'BRIEN: I don't want to blow your mind with my high
tech talk.
>> SPEAKER: It is amazing what it can do.
>> SPEAKER: I think we've got a winner.
>> SPEAKER: It's top notch.
>> SPEAKER: Yes, Google.
>> SPEAKER: That is amazing.
>> SPEAKER: Okay Google.
What's next?
>> RICK OSTERLOH: We're still in the early days of our hardware
line but we're off to a great start.
We're thrilled to announce that there are now fifty-five million
Chromecast devices around the world.
Google Wi-Fi is the number one selling mesh router since its
launch in the US and Canada.
And in the last twelve months alone, the Assistant has gotten
a lot smarter thanks to Google.
We've added more than one hundred million new, unique
featured snippets from the web, video results from YouTube, and
new local places in Google Maps and a lot more.
And Pixel had a great year.
I just wish we had a few more of them to go around but user
satisfaction among Pixel owners is among the highest of any
Google product ever.
Industry analysts and the media gave Pixel rave reviews too.
Our performance scores led the industry and Pixel had the best
and top rated smartphone camera.
We're really proud of how well the Pixel did for our first
generation smartphone in such a competitive space.
You all know this better than anyone, but the playing field
for hardware components is leveling off and I don't envy
those of you who have to write reviews for a bunch of
smartphones with very similar specs.
Megapixels in the camera, processor speed, modem
throughput, battery life, display quality, these core
features are table stakes now.
Moore's Law and Dennard Scaling are ideas from the past.
To be honest, it's going to be tougher and tougher for people
to develop new, exciting products each year because
that's no longer the time table for big leaps forward in
hardware alone.
And that's why we're taking a very different approach
at Google.
As I said last year, the next big innovation will happen at
the intersection of AI, software, and hardware.
Smartphones might be reaching parity on their specs but as we
just heard from Sundar, we're seeing huge breakthroughs in the
kinds of experiences we're able to deliver to users.
And it all starts with reimagining hardware from the
inside out.
AI and machine learning have helped us to do this across
our products.
For one, Pixel completely revolutionized the end to end
photo experience for users and removes all of the hassles.
Machine learning works throughout the experience to
make your Pixel photos more vibrant, to smooth out your
videos, and to make all of those thousands of memories easy to
find on any of your devices.
We did the same thing with home networks.
Google Wi-Fi uses machine learning not only to keep your
signal strong but to reduce Wi-Fi congestion.
While you're moving throughout the house, your router is
intelligently transitioning your devices to the best Wi-Fi point
and placing you on the right channel.
When you're using the Assistant on Google Home, you'll notice it
can pick up your voice even in a noisy room.
Our deep learning capabilities and neural beam forming help
Google Home locate the source of sound and hear you accurately so
we can do with just two microphones what others normally
need six or eight to do.
And this is what it means to design hardware from the
inside out.
It's this combination of AI, software, and hardware working
together that provides a helpful experience for our users.
And that is where the big leaps forward are going to happen in
the next ten years.
We're still in the early days for our hardware line but we
know what it takes to build great products in a
crowded field.
We weren't first with many of our most successful products,
Gmail, Chrome, Android, and even search, but in each case, we
succeeded by doing what we're best at, reimagining the
experience to make it radically helpful for the user.
And as you'll hear today, our next generation of devices is
radically helpful too.
They're fast, they're there when you need them, they're simple to
use, and they anticipate your needs.
Everything is designed for you to keep the tech in the
background and out of your way.
Interact with your devices naturally through your voice or
by touching them.
And by building hardware around our AI and software, we're
creating products that get even better over time.
They are constantly getting faster and more helpful the more
you interact with them thanks to machine learning.
As a family, made by Google products represent the ultimate
Google experience.
Today we're going to show you how we're creating a family of
products that are there when you need them, at work, at school,
and on the go.
But we're going to start with the most important place,
your home.
So let's hear next from Rishi Chandra who leads our Home team.
Over to you, Rishi.
>> RISHI CHANDRA: Thank you, Rick.
It's been a really exciting year for us.
Google Home was the first product to bring the Google
Assistant to the home and we've been constantly learning from
our users so that we can make the Assistant radically helpful
for you.
For example, as Rick mentioned, in the last year we answered
over one hundred million new questions.
The best part is you don't have to talk like a computer or teach
Google Home any new skills.
It just works.
Now we're really happy with the positive feedback thus far but
we know this is a journey.
So we've been working hard to bring Google Home to more people
and more countries.
This year, we launched in five new countries and we're happy to
announce that we'll be launching in Japan later
this week.
Now bringing the Assistant to people all around the world is
no easy task.
We had to make sure we can understand people of different
age groups, genders, and accents.
So we trained the Assistant at a scale that only Google could
with over fifty million voice samples from hundreds of
different ambient environments.
We've been investing in voice search for over a decade which
is why we have the best voice recognition in the world.
Now earlier this year, we had a major breakthrough with the
ability to recognize your voice.
We call it Voice Match.
With your permission, we build a model of your voice by looking
at dozens of different voice characteristics like vocal
construct, pitch, and tone.
This is a really big deal.
Google Home is great for the whole family but it doesn't mean
I want to get the same answer to every question.
An Assistant can only be truly useful if it knows who you are.
With Voice Match, we're the only assistant that can bring
personal help to each individual member of your household.
So when you ask a question, we match your voice and we respond
with your calendar, your commute, and your
personal reminders.
Now Voice Match has already become one of the most popular
features on Google Home today.
Over half of all queries are from people who have trained the
Assistant to recognize their voice.
And starting today, we're rolling out Voice Match to all
seven countries Google Home will be available.
Now another popular feature is hands-free calling.
You can use your Google Home to call any landline or mobile
number in the US or Canada for free without any additional apps
or accessories.
It just works.
And I am happy to announce that we'll be bringing hands-free
calling to the UK later this year.
And starting today, you can call out with your own personal
mobile number so whoever you're calling will know it's you.
You just need to verify your number through the
Google Home App.
And of course with Voice Match, we can recognize your voice to
make sure we call your contact with your personal number.
Hands-free calling also has the intelligence of Google built
right in.
Just say hey Google, call the bakery on 24th street.
>> GOOGLE ASSISTANT: Okay.
Calling Noe Valley Bakery.
>> RISHI CHANDRA: We do the hard work to figure out which
business you want to call thanks to Google's deep expertise in
local search.
Making a call has never been easier.
Now we have a lot of great news to share with you today so to
kick things off, I would like to welcome Isabelle, lead designer
for Home hardware.
>> ISABELLE OLSSON: Thank you, Rishi.
I am thrilled to share with you how we think design is
redefining technology in the home.
You heard Rick mention earlier today that everything in our
hardware line is designed to fit into your life.
The home is a special, intimate place, and people are very
selective about what they welcome into it.
You don't want to fill it with black plastic, complicated
buttons, and random blinking lights.
Our vision is to build simple, helpful solutions that work in
the background, helping you when you need it and staying out of
the way when you don't.
So here is what that means to us.
We design for the spaces our products live in and we take
inspiration from the materials and colors that people already
use to make their home more warm and inviting.
Second, when designing for real homes form and size really
matters for creating something that actually fits into any room
or on any surface.
And lastly, the way you interact with our products has to be so
intuitive, you never even have to think about it, and so simple
that the entire household can use it.
With that, we are introducing Mini, the newest member of the
Google Home family.
The first thing you might notice is the striking simplicity of
the design.
It is sleek and smooth with no corners or edges.
And it is small enough to be placed in anywhere in your home.
It makes Google Home more accessible to more people in
more rooms.
And as you can see, almost the entire enclosure is made out of
fabric and it is not just about aesthetics, it's core to the
product experience.
That is why we created this material from scratch, right
down to the yarn.
It needed to be durable and soft but also transparent enough to
let through both light and sound.
This perfect balance allows for all of Mini's technology to be
thoughtfully tucked away underneath the clean,
simple enclosure.
So the four LED lights under the fabric are there when you need
them, lighting up to show you that it hears you or
it's thinking.
And you can control it by touch.
Give the fabric a quick tap to pause your music, adjust the
volume, or talk to your Assistant.
We thought a lot about how to get great sound with such a
small product too.
We embraced Mini's circular shape to project three hundred
and sixty degree sound so it is really crisp no matter where you
stand in the room.
You'll be surprised how amazing the quality is.
And if you want bigger sound, you can easily connect mini to
any Chromecast built-in speaker wirelessly.
We created Mini in three beautiful colors, coral, chalk,
and charcoal.
Color really matters in the home.
This is the place where we choose our drapes and carpets
and paint colors with so much care.
We brought that same level of care for mini.
For the chalk color alone, it took us one hundred and
fifty-seven tries before we found the perfect shade of gray.
So that is Google Home Mini with all of the power of the
Google Assistant.
You can make it your sous chef in the kitchen, your white noise
machine in the nursery, your voice remote for your TV in the
living room, or your alarm clock in the bedroom, or all of
the above.
Mini will retail for just forty-nine dollars in the US.
And it is going to be available for preorder online starting
today and in stores starting October 19th for
major retailers.
We are also really happy to announce that Mini is coming to
all seven Google Home countries.
So with that, let's take a look at this ad that will launch
later this week.
>> SPEAKER: What's this little thing?
Well, it's Google Home Mini.
You put it in your house and control it with your voice.
Hey Google.
Play my fun playlist.
>> GOOGLE ASSISTANT: Okay.
Playing now.
>> SPEAKER: It's also a remote control and an alarm clock and a
sous chef.
>> GOOGLE ASSISTANT: Let it cool for ten minutes.
>> SPEAKER: It can play this on TV or that on the internet.
It can tell you the weather or if your flight's delayed because
of the weather.
And it knows the difference between you -
>> SPEAKER: Hi.
>> SPEAKER: And your husband.
>> SPEAKER: Hey.
>> SPEAKER: So if you go, hey Google, call Alex, it won't call
your husband's friend Alex.
It will call your friend Alex.
>> GOOGLE ASSISTANT: Calling Alex.
>> SPEAKER: Hey. How are you?
Okay. Got to go, girl.
>> SPEAKER: Anyway.
It's made by the same people who help you find stuff on the
internet so you know, no biggie.
It's smaller than a donut and weighs less than a full-grown
chipmunk without the nuts.
It's powered by the Google Assistant so it has Google Maps,
Google Calendar, it gets you to all of your music, lots of your
favorite shows, all of YouTube, and lots and lots and lots of
other stuff you love.
All for less than fifty bucks.
Yeah. It's Google Home Mini.
A little help at home like only Google can.
>> RISHI CHANDRA: Thanks Isabelle.
We can't wait for you to try out Mini.
We also want to share how the Google Assistant continues to
get better over time.
From the start, we've been helping everyone - people - with
their everyday routines.
For example, one of my favorite features is My Day which gives
me a personalized briefing of my important events of the day.
And with Voice Match, each member of the house can get
their own unique start to their morning routine.
Well soon, we're going to be extending routines in two
important ways.
First, we're adding more routines to help you with those
everyday moments like getting ready to leave for work, coming
home in the evening, or turning in for the night.
Second, we're adding additional actions to the routines so now
when I say good morning, I not only get my personal briefing
but I also can turn on the lights, start the coffee maker,
and even play my favorite morning playlist.
This kind of help is exactly what I need to get my
day started.
Now another ritual for me in the morning is I'm always looking
for my phone before heading to work.
Well the Google Assistant can help there too.
Just say hey Google, find my phone, and we'll automatically
ring your Android phone, even if it's on silent mode.
And for iPhone users, we just give you a call.
It saves me five or ten minutes every morning.
Yes.
Now we're also working to make the smart home work better
for you.
The Google Assistant can now voice control over a thousand
different smart home products from over one hundred brands.
And even more importantly, we brought our deep expertise of
natural language processing to make it easier to voice control
in a more conversational way.
So for example if I want to change the temperature, I
shouldn't have to remember the current thermostat setting.
I should just be able to say hey Google, make it warmer.
>> GOOGLE ASSISTANT: Okay. Warming up the living room.
>> RISHI CHANDRA: Your Assistant knows what warmer means and will
just take care of it for you.
Now voice control is just the start.
We believe the next evolution of the smart home is to bring real
intelligence to the home so it can behave in a more thoughtful
way. Your home should smartly adjust to you, not the other
way around.
So Google and Nest are working together to reimagine
the smart home.
To tell you more, I would like to welcome Yoky, CTO of Nest.
>> YOKY MATSUOKA: Thank you, Rishi.
At Nest, we like to talk about the thoughtful home, one that
takes care of the people inside it and the world around it.
We recently took a big step forward in achieving that goal
by doubling our portfolio by shipping six new
hardware products.
What I love about Nest's growing ecosystem is that we combine the
best in class hardware and machine learning to help make
peoples' lives easier.
And your Nest experience reaches a whole new level when it is
combined with Google products.
Let me show you three examples of how we can provide you with
even more help at home.
With Nest Cam, Google Home, and Chromecast, we can help keep you
- keep an eye on your home just with your voice.
So if you hear some unexpected sound by the front door and I am
in the back in the family room without my phone, I can just say
okay Google, show me the entryway.
>> GOOGLE ASSISTANT: Okay.
Streaming the entryway.
>> YOKY MATSUOKA: And then I realize it's not an intruder but
my pet pig, Cayenne, and she has found a leftover lunch in my
son's backpack.
Sadly this has happened more than once.
And if I want to keep it for later for my kids, I can simply
say okay Google, save me this clip to show them why we don't
leave food in our backpacks.
With our new video doorbell, Nest Hello, we can also start to
solve some of our common pain points through Google and Nest
integrated computer vision technologies.
Of course, most of us want to know who is at the door before
we get off the couch.
Through a feature called familiar faces, Nest Hello will
be able to recognize the people that you asked it to remember.
So when the doorbell rings and Nest Hello recognizes the person
at the door, it will automatically broadcast that
information to Google Home devices in the house.
>> GOOGLE ASSISTANT: Auntie Suzy is at the front door.
>> YOKY MATSUOKA: And you don't even have to ask.
And if Nest Hello doesn't know who is at the door, you can just
say okay Google, show me who is at the front door and we will
stream your camera feed right to your TV.
This is another great example of what Google and Nest can
do together.
And this last one is my favorite.
Your home can soon get ready for the night just with a
simple command.
As Rishi mentioned, you can personalize your routine to
include actions for Nest products.
So all I have to say is hey Google, goodnight.
>> GOOGLE ASSISTANT: Okay.
Let's get ready for bed.
I am armed your security system.
Your first calendar event is tomorrow at 9:00 AM.
Your alarm is set for 7:00 AM.
Sleep well.
>> YOKY MATSUOKA: In the background, my Assistant is also
turning my security camera on, adjusting the thermostat, and
turning off the lights.
This really simplifies my life and gives me the peace of mind
that I cannot get any other way.
Together, Nest and Google deliver an ecosystem of products
that make your home more secure and energy efficient, more
connected and entertaining, with the best home
assistant experience.
Our close collaboration makes these products work seamlessly
together and help us get one step closer to a truly
thoughtful home.
On that note, to tell you more about Google Assistant, let me
welcome Rishi back on stage.
>> RISHI CHANDRA: Thank you, Yoky.
You know, making a more thoughtful home is just one way
to help make our joyfully hectic family lives a little easier.
You know, the Google Assistant already provides a lot of help,
from using your voice to order more diapers, playing a lullaby
in a nursery, checking the traffic for those piano lessons,
or easily giving Grandma a call.
Now if you're like me, it can be hard to gather the household
together for those family moments like dinner time or
movie night.
Well to help, we're launching a new feature called Broadcast.
It's really easy.
Just say hey Google.
Broadcast it's time to leave for school.
>> GOOGLE ASSISTANT: Got it.
Broadcasting now.
>> RISHI CHANDRA: The Assistant will broadcast a message to all
of the other Google Home devices in my house.
It's really great.
Parents are going to love this feature.
Kids - kids are going to hate this feature.
But that's what makes it useful.
Speaking of kids, we think Google Home has a lot of
potential with - to help kids and families.
Voice interaction is so simple and easy to use, kids are
naturally drawn to it.
So we want to be thoughtful about what it would take to
deliver a great kids experience.
We conducted research with hundreds of parents and kids
from across the country from different backgrounds.
So first, we're announcing support for Family Link accounts
on Google Home.
These are Google accounts that parents can create for children
under thirteen.
They can manage it within the Family Link app.
And as we all know, kids have their own unique way
of speaking.
Sometimes it can be a little hard to understand, even for
us parents.
Well fortunately, we have improved our voice recognition
accuracy for kids so the Assistant can understand
them too.
And finally, we're introducing over fifty new experiences with
the Google Assistant to help kids learn something new,
explore new interests, imagine with storytime, or just share
laughs with the whole family.
So let me show you how my kids and their friends had some fun
this weekend.
>> SPEAKER: Let's go to the living room.
>> SPEAKER: Okay Google.
Play musical chairs.
>> GOOGLE ASSISTANT: Ready to see who our musical chair
champion will be.
>> SPEAKER: Okay Google.
Beat box for me.
>> SPEAKER: Hey Google.
Let's play freeze dance.
>> MICKEY MOUSE: Hiya pal.
It's me, Mickey Mouse.
>> SPEAKER: Okay Google.
Play what kind of fruit are you.
>> SPEAKER: Okay Google.
What sound does a seal make?
>> GOOGLE ASSISTANT: This is a seal.
>> SPEAKER: Okay Google.
Let's learn.
>> SPEAKER: Hey Google.
Let's play space trivia.
>> GOOGLE ASSISTANT: Which was the first planet to be
discovered using a telescope?
>> SPEAKER: Uranus.
>> SPEAKER: Uranus.
>> SPEAKER: Hey Google.
Can you tell me a story?
>> SPEAKER: One day I was hanging out at home with my dad.
>> SPEAKER: There lived a sweet tempered girl whose name
was Cinderella.
>> RISHI CHANDRA: I can tell you; as a parent, it's great to
see the kids without their screens and sharing
experiences together.
And it's easy to get started.
Just say hey Google, let's learn, to play science quizzes
or talk to an astronaut.
Or try hey Google, let's play a game, to play musical chairs or
tackle a riddle.
And finally hey Google, tell me a story, to hear classics like
Snow White or original stories like the Chef who
Loved Potatoes.
Now to bring this to life, we have partnered with Disney.
We are bringing exclusive experiences to Google Home
featuring Mickey Mouse, Lightning McQueen, and
Star Wars.
We are working with many other top brands in the family space
including Warner Brothers and Sports Illustrated Kids.
And starting today, we're opening up the Actions on Google
platform for developers to create experiences specifically
for families and kids.
All of these new family features will be rolling out later this
month across all Google Home devices.
Okay. Finally, I have one more exciting addition to share that
is coming to the Google Home family.
Say hello to Max, our biggest and best sounding
Google Home ever.
Just like the Pixel reimagined the camera, we will do the same
with sound.
With a combination of great hardware and software powered by
Google's machine learning capabilities.
Now it starts with a strong foundation of great hardware.
The first thing you'll notice is how we obsessed over the base.
Its two 4.5 inch woofers have twenty-two millimeters of
excursion and extremely high range for their size.
That means these woofers can move a lot of air, allowing Max
to really hit those low frequencies.
And Max can play loud, really loud.
It's more than twenty times more powerful than Google Home so it
will fill any room in your house with amazing audio.
Now great hardware alone isn't sufficient for great sound.
The challenge of speakers today is that they are tuned for ideal
acoustic conditions, but they fall short in the real world.
That's why it always sounds different in the store than in
your home.
To sound great, the speaker needs to adjust to you and
your home.
So today we're announcing Smart Sound, a new audio experience
powered by Google's AI.
It allows Max to adapt to you, your environment, your context,
your preferences.
So for example, if you setup a speaker near a wall, on a shelf,
in a corner, fairly common places, it can dramatically
change the sound balance of the speaker and make the music sound
muddy and the vocals lose clarity.
Well with Smart Sound, we automatically tune the speaker
to correct for this effect using our machine learning model
trained with thousands of different room configurations.
What's really cool, this is all done dynamically.
So if you decide to move Max a few feet, it will compensate
within seconds.
And over time, Smart Sound will automatically adapt the sound to
fit your context, lowering the volume in the morning, raising
the volume when the dishwasher is running, or just adjusting
the tuning based on the type of media you're listening to,
whether it be music, podcasts, or news.
It's about delivering a consistent, crisp, thoughtful
sound experience, one that is tailored to your home and
your moments.
Now great sound only matters if you can play a lot of
great content.
With YouTube Music, you'll have access to the world's broadest
catalog of songs, remixes, and covers, and we also support all
free and paid Spotify users along with other top
music services.
Just use your voice to start playing your favorite tunes.
Max also supports Cast, Bluetooth, and stereo AUX input
so you can play anything from your phone or plug in that
record player to breathe new life into that vinyl collection.
And of course, Max has a Google Assistant built right in.
We made - your Assistant can hear you even when the music is
blasting using our unique microphone placement and
Google's neural beam forming technology.
And with Voice Match, the Assistant can become your own
personal DJ.
We recognize your voice and we can play playlists and music
stations personalized to you.
And Google Home Max was designed to be incredibly versatile and
fit naturally into your home.
You can stand it up vertically, place it horizontally, whatever
makes sense for your space.
Its base magnetically pops into place so when you change Max's
orientation, there aren't any rubber feet where they don't
need to be.
We thought through every detail.
And finally, Max works seamlessly with the Google Home
family and hundreds of Chromecast built-in speakers.
So you can use multi-room to sync every room in your house
with immersive sound.
Google Home Max will be available starting December for
$399 and it will come in two colors, chalk and charcoal.
We will initially launch in the US, with more countries coming
early next year.
And to make sure you have an amazing sound experience out of
the box, we're providing a free twelve month subscription to
YouTube Red which includes YouTube Music ad-free.
We think you're going to really love the newest members of the
Google Home family.
Along with the updates to the Google Assistant, they represent
a big leap forward in the type of helpfulness, fun, and sound
you can expect from Google in the home.
And to close, we gave Max to a music lover to get their
thoughts. Let's take a look.
>> THOMAS WESLEY PENTZ: I'm a producer and a DJ so I am not
classically trained but I am obsessed with music.
I listen to music all day long.
I'm constantly being critical of my own music, of music that
I hear.
Watch this.
Hey Google.
Play Particular by Major Lazer.
>> GOOGLE ASSISTANT: Sure.
Particular by Major Lazer and DJ Maphorisa.
Here it is on YouTube.
>> THOMAS WESLEY PENTZ: Every room I have is setup to play
music, bedrooms, the living room.
Hey Google.
Turn it up.
Hey Google.
Play the song that goes wrist looking like it been dipped,
dipped in that, dipped in that, dipped in that.
>> GOOGLE ASSISTANT: Alright.
Know No Better by Major Lazer.
Here it is on YouTube.
>> THOMAS WESLEY PENTZ: The speaker can find the periphery
of walls and where it is located in that room.
To have that technology to understand where it is and to
adjust appropriately is amazing.
That sounds great.
Oh, that's - this is dope.
If people could actually hear what we're giving them from when
it leaves our studio, that would be amazing.
That is what any producer or artist would love.
Hey Google.
Turn it all the way up.
>> MATT VOKOUN: As you just saw, there is so much amazing
innovation happening in the Home space.
And at the same time, our phones have evolved to become the most
trusted devices in our lives.
Have you ever wondered why the laptop experience has been
basically the same for the past twenty years?
So while the laptop experience hasn't changed all that much,
how we use technology in our lives definitely has.
We live online.
We create and collaborate more than ever.
We use tons of apps every day and we're constantly jumping
between our phone, tablet, and laptop to get things done and
have fun.
Shouldn't we be able to do whatever we want on whatever
device is in front of us?
We think so.
We have worked hard to combine the most helpful parts of a
laptop, a tablet, and a smartphone to fit how we use
technology today.
Meet Google Pixelbook, the high performance Chromebook.
Pixelbook reimagines the laptop experience by marrying the best
of premium performance hardware, the speed, simplicity, and
security of Chrome OS, the smarts of the Google Assistant,
and all of your favorite apps.
Pixelbook is the perfect example of how we combine the best of
hardware and software with AI at the center.
We will start with the stunning versatile design.
Pixelbook is the thinnest, lightest laptop we have ever
made at just around ten millimeters thin and a
kilogram light.
In fact, it is so light I find myself constantly checking my
bag to make sure I haven't forgotten it.
Pixelbook easily adapts with a four in one design, so it's
built for the many ways you want to use it.
When you're at your most productive, it's an
incredible laptop.
Fold the keyboard underneath and easily watch videos or play your
favorite games.
Use Pixelbook as a tablet to catch up on the latest news or
read a book.
It's the first convertible laptop that actually feels
natural to use as a tablet.
Prop it up into tent mode to share your screen or follow
along with a recipe in the kitchen.
Use it however you want because Pixelbook adapts to fit the
office, the classroom, movie night, or even a long flight.
Pixel phone fans will appreciate our signature glass inlay on the
lid, giving it not only a refined look but improved
wireless performance.
The 12.3 inch touchscreen display is gorgeously high
resolution, with vibrant colors, deep blacks, and enough
brightness to use outside even on a sunny day.
We developed an extremely thin keyboard that is comfortable to
use with soft touch keys and a backlit design that helps you
work from anywhere.
And Pixelbook's track pad uses touch processing algorithms that
improve accuracy and palm rejection.
With Intel Core i5 and i7 processors, Pixelbook packs
a punch.
It offers plenty of RAM to handle your multitasking needs
and gives you up to 512 gigs of storage.
The long lasting battery provides up to ten hours of use
from a single charge and plugging in for just fifteen
minutes gives you two hours of battery life.
You can even use your Pixelbook charger with your Pixel phone,
giving you one less thing to carry around.
And in moments when you don't have access to Wi-Fi, Pixelbook
is smart enough to automatically connect through your Pixel
phone. We call this instant tethering.
It is so much simpler than setting up a hotspot on my phone
and then manually connecting to it on my laptop.
Now let's take a further look at the software experience inside
Pixelbook. This laptop is fast.
It starts up in seconds, stays fast throughout the day, and
won't slow down over its lifetime.
Chrome OS provides a more secure computing experience, with
multiple layers of security and automatic updates.
No need to worry about manual downloads or annoying security
patches to keep your machine safe.
And your most important files are available on your local
drive and securely backed up with Google Drive so you can
access them from anywhere, online or offline.
We're also excited to announce that Pixelbook is the first
laptop with the Google Assistant built in, making it
radically helpful.
Just like you can with your other devices, you can check the
weather before heading out for the day, control your smart
home, set reminders, check if my Packers won.
Your Assistant is there for you no matter what you're doing.
And we've made it easy to interact with the Assistant in
all of the ways you use Pixelbook.
When it's in tent or entertainment mode and the
keyboard is tucked away, simply say okay Google, play Future Me.
And that exact music video I had in mind starts playing
on YouTube.
Other times though, it's not the right moment or place to talk to
your computer.
Like if you're sitting at a coffee shop or out in the
audience right now.
So Pixelbook has a key that puts the power of the Assistant right
at your fingertips, letting you type your request.
And when you're using your Pixelbook as a tablet, it's
easiest to show your Assistant what you need help with on
your screen.
That is why we created the new Pixelbook Pen, a smart
responsive companion for your Pixelbook.
Just hold the pen's button and circle an image or text on the
screen and the Assistant will take action.
When you're browsing through a blog and discover a new
musician, you can circle their photo and the Assistant will
give you more information about them.
From there, you can check out their Instagram page, go to
their YouTube channel, get a list of their songs, and more.
Or if you're doing homework, the Assistant can help with
your research.
Like if you get stuck on a really unfamiliar concept, just
circle a word and the Assistant will give you everything you
need to know.
The pen is also handy for moments when it feels more
natural to write, draw, or take notes.
Like when you're using your Pixelbook as a tablet, the
experience using a pen should be like writing on paper.
To bring that experience to life, we partnered with the
engineers at Wacom and together we built the fastest most
responsive stylus experience ever with just ten milliseconds
of latency, sixty degrees of angular awareness, and two
thousand levels of pressure sensitivity.
Pixelbook Pen uses machine learning for handwriting
recognition and demonstrates how performance hardware combined
with our expertise in machine learning delivers a user
experience that just works better.
And shouldn't the apps you know and love on your phone also be
available on your laptop?
With Pixelbook, they are.
Pixelbook fully supports Google Play so popular smartphone apps
are now available on a laptop.
In fact, it's never been easier to put the final touches on your
favorite photos in Adobe Lightroom and then post them
straight to Instagram.
You can listen to music from Spotify or download Netflix
shows and movies for viewing on the go.
Pixelbook is also perfect for creativity apps like Evernote,
AutoCAD, and more.
Snapchat is already one of the most popular apps among early
users and we're thrilled to announce that the Snap team is
working with us to bring an amazing Snapchat experience to
the larger screen on Pixelbook.
In addition to these amazing Pixelbook experiences you've
seen so far, there are many more developers building for the
larger screen.
Now let's take a look at how Pixelbook will fit into
your life.
>> SPEAKER: Okay Google.
Play Discover Weekly on Spotify.
>> GOOGLE ASSISTANT: Okay.
>> MATT VOKOUN: So that is a first look at Pixelbook, a
radically helpful laptop that combines the best of Google's
AI, software, and hardware.
We think you'll love the beautiful, versatile design,
high performance hardware and software, new ways to access the
Assistant, and all of your favorite apps.
It's available in three configurations so you can choose
the performance, memory, and storage that is right for you.
Pixelbook starts at $999.
And the Pixelbook Pen is $99.
Both products will be available in the US, Canada, and the UK on
the Google Store and over a thousand retail locations,
including BestBuy.
Preorders start today and they will be in stores beginning
October 31st.
We can't wait to see how you use Pixelbook to work, play,
and create.
Next, Mario is going to come out and talk about our Pixel phones.
>> MARIO QUEIROZ: At Google, we believe in questioning the
status quo; asking more from the world around us is in our
nature. After all, Google was built on a single question.
What if all of the information in the world was available to
everyone, everywhere?
A big question for us is what if smartphones got smarter
and simpler?
We set out to design a phone ourselves because we believed we
could make the smartphone experience better.
The first Pixel phone delivered a clean and fast UI with the
Google Assistant in your pocket and brought you the best camera
ever in a smartphone.
And we didn't stop asking big questions or working to solve
big challenges for users.
Today, we're proud to introduce the Google Pixel 2, designed for
you with the best of Google built in.
We've created the new Pixel in two sizes, a five inch and a six
inch XL with thoughtful design elements and improvements to the
UI throughout.
We're bringing you more Google Assistant capabilities.
We've added new smart experiences to the camera.
And we'll continue to deliver the best photography.
Plus, we're introducing some innovative new products to
complement your Pixel experience.
We do all of this by bringing together the best of hardware,
software, and AI.
Let's begin with the design.
We've evolved the Pixel's iconic look to be even bolder.
The all-aluminum body, with a premium hybrid coating, gives it
a sleek and clean profile.
The soft sculpted back with the distinct side band feels
comfortable and secure in your hand.
The refined visor is made of sturdy, gently curved colored
glass and transitions seamlessly into the metal body.
We've placed a fingerprint sensor where your finger
naturally falls.
It is extremely accurate, secure, and it unlocks faster
than any other smartphone.
We also added a small pop of color on the power button for a
bit of playfulness.
Over on the front, Pixel 2's front facing stereo speakers are
precision tuned to deliver the perfect balance of volume,
clarity, and frequency response.
We placed them so you can comfortably watch videos in
landscape mode without muffling the audio with your hand.
If you prefer headphones, use your existing analog headphones
with the included adapter or your digital USB-C headphones
for the highest resolution audio all the way to your ears.
Or avoid cables altogether with your upgraded Bluetooth 5.0
support and hi-fi codex.
The smaller five inch Pixel 2 features a full HD OLED display
and it is as smart as it is beautiful.
The display intelligently selects which specific pixels to
turn on, leaving the others perfectly dark for a 100,000 to
one contrast ratio.
That is twice the resolution and more than ten times the contrast
ratio of phones in its category with LCD displays.
Pixel 2 comes in three colors, kind of blue, just black, and
clearly white.
We love picking the color names.
So let's talk about the display on the XL.
The six inch Pixel 2 XL was designed with innovative POLED
display technology.
We elegantly curved the glass to bring the display to the edges
for a full screen, immersive viewing experience.
The screen supports eighteen by nine UHD plus resolution with
over 4.1 million pixels which equals 538 pixels per inch for
amazing detail.
We tuned the display to take advantage of the wide color
gamut for vivid and realistic colors.
We have integrated a circular polarizer which lets you enjoy
the screen even while wearing sunglasses.
And we've optimized the displays on both phones for
virtual reality. They are both Daydream ready.
The Pixel 2 XL comes in two colors, just black and the
stylishly simple black and white.
Feel free to choose whichever size Pixel you prefer because
you'll get the same great experience on both.
We don't set aside better features for the larger device.
Both devices -
Both devices are IP67 water and dust resistant and have an
all-day battery that charges superfast.
You get up to seven hours of charge in just fifteen minutes.
Now Sabrina from the Pixel product team is going to take us
through the new Pixel 2 UI and some cool and useful features.
>> SABRINA ELLIS: Thanks, Mario.
We're constantly working to make the smartphone experience easier
and more helpful and you'll see improvements throughout the
Pixel 2's UI.
That helpfulness starts before you even unlock the phone.
Let me show you the Pixel 2's new always-on display.
You can see important information like the time, date,
email, and text notifications and reminders all without
pushing any buttons or unlocking the device.
And there is more than just notifications on your
always-on display.
Have you ever heard a song you loved and wondered what it
was called?
With Now Playing, you can just glance down to see the song name
and artist on your phone.
What makes this feature so special is that the music is
identified without your Pixel 2 sending any information
to Google.
On device machine learning detects when music is playing
and matches it to a database of tens of thousands of song
patterns on your phone.
In this example, you can see that Pixel recognized the song
Perfect Places by Lorde.
From here, I can tap on the song title and I am taken to the
Google Assistant.
I can add the song to my library on my favorite music service
like Google Play Music or Spotify, search the web for the
song, or even watch the video on YouTube.
This feature has brought me some delightful moments over the past
few months and I am finally learning who sings my
favorite songs.
Okay. So let's go to a live demo.
Here we are on the new Pixel 2 home screen.
You can see here, I've selected one of our live wallpapers.
This is from our new collection called the Living Universe and
if you look closely, you'll notice subtle movements like the
waves crashing on the beach.
I love that.
At the top of the screen, there is a space called at a glance
where you can easily see the latest updates on your day.
We're starting with calendar events today, with traffic,
flight status, and more coming soon.
In this case, I can see that I've got a mysterious meeting
coming up.
Not sure what that's about.
We've placed the Google Quick Search Box at the bottom of the
home screen where we found that it's easier to reach and we're
making it even more powerful.
It stays with you as you swipe across your screens of apps
and widgets.
A quick tap into the search box can help me find anything that
I need.
As I start typing, I can see web search results at the top but
also contacts and apps from my phone.
I can even drag an app from here and drop it right on my
home screen.
We've also made it faster and easier to get to the Google
Assistant on Pixel 2 with a new feature called Active Edge.
Just give your Pixel a quick squeeze where your hand
naturally holds the phone and ask for whatever you need.
Here is a quick demo.
I just squeeze the phone, take a selfie, one hand, no buttons.
Help me out front row. Nice.
Our research team investigated dozens of ways to trigger the
Assistant but squeezing the phone felt the most natural,
satisfying, and efficient.
After a lot of work in machine learning, we're able to
accurately identify an intentional squeeze.
We're really proud of how it turned out and it even works
when your phone is in a case.
So your Assistant is just a quick squeeze away, making lots
of common tasks on your phone fast and easy.
From calling and texting friends to controlling your smart home
to enjoying your favorite music and videos.
And having the same Assistant at home and on your phone means the
experience is connected across devices.
For example, I can say to my phone okay Google, broadcast.
Hey, I'm almost home with the pizza so get ready for dinner.
And my voice message is shared with my family through the
Google Home in our kitchen.
The routines Rishi mentioned are coming to the phone as well and
we have also customized routines specifically for when you're on
the go.
The one I use most is when I'm on my commute.
I jump in my car, drop my phone in the cup holder, and say okay
Google, let's go home.
>> GOOGLE ASSISTANT: Sure.
Let's go.
Your commute home is currently twenty-seven minutes with heavy
traffic if you take US 101 South.
You have one unread text message.
It's from Mom.
It says we'll bring dessert to dinner tonight.
Here is where you left off in Fresh Air.
Have a safe drive.
>> TERRY GROSS: I'm Terry Gross with Fresh Air.
>> SABRINA ELLIS: Super useful.
All from one quick command.
I love how it resumes my podcast right - from right where I
left off.
My Assistant even adjusts the volume so I can hear clearly on
the road and automatically sends a message to my husband letting
him know I'm on my way.
Like all of the new routines that the Assistant will support,
you can customize this one based on your preferences.
So with a quick squeeze, useful routines on the go, and an
Assistant that keeps getting smarter, Pixel continues to be
the most helpful phone around.
Pixel 2 will launch with pure Android Oreo with features like
notification dots, picture in picture, and many more.
If you already have a Pixel, you recently received the Android
Oreo update and we hope you're enjoying all of
the improvements.
As a Pixel user, you'll always be the first to get OS and
security updates without having to do anything.
Your phone becomes faster, more capable, and more secure all
the time.
Next up, Aparna is going to share some smart, helpful new
ways to use your Pixel 2.
>> APARNA CHENNAPRAGADA: Have you ever asked yourself what
kind of puppy is that?
Or I want pillows with that pattern.
Our phones can help us in many ways but sometimes the questions
we have are difficult to describe in a search box.
As you heard Sundar talk about, we believe that as computers can
understand what you see, it unleashes a whole new kind
of assistance.
That is why we started working on Google Lens, a way to do more
with what you see.
We are super excited to bring a preview of lens to Pixel phones
this year so let's see it in action.
Say you see this flier for piano lessons.
As a guilty mom, I've done this.
As you know, the email addresses on them are always so long.
Like best music teacher SF 2001 at Gmail dot com.
Now you can simply tap on the lens icon, grab the right
information, and fire off that email.
This also works for phone numbers, websites,
and addresses.
Yeah, pretty neat.
Here is another example.
You are at your friend's place at a party.
You see this nice print and you wonder who is the artist.
Now you can just Lens it.
You can also follow up with what else did he paint?
You can use Lens to answer many more questions like is this
movie worth watching?
It totally is.
How good is this book from Zadie Smith?
Tell me all about this album that my music hip friend Rishi
sent screenshots of.
What was the temple from our trip five years ago?
Look, across all of these examples, there are a lot of
things happening under the hood that are all coming together so
I am going to geek out for a moment and call out a
few things.
The computer vision systems.
We have had some major breakthroughs in deep learning
and now they go from pixels in the photo to things in
the scene.
The power of Google Search helps us train these algorithms with
millions and millions of images across the entire web.
The power of Google's knowledge graph with its billions of facts
about people, places, things, we put it all together and now
you know.
This Buddhist temple in Kyoto was built in the year 1236 AD.
Who would have known?
Now this has to work in the noisiest of conditions.
You know, these photos can be taken at different angles,
different times of the day, different weather conditions,
out of focus pictures, shaky hands, you get the picture.
To be clear, sometimes computers do trip up, especially when
things kind of look similar like this.
Let's take a moment.
Is this a muffin or is this a Chihuahua?
You tell me.
It's early days for Lens but you know, this is a familiar journey
for us.
When Google started, our search index contained about 25 million
pages. Now it is in the trillions.
When we started working on voice recognition, our algorithms
couldn't always catch what you were saying, especially with an
accent like mine.
Now we have an accuracy rate of more than 95%.
That is as good as humans.
We are just getting started with helping you do more with what
you see, be it a photo from five years ago or something right in
front of you.
In time, we want to bring Lens everywhere.
We are super excited to bring a preview of Lens to Pixel users
first so you will start to see this icon appear in Google
photos and soon, Google Assistant.
We just talked about how the phone can help you understand
the world but it can also help you place things into the world.
That's right.
That is augmented reality.
We see it as a powerful way to bring the physical and the
digital together.
That is why we started working on ARCore, our augmented reality
SDK for Android.
And we see the developers are already using it to create some
amazing experiences and I want to show you a few examples here.
So let's take Houzz.
They are bringing the showroom into your living room using AR.
With League of Legends, you can watch one of the most popular
e-sport games on an AR map built by Grab Games.
And with Lego, my favorite, you can build virtual models with
characters that really come to life.
All of the fun, no Lego pieces to step on.
We at Google are also making some fun things and our teams
have been working hard to create this new experience.
We call it AR Stickers.
It is built right into the camera and it is exclusive
to Pixel.
With AR Stickers, you can express yourself with playful
emojis like these fun food characters here.
There are lots to choose from and more will be added
over time.
Our partners are also making some fun stickers here and I
want to show some of them to you right now.
So let's start with the Netflix original, my favorite show,
Stranger Things.
I can't wait for season two.
Ready to see it?
Okay. Time to set the mood.
Lights please.
Okay. Here is Leah.
Using the Pixel camera, we are looking at the AR Stickers.
We are in the Stranger Things collection, yes?
Okay. So let's pick a character.
Let's pick Demogorgon.
And place him right next to me.
Wow. Okay.
You need to have a cup of coffee, dude.
Notice how the creature is not just sitting around where you
put it, right?
It's not a dead sticker.
It's moving.
It's reacting to what's around it.
It is alive.
We really paid attention to the motion, the physics,
the lighting.
We want to make it feel like it's actually here.
Is it?
Let's have some fun.
This is a great photo or video by the way.
Cheese. But let's have some fun here.
Leah, can we drop Eleven into this?
Alright. Okay.
This is getting interesting.
Now we have two characters in the scene and notice how they
are not just reacting to the environment around them.
They are reacting to each other.
They know each other's presence.
That's a big reaction.
These interactions make for really interesting moments here.
Wow. That was fun.
Thank you, Leah.
Now you can be the director of all kinds of stories and share
these with your friends.
You will have more AR Stickers to play with, your favorite
characters from SNL, YouTube, the NBA, and just in time for
Episode VIII, Star Wars.
And playing with these stickers is so easy to use and they look
great and this is because the Pixel camera is specially
calibrated for augmented reality.
It allows robust tracking even in low light conditions.
It also supports sixty frames per second rendering of
AR content.
You saw how the camera can help us do more with what we see with
Google Lens.
You saw how the camera can bridge the physical and digital
with AR.
But the Pixel camera also takes awesome photos.
So to tell you all about it, we'll have Mario back on stage.
>> MARIO QUEIROZ: Thanks Aparna.
We hear every day from users of the first Pixel phone that the
camera is one of their favorite features.
You can see why in the amazing photos that people post with the
Team Pixel hashtag.
Last year Pixel received a DxO score of eighty-nine.
That was the highest rating that DxOMark had ever issued to a
smart phone.
With Pixel 2, we have reimagined smartphone photography through
great hardware and unparalleled imaging software, tuned by our
engineers and researchers.
Today we're proud to announce that DxOMark has issued Pixel 2
an unprecedented score of ninety-eight.
That is the highest score of any smartphone camera.
Let's talk about how we achieved that and it starts with
great hardware.
The Pixel 2 has a 12 megapixel F1.8 aperture rear camera with
optical image stabilization.
We have integrated these components with our HDR plus
computational photography.
For every photo, we take a burst of shots with short exposure
times to avoid blowing out the highlights.
We then align and combine each pixel algorithmically for
optimal color and contrast.
All of this happens instantly with zero shutter lag.
The results are stunning, with high dynamic range even in
challenging low light conditions.
This year, we've increased the dynamic range and improved
texture and sharpness and combined with a new auto focus
mechanism and optical image stabilization to reduce blur
from shaky hands, you get amazing shots every time like
this and this.
Pixel 2 also brings you portrait mode but we're implementing it a
little bit differently.
We're applying Google's unmatched computational
photography and machine learning.
Portrait mode lets you take professional looking shots with
a blurred background to make the foreground pop like what you
might capture with an SLR camera.
Other smartphones do this by combining the images from
two cameras.
Pixel 2 does portrait mode with just one camera and
machine learning.
What is unique about Pixel 2 is that it can generate a true
depth map with a single lens so you get beautiful portrait shots
without needing a second camera.
The way this works is that the Pixel 2 camera includes a dual
pixel sensor technology.
This means that each pixel in an image contains a right and a
left view.
The difference in perspective from those pixels combined with
machine learning models trained on almost a million photos means
that this works on really hard cases like a busy,
colorful background.
And yes, this does work with objects too.
And there is more.
Portrait mode doesn't just work on the main camera.
Because of the quality of Google's computational
photography with a single camera, you can also take
portrait selfies through the front camera on both the Pixel 2
and the Pixel 2 XL.
The Pixel 2 takes amazing videos too.
We're applying optical image stabilization simultaneously
with video stabilization at the same time.
Other smart phones do optical or digital.
We do - but don't do both at the same time.
During video recording on a Pixel 2, the OIS hardware
actively corrects shake artifacts while at the same
time, the software based video stabilization uses intelligent
frame look ahead to stabilize the overall video.
We call this fused video stabilization and it is another
example of Google's hardware plus software plus AI working
together. The result is less motion blur even in low light.
Your videos look smooth and professional.
The Pixel 2 camera also supports a feature called motion photos.
The photos you take with your - for the photos you take, your
Pixel captures up to three seconds of video behind the
scenes, removing uninteresting motion and applying our unique
fused video stabilization technology to the result.
But of course, the proof is in the photos and videos.
So we invited some photographers and producers to test drive the
new camera and they were stunned by what they can do with the
Pixel 2.
Timmy McGurr, better known as 13th Witness on Instagram, and
filmmaker Zack McTee documented a recent trip to New Orleans
using just the Pixel 2.
The film you are about to see uses no attachments, fancy rigs,
or lighting.
All of the video footage and stills were captured on Pixel 2
phones without any image enhancements or
color corrections.
Again, each photo and video in what you just saw was shot using
only Pixel 2 phones, no image enhancements or
color corrections.
Now the Pixel 2's camera is simple enough and smart enough
that your photos and videos can look just as good.
So the Pixel 2's photo experience doesn't end when you
take a photo or shoot a video.
We want users to be able to easily find them and store them
securely without having to worry about running out of storage.
After all, simple storage and search is part of what makes
us Google.
So we're happy to announce that Pixel 2 users will continue to
get free, unlimited storage for all of the photos and videos
they capture in the highest resolution, including 4K videos
and motion photos.
This is a big deal.
This is a big deal.
Pixel users take twice as many photos as typical iPhone users
and store an average of 23 gigabytes of photos and videos
per year in Google's Cloud.
If you had to use iCloud, you'd reach your free limit in less
than three months.
With Pixel, you can safely and effortlessly store all of your
memories for free.
With your permission, Google Photos automatically tags and
categorizes those thousands of photos and videos it will take
with your Pixel 2.
You can search for photos of your friends, of a specific
location, of dogs, all without the hassle of manually tagging
or sorting.
Finding your photo - the photo you're looking for is our
problem and not yours.
We're delighted by what we have been able to achieve with Pixel
2 in terms of photography and are so excited by all of the
moments that you are going to capture.
So to recap, the smart and simple Pixel 2 has everything
you'd expect from a smartphone and some things you might not.
A gorgeous display that brings your apps, photos, and videos
to life.
An all-day battery that charges superfast, plus water resistance
and security features to keep your phone and your data safe.
Easy, quick access to the Google Assistant which understands you
and gets smarter and more helpful all the time.
Fun and useful ways to use your camera like Google Lens and AR.
And the best photography from any phone for brilliant photos
and smooth, professional videos.
We're also making it easier than ever to switch to Pixel.
Most new Pixel users will be able to transfer their stuff
from their old phone in less than ten minutes, including all
of your photos, apps, and even your iMessages.
So let's talk about availability and pricing.
Pixel 2, with its five inch cinematic display starts at $649
for the 64 gigabyte version.
The Pixel 2 XL has a six inch full screen POLED display and
starts at $849.
Both phones will be available in six countries, Australia,
Canada, Germany, India, the UK, and the US.
In the US, we're teaming up with Verizon to bring you Pixel 2 and
you can also buy it on the Google Store and through
Project Fi.
You can preorder starting today.
We're also excited to make the Pixel 2 XL available in Italy,
Singapore, and Spain later this year.
And we have a new family of cases that make your Pixel
truly yours.
Pixel owners can customize a live case with a favorite photo,
place, artwork, or one of your - one of our Google Earth designs.
We're also launching a range of soft knit fabric cases in
four colors.
And more than two hundred and fifty Pixel 2 accessories from
twenty-five popular brands will be made available through our
Made for Google program.
Finally, to make sure you're getting the most out of your new
phone and the Google Assistant, we're including a Google Home
Mini with each purchase of a Pixel 2 and Pixel 2 XL for a
limited time in Australia, Canada, Germany, the UK, and
the US.
So that's the new Pixel, the radically helpful smartphone
that combines the best of Google hardware, software, and AI.
We hope you'll love it as much as we do.
>> JUSTON PAYNE: Alright.
So as you've seen, we've spent the past year reimagining a
simpler, more helpful smartphone experience.
We've also been working on some new products to make your Pixel
2 experience even better.
Let's start with the newly updated Google Daydream View,
our virtual reality headset.
We kept everything from last year that people loved and we
made it even better.
To start, we have a new lineup of colors that complement this
year's Made by Google family.
We also upgraded the lenses and the fabrics so the headset has
our widest field of view yet and is super light and comfortable.
Of course, what matters most in a VR headset is where it can
take you.
Last year we launched with twenty-five apps and games and
now we have more than two hundred fifty high quality
VR titles.
We've also seen that people love going places with VR videos so
we're excited to announce a slate of premium video content
coming to Daydream, including multiple new original series on
YouTube VR.
And on Google Play Movies, you can experience new IMAX movies
available for free to Pixel 2 owners.
So you get to hang out with tigers, run alongside elephants,
dive into the ocean with some sharks, and blast into space.
With Pixel 2's new front firing stereo speakers, these movies
sound as amazing as they look.
And now you can even cast the experience to the TV so your
friends and family can see what you're seeing.
The new Google Daydream View is the best mobile VR headset on
the market and will be available for $99.
Today we're also announcing a smart new audio accessory that
works perfectly with Pixel 2, our first pair of premium
wireless headphones.
They're called - they're called Google Pixel Buds.
Our team designed Pixel Buds for great sound, delivering clear
highs and deep bass, so whether it's jazz, house, or a symphony,
it all sounds incredible.
With Pixel Buds, controlling your music is super simple
because all of the controls are built into the right earbud.
You simply tap to start and stop your music or swipe forwards and
backwards to change the volume.
Now what's amazing is when you pair your Pixel Buds with your
Pixel 2, you get instant access to the Google Assistant.
We optimize the experience to be quick and easy.
Just touch and hold the ear bud and ask your Assistant to play
music, send a text, or give walking directions all while
keeping your phone in your pocket.
It can also alert you to new notifications and read
your messages.
Pixel Buds work great with Pixel.
They even give you access to a new real-time
translation experience.
It's an incredible application of Google Translate powered by
machine learning that is like having a personal translator by
your side.
To show you how easy this experience is, I would like to
invite Isabelle back on stage for a conversation in her native
language, Swedish.
So Isabelle is going to speak Swedish into her Pixel Buds and
I'll hear the English translation out of Pixel 2's
front speakers.
And then I'll respond in English and she'll hear the Swedish
translation in her Pixel Buds.
To illustrate this today, you'll hear both sides of
the conversation.
Hey, Isabelle.
How's it going?
>> GOOGLE ASSISTANT: Speaking Swedish
>> ISABELLE OLSSON: Speaking Swedish
>> GOOGLE ASSISTANT: Absolutely okay.
Thank you.
>> JUSTON PAYNE: What do you think of these cool
new headphones?
>> GOOGLE ASSISTANT: Speaking Swedish
>> ISABELLE OLSSON: Speaking Swedish
>> GOOGLE ASSISTANT: My team designed them so I think they're
pretty cool.
>> JUSTON PAYNE: Alright.
So while you're up here, what color should I get?
>> GOOGLE ASSISTANT: Speaking Swedish
>> ISABELLE OLSSON: Speaking Swedish
>> GOOGLE ASSISTANT: I think kind of blue would suit you.
>> JUSTON PAYNE: Alright.
>> GOOGLE ASSISTANT: Speaking Swedish
>> ISABELLE OLSSON: Speaking Swedish
>> GOOGLE ASSISTANT: I think I should go now.
The audience would like to see some more new stuff.
>> JUSTON PAYNE: Thank you, Isabelle.
So with Pixel Buds, I can use real-time Google Translate to
have a natural conversation in forty languages.
We're letting you connect with the world around you in a more
natural way by rethinking how headphones should work,
connecting it to Cloud-based machine learning, and giving you
access with the touch of a finger.
Google Pixel Buds come in a pocket-sized charging case that
provide up to five hours of listening time with just
one charge.
And the case holds multiple charges so you get up to
twenty-four hours of listening time when you're on the go.
Pixel Buds are available in clearly white, kind of blue, and
just black to beautifully complement your Pixel 2.
And they will be available in November for $159 and preorders
begin today.
We have one more product to show you.
It's about photography.
Now we all love photos.
A lot of our photos let us step back into a moment with friends,
with family.
Some of our favorites are the candid ones like this, that
capture the essence of a moment.
But the problem is to get those photos, someone needs to be the
photographer on duty waiting to press the shutter button at just
the right moment.
And we were thinking, how do you capture those fleeting,
spontaneous moments while letting yourself be part of
the moment?
So we've been working on that.
And I am really excited to share an entirely new type of camera
that works with your Pixel.
We call it Google Clips.
And this first edition is specifically designed with
parents and pet owners in mind.
So you'll capture more of those spontaneous smiles, first steps,
and first tricks.
Now clips has all of the things you would expect from a great
camera, a high performance lens, a powerful image sensor.
It has a shutter button.
But that's not what this camera is all about.
We reimagined this camera from the inside out, starting with
the software and applying machine learning to build a
camera that takes photos for you so you can enjoy the moment and
instantly get shots you couldn't get before.
Thank you.
How does all of that work?
It starts with an AI engine at the core of the camera.
Let's talk about what that engine does.
When you're behind a camera, you look for people you care about.
You look for smiles.
You look for that moment your dog starts chasing her tail.
Clips does all of that for you.
Turn it on and it captures the moment so you can be in
the moment.
And because the software is at the core of the camera, it's
going to get smarter over time.
We also reimagine what camera hardware should do for you.
Clips is tiny.
It weighs almost nothing so you can easily toss it into
your pocket.
And it's a hands-free camera so you can get these amazing new
perspectives on your experience and get yourself in the shot by
attaching clips to almost anything or setting it down.
Now from the beginning, we knew privacy and control really
matter and we've been thoughtful about this for Clips users,
their family, and their friends.
Let's start with the basics.
It looks like a camera and it has an indicator light so
everyone knows what the device does.
Next, it looks for stable, clear shots of people you know and you
help the camera learn who is important to you.
And finally, all of the machine learning happens on the device
itself so just like any point and shoot, nothing leaves your
device until you decide to save and share it.
This approach to designing and engineering Clips required some
significant technical breakthroughs to miniaturize the
type of powerful machine learning that only a few years
ago needed a super computer.
Let's take a look at what Clips actually captures.
So here are some Clips I've captured with my family at home.
Now as you can see, it's not just photos that Clips
is taking.
It's taking little snippets that capture the whole moment.
We call those, of course, Clips.
You can save them as a motion photo or choose any frame to
save as a high resolution still.
It's like having my own photographer shooting and
choosing the best moments for me.
And I just have to swipe to save to Google Photos which can
automatically make movies out of your clips in seconds.
So that's the new Google Clips, a new type of camera that brings
together the best of Google AI, software, and hardware to help
people more easily capture genuine, candid moments of
family, friends, and pets.
Clips will be coming soon and selling for $249.
Take a look at some of these great Clips from our earliest
users. Enjoy.
>> RICK OSTERLOH: Hello again everyone.
What you just saw in Pixel Buds and Clips blends AI with the
best of Google software and high performance hardware to create
new, radically helpful experiences for users.
This is our strategy for the Google hardware team and you've
seen it come to life in all of our new products.
The two newest members of the Google Home family, the
beautiful new Pixel phones, and our premium Pixelbook laptop.
And of course, the Google Assistant is the primary way we
bring AI to our users.
Your personal Google is ready to get you the help you need when
you need it.
Talk to it, type, show it what you see, or just give your phone
a squeeze.
It's the simplest way to get things done and it just works.
So here it is, this year's Made by Google Family.
We've created them as a single set of products that are
designed for you with familiar elements, from soft materials to
curved lines, and simple, stylish colors.
They look great together and they also work great together
and perfectly demonstrate our approach of reimagining hardware
from the inside out.
Thanks so much for joining us today.
For you live stream viewers, take a closer look at our new
products at our redesigned Google Store.
For everyone here in SFJAZZ, you'll get to see everything up
close in our demo area outside.
Thank you so much and enjoy your day.