Subtitles section Play video
[MUSIC PLAYING]
MAGNUS HYTTSTEN: Welcome to release 7.8
of Google Play Services.
You are here because you want to build better apps,
and that's exactly what Google Play
Services is all about-- allowing you to build better apps.
Let's get started with the highlights in this release,
because we are excited to present some really
new cool stuff.
Let's start it off with-- yeah, that's right.
We have a new API.
Let's welcome the Mobile Vision API with a round of applause.
[APPLAUSE]
The new Vision API enables you to build really amazing things.
One part of it is the Faces API that
allows you to detect faces and different characteristics
of faces in images or even in real time video.
So here, for example, are some of my friends working
at Google.
And as you can see, the location of all of their faces
has been detected by the Faces API.
And faces can be detected in different angles too.
It doesn't have to be oriented straight forward.
It can, for example, be tilted and turned
and all the other possible positions
that you can see here, but that I don't have time
to perform with my own head.
And once a face has been detected,
you retrieve the different landmarks
in the face-- for example, the location
of the eyes, the mouth, the cheeks, the nose
space, and even the ear tips.
How cool is not that?
And believe it or not, you get even more with this one
is my favorite.
You can detect whether certain facial characteristics
is present using classifications-- for example,
if an eye is closed or open, or how
much smiley-ness the face has.
For example, this here would be close to zero,
but this one would be a smiley-ness close to one.
And remember, since the Faces API
can detect these things in real time video,
you can create really cool apps.
Let's look at one in real life.
So here I am, sitting on the sofa,
trying out an app that uses the Faces
API to detect if my eyes are open and whether I'm smiling
or not.
Let's try it out, starting with eyes closed and not smiling.
FACES API: I detect that your eyes are closed.
Please open your eyes.
That's beautiful.
You should always keep your eyes open.
Now wipe that grumpy grin off your face and smile instead.
Lovely.
Smiling is good for you.
You should always smile.
MAGNUS HYTTSTEN: Amazing.
As you may think now that that was it for the Vision API,
but it's not.
You get even more, because in addition to the Faces API,
the Vision API also has a Barcode API.
So pretty much in the same way that you can detect faces
in images or videos, you can also detect barcodes.
So go out there and create some great apps with the Vision API.
And remember, it has both a Faces API, as well as a Barcode
API.
And it can detect things in images as well as
in real time video.
In this release, we also have announcements for Google Cloud
Messaging where we have expanded notification
to support localization.
So you can now construct notification messages
that will be displayed using the locale of the device it
is sent to.
And to get ready for the Android M release,
we've added high and normal priority to GCM messaging.
This allows you to set messages that need immediate attention
to high priority-- for example, a chat message
alert or an incoming voice call alert.
This brings us to-- yeah, that's right.
It's time to announce another new API in this release.
Let's welcome the Nearby Messages API
with a round of applause.
[APPLAUSE]
Nearby Message is a cross-platform API
to find and communicate with mobile devices and beacons
based on proximity.
You may recall that in a previous version,
we introduced a Nearby Connections
API that allows you to discover other devices
and create connections, allowing them
to communicate in real time.
Great for local multi-player gaming
where players can use their own device
as a remote control connecting to a central game server.
And now we're extending the Nearby API
with the introduction of Nearby Messages.
Nearby Messages allows your users
to find devices and share messages with them
through a Publish and Subscribe API.
This allows you to build apps with rich local interactions
between devices.
For example, to collaborate on content, forming a group,
vote on something, or broadcasting a resource.
So how does Nearby Messages send these messages?
Well, first of all, it can use the Bluetooth or Wi-Fi signals
to connect and exchange data.
But it goes even beyond that.
Ultrasonic sounds can also be used to transmit messages.
That's right.
Sounds that we humans cannot hear and that contains
the information.
That's pretty, pretty amazing stuff.
And there's even more.
This API is also used to pick up information from beacons.
So beacons are these devices you can put up that broadcast
information through the Nearby Messages API.
And any device that is close by can then pick this information
up and act on it.
And that's it for this release of Google Play Services.
But be sure to check out these resources as well.
Now you have some work to do to use these fantastic
APIs to build better apps.
That's right.
Go out there and create some great apps.
And don't forget to tell us all about it.
[MUSIC PLAYING]