Subtitles section Play video
MALE SPEAKER 1: A data centre's the brains of the Internet.
MALE SPEAKER 2: The engine of the Internet.
FEMALE SPEAKER 1: It is a giant building with a lot of power,
a lot of cooling and a lot of computers.
MALE SPEAKER 3: It's row, upon row, upon row of machines,
all working together to provide the services that
make Google function.
JOE KAVA: I love building and operating data centres.
I'm Joe Kava, Vice-President of Data Centres at Google.
I'm responsible for managing the teams globally that design,
build and operate Google's data centres.
We're also responsible for the environmental health
and safety, sustainability and carbon offsets for our data
centres.
This data centre, here in South Carolina,
is one node in a larger network of data centres
all over the world.
Of all the employees at Google, a very, very small percentage
of those employees are authorised to even enter
a data centre campus.
The men and women who run these data
centres and keep them up 24 hours a day, seven days a week,
they are incredibly passionate about what they're doing.
MALE SPEAKER 2: In layman's terms, what do I do here?
FEMALE SPEAKER 1: I typically refer to myself
as the herder of cats.
MALE SPEAKER 4: I'm an engineer.
MALE SPEAKER 3: Hardware site operations manager.
MALE SPEAKER 2: We keep the lights on.
MALE SPEAKER 1: And we enjoy doing it.
JOE KAVA: And they work very hard,
so we like to provide them with a fun environment where they can also
play hard as well.
FEMALE SPEAKER 2: We just went past the three-million-man-hour
mark for zero lost-time incidents.
Three million man-hours is a really long time,
and with the number of people we have on site, that
is an amazing accomplishment.
JOE KAVA: I think that the Google data centres really
can offer a level of security that almost no other company
can match.
We have an information security team
that is truly second to none.
You have the expression, "they wrote the book on that."
Well, there are many of our information security
team members who really have written
the books on best practices in information security.
Protecting the security and the privacy
of our users' information is our foremost design criterion.
We use various layers of higher-level security
the closer into the centre of the campus you get.
So, just to enter this campus, my badge
had to be on a pre-authorised access list.
Then, to come into the building, that
was another level of security.
To get into the secure corridor that leads to the data centre,
that's a higher level of security.
And the data centre and the networking rooms
have the highest level of security.
And the technologies that we use are different.
Like, for instance, in our highest-level areas,
we even use underfloor intrusion detection via laser
beams.
So, I'm going to demonstrate going into the secure corridor
now.
One, my badge has to be on the authorised list.
And then two, I use a biometric iris scanner
to verify that it truly is me.
OK, here we are on the data centre floor.
The first thing that I notice is that it's
a little warm in here.
It's about 80 degrees Fahrenheit.
Google runs our data centres warmer
than most because it helps with the efficiency.
You'll notice that we have overhead power distribution.
Coming from the yard outside, we bring in the high-voltage power
distributed across the bus bars to all of the customised bus
taps that are basically plugs, where we plug
in all the extension cords.
All of our racks don't really look like a traditional server
rack.
These are custom designed and built for Google
so that we can optimise the servers
for hyper-efficiency and high-performance computing.
It's true that sometimes drives fail,
and we have to replace them to upgrade them,
because maybe they're no longer efficient to run.
We have a very thorough end-to-end chain-of-custody
process for managing those drives
from the time that they're checked out from the server
til they're brought to an ultra-secure cage, where
they're erased and crushed if necessary.
So any drive that can't be verified as 100%
clean, we crush it first and then we
take it to an industrial wood chipper,
where it's shredded into these little pieces like this.
In the time that I've been at Google – for almost six
and a half years now – we have changed
our cooling technologies at least five times.
Most data centres have air-conditioning units
along the perimeter walls that force cold air under the floor.
It then rises up in front of the servers
and cools the servers.
With our solution, we take the server racks
and we butt them right up against our air-conditioning unit.
We just use cool water flowing through those copper
coils that you see there.
So the hot air from the servers is contained in that hot aisle.
It rises up, passes across those coils,
where the heat from the air transfers
to the water in those coils, and then
that warm water is then brought outside the data centre
to our cooling plant, where it is cooled down
through our cooling towers and returned
to the data centre.
And that process is just repeated over and over again.
To me, the thing that amazes me about Google and the data
centres is the pace of innovation
and always challenging the way we're doing things.
So, when people say that innovation in a certain area
is over, that we've kind of reached the pinnacle of what
can be achieved, I just laugh.
[MUSIC PLAYING]