Subtitles section Play video
Hello? Hello, everybody. How is the day going? HTTP we have to do that again. How is the day going? HTTP headers for the responsible developer. Let me tell you about my journey
on the web. I was one of these lucky people that had an Internet connection at home very
early because my father was a techsavvy person. But in the beginning, I had no idea what to
do with the Internet, I could Google or surf, that was it. I was whatever, 12 years old
and it was not that exciting. But then I discovered this website, woot.com. Any Germans around?
That was one of the first social networks in Germany. And there I was, 1999. This is
where I'm coming from, the middle of nowhere in Germany. And I found myself chatting to
people if Berlin. Mainly about music. This is when I found out for the first time that
the web actually connects people. It's not about Googling stuff, it's about connecting
people. Then I moved to Berlin and I had a different
job initially. In 2010 something interesting happened, I became a web developer. Which
was cool. But then the statement that the web connects people actually changed. We as
developers have a lot of power. And I 100% believe that we connect people. Because we
are building stuff for the Internet. We enable people. And we help people by the
stuff that we build. So, let me quickly introduce myself. Hey, I'm Stefan. I work for a company
called Twilio. We do communications as an API, S MS and phone calls, and check that
out. And the most important thing is, hey, I'm Stefan, and I want to be a responsible
developer. So, when we look at the global population,
you will find out that 1999, there I was in my little bubble north of Germany. Not much
going on. But where actually the most Internet users are coming from today? So, they're coming
from China, India and the United States. But this is just global statistics, right? But
I also do run a private blog. And only the last month I had 300 people from Brazil reading
articles. I had 100 people from Vietnam. And I had 80 people from South Africa. That gets
me really excited, right? Who am I? I'm writing stuff and people in Brazil are reading my
stuff. But in the end that really doesn't matter because we as web developers should
be building for everybody. And when responsive web design came up, I heard this sometime
way too often. We don't have users coming from a certain area. We don't have users that
use a certain device. When you say these kinds of statements, you're creating a chicken and
egg problem. When you don't build stuff that works for people under certain conditions,
these people won't use your stuff. But building a good website these days is
actually very, very hard. What do you have to do? You have to consider design. Colors
around the world are different. You have to bring up good content. You have to consider
web performance which is a big topic by itself. You have to make yourself accessible because
there maybe people that visit your sites with assistive technology. And you have to find,
what should I use, actually? Frameworks are a thing. You have to optimize the network
stack and you have to make it work on several devices. And there are many, many more factors
out there. And in the next 21 minutes I want to talk about the network stack. So, let's
talk about HTTP. So, basically when your browser makes a requests
for a resource, what it does, sends value pairs, these are called request headers. And
the server responds with another set of key value pairs and the actual resource that you're
asking for. So, we're dealing with responsive request headers.
And when I started preparation for this talk, it was trending, so I had to buy a dot Dev
domain. You can see the responsible dot div and the site, but it right now has JavaScript
disabled. You see when I refresh it, there is some stuff going on. First of all, the
script is hijacking, adding unicorns and stuff. But going to this, you can pretend to be my
site. You can track what I'm doing, pretending to be my unicorn, not responsible unicorn,
but my Dev site. I wanted to make it better just using headers.
And the web is scary place. A few months ago, this was in the news. Lots of websites were
mining cryptocurrency and the developers that built these sites didn't even know about this.
And the reason for that is when we're building for the web, we're relying on other people
building software. Open source is the thing and we're also thirdparties from different
domains. So, we always rely on others and our belief that the web has to be safe. The
biggest part of being safe is HTTPS. So, without HTTPS, what is possible is that
someone can Open up a public WiFi and you're browsing with HTTP, this person could pretend
to be the WiFi and get the requests and mess with you. Or get the passports that you're
sending over HTTP. HTTPS lets you use HTTP/2 and service workers and media. And in my bubble,
I think it's the standard. But there's a site out there called why no HTTPS and it is many,
many sites not enforcing HTTPS yet. When you go into the list, you find out that there's
a massive German media outlet not enforcing HTTPS which is surprising.
When you run on HTTPS, you want to ensure that it is always HTTPS and on a secure connection.
So, what you can do is you can set the strict transport security header which you can define
a max H property in seconds. This will tell the browser, hey, please only use this website
or these resources over HTTPS. You can define how wide it should range. And you can, if
you want to go the extra mile, you can define a preload director which basically allows
you to supplement your site to another site, which is called HTTPS preload. And the thing
about that is that browsers internally keep lists of websites that should only be on HTTPS.
This is the configuration profile in Chromium. It's thousands of sites that will never work
on HTTP. If you wonder why dot Dev domains are not over HTTP anymore, that's the reason.
They're in this file. But HTTPS is not only about security. What happens when you type
in the address of the browser? It the make an HTTPS request and could lead to a delay
of 35 seconds depending on the connection. And you can skip this request because the
browser knows we're going HTTPS. That's it. So, how is the support for HTTPS? We're pretty
green here. Pretty sweet. But enforcing HTTPS is not the easiest thing in the world when
dealing with big projects when a lot of people are putting a lot of source code in your codebase.
How do you tackle this approach to go over HTTPS? When you run your site over HTTPS and
you make a request to an HTTP, it may be blocked by the browser. You can set a content security
header with the directive and this magically updates all the requests to be HTTPS and secure.
That is supercool. But this is not the main purpose of CSP. The main purpose of CSP is
that you want to limit what is allowed inside of your websites. And you can configure a
lot of things. So, this is the complete list of what is possible
with CSP. You can define where should fonts be loaded from, where should images be loaded
from. And a few cutting-edge things in there like this opener. Navigate too. And you can
basically trim down what is allowed in your website and you can avoid mining cryptocurrency
in your website because a thirdparty got hacked. You can use CSP with a meta element in your
HTML or you set a header like this. This is the head their I ship for my private
website. And includes all the thirdparties that I have. And coming up with this is actually
very, very hard. I deployed CSP three times and broke my site and rolled back. So, what
you want to do is you want to set a different header which is report only which then allows
you to define an HTTP end point and then you get all the warnings of requests that would
be blocked if this would be active. And you can start monitoring what is going on.
When you watch this when you have a detailed look at this, you will find out there is something
not right. I have unsafe in line and unsafe evil right there and it bothers me. But I'm
using a JavaScript a that inlines JavaScript JSON in the body so that the JavaScript framework
end gets the state. And I'm working on fixing that. It's not the easiest when dealing with
a framework. The way this should work is that inline scripts, when you have CSP enabled,
you have two ways You can either define a hashed value in your
CSP directive so that you can say, hey, this hashed value should be allowed in my website.
This is a little bit brittle, though, because now when you update your contents of the script,
then you have to change this header or your meta element in the edge. What you can do
is you can define a nonce value. Giving an ID and say, hey, this part is cool. Please
allow this. So, also support for CSP. CSP is out there in two levels. Level one, all
green. Pretty cool. Cutting edge stuff and more fancy stuff is a little bit jumpy still.
And there might be things missing. But you can have a look at that. So, I think this
technology is very exciting because it makes the web a safer place. How many websites use
CSP, there is a website that crawls the Internet and you can write search queries. And when
you do that, you will see that only 6% of the crawled websites use CSP. This is surprisingly
low, and I think we can do better. So, when you want to start working with CSP, start
in report mode. Monitor what is coming in and out. Don't break your site. And then only
when you are safe and you know that all your request resources are white listed, then you
can turn it off. For my site, I added a new route,/safe. And
then see that the unicorns are gone. And Chrome is saying I'm not reported yet. But I go to
CodePen and go to the framing site, Chrome will say, hey this is not allowed for the
site. People cannot hijack my stuff which is pretty cool. This makes the website safer.
This is important, the web is important for people. I travel quite a bit a little bit.
I was in the Ukraine two months ago. I get off the plane and I get this message by my
mobile provider. It basically tells me this. And just because this is ridiculous, I get
six megabytes are for 2 Euros. But I have to use it in 24 hours. This didn't hold me
30 minutes. The web needs to be affordable. Don't request the same content over and over
and over again. You can set proper caching headers. Caching is very, very tricky.
And I'm only going briefly into it. I'm defining there in seconds a max H property and telling
it, this is how long this resource could be potentially cached. But this doesn't necessarily
mean that there are no requests flying because browser also revalidates if the resources
changed. So, what is cool, you can define an immutable directive, telling the browser,
hey, this is maybe a hashed style, hash. JSS or something. The browser will not revalidate
stuff. That's cool. You can save some requests. Unfortunately, the spot for immutable is not
that great anymore. It was supported in Edge, but now Edge switched to Chromium. If you
want to learn more about caching, highly recommend this article by Harry Roberts. It goes over
the directives and if you wanted to set proper caching control headers, check this out. It's
not only about requests, it's about sending the right data. What happens when your browser
requests HTML? It sends a header called accept encoding and tells the server, hey, this is
what I understand. Cool. And you see the DEFLATE and Boat LY. It's a different compression
algorithm. And I took a CSS file and compressed with two different compression algorithms.
You can see that by itself it's a little bit smaller than Gzip. But surprisingly, people
are not using that often because what the whole industry things is that Boat LY compression
is so slow. Hard encode. With Gzip, the server compresses and makes a Gzip. It might be a
little bit slow. But when you say this sentence, what you're basically doing is you're comparing
apples with oranges. Let's look at Gzip and Boat LY. Gzip has nine
modes, Boat LY has 11. When you use these in default mode, Gzip runs in level 6 and
Boat LY runs in level 11. Between those, level 6 in Gzip is optimized for speed and compression
ratio. The other default is there to save the file size. When you tweak a little bit
and go not with the default mode, but rather level 4, it compresses better with the same
speed on the fly. You can set this. Maybe you have a build process in place, and you
don't to want compress it on the fly because you have a build process in place. You could
compress all the files and serve them to save some kilobytes. If you want to learn about
this topic, there is extensive research about what that means.
What's the support? Pretty there. All the big sights like Facebook, Dropbox, they are
shipping Boat LY. And I would hope that we do more things with this. It's not only about
compression. It's about serving tailored media. They cost the mode amount of data on the web.
When you're doing frontend and you want to ship, for example, an image, a little bit
smaller than JPEG. You find yourself building things like this. This is a picture element.
Responsive images for several sizes and ships when the browser understands it. This is horrible,
right? You do feature detection, this will break when the next person comes in. But guess
what? The browser also tells the server what image formats they're going to send. So, what
you could do potentially for browsers that support WebP, is you could read this header
and serve a WebP image instead of a JPEG when the browser tells you that this works. But
you can go even further. When someone requests your website and you said, for example, the
accept CH header, this stands for client hint, you can tell the browser, hey, I would like
to know how wide your viewport is and please tell me for the next 100 seconds. What happens
then for the additional image requests is that the browse her tell you dimensions of
the images. How cool is that? This means that you can use normal images without all the
responsive images stuff. But you have to give it a size attribute.
So, that the browser up front knows how this image is laid out. And this is then the request
that goes out. It will tell viewport with Nth of their image. And guess what? When you're
on a high pixel resolution display it will tell you the real size of the image. You can
then serve proper images via server-side generation or service worker which is pretty cool. I'm
excited about this. This is a little bit cutting edge. If you want to learn about client hints,
check out this resource. He does a lot of cool things around this topic.
So, I tweaked another side of my responsible Dev site. You see there, slash affordable.
You see that I'm shipping and also the image element is a JPEG. But serving web P without
any markup changes and tells me how big the image is. And without changing the viewport,
the server knows who resources to ship and what image would be the perfect fit here.
The web has to be affordable because the web is with us every day. Unfortunately, we reached
this state in the web right now. It's not playing? Here we go. This is made
by a former colleague of mine. This is where we are, right? We are web developers. This
is what we build. I believe that the web has to be respectful. And one of the most things
that we should honor more is time. And we should get the stuff that we shipped down
more quickly. So, what you can do is you can optimize the loading process for certain things,
and you can use link preload in your HTML oar set a header which then gives information
to the browser telling, hey, you will need these resources. Please start loading it.
Because a person doesn't want to wait for a font to kick in and watch a blank screen.
When you use the HTTP header, you have to be a little bit careful though when you're
using certain proxy servers or CDNs. Think preload will lead to the server push command
which is then not taking the browser cache into consideration.
So, if you use this header, you have to use no push. This is great to speed up credit
calorie sources. How is the support today for link preload? It's pretty good. You can
add these things and optimize the loading process to get your fonts or header images
down quicker and optimize for the time of your visitors.
The next thing I want to talk about is the AMP reaction. So, two and a half years ago
I was giving a talk about AMP and how it works technically. Actually, versus very interesting
what they do. And when AMP was released, doing a lot of standard work and spec writing and
then the web ecosystem. What he did is when he saw that we need to come up with an alternative
very, very quickly because AMP is very JavaScriptdriven. And this is very cool because now two years
later what is slowly coming is the feature policy header. What you can do there is you
can define what should be allowed in your website. And there's a lot of cool stuff possible
that you can configure. It could be possible that a thirdparty script
coming from somewhere accesses wants to access my camera. I think this is cool that
you can block a lot of things that shouldn't be possible by default. Unfortunately, we're
now entering a little bit of cutting-edge technology here. But there are cool things
like unoptimized media and optimized images. You can limit yourself and tell the browser
I don't want to ship, even accidently, massive images. You can then also define this for
iFrames. And there will be a JavaScript it's already shipping in Chromium browsers
there's also a JavaScript API that lets you access these values that came in via header.
This JavaScript API is still under discussion, though, so, please use that carefully. If
you had to look at this list very carefully, you might have noticed that there was one
thing missing in this huge list of things that you don't want to allow on your site.
What's the most annoying one on websites these days? Push notifications, right? People working
on these kinds of things figured out handling push notifications and disallowing them are
trickier than first thoughts. If you want to know how we go with these kinds of things,
you can follow this issue. What's the support for feature policy? We're not that bad. Which
is prettying with pretty cool. So, with these kinds of headers, I set up
another site,/respectful. And what you see is the permission dialogue is gone because
I don't allow it. And also, the JSConf PNG in the bottom and defined in CSS, I pushed
that to be loaded quicker which makes make the experience a little bit better. So, building
for the web is very, very, very hard these days. There are so many things to consider.
You have to think about the design, the content, web performance, accessibility, frameworks,
network and devices. And there's surely more things that you have to consider when you're
building for the web. So, all things headers, this 20minute rundown is not a complete one.
If you want a complete overview, my friend Shepp maintains this slide deck. This is a
massive resource if you want to know what HTTP headers are out there and you can Google
headers for hackers. And Andrew Betz give this is fantastic talk if you want to know
more. I really believe that the web has to be safe.
My mother shouldn't browse the web and mine cryptocurrency. People pay a different amount
of money depend on which situation they are and where they are in the world. And it has
to be respectful because nobody likes a person that is asking random questions and permissions
all the time. So, the web has to be safe, affordable and respectful so that it really
is for everybody. Thank you very much.
[ Applause ]