Subtitles section Play video Print subtitles Hi, everyone. I am excited to be here and really, really excited we get to start the day off and the conference talking about peer-to-peer protocols because it is something I think is super fascinating and I have interesting stuff to show off later in my talk. This talk is called reclaiming the web with peer-to-peer protocols but before I dive in and tell you what in the world that is all about, I want to introduce myself. I am Tara Vancil and I am a web developer. I'm also really, really into anything to do with nails and nail art and I love music, especially if it is from Beyonce. If you like nails or Beyonce or both we should definitely talk later on. I am a web developer but I have spent the last year and a half in a strange role. Instead of working on a team where my job is to build websites, I have been building a browser with these two guys. Pfrazza and mafintosh. We have experiments we want to run and figure the browser is the best place to do that. The browser is called Beaker and it is experimental way meaning we are doing experiments in the science way not that it is buggy. It actually works. You can download it. The cool experience we are running in Beaker is what happens when you put a peer-to-peer protocol in the browser. Pretty interesting things happen like being able to publish a website from the browser and being able to offer experimental peer-to-peer APIs to developers. I will tell you a lot more about Beaker and show you Beaker later. Before I do that, I want to spend some time to reflect on what even is the web because I think it is relevant to your conversation. -- our. So, I am asking this to reclaim the web and we should probably talk about who took it in the first place and what have they done with it. We are all here in this room at a conference dedicated to the web so surely the web can't be in too much trouble; right? I have to agree. I am extremely optimistic about the future of the web. I am really pleased with where the web is at right now. I am really happy to see so many new people still coming to the web every day to learn how to build things with HTML, and JavaScript, and CSS. I am pleased to see that the tools we depend on like NPM and WebPack and Babel are improving to make workflows seamless. And I am thrilled browsers are working hard to improve the web and compatibility across browsers. If you are not optimistic about the web, I might highlight the Chrome Developer summit happened this weekend. This is an opportunity for Chrome developers to share ideas and prototypes about how the web is going to move forward. They announced cool proposals like virtual scrolling which should help you improve loading as you scroll down a page and other neat stuff was demoed like Houdini which is an improvement on how CSS works. Seeing the neat announcements earlier this week gave me time to reflect on the wins of the web like CSS grid which if I dare say makes com posing layouts kind of fun or the fact that Boku has been working hard with the W3C and other browser vendors to build a huge test sweep for compatibility across the web. The web is making progress and I think it is important to recognize that and the people that make it happen because their jobs are not easy. Also because the web is a miracle of human cooperation if you think about it. When you take a second to think about what the web is it is a miracle it exists and let alone that it is improving. The web is this strange thing where 7 billion people on earth have come together and decided on a language for how we build digital stuff and get it from one computer to another. Like, that is absolutely miraculous we pulled that off. You have a web page and no matter what context you are browsing in, you have a reasonable expectation it will work consistently. If you will allow me a moment to be sentimental I just want to say I think that is bad ass. This is a talk about reclaiming the web, though. Even though I am really optimistic about the web, I am a little bit worried too. I am worried about the web isn't perfect and honestly that is OK especially when you think about how this web is a weird amorphous set of technologies that we have all agreed to use and that is pretty much the only thing binding it together; our shared agreedness to use it. The web is only 28 years old. The first was built in 1990 by Tim Burners-Lee. It was only 25 years ago the first mainstream browser was released. We are operating on a small time scale here. The web is a baby so you would expect it to have some problems. The web isn't perfect. We can accept that but I think the next step is to ask ourselves how are we going to shape the next 30 years and I mean, we, the people in the room who are web developers and people who influence standards, we do have say over how the web works. We know that the web is going to change because there are standards of bodies and browser vendors and other interested parties who want to shake the web but the question is what values are why going to choose to uphold in the next 30 years? What new features are we going to enable? How do we decide those things? Oftentimes, I think they are decided by personal experience. Some of you probably work with e-commerce and you might be paying more attention to the web payments API. Or maybe some of you in this room have been the target of a focused-harassment campaign on social media and you might have an interest in seeing how the web learns from what we have seen about how humans engage online in the last 28 years. The web is very, very new and we are still learning so much about how communities work online and how humans behave. When I think about what I want to web to look like in 30 years, honestly the community bit of it is what interests me the most. Yeah, the graphics and all the cool technical stuff about the web is amazing and it is what makes the web the web but we come to the web because we want to talk to me. We want to share interests, make friendships and form communities. I am extremely interested to ask how can the web platform itself change, however subtly, the ways we interact with each or online. Whether you like it or not, this guy right now has a lot of say over how online communities work. And I am not sure he knew what he was getting into when he started Facebook but the point is Facebook is a massive, global online community and we have learned some kind of terrifying things about humans interacting with each other online. We are mean, we are nasty, we are reactive, we are just not very good at talking to each other, are we? This guy and other folks are in charge of helping us move forward. I frankly don't think they have stepped up to that responsibility very well. I want us, as web developers and a web community, to think about what can we do to adjust how people talk to each other online? Are there some nobs we can turn in terms of the technical architecture of the web that can improve the situation? I don't know but this question motivated me to work on Beaker and it has been the guiding star in my exploration and the experiments we have been running at Beaker. So to go back to the question I mentioned earlier, what happens when you put a peer-to-peer protocol in the browser? Would that be the right knob to turn to maybe make it a little nicer to communicate with people online? I don't know. I can show you some of the experiments we have done and share some of the things we have learned in the process. Before where do that, let's just take a look at HTTP because this is the knob we turn in Beaker. The protocol knob. This is the distillation of how HTTP works. It is a client server model where one person can upload data to let's say Facebook, Twitter, WhatsApp. A person gets the data from the service and there is nothing wrong with that architecture, but I would like to propose HTP and this client server architecture is a big part of why we have gotten the way -- a big part of why online communities, massive online communities, have turned out to be so problematic. In contrast, this is what a web base on a peer-to-peer protocol looks like. This is a contrived example but it is a network wherein one individual can connect directly to another individual. In this case, we are looking at someone sending a message but we can also think of websites being transmitted from computer-to-computer and cutting out servers. Why don't I just show you Beaker because I think it is a lot more exciting to see this stuff in action. This is Beaker. It is a browser. It is not terribly exciting to look at when you are just looking at the start page but it works like you would expect a browser to work. You can browse HTTP websites. This is the Beaker website and nothing remarkable here. But if you can see in the top corner there is a tab that says P2P version. When I like that it will take me to the peer-to-peer version of this website. There is. You probably didn't even see that. The only thing that's different is the protocol. It says dat and that is the protocol we use in Beaker. This website doesn't look like anything special. It is just a bunch of files, CSS, HTML images, links, everything. It works just the way you would expect a browser to work. I mentioned that by putting a P2P protocol in the browser we enabled things like publishing a website from the browser. I am show that off. I will go to the top-right menu and click create view and I will create a website using a pasted template Beaker provides. Beaker will literally create a new URL for me and populate the website with basic template files. I will click it and it will happen fast. Boom. We are looking at Beaker's view source tool here. I will show you more but let me set the title of the website for now. Then I will open it up. Here it is. It is just a website. It doesn't do anything except let you change the background color. What is interesting about this is this is a website and I can share the URL with any of you and you can download the website and files directly from me. I didn't publish the files on a server anywhere. I did it all inside the browser. How do you edit websites? If we jump back to the view source tool we can look at all the files that compose a website. Why don't we open index.html and actually edit it. I will change this to say hello, Seattle. Hit the save button. When I refresh I will see the change. Cool. I want to show off this neat feature we have which is live view loading. We put that right into the browser because it is so convenient. Like a lot of us in this room who are developers, we probably have a preference about writing code like Sublime or VS code. This syncs the files to a directory on my computer. OK. I am going to open this up in Sublime and open the index.html in Sublime and go back to this website. We have live view loading on and I will edit the H1 tag to say hello from Sublime and when I hit save you will see the updates. Boom. There we go. This is neat but I mentioned earlier that Beaker also has peer-to-peer APIs. This is really where it gets exciting because static websites are cool and make up a lot of the web. They are not every part of the web. We still need applications where people can have profiles and data linked to their profiles. Beaker's APIs are the key to making that possible. I am going to open the dev tools and show you a little bit of how this works. To start off, we need to get access to the files that compose this website. I am going to do that using Beaker's datarchive constructor which basically gives you access to the functions that help you connect to the peer-to-peer network. We have a variable that we will be able to work with here. Let's start off with just listing all the files in this website. You might notice this looks a lot like the Node files API and that was on purpose. Let's do files.readdir and we will read the top level directory. Then we will console.log the output. We are seeing a listing of all files that compose this website. There is only 65 of them right now. Let's see if we can read an individual file, the content file. So files.readFile and let's do index.html file. We are looking at the actual content of this page now. We can do all sorts of things like readfiles, readfile listing, and we can write two files also. This gets interesting when you think about storing data in websites. I am actually going to overwrite the HTML file for this website. Files.writeFile and I am going to replace it with a new h1 tag. OK. Now when I refresh I get a totally new index with HTML. Again, this is sort of a contrived example but I want to take it back to a real world example. I said I am interested in thinking about how online communities can be shaped by the intelligent that underpins them. I want to show you an application that we built called Fritter. This is Fritter. As you can probably guess by the name and appearance, it is inspired by Twitter. That was on purpose. I really like Twitter. I like a lot of things about Twitter. I dislike some things about Twitter but I like that it gives me a nice feed, I can follow my friends on, and I like that the content is short, sweet, and enjoyable most of the time. So, we said how far can we get with building something like Twitter using peer-to-peer protocols and we got pretty far actually. It is pretty cool. This doesn't look like much but I will break down the architecture. Fritter works in two pieces. There is the application itself which is what we are looking at. It is a JavaScript application that uses Beaker's APIs to consume a profile and fetch data from the peer-to-peer network and render it into a nice feed. When I write a post, Beaker uses the writefile API to write a post to my profile. Works like you would expect. But what does a profile actually look like? What is a profile? Well, it is not a row and database that lives up in a server somewhere. It is just a website. This is it. It has some metadata. My name, my bio, well this is a fake profile, it has information about the people I follow, and it contains my posts in JSON. This is interesting because we separated the data from the application which is not news. We are used to doing that as developers but what is different is your profile on Fritter is just a website. It is not tied to Fritter. If you wanted to customize your own version of Fritter you could do that and you wouldn't need to give up your circle of friends or content. You could carry on like normal. I think this demonstrates a lot of potential for building meaningful applications with peer-to-peer protocols. This is my demo profile. There are posts from my friends, like two friends. And people talk to each other. I am not going to say this is the ideal architecture for moving forward on the peer-to-peer web or it won't have its own kind of problems. But it is a kind of community controlled social media and that's really, really exciting to me because if there is anything I have learned about being a woman online in the last year and a half is that sometimes you really do want to take control over who you do and don't talk to because otherwise it gets a little bit noisy. So that is a quick and dirty tour of Beaker. We are experimenting. We are doing a lot of things that are strange and most certainly don't adhere to what is standard but we think it is worthwhile to be a little messy and see what happens. We don't have the kind of reputation like Apple or Mozilla or Google does to influence standards bodies. We are just random people who had an idea and we built it. We hope you will find it interesting. If you want to try out Beaker and see what other people have built I recommend checking out my website. I have the P2P subdomain on my website. I have a huge selection of apps, games, and other things people have built on the peer-to-peer web. Thank you for coming to learn about Beaker and the peer-to-peer web. Come talk to me later and you can check out the slides on my website. Thank you so much. [APPLAUSE]
B1 web peer beaker browser website fritter Reclaiming the Web with peer-to-peer protocols // Tara Vancil // CascadiaJS 2018 1 0 林宜悉 posted on 2020/04/15 More Share Save Report Video vocabulary