Subtitles section Play video Print subtitles >> Sean: Just to remind people here who, perhaps, haven't seen our earler videos with you, Brian. So, you worked at Bell Labs and at AT&T. I mean you did research there. You were looking to the future! How right were you? How wong were you? Can you tell us? >> BWK:Yeah, so I was at AT&T. I went there for a couple of summers, starting in the in the late 1960s and I stayed there, essentially full-time, until 2000. I was in Bell Labs. So at the time AT&T was a very big company. It was well over a million people. It was the biggest company in the country and it provided communication services - telephone - for essentially all of the United States. And Bell Labs was the research arm of that. So that's the part that in theory is looking at the future, trying to improve the services that they have and build the technology, in all kinds of ways, that will make it easy for -- or make it possible to improve -- telecommunications services. And so AT&T did things like the transistor and the laser and miscellaneous other things that were useful. And also a lot of computing-related things as they gradually realized that computers were here to stay. And that a lot of mechanical things like relays, and so on, could be replaced by electronic devices and then controlled by programs, running on general-purpose computers. In some sense the Golden Age for me was probably the 1970s. I had just gotten out of School and with the early days of computing there were a lot of really interesting work going on and that was when UNIX was developed, in the early 1970s, and the C programming language and a variety of other things. And it's also when I played with programming languages as well. So that was definitely a good time. Sort of behind your question, I think, is: "Well, how good were you at seeing the future?" And the answer is: "Pretty awful!". I think most people are pretty awful at seeing the future and I'm quite - I could hardly deny it so I'm perfectly willing to admit it - I don't think we had and I wouldn't say all my colleagues and friends at Bell Labs. had much of a clue of how the world would change. We I guess knew about Moore's Law. Moore's Law, I think came from I think he published that paper in 1965 based on a fairly a short period of data, like maybe five years. And I don't think - certainly not I - but I don't think many people realized the implications of an exponential rise in capability at a fixed price, if applied for 40 or 50 years. I don't think we realized any of that >> Sean: So, nobody believed it? Nobody took it that seriously for this period of time? >> BWK: I don't know whether I would say: "Nobody believed it", but I don't think people realize the implications. Certainly I did not realize the implications and we were, I think, often surprised when we discovered that something we had done had actually been noticed in the outside world. I remember at one point Dennis Ritchie saying to me something like: "We have arrived!", because he had found in the New York Times, which was still running classified ads for programmers, they'd wanted a programmer who knew UNIX. And this was, I would guess, in maybe late 70s or something like that. And this was us [saying]: "Boy, what we've done has actually had some influence in the outside world", in a way that was completely unpredictable. >> Sean: So is there anything you can think of that you think should have made it but didn't? I'm kind of putting you on the spot here! >> BWK: Yeah! that's definitely on the spot. It's hard to say. One gets the feeling that a lot of things that were obvious to us have been kind of lost in going forward. One [is] UNIX, the operating system; C the programming language. Many of the tools. Those all had a flavor of minimality, of being small and compact and good for their particular purpose,. and not with too many bells and whistles and very carefully written, and so on. And as Moore's Law came along and we got more and more processing power and more and more memory capacity people sort of forgot the merits of simplicity, perhaps. And so things became rather more baroque or rococo or - pick your architectural period - and I think that that is in some ways to everyone's detriment that systems are very complicated. The other thing is that at one point there was really only one - let's call it UNIX - system and so things were fairly compatible. And another thing that I think we've lost is compatibility - that it's harder to get things to work together perhaps than it was. But in some ways perhaps that's necessary. You think the big change in a lot of things has been the rise of networking: the idea that computers are not self-contained things any more but, rather, they are devices that talk to other devices using a variety of networking technology - Internet itself. And of course increasingly phones. The phone system. And in fact there's no real difference between the phone system and the Internet in some sense, it's kind of an accidental separation that will disappear over time. And as different things talk to each other what happens there gets more and more complicated. There are more ways that things can break. You need standards but the standards aren't necessarily there. And so, yeah, I think that's the place where we need the most improvement in some sense. It's ways to make things work better together and make them simpler and I guess those are related. >> Sean: I think there's a kind of irony, as well, that you've got this massive phone company effectively pushing forward on computer technology, and now the many computer people use is the phone in their pocket! >> BWK: Yes, right, I think that's again something that people didn't predict. There's a famous story that a consulting company in the United States, McKinsey, did a study for AT&T and told AT&T that there was basically no market for cell phones; that people didn't want portable phones. And this I think is a triumph of how consultancy can go wrong and yet not suffer for it. in any sense. But AT&T didn't back that stuff as well - a lot of the early work on mobile telephony was done at AT&T. Absolutely. They figured out this the notion of cells and how you would pass conversations from one cell to another something that's completely invisible to people today but all of that was in fact invented originally at AT&T. And then of course evolved tremendously by other people over the years. And again cell phones today profit from Moore's law. That you've got incredible power in a very, very compact device, would not have been possible thirty years earlier. >> Sean: And the last thing they're used for is phone calls >> BWK: It's hard to say. You're hinting that people don't use their phones to talk and I think that's true. As far as I can tell students that I deal with in the United States use their phones primarily for texting each other and perhaps for checking their Facebook pages - or whatever the modern equivalent is. And this is so foreign to me that I don't actually know how to use a phone in many ways. I had an extended discussion with Dave [Brailsford], earlier this morning, about how you deal with with phone numbers and whether you need a 44 and not a zero or is it ... ? Hopeless! And that was the reason I had trouble communicating with him this morning. So culture shock in some sort of way.
A2 moore law unix sean moore bell people Bell Labs' Research (Prof Brian Kernighan) - Computerphile 3 0 林宜悉 posted on 2020/04/13 More Share Save Report Video vocabulary