Placeholder Image

Subtitles section Play video

  • Good afternoon, everybody. I'm Andrew Ross Sorkin. It is a privilege to have with me

  • Peter Thiel this afternoon, one of the great legendary investors in Silicon Valley. He has been involved in just about everything that you touch and feel, including being the co-founder of PayPal, the co-founder of Palantir. He made his first outside investment, made the first outside investment, I should say, in Facebook. His firm Founders Fund is a big backer of Stripe and SpaceX. His firm backed numerous other startups through the Founders

  • Fund and Thiel Capital. He also started the Thiel Fellowship, a two-year program that's an alternative to a college degree, which I want to get to at one point. And more importantly than all of it, he has touched some of the people and found the people who you read about in the headlines every day, from Mark Zuckerberg to Elon Musk to Sam Altman and so many others.

  • And it is great to have you here.

  • Thanks for having me.

  • We're also going to talk a little politics as well, along with maybe some of the issues and culture conversations that are happening in Silicon Valley. But here's where I want to start the conversation, because I want to start the conversation talking about people, because I think there's something actually extraordinary when you think about your track record over the years of involving yourself in investing, not just in companies, but ultimately in people. You wrote a book, which is coming on a 10-year anniversary. And by the way,

  • I reread it, and it stands up in a very big way. It is called Zero to One. And you wrote the following about founders, the idea of founders. You wrote that the lesson for business is that we need founders. If anything, we should be more tolerant of founders who seem strange or extreme. We need unusual individuals to lead companies beyond mere incrementalism.

  • And I mention that because I also just mentioned a number of individuals which we read about all the time. And some of those people would be described as unusual, perhaps, or even strange. And I'm curious about how you think over the years you have found these individuals, what it is that has made these individuals as successful as they have become.

  • Yes, it's obviously, if there was some simple magic formula, this is what a founder looks like, and you invest in this category of people who's a founder, it probably gets faked. It's like, I don't know, it's a 20-year-old with a T-shirt and jeans, or something like this, or you end up with all kinds of really fake ideas. But yeah, I think a lot of the great companies that have been built over the last two decades were, somehow, they were founded by people where it was somehow deeply connected to their identity, their life's life project.

  • They had some kind of idiosyncratic, somewhat different vision of what they were doing.

  • They did something new, and then they built something extraordinarily big over the years.

  • And of course, they have these sort of extreme personalities, often have a lot of blind spots, and there sort of are all these ways in which it's a feature, and there are ways in which it can be a little bit buggy. But it's sort of a package deal, and I net out to it being massively advantageous versus, let's say, a professional CEO being brought in.

  • The prehistory of this, I would say, would be in the 1990s. The Silicon Valley formula was you had various people found the company, and then you'd replace them as quickly as possible with professional CEOs, professional management. And there are variations of this that happened with Netscape, and Yahoo, and even Google, all these companies.

  • The Gen X people founded them. The baby boomers came along and took over the companies and stole them from the Gen X founders in the 90s. In the 2000s, when the millennials founded the companies, they were given more of an opportunity, and it made a big difference.

  • The Facebook story I always tell is, it was 2006, two years in, Zuckerberg was like 22 years old, and we got a $1 billion offer to sell the company to Yahoo. And we had a board meeting. There were three of us, and we thought we should at least talk about it.

  • It was a lot of money. Zuckerberg would make $250 million, and it was sort of an eight-hour-long discussion, and he didn't know what he'd do with the money. And he'd just start another social networking company. He kind of liked the one he had, and he didn't know what else he would do, and so he really didn't want to sell. And if you had a professional CEO, it would have just been, man, I can't believe they're offering us $1 billion, and I'm going to try not to be too eager, and we better take the money and run. And getting that one thing right makes a big difference.

  • Let me ask you a different question. All of these individuals had a huge impact on society and have an enormous individual power. And I think one of the things that you've argued in this book and that you've argued over the years is that we need to give them that power.

  • We need to offer them a latitude that in many ways we don't offer others.

  • Well, I think one of the frames I always have is that there are many ways in which the United

  • States, the developed countries, have been relatively stagnant for the last 50 years.

  • Progress has slowed. We've had progress in computers, Internet software, and many other domains. Things have kind of stalled out, and it sort of manifests in low economic growth in the sense that the younger generation is going to have a tough time doing as well as their parents. And there is sort of this way that there has been this broad stagnation for 40, 50 years, and we need to find ways to do new things. I don't think tech startup companies are the only ways to do them. That is a vehicle for doing it. And yeah, if you don't allow these companies to have a certain latitude and flexibility to try to do new things, we shut it down right away. The stagnation will be worse than ever.

  • Okay, but here's a separate almost philosophical question. I'm going to read back something you said to The New Yorker. There was a piece about Sam Altman. This is right around actually when OpenAI began, 2016. And I think it actually might even be representative of how you might think about Mark Zuckerberg or Elon Musk or some of these other kinds of major players.

  • This is what you said. You said, Sam's program for the world is anchored by ideas, not people.

  • And that's what makes it powerful, because it doesn't immediately get derailed by questions of popularity. And I thought that that was actually very indicative of most of the people that you have invested in. It's really been about ideas, and in some ways, you could even argue is disconnected from people.

  • I think it is really about a whole wide... People, they're able to think about a wide spectrum of things. They're able to think about... Good founders have theories of how to hire people, how to manage them, how to build teams. They have theories about where the culture of the society's work going. They have technical things about the product, the design. They have ideas about how they should market their company. So they're sort of polymaths who are able to think about a lot of these things. But yeah, I'm biased towards a lot of the ones where it's more intellectual. But I think that quote has held up pretty well with Sam Altman. Maybe he needed to pay a little bit more attention to the board and things like that. There was probably a people dimension that he had ignored a little bit too much in November 2016.

  • Since we're on the Sam Altman of it all, and since Sam was here yesterday, I'm so curious.

  • You were a mentor of his. What do you think of open AI? What do you think of AI more broadly right now? I mean, are we in a bubble? Is this the future? What is this?

  • That's a broad question. I think I'm always hesitant to talk about it because I feel there's so many things I would have said about AI where I would have been very wrong two, three years ago. So maybe I'll start by just saying a little bit about the history of what people thought was going to happen, and then the surprising thing that open AI achieved that did happen. If you had this debate in the 2010s, there was sort of one maybe frame in terms of two paradigms, two books. There was the Bostrom book, Superintelligence 2014, which is that AI was going to build this godlike superhuman intelligence. It was heading towards this godlike oracle. That was what AI was going to be. And then there was the Kai-Fu

  • Lee rebuttal 2018, AI superpowers, which was sort of where the CCP rebuttal to Silicon

  • Valley that no, AI is not about godlike intelligence. That's a science fiction fantasy Silicon

  • Valley has. AI is going to be about machine learning, data collection. It's not conscious.

  • It's not any of these weird things. It's surveillance tech. And China is going to beat the US in the race for AI because we have no qualms about sort of this totalitarian, not the word he used, collection of data in our society. And that was sort of the way the AI debate got framed. And then the thing I always said was, man, it's just such a weird word. It means all these different things. It's annoyingly undefined. But then the sort of surprising and strangely unexpected thing that happened is that, in some sense, what open AI with chat GPT 3.54 achieved in late 22, early 23 was you passed the Turing test, which was not superintelligence. It's not godlike. It's not low tech surveillance. But that had been the Holy Grail of AI for 60 or 70 years. And it's a fuzzy line. The Turing test is you have a computer that can convince you that it's a human being. And it's a somewhat fuzzy line. But it pretty clearly hadn't been passed before. It pretty clearly is passed now. And that's a really extraordinary achievement. It raises all sorts of interesting, big picture questions. What does it mean to be a human being in 2024? The placeholder answer I would have been tempted to give a couple of years ago would be something like the Noam Chomsky idea that something very important about language, this is what sets humans apart from all the other animals. We talk to each other. And we have these rich semantic syntax things.

  • And so if a computer can replicate that, what does that mean for all of us in this room?

  • And so it's an extraordinary development. And it was also somehow, even though it had been the Holy Grail, some of the last decade before, it was not expected at all. And so there's something very significant about it and very underrated. And then, of course, you get all these questions about, is it going to, the econ one question, is it a compliment?

  • Is it going to make people more productive? Or is it a substitute good where it's going to replace?

  • What do you think of all of this and how bullish as an investor are you on this? And what do you think our society is? When you hear Sam Altman talk about this, you say he's right.

  • That's what it's going to be. Do you think it's going to be something else? You lived through 1999. There's some people who say this is a hype cycle. Other people say this is the future.

  • Well, I'm very anchored on the 99 history. And I somehow always like to say that 99 was both. It was, you know, the peak of the bubble was also, in a sense, the peak of clarity.

  • People had realized the new economy was going to replace the old economy. The Internet was going to be the most important thing in the 21st century. And people were right about that. And then the specific investments were incredibly hard to make. And even the no-brainer market leader. So, you know, if you said 1999, the no-brainer investment would have been

  • Amazon stock. It's a leading e-commerce company. And they're going to scale and they'll get bigger.

  • And it peaked in December 99 at $113 a share. It was $5.5 in October 2001, 22 months later.

  • You then had to wait until the end of 2009 to get back to the 99 highs. And then if you'd waited until today, you would have made 25 times your money from 99. You would have first lostyou would have gone down 95% and then made 500X. So even the no-brainer investment from 99 was wickedly tricky to pull off in retrospect.

  • And I sort of think that AI, the LLM form of AI

  • These are the large language models.

  • Large language models.

  • Open AIs of the world.

  • Again, that's – passing the Turing test, I think it's roughly on the scale of the

  • Internet. And so it's an incredibly important thing. It's going to be very important socially, politically, philosophically, about all these questions about meaning. And then the financial investment question I find unbelievably hard and confusing. And yeah, it's probably quite tricky.

  • If I had toif you had to sort of concretize it, one thing that's very strange about theif you sort of just follow the money, at this point 80% to 85% of the money in AI is being made by one company. It's NVIDIA. And so it's all on this sort of very weird hardware layer, which Silicon Valley doesn't even know very much about anymore. We don't really do hardwarewe don't do silicon chips in Silicon Valley anymore. I get pitched on these companies once every three or four years and it's always, I have no clue how to do this. It sounds like a pretty good idea, but man, I have no clue and we never invest.

  • And soand then there's sort of this theory that the hardware piece makes the money initially, then gets more commodified over time and it'll shift to software. And the – I don't know, the multi-trillion dollar question, is that going to be true again this time or will NVIDIA sort of have this incredible monopoly position?

  • And what's your bet at the moment?

  • I suspect NVIDIA will – I think it will maintain its position for a while. I think the game theory on it is something like all the big tech companies are going to start to try to design their own AI chips so they don't have to do the 10x markup to NVIDIA.

  • And then how hard is it for them to do it? How long will it take? If they all do it, then the chips become a commodity and nobody makes money in chips. And so then do you go into hardware and you should do it if nobody else is doing it. If everybody does it, you shouldn't do it. And then maybe – I'm not sure how that nets out, but probably people stay stuck for a while and NVIDIA goes from strength to strength for a while.

  • I have a related but maybe personal question for you. You happen to have this very interesting relationship with Sam Altman and then also a very interesting relationship with Elon Musk.

  • You both worked at PayPal. You famously were part of a coup effectively to push Elon Musk out of the company. You're now friends with him all over again and have a stake in SpaceX.

  • You can maybe walk us through that friendship. We had some rough moments in 2000-2001.

  • We can get into that if you want, but where I was going to go with this actually is one of the things that's been fascinating and fascinating to the Valley and I think to the rest of the country has been the commentary we've heard from Elon Musk who helped build

  • OpenAI with Sam and the break actually between the two of them as creating this not-for-profit and what's happened to it. In fact, Elon Musk originally sued Sam earlier this year and then dropped the suit recently. But how do you think about this idea of a company that was started as a not-for-profit and all of the safety concerns and things that you hear from Elon on one side and Sam on the other?

  • Man, it's whichever person I talked to last I find the most convincing probably. So, you know, I talked to Elon about it and he made this argument. It's just completely illegal for a non-profit to become a for-profit company because otherwise everyone would set up companies as non-profits and take advantage of the tax laws and then you turn them into a for-profit and this is the most obvious arb and they just can't be allowed to do this. It's obviously just totally illegal what Sam's trying to do at OpenAI. And then like half an hour after the conversation was over, at the moment, it's like, oh, that's a really strong argument.

  • And then half an hour later, it's like, but, you know, the whole history of OpenAI is that the biggest handicap they had was a non-profit and it led to all these crazy conflicting things culminating in this non-profit board that thought it was better to shut down the company or the whole venture, whatever you want to call it, rather than keep going. And nobody is ever going to take the lesson from OpenAI to start a non-profit and turn it into a for-profit later given what a total disaster that was. But yeah, whoever I listened to last I find the most compelling.

  • Let me ask you a different question. You left Silicon Valley. You have now moved to Los

  • Angeles. That's your home.

  • We left San Francisco specifically. Yeah.

  • San Francisco specifically.

  • It was, it just felt it was time to get out.

  • So tell us why it was time to get out because I think a lot of the issues that actually we read about whether around OpenAI and some of the culture issues at a lot of these companies are the reason you decided you didn't want to live there anymore.

  • It was, man, it's hard to, it's a bunch of things that came together, but it was, there was a sense that it was sort of the ground zero of the most unhinged place in the country.

  • It was, you had this catastrophic homeless problem, which maybe is not the most important problem, but sort of, and it was never getting better. You had, it was by 2018 when we moved to LA, it felt like it had become extraordinarily self-hating where everybody who was not in tech hated the tech industry. This would be, this is very odd. It would be like the people in Houston hating oil or people in Detroit hating cars, you know, people in New York hating finance. And so it had this unhinged, self-hating character in the city itself.

  • And there were all these things that seemed extraordinarily unhealthy. And if you asked me in 2021, you know, I would have said, man, they are finally, you know, yes, they're sitting on the biggest, you know, they created all this wealth and yet they are going to succeed in committing suicide. Three years later, you know, I think the jury is a little bit more out because maybe the AI revolution is big enough that it will save even the most, you know, I don't know, the most ridiculously mismanaged city in the country.

  • It seemed to me, I thought that part of the issue that you had with San Francisco was the politics of it. And not just the politics of it, but how politics had seeped into the culture of so many of the companies and feeling, I think that you thought that it had moved in a very progressive way.

  • Yeah, that's always a very clear dimension of it. But, and that's sort of the tip of the iceberg. That's the part that's above the surface that people always focus on. And then the part that's below the surface is just the deep corruption, the mismanagement of the schools, the buses, all the public services, the way things don't work, the way the zoning is the most absurd in the country. You know, there was, I don't know, there was a house I was looking to buy where you couldn't build access into the garage. And Gavin Newsom, who was the Lieutenant Governor of California at the time, said he'd help me get a garage access permit. Again, it's not clear that's what the Lieutenant Governor of the fifth largest economy in the world should be doing. But he said he knew how to do this in San

  • Francisco, and it was circa 2013. And then, you know, you needed to get it, you needed to get the neighbors to sign off, which was maybe doable. And then you needed to go to the Board of Supervisors because you had to build a staircase, and it was a public walkway, and the whole public had to comment. Nobody knew what happened then. But then even harder, a tree had grown where the driveway was supposed to be, and you needed a tree removal permit. And this was the sort of thing that you would never get. And so you can describe all this as like crazy left-wing ideology, but I think it's more, you know, it's more like, you know, really, really deep corruption. And then this is, you know, this is in a way the San Francisco problem, it's the California problem. The analogy I have, if you want to think about the economy of California, in some ways it's analogous to Saudi Arabia.

  • You have these, you know, you have a very mismanaged state government. There's a lot of insane ideology that goes with it. But you have these incredible gushers called the big tech companies. And then there's a way the super insane governance is linked to the gold rush of the place. And so, yeah, there's sort of a, there's some point where it'll be too crazy even for California, but California can get away with a lot of stuff you wouldn't get away with elsewhere. San Francisco, my judgment, had gone a little bit too far. Maybe the AI thing is, you know, they found one more giant gusher, and, you know, and maybe you don't have any Saudi money in your fund, I hope.

  • Virtually none, no.

  • Just in case. Here's a different question, though, because it gets to the politics of this, which is, there's been, it seems like a shift inside Silicon Valley, and a shift in terms of even the way the companies are managed around, in a political dimension. And you were very outspoken, obviously, you supported President Trump in the last go around. But speak to what do you think, and I want to get to that part too, but I want you to speak first to the shift in the Valley, at least what seems like a shift, perception wise, from being a very progressive place to maybe less so. Maybe not, maybe it's just the, you know,

  • Larry Summers and I spoke this afternoon, and he said there's, you know, 10 people he thinks are very loud on Twitter. And that's why the world thinks, you know, that between

  • David Sachs and, you know, a bunch of other people, and Elon Musk, that's not representative, and I think you may have a different view.

  • Well, you know, I don't think, you know, I don't think you'll get a majority of tech people to support Trump over Biden or anything like that. I think you'll get way more than you had four or eight years ago. So, you know, I don't know if you're measuring a relative shift or an absolute number. Those are probably two different measures on that. But I would say that if we ask a very different question about, let's say, you know, extreme wokeness, or I don't even know what you're supposed to call it, there is probably, you know, a broad consensus among the good tech founders, startup CEOs, people across a pretty broad range that it's gone way too far. I talked to a lot of these people, a lot of them are, you know, I'd say more centrist Democrats, but it is just, you know, we need to have a secret plan to fight this. And they are, what they tell me behind closed doors is way, way tougher than what they dare say in public. And so it is like, you know, we need to have a plan to hire fewer people from San Francisco, because that's where the people, the employees are the crazy. So if you want to have a less woke workforce, we need to, you know, we're going to have targets about how we steadily move our company out of San Francisco, specifically.

  • And yeah, these are the sort of conversations that I've...

  • And do you agree with this? And by the way, let me just read, you probably know Alex Outswang,

  • Scale AI CEO.

  • Yes.

  • Who said that he's put together what he calls a merit-based hiring program. He said he's getting rid of DEI. It says hiring on merit will be a permanent policy at scale. It's a big deal whenever we invite someone to join our mission. And those decisions have never been swayed by orthodoxy or virtue signaling or whatever the current thing is. I think of our guiding principle as MEI, merit, excellence, and intelligence. Bill Ackman went on to say that he thinks DEI is actually inherently a racist and illegal movement.

  • Yeah, I, again, my feel for it is there aren't that many people who are willing to say what

  • Alex says, but I think there are an awful lot of people who are pretty close to thinking this, that there were ways they leaned into the DEI thing. It was like an anti-Trump thing.

  • Everything was sort of polarized around Trump for the last four years of his presidency.

  • And so you have to demonstrate that you're anti-Trump by being even more pro-DEI. That's of course not necessarily a logical thing. But yes, people somehow ended up in this place that was very different. And then, you know, there probably, there always are questions what drove the DEI movement, the wokeness in these companies. And it probably is over-determined.

  • You know, there probably is a, there's a bottom-up, you know, woke millennial people who were brainwashed into DEI in their colleges. That's sort of the bottom-up theory. There's sort of a, I don't know, there's sort of a cynical corporate version where this is, you know, the leadership of the company either believed it or used it as sort of a, as a way to manage and control their companies in certain ways. You know, the part that I always feel is a little bit underestimated is there was probably also some top-down level from a government regulatory point of view where, you know, if you, if you don't do DEI, there is some point where you get, you do get in trouble. You know, if you, if, you know, I don't know.

  • This is part of the ESG movement now. I mean, look, we talked, we talked about ESG here for a long time.

  • There was an ESG movement and then there were probably all these, these governmental versions.

  • And so, I don't know, this, this would be probably, if my candidate for the company in Silicon Valley is still probably the most woke, would be, would be something like, like

  • Google. And it's less woke than it was two, three years ago, but in some ways, you know, they have a total monopoly in search. And so, there's sort of some way in which, you know, if wokeness is a luxury good, like you can afford it more if you're a monopoly than if you're not.

  • And then, and then the problem for, for Google as a pretty big monopoly is that it's always going to be, you know, subject to a lot more regulatory pressure from the government. And so, if you have something like the Gemini, the Gemini AI engine and, you know, and it's sort of, it's sort of this comical absurdist thing where it generates these black women

  • Nazis, you know, and you're supposed to find famous, famous Nazis and then the diversity criterion gets applied across the board. And so, it just generates fake black women who are Nazis, which is, you know, a little bit too progressive, I think.

  • But, but, but then, but then if you think of it in terms of this larger political context,

  • Google will never get in trouble for that. The FTC will never sue them for misinformation or anything like that. That's not, that, that stuff does not get fact checked. You don't really get in trouble. And you probably even get some protection where, okay, you know, you are, you're going along with the woke directives from the ESG people or the government.

  • Maybe you overdid it a little bit, but we trust you to be good at other things. So, there may be a very different calculus if you're a sort of a large quasi-regulated monopoly.

  • Let me ask you about large quasi-regulated monopolies and also concentration, but I want to read you, this is something you actually wrote in your book 10 years ago about Google and it being a monopoly. You said, since it doesn't have to worry about competing with anyone, it has wider latitude to care about its workers, its products, and its impact on the wider world. Google's motto, don't be evil, is in part a branding ploy, but it's also characteristic of a kind of business that's successful enough to take ethics seriously without jeopardizing its own existence. In business, money is either an important thing or it's everything. Monopolists can't afford to think about things other than making money.

  • Non-monopolists can't. In a perfect competition, a business is so focused on today's margin that it can't possibly plan for a long-term future. Only one thing can allow a business to transcend the daily brute struggle for survival, monopoly profits. Were you writing in favor then of the monopoly idea or against?

  • Oh, I will, well, my book was giving you advice for what to do and from the inside, you always want to do something like what Google did. If you're starting a company, competition is for losers or capitalism and competition, people always say they're synonyms. I think they're antonyms because if you have perfect competition, you compete away all the capital.

  • If you want to have Darwinian competition, red in tooth and claw, you should open a restaurant.

  • It's like an awful, awful business. You will never make any money. It's perfectly competitive and completely non-capitalist. So from the inside, you want to always go for something like monopoly.

  • And then, yes, there are, in other parts of my book, I also qualify it that there are dynamic monopolies that invent something new, that create something new for the world and we reward them with patents or things like that that they get. And then at some point, there's always a risk that these monopolies go bad, that they become like a troll collecting a toll at a bridge, that they're not dynamic and that they sort of become fat and lazy.

  • Are we there yet? I mean, Lena Kahn, if she was sitting here, would say, we got there a long time ago.

  • I think, man, there are all these ways I would, if I had to defend Google, and I would still say that it's still better run, even in its silly woke way, even in a slightly troll-like toll collecting way than whatever completely destructive path Lena Kahn would have for the company. And so we're still getting more good from Google as it is.

  • Do you feel that way about all the big tech companies? I mean, you have lots of investments in smaller companies that need to access the App Store on Apple's phone. Do you say to yourself that that should be opened up? Do you say they created the store, therefore they should control the store? How do you think about that kind of stuff?

  • There sort of are a lot of complicated questions on all these things. It's obviously, yeah, they're much bigger. We're in a very different place from where you were 10 years ago on these things. I still worry that, in many cases, the remedy is worse than the disease.

  • A lot of these businesses are, if you have a natural monopoly, the remedy is not to break it up. The remedy is to, it's like a utility company, and then the remedy is to regulate it or tax it or do various things like that. So if you could convince me that we are as static as a utility company, then maybe the remedy is to do something like that. But to the extent, the real monopoly problems in our society, I think, are much more these old economy racket-like companies. I spent three months during COVID in Maui, and there's a single hospital in Maui, and there was sort of this line, if you have a pain, get on a plane. Because it's a local racket, it's completely mismanaged. And that's probably, the really dysfunctional monopolies in our society are these pretty big ones that control these local markets and that are 100% troll collecting. And I think even with all my misgivings about something like Google, it's a vastly morally superior place to your local hospital.

  • How do you feel about it in the context of AI, which is to say that if you believe AI is this transformative product, and that there's only going to be three or four players who are going to control all of these models, whether it be Google or Microsoft with open

  • AI, or maybe an Amazon along the way. I don't know where you think Apple is going to land in this conversation. But is that a good thing or a bad thing? And also, I would argue, even as an investor who looks at startups, how do you even look at startups down the line that could effectively get competed away, because I'm going to basically build my app with AI, and I'm just going to copy what you've made?

  • Well, I think it's in a very different place from the consumer internet type businesses, which there's been a history, they've been around for decades. If I had to make the anti-Google argument, it would be they won at search in 2002, and there's been no serious competition for 21, 22 years. They beat Microsoft and Yahoo in 2002. And then it's somehow very hard to disrupt that. And then I think the AI piece is extremely fluid. It's extremely hard to know. It's very hard to know where the value is. And as I said, it's like the obvious monopoly right now is NVIDIA. But it doesn't seem that durable. If you thought

  • NVIDIA is as durable as Google, I mean, the stock's really cheap. You should just buy it like crazy. And so what the market pricing is telling you is, yeah, they have a temporary monopoly, but it's not very robust. And then on the level of the software companies, I worry that OpenAI has a lead. All sorts of other people are going to be able to catch up pretty quickly. And if you have three or four doing the same thing, that's a lot more than one. Very, very, very different set of economics.

  • I want to pivot the conversation again, because another investment that you've made and been very public about is Bitcoin. And you have remained a very big bull. You have come out publicly and you said that enemy number one to Bitcoin is the sociopathic grandpa from

  • Omaha that you described as Warren Buffett. Can you tell me what you were thinking when you said that? It got a lot of laughs. So somehow people, it probably had some kind of a nerve. But it was in a 2022 Bitcoin convention talk I gave. And there were three separate enemies. There was Jamie Dimon, Larry Fink.

  • Larry Fink, who's no longer an enemy, by the way.

  • He sort of shifted, but maybe I can save the man until Larry Fink thinks too. And then there was Warren Buffett. And the rough context was, my sort of political, sociological analysis was the cryptocurrencies were, it was a revolutionary youth movement, but for them to really take over, you needed, it couldn't just be a student uprising like 1968. You needed to get the rest of the society on board. And as long as the old people were going to sit on their hands, that was the big blocker for cryptocurrencies to go to the next level.

  • Are you still convinced?

  • I think it's gotten partially unlocked with the Bitcoin ETF. But then probably the part where I'm less convinced of is this question of the sort of ideological founding vision of Bitcoin or these cryptocurrencies as sort of a cypherpunk, crypto anarchist, libertarian, anti-centralized government thing.

  • Isn't that what got you interested in the first place?

  • That's what I thought was terrific about it. And then the question is, does it really work that way? Or has that thread somehow gotten lost? And so when people in the FBI tell me that they'd much rather have criminals use Bitcoin than $100 bills, it suggests that maybe it's not quite working the way it was supposed to.

  • Have you sold any of your Bitcoin?

  • I still hold some. There are all these ways. I didn't buy as much as I should have. And I'm not sure it's going to go up that dramatically from here.

  • From here?

  • Yeah, I think we got the ETF edition. And I don't know who else buys it quickly from here.

  • That's some interesting investment advice. That actually surprised me because I don't think I've heard you. I thought you were still all in.

  • I still have a small position. It probably still can go up some, but it's going to be a volatile, bumpy ride. And I had a dual reason. One was this sort of ideological, decentralized future of computing world that I really do believe in, really believe would be better.

  • And it seemed like the perfect vehicle for that for such a long time. And I am just much less convinced of that.

  • Interesting.

  • So maybe Larry Fink with the BlackRock ETF surrendered to the forces, the anti-ESG forces, or maybe it's more like Bitcoin's been co-opted by them. And I worry it was more the latter.

  • Okay, different question. SpaceX, that's another big investment for you. After ousting Elon Musk, you became friends with him again. What does that look like to you in the future? Is that going to be the biggest and best investment you've ever made when this is all said and done?

  • Man, it's – I'm always sort of hesitant to sort of pitch these companies too much. But I thinkyeah, there were sort of a lot of different things that came together.

  • When Elon was building both Tesla and SpaceX in the 2000s, people thought he was just really, really crazy. And I think even a lot of those of us who had worked with him at PayPal, there was this PayPal book that David Sachs and I thought of writing.

  • The Elon chapter was, I think, entitled something like The Man Who Knew Nothing About Risk or something like this. And there were all these sort of crazy Elon stories I could tell.

  • And then if one of the two companies had succeeded, you would say, well, maybe he still got really lucky. But when two out of two companies that people thought were completely harebrained in the 2000s, when they both succeed, man, you have to somehow reassess it and somehow the rest of us somehow are too risk-averse or there's something about risk he knows that we don't or something like this.

  • And so, yes, I think there's – and then

  • You didn't invest in Tesla.

  • We did not invest in Tesla. We should have invested in that one. It was public at a much earlier date. And then there's always sort of a self-imposed limitation that we tend not to invest in public companies.

  • There's 20% of a venture fund you could, but that was sort of theand I think they started Tesla in 2002. It went public in 2010.

  • I remember test driving the Model S in October 2012, and it was just, wow, this is just a terrific car.

  • And you could have – I think the correct thing would have been to wait until they came out with it, and then nobody liked it.

  • It was such a hated stock, shorted by everybody, and you could have just waited 10 years and just bought the shares in the public market, and you would have made 10 times your money in 18 months and 100 times in the next six, seven years.

  • Seven, eight years.

  • And then there was something also about SpaceX that looked like it was a very crazy, harebrained idea, and yet it was very straightforward.

  • It was the rocket launch business.

  • The government will payor the customers pay for the vehicles before you build them, so it's actually cash flow positive from a verythere's some money they needed for expansion, but it was basically a cash flow positive business.

  • It was a weird investment in 2008.

  • They didn't need any of the money, but there was some – I think some NASA or government rule where they needed outside investors.

  • And so they were forced to take investors, and then we were on good enough terms that we did it, and everyone else thought it was too crazy.

  • If you had been a Tesla shareholder, we've all been reading about it, would have you paid him the big compensation package?

  • I would havewell, I think the nuanced answer is I would have voted in favor of the compensation package because you would know that if it failed, the share price would have gone down a lot the next day because people would wonder whether Elon would quit, and that would be bad for the company.

  • So whether you believe in the package or not, the rational thing would be that you should vote for it.

  • And then if you think it's a bad idea, maybe you sell your shares after you get a pop or something like this.

  • Soand that's the obvious game theory on why Elon was going to win that vote no matter what.

  • And it was really crazy that we listened to people in the media, and I'm not sure yourself, but we're all saying it was this harebrained thing, and the shareholders were all going to vote against it.

  • And if you just did the basic analysis, it was obvious Elon was going to win the vote regardless of what the shareholders

  • What did you think of him investing in X? Did youhe thinksby the way, X is what he wantshe wanted PayPal to be.

  • Did you give him money for that?

  • I – we didyeah, we didn't do anything on the Twitter one.

  • We didn't do anything on the current XAI company.

  • I guess there's sort of a lot of different things that have X in the name with Elon.

  • But yeah, it basically – I don't know.

  • I think it was a incredibly – I do think we need like a broader surface area for debate in our society.

  • And so I thinkand obviously there are all these very complicated tradeoffs between how much speech do you suppress, how much good speech are we suppressing, how much bad speech are we allowing, how do you get those tradeoffs right.

  • Very, very hard to do.

  • My judgment is we should have just a lot more surface area for debate, discussion.

  • And I think what Elon did with Twitter was I think extremely important.

  • And I support it as an ideological project.

  • I worry about it as a financial thing.

  • I don't know if that works.

  • We've looked over the years.

  • We've looked over and over again at starting some kind of media company.

  • And there's always sort of this thought you could docan't you do something else in the sort of right-of-center media space?

  • And does it have to all be as lame as Fox News?

  • Isn't there an opening to do something else?

  • And then the question you always have to ask is it the Murdoch family that keeps it lame or is it

  • Why do you thinkyou called it lame.

  • So why do you think it's lame?

  • I think it's lame because they're controlled by the advertisers, and there's a very narrow limit on what they can do.

  • And then the Elon question with Twitter was are you really allowed to do this and keep the advertisers?

  • And so that's where

  • Would you make it harder?

  • It's super important what Elon did as a nonprofit.

  • But it's – yeah, it may notit's going to be tough as a business.

  • What about Truth Social?

  • They have a few other problems they have to solve first.

  • Not something you'd invest in.

  • You get your head around the $6 billion valuation.

  • If I wanted to secretly funnel money to the Trump campaign and get around the campaign limitations so the stock price goes up and he can sell some stock and fund his campaign, that might be a reason to invest.

  • Do you think people are doing that?

  • Probably not.

  • They probably don't think of it in quite that literal term, but maybe that's what's going on.

  • Do you know people?

  • Have you talked to people in your realm who have said, hey, this is how we're going to get money?

  • Nobody has said that, but, yeah, it's probably – I suspect a lot of the investors are going to vote for Trump.

  • So they're thinking about it at least on some subconscious, not articulated level.

  • I want to talk about Trump – a little more about Trump in just one more second, but I want to ask you one last related social media question, which is the surgeon general was here in Aspen.

  • And I think you've probably seen in the last couple weeks that he came out and genuinely believes that social media and the Facebooks of the world really have done a real disservice to young people in the country.

  • And I just wonder what you think of that as somebody who invested early in Facebook.

  • Man, there's – I think – I can't say that he's 100 percent wrong.

  • The place where I always push back on is that I feel it's too easy to turn tech or the social media companies into the scapegoat for all of our problems.

  • And so, yes, there's probablythere is some kind of an interesting critique one can make of the tech companies.

  • And if you ask how many of the executives in those companies, how much screen time do they let their kids use?

  • And there's probably sort of an interesting critique one could make.

  • What do you do?

  • Not very much, and I think that's very

  • What's not very much?

  • An hour and a half a week.

  • Hour and a half a week.

  • Something like that.

  • How old are your kids?

  • Three and a half, five years.

  • Three and a half and five years old, OK.

  • Butand I think that is sort ofif I had to do theif I were to make the anti-tech argument, it's that there are probably a lot of people in tech who do something quite similar for their own families, and thatand there's some questions that that might lead you to ask.

  • And then on the other hand, I don't think this is the main cause for all the different types of social dysfunction we have, and maybe it's a 15%, 20% cause.

  • There's sort of a lot of other things that have gone super haywire in our society, and by putting all the blame onto tech or onto one company, you are really ignoring a lot of other stuff.

  • We could do a whole panel on this, but one related question, because we haven't mentioned it, TikTok.

  • Do you think of TikTok as a national security threat?

  • It's – yeah, it's a verythere's something very strange going on, since obviously the TikTok algorithms for the U.S. are very different from the ByteDance algorithms in China.

  • And so

  • Would you shut it down in this country?

  • I think – I probably would lean towards a tougher response.

  • I think just to shift from the normative to the – I don't think we're going to do anything.

  • I met the TikTok CEO last summer, and I – the Singaporean guy, the TikTok CEO.

  • And I told him he didn't need to worry about it being shut down in the U.S., and maybe I'm wrong, but I think

  • Because you don't –

  • Because it willwe are incompetent and slow and bureaucratic, and we will never get our act together in dealing with the problems of China until the day they invade Taiwan, and then it will be shut down within 24 hours.

  • And since I think there's a 50-50 chance that China will invade Taiwan in the next five years, my advice to the TikTok CEO was you should take all your people and computers and get them out of China, because once Taiwan gets invaded, it'll be too late.

  • So that's my advice.

  • But you don't need to worry about us doing anything before then.

  • And then his somewhat – I'm not sure good or worrisome answer was that they had studied World War I and World War II very carefully, and there were a bunch of companies who were able to trade with all sides in those wars.

  • By the way, that implies that he also thinks that China is going to invade Taiwan.

  • He did notyou know, again, it wasagain, I didn't frame it deterministically.

  • I said 50 percent chance, five years.

  • We are over time, but we're going to keep going just for a little bit, because I promised you we were going to talk a little bit about politics, and I want to talk about your own politics, your own personal politics.

  • You were very vocal and outspoken about supporting who is now the former president the last time.

  • You have been less outspoken this time.

  • We're all going to watch the debate tonight.

  • So before we even get into the lessons and everything that you've learned and all of your prior experience, are you planning to support the president this time?

  • You know, I –

  • The former president, I should say.

  • You know, you hold a gun to my head.

  • I'll vote for Trump.

  • I'll still – I'd rather have him vote than Biden.

  • I'm not going to give any money to his super PAC.

  • I'm going to be less involved in all these ways.

  • And, man, look, it isand then I don't know.

  • I think Trump will win.

  • I think he will win quite solidly.

  • I don't think it's going to even be close.

  • And then my pessimistic look-ahead function is after he wins, there will be a lot of buyer's remorse because the elections are A-B tests.

  • You know, if you ask me to make a pro-Trump argument, I wouldn't, but I can probably come up with anti-Biden arguments, and Biden is not going to make a pro-Biden argument.

  • He's going to make anti-Trump arguments.

  • And it's these two different hate factories that we have targeted at each other, and that's the way the politics work.

  • And my judgment is Trump will easily win that.

  • But, yeah, the election is a relative choice.

  • The post-election is absolute.

  • And then it will be like if Biden wins, how did we get this senile old man?

  • And if Trump wins, it will be, wow, this is

  • Can I just ask you this?

  • It's still like this clown show or whatever people will say.

  • I'm not going to ask you to make the pro-Trump argument.

  • But that's sort of

  • I understand.

  • But let me ask you about what I imagine is your anti-Biden argument.

  • I look at the last four years and say to myself, if you were in Silicon Valley and you owned stock in these tech companies over the last four years, they virtually did nothing but go up.

  • And I knowand I just – I wonder if you can make the argument because we can talk about Lena Kahn and we can talk about regulations and potential taxes and all sorts of things, but it's hard for me to look at the last four years and say, especially if I was sitting I would imagine in your seat and say, this was a terrible travesty.

  • But maybe I don't understand.

  • Well, I mean, I don't know.

  • This may not be believable to you, but I don't think it's – the only thing I care about is whether the country is good for the tech billionaires.

  • And I think there are a lot of people who have not experienced the last three or four years this way.

  • I think one of the things Carville said in the earlier presentation just before this one that I thought was quite good was there's been a shocking loss of support by the Democrats in the 18- to 35-year voters, and it's because you can't get on the housing ladder.

  • You canthe college debt's overwhelming.

  • You can never get started.

  • And so there's sort of a sense that

  • And you think you'd be able to fix that?

  • I don't – there's – I think it's just an up-down referendum on the incumbent at this point.

  • And my guess is that the sense is Biden's definitely not going to fix it and his time will run out, and that's –

  • And then this is where I'm not overly excited.

  • I don't think Trump will particularly fix it.

  • But, look, the place where people at thisin the audience here, I think, are just maximally divergent is, yes, the stock market has been great for people here.

  • You're in this wonderful bubble in Aspen where it's like

  • I don't know if Clinton is still president, and it's 1995.

  • And everything is just getting better every day in every way, and it's like some New Age chant.

  • If you just say that to yourselves, it's true.

  • And then the part of the Trump statement that I think was the most offensive thing he said, it was very offensive not just to Democrats and to Republicans, and especially to Silicon Valley, was make America great again because that was a pessimistic slogan.

  • It was the most pessimistic slogan a major presidential candidate ever had because what it says implicitly is this is no longer a great country.

  • And that's what you are never supposed to say, especially if you're a Republican.

  • That's why the Bush people probably hate him more than anybody in this audience.

  • And then Silicon Valley wasit's somewhat offensive to people in New York City, but the bankers on Wall Street don't really think they're making the country a great place, so it's not personally offensive.

  • It was personally offensive to Silicon Valley.

  • And yet I always think there is this problem of stagnation.

  • There is this problem we're stuck.

  • There's a sense that we're not progressing in all these ways as a society as much as we have.

  • I don't think Trump has all the answers, but I think what I said in 2016 is the first step towards solving problems is to at least talk about them.

  • What about the polarization part?

  • The polarization part, the uncertainty part, the questions about democracy and the rule of law and the future of a country, and I think there's a lot of people who worry about those things.

  • Sure. Those are all still

  • Those are things you probably wouldn't worry about if Biden was the president, right?

  • I feel the country is still very polarized.

  • It's been getting more polarized for decades.

  • It was polarized against Bork in the 80s, and that was sort of a new crescendo in polarization.

  • That's the way Fox News was polarized against the Clintons, and we've been – I don't know.

  • It's always what's cause and effect.

  • Is the polarization causing the stagnation, or does the stagnation lead to the polarization?

  • I don't think the polarization just happens in a country where everything is growing.

  • Let me ask you a tonal question.

  • You come across as a very sensible, reasonable person, I think.

  • There are people here who I'm sure will disagree with you about lots of different issues, but my question about tone, about the president and the tone of the presidentby the way, I should say, and I'm not speaking out of school.

  • You also, I would say, by the way, you've been a Republican for a long time now, public about that.

  • You're also proudly gay, openly so, and I wonder if you can tie

  • I think there's a lot of people who say to themselves that President Trump, when he talks about some of the issues around LGBT issues in this country and other people, there are people in those communities who say they don't feel safe about it.

  • Yeah, well, we go through all these different versions of that.

  • I think – I mean there was never any thought of reversing gay marriage or any of those things, by Trump at least.

  • And I think theand look, I think theyeah, there are all these ways.

  • They're not the way I would articulate these things.

  • But the sort of polite tone, there waspeople had attempted to say something's gone very wrong in our country.

  • The house is on fire. It's burning to the ground.

  • We are a society in decline, stagnation, where – I mean maybe AI will save us, but this is like the way people talk about AI, just to come back to that.

  • It's like if it doesn't lead to this cornucopian growth, I mean we're just completely going to be buried by budget deficits and debt for decades to come.

  • And it's – like I think AI is a big thing.

  • Is it big enough to solve our budget deficit problem?

  • I don't believe it is.

  • And so yeah, we have a lot of these problems.

  • And at some point, extra politeness is not quite the thing.

  • It was an inarticulate shriek for help.

  • And look, my sort of fantasy in 2016 in supporting Trump, this was where I was completely delusional, was this would be the way you start to have a conversation.

  • And that's another reason why I'm off-ramping it.

  • I'd much rather have the sort of conversation we had here, because if I lean in all the way to support Trump, it'll be all about that and we can't talk about all these other things, which is the way we are going to substantively solve the problems.

  • Well, I want to thank you for this conversation and for addressing all these issues.

  • It really was a phenomenal discussion.

  • Thank you so, so much.

  • All right. Thank you very much.

  • Thank you.

Good afternoon, everybody. I'm Andrew Ross Sorkin. It is a privilege to have with me

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it