Transcript of E111: Microsoft to invest $10B in OpenAI, generative AI hype, America's over-classification problem
This episode of the All In podcast covers Slate's article on All-In, a San Francisco business owner caught spraying a homeless person, Microsoft's investment in OpenAI, Biden's documents, and the best cabinet positions and ambassadorships.
This transcription was generated by AWS Transcribe + GPT-3 and might be inaccurate. Full copyright belongs to the All In Podcast hosts.
Chamath: Is anyone else seeing a half-second lag with Jake? It looks like a second delay.
Jason: Testing, testing, 1, 2, 12.
Chamath: It always happens here; it comes and goes. Relax, relax.
Sacks: Are we going? We're recording this; it's plus material. This is Chamath at the punchline.
Jason: Alright everyone, welcome to episode 111 of the world's greatest podcast according to Slate. The podcast that shall not be mentioned by the press apparently. Well, they did, this is the conundrum: it's such a phenomenon that we're the number one business and the number one tech podcast in the world, hands down, that the press has a hard time giving us any oxygen because they want to hate us. They want to cover it.
Chamath: You're saying they take the ideas but they don't want to cite it?
Chamath: They don't want to cite it.
Jason: They don't want to cite it, but anyway, shout-out to Slate.
Friedberg: What I found interesting was the point that we don't want to be subject to independent journalists asking us independent questions, so we go direct. That seems to be the trend nowadays when people say they want to go direct because they don't want to be subject to independent journalists.
Jason: One might ask themselves why subjects don't want to go directly to journalists. Yeah, exactly.
Sacks: Yeah, because there's a specific reason why principals, the subject of stories, do not want to have the press interpret what they're saying. They don't feel they're getting a fair shake. They feel like the challenges that we avoid independent scrutiny, points of view, and they're constantly writing hit pieces about us. The question is, when we want to present our side of it, do we need to go through their filter or not? Why would you go through their filter when it's always gonna be a hit piece? They have a class hatred, basically, of technology entrepreneurs and investors. You're right, Jake, they don't hate you because you genuflect to their political biases. You see, if you do what SPF did, which is basically agree with all of their biases, then yes, they'll treat you better. That's the deal. That's how specific large media outlets you can name one. I'll trade you. I'll tell you what, I'll trade you Fox for MSNBC and CNN and The New York Times, The Washington Post, and The Atlantic Magazine, and on and on and on. You get a lot of mileage out of being able to name Fox. The fact of the matter is, that's a podcaster. She's independent. Now you can name one. I mean, literally one outlet that is not part of this mainstream media and they all think the same way. There are very small differences in the way they think. It's all about clicks at this point and it's all about journalism and advocacy. Is that what you're calling advocacy, is bias?
Jason: That's what I'm talking about, activism journalism. Yes.
Friedberg: I think Draymond also highlights a really important point, which is you know he started his podcast and it's become one of the most popular forms of sports media, and he can speak directly without the filtering and classification that's done by the journalist. And it seems to be a really powerful trend; the audience really wants to hear direct and unfiltered raw points of view. And there may still be a role for journalism separate from that, which is to then scrutinize, analyze, question, and critique activists. Are there also journalists out there, Sacks?
Sacks: Well, it depends on what the topic is and what the outlet is. But I would actually argue that most of these journalists are doing what they're doing for the same reason that we're doing what we're doing, which is they want to have some kind of influence because they don't get paid very much. But the way they have influence is to push a specific political agenda. I mean, they're activists; they've basically become advocacy journalists.
Jason: Yes, that's a term I coined for it.
Chamath: You guys see this brouhaha where Matt Yglesias wrote this article about the Fed and about the debt ceiling, and through this whole multi-thousand word piece, he didn't understand the difference between a percentage point and a basis point and then calculate the interest.
Jason: Wait a second. You're saying the Feds are raising taxes by 25%?
Friedberg: That's a huge difference, 25%, between a principal and an outside analyst, right? Like a principal typically has a better grasp of the material. But you know, the argument from a journalist is considered the conventional wisdom within their circle.
Friedberg: I understand. But the argument from a journalist is that, by having that direct access, that person is also biased because they are an agent, a player on the field, and they do have a point of view and a direction they want to take things. So it is a fair comment that journalists can theoretically play a role as an off-field analyst, and that they may be less educated and more biased than we are.
Chamath: That may or may not be true what the two of you are debating, which is a very subjective take. But the thing that is categorical and cannot be denied is that there is zero checks and balances when something as simple as the basis point percentage point difference isn't caught in proofreading, isn't caught by any editor, and isn't caught by the people who help them review this. And so what that says is all kinds of trash must get through because there's no way for the average person on Twitter to police all of this nonsensical content. This one was easy because it was so numerically illiterate that it just stood out. But can you imagine the number of unforced errors journalists make today in their search for clicks that don't get caught out and may actually tip somebody to think A versus B? That's, I think, the thing that's kind of undeniable.
Jason: There is a very simple test for this. If you read journalists writing about a topic you are an expert on, whatever the topic happens to be, you start to understand, "Okay, well on that story I'm reading, they understand about 10 or 20 or 30% of what's going on." But then when you read stories that you're not involved in, you know, you read a story about Hollywood or, I don't know, pick an industry or a region you're not super aware of, you're like, "Okay, well that must be 100% correct." And the truth is, journalists have access to their name. There's a name for it. It's called the "Gentleman Amnesia Effect." You just plagiarized Michael Crichton who came up with that. Yeah. So, you know, it's exactly right. But I think it's worse than that. It's because now the mistakes aren't being driven just by sloppiness or laziness or just a lack of expertise. I think it's being driven by an agenda. So, just to give an example on the Slate thing, the Slate article actually wasn't bad. It kind of made us seem, you know, cool. The sub-headline was "A Close Listen to All in the Infuriating Fascinating Safe Space for Silicon Valley's Money Men." Okay. But the headline changed. So, I don't know if you guys noticed this. The headline now is "Elon Musk's Inner Circle is Telling Us Exactly What It Thinks." First of all, they're trying way too hard to describe us in terms of Elon, which you know is maybe two episodes out of 110. But before "Inner Circle," the word they used was "Cronies," and then somebody edited it because I saw "Cronies" in one of those tweet summaries, you know, where it does a capsule or whatever.
Sacks: Yeah, and those get frozen in time. So, you know, they were trying to bash us even harder, and then somebody took another look at it. What happened? I'll tell you what happens in the editorial process. Whoever writes the article, the article gets submitted, maybe gets edited, proofread, whatever. Maybe it doesn't even, in some publications, they don't have the time for it because they're in a race. Then they pick, there's somebody who's really good at social media. They pick six or seven headlines. They A/B test them, and they even have software for this where they will run a test. Sometimes they'll do a paid test, they put $5 in ads on social media, and see which one performs the best. That's the one they go with. So it's even more cynical, because people who read the headlines, sometimes they don't read the story, right? Obviously, most people just see the headline, they interpret that as a story. That's why I told you when they did that New Republic piece on you with that horrific, monstrous monstrosity of an illustration, don't worry about it. People just read the headline, they know you're important. Nobody reads the story anyway.
Sacks: But it wasn't a bad article actually.
Jason: It was well written; I was actually in shock. I was like, "Who is this writer who actually took the time to write something so decent?"
Sacks: Yeah, he had listened to a lot of episodes. Clearly.
Chamath: That was a really good moment, actually. That was great advice because you gave it to him and you gave it to me, because both of us had these things, and Jason said the same thing: just look at the picture and if you're okay with the picture, just move on. And I thought, "This can't be true," and it turned out to mostly be true.
Sacks: But my picture was terrible. Yeah.
Jason: But it's close to reality. So, I mean, there was Peter Thiel.
Sacks: But that just shows how ridiculously biased it is, right? Major look.
Jason: Like Hugh Grant alone. Let's put that up one more time here. Alone. Looks like Hugh Grant. I just kind of, not bad, kind of looks like Hugh Grant and like Notting Hill.
Sacks: I knew that article was going to be fine when the first, you know, item they presented as evidence of me doing something wrong was basically helping to oust Chasing Bodine, which was something that was supported by like 70% of San Francisco, which is a 90% Democratic city. So, not exactly evidence of some, you know, out-of-control right-wing movement.
Jason: Look at the headline, the quiet political rise of David Sacks has Silicon Valley's profits in an uproar. I'm just letting you know, people don't get past the six words in the image. It's 99% of people were like, "Oh my God, congrats on the, you know, Republic article." It could have literally been "laurels" what do they call them? Laurels? It seems like it could have just been filler words from their second paragraph down and nobody would know.
Sacks: But now, apparently, if you notice that San Francisco streets look like the Walking Dead, then apparently you're a prophet of the urban. I mean, people are so out of touch that they can't even acknowledge what people can see with their own eyes.
Jason: That's the bias. That's gotten crazy. And I don't know if you guys saw this, really horrible dystopian video of an art gallery owner who's been dealing with owning a storefront in San Francisco which is challenging, having to clean up feces and you know, trash and whatever every day and I guess the guy snapped and he's hosing down a homeless person who refuses to leave the front of the store. Just, like the humanity in this is just insane, like really, like you're hosing a human being down, it's terrible, who is obviously not living a great life and you know.
Sacks: I can feel for both of them. I agree that it's not good to hose a human being down, on the other hand, think about the sense of frustration that store owner has because he's watching his business go in the toilet because he's got homeless people living in front of him. So they're both like being mistreated, the homeless person being mistreated and the store owner being mistreated by the city of San Francisco. That person is not in a privileged position, that person, probably the store owner, he's probably fighting to stay in business. I'm just saying, I'm not saying that's right, but I think, I'm just saying, you're trying to do what you're trying to do is, oh my God, look at this, this homeless person being horribly oppressed, know that store owner is a victim too.
Jason: Yeah, there's no doubt it's horrible to run a business, you know. This is symbolic of the breaking down of basic society. Like, these two people are obviously... It's just a horrible moment to even witness. It's like something...
Chamath: Jason, do you have equal empathy for the store owner and the homeless person or no?
Jason: Under no circumstances should you hoes a person down in the face who is homeless. Like, it's just horrific to watch. It's just inhumane. This is a human being now, but as a person who owns the store, yeah, my dad grew up in local businesses. People were abusing the store, you're trying to make a living and you gotta clean up, you know, whatever excrement every day, which...
Sacks: A person is in that moment, the empathy is not equal. I think you have more empathy obviously for the person on the receiving end of that host. Okay, but in general, our society has tons of empathy for homeless people. We spend billions of dollars trying to solve that problem. You never hear a thing about the store owners who are going out of business. So, on a societal level, you know, not in that moment, but in general, the lack of empathy is for these middle class, or working class, store owners who are struggling to stay afloat. And you look at something like what is it, like a quarter or a third of the storefronts in San Francisco are now vacant. This...
Jason: The shocking thing is like this person is running an art gallery storefront in San Francisco. Like, why would you even bother? Why would you bother to have a storefront in San Francisco? I mean, everybody's left. It's just...
Chamath: What do you mean? Why do you bother if you open the stores? So, what are you supposed to do all of a sudden?
Jason: Well, no, I mean you would shut it down at some point and find an exit and just like, businesses have large fixed costs, right? 10 years ago.
Jason: Exactly. At some point you have to shut down your story in San Francisco as soon as you can.
Chamath: The solution to everything, Jake, isn't to go to coding school online and then you know...
Jason: It was just a suggestion to consider moving to another city. True.
Chamath: A lot of folks in Silicon Valley, I think, in this weirdly messed up way do believe the solution to everything is to learn to code or learn to get a job, get a gig.
Friedberg: For years, the guy spent years building his retail business. A homeless person camps in front and the homeless person and he calls the police, but the police don't come and move the homeless person, the homeless person stays there. He asked nicely for them to move.
Chamath: Customers are uncomfortable going into stores.
Friedberg: Yeah, I stopped going to certain stores in my neighborhood because of homeless tents being set up right in front of the store, so I go to the store down the road to get my groceries or whatever. It's not an uncommon situation for a lot of these small business owners; they don't own the real estate, they're paying rent, they have high labor costs, and everything is inflating. Generally, the city population is declining. It's a brutal situation all around.
Chamath: I think that if everyone learns to code or drives an Uber, the problem is that, in the absence of things like local stores and small businesses, you hollow out communities. You have these random, detached places where you kind of live, and then you sit in your house, which becomes a prison, while you order food from an app every day. I don't think that is the society that people want. So, I don't know, I kind of want small businesses to exist, and I think that the homeless person should be taken care of. But the small business person should have the best chance of trying to be successful, because it's hard enough as it is; the mortality rate of the small business owner is already 90%.
Jason: It's impossible in San Francisco. Let's be honest, I'm not trying to push people to listen.
Sacks: You are because here's what I'm saying. I'm just shocked that the guy even has a storefront; I would have left it alone.
Sacks: You're sharing a tweet. That's a moment in time and you're not showing the ten steps that led up to it. Oh, a thousand steps, five times. He called the police from the customer's. The stuff that we were just talking about.
Friedberg: Maybe there was physical conflict that we didn't see, and he's resolving it. It's really hard to look at these videos and know what's going on. It's awful to see, but we don't know—it's a whole thing.
Sacks: Actually, you want to know another reason why we can't solve this problem? It's the language we use around the fundamental problem here—it's not homelessness, it's addiction and mental illness. SchElonberg has done the work. It's like he said, 99% of the people he talks to, it's either mental illness or addiction, but we keep using this word 'homeless' to describe the problem. But the issue here is not the lack of housing, although that's a separate problem in California, but it's basically the lack of treatment. So we should be calling them 'treatment lists'.
Friedberg: And mandates around this because, an enforcement—you can't have a super drug be available for a nominal price and give people, you know, a bunch of money to come here and take it and not enforce it. You have to draw the line at fentanyl. I'm sorry, fentanyl is a super drug.
Friedberg: These three alternatives are mandated rehab, mandated mental health services, jail, or housing services. If you're not breaking the law and don't have a drug addiction, then these are the four paths of success. If all four of these paths were both mandated and available in abundance, this could be a tractable problem. Unfortunately, the mandate...you guys remember that Kevin Bacon movie where Kevin Bacon was locked up in a mental institution, but he wasn't mentally ill? It's a famous story. It's a famous...what's that famous story? You guys are probably going to call me an idiot for messing this whole thing up. But I think there's a story where mandated mental health services, like locking people up to take care of them when they have mental health issues, became kind of inhumane.
Friedberg: And, a lot of the institutions were shut down and a lot of the laws were overturned. And there are many of these cases that happened where they came across as quite torturous to what happened to people that weren't mentally ill. And so the idea was like let's just abandon that, right? And, it's unfortunate. But I think that there's some, you know, we talk a lot about nuance and gray areas but there's certainly some solution here that isn't black or white. It's not about not having mandated mental health services and it's not about locking everyone up that has some slight problem. But there's some solution here that needs to be crafted where you know you don't let people suffer and you don't let people suffer both as the victim on the street but also talking about a 5150, I think, like when people are held, but because they're a danger to themselves or others kind of thing. Right?
Sacks: But think about the power of language here. If we refer to these people as untreated persons instead of homeless persons and that was the coverage 24/7 in the media is this is an untreated person. The whole policy prescriptions are completely different and realize there's a shortage of treatment. We realize there's a shortage of remedies related to getting people in treatment, as opposed to building housing. But why? Why?
Friedberg: And laws that mandate it that don't enable it? Because if you don't mandate it then you enable the free rein and the free living on the streets and the open drug markets and all this sort of stuff.
Jason: There's a really easy test for this. If it were yourself or a loved one, one of your immediate family members, would you want them to be picked up off the street and held with a 5150 or whatever code involuntarily against their will because they were a danger? Would you want them to be allowed to remain on the street? Would you want yourself if you were in that dire straits? And the answer of course is you would want to."
Friedberg: From a policy perspective on this Jake House, so let me ask you as our die-hard liberal on this show: what percentage of the time do you vote Democrat?"
Jason: I'm not a die-hard liberal, I vote Democrat 75% of the time."
Sacks: What percentage of independents or Democrats vote for Democrats, 5% Republicans?"
Friedberg: Is it not that your individual liberties are infringed upon if you were to be quote 'picked up and put away'?"
Jason: You know my position on it is if you're not thinking straight, if you're high on fentanyl, you're not thinking for yourself and you could lose the liberty for a small period of time, 72 hours a week, especially if you're a danger to somebody, yourself or other people. And in this case, if you're on fentanyl, if you're on meth, you're a danger to society. And I think if more people had that point of view and had that debate in a more open way, you could get to some path to resolution."
Chamath: Not how it happens. So, you guys know this, we won't say who it is, but someone in my family has some pretty severe mental health issues. And the problem is, because they're an adult, you can't get them to get any form of treatment whatsoever, right? You only have the nuclear option. And the nuclear option is you basically take that person to court and try to seize their power of attorney, which is essentially saying that individual liberties are gone. And by the way, it is so unbelievably restrictive. What happens if you lose that power of attorney, and somebody else has it over you? It's just a huge burden that the legal system makes extremely difficult. And the law is a backstop. You know, if the person is committing something illegal, like camping out or doing fentanyl, meth, whatever, you can use the law as a backstop against that person. But all that person can do is really get arrested. Even that is not a high enough bar to actually get power of attorney over somebody. The other thing that I just wanted you guys to know, I think you know this, but just a little historical context is, a lot of this crisis in mental health started because Reagan defunded all the psychiatric hospitals, he emptied them in California. And that compounded because, for whatever reason, his ideology was that these things should be treated in a different way. And when he got to the presidency, one of the things that he did was he repealed the Mental Health Systems Act, or M.H.S.A., which completely broke down some pretty landmark legislation on mental health. And I don't think we've ever really recovered, and that we're now 43 years onward from 1980. But just something for you guys to know, that that's had a lot of positive, but that's one definitely negative check in my book against his legacy is his stance on mental health in general and what he did to defund mental health."
Sacks: Well, let me make two points there. I'm not defending that specific decision. There were a bunch of scandals in the 1970s, epitomized by the movie One Flew Over the Cuckoo's Nest with Jack Nicholson, about the conditions in these mental health homes. That did create a groundswell to change laws around that. But I think this idea that somehow Reagan is to blame when he hasn't been in office for 50 years, as opposed to the politicians who have been in office for the last 20 years, I just think it's letting them off the hook. I mean, Gavin Newsom 10-15 years ago, when he was mayor of San Francisco, declared that he would end homelessness within 10 years. He just made another declaration like that as governor. So I just feel like I'm saying it's an interesting historical moment and I think it's letting society off the hook. Society needs to start thinking about changing priorities.
Sacks: We didn't have this problem of massive numbers of people living on the streets ten to fifteen years ago; it was a much smaller problem, and I think a lot of it has to do with fentanyl. The power of these drugs has increased massively; there are other things going on here. So, in any event, I mean you can question what Reagan did in light of current conditions, but I think this problem really started in the last ten to fifteen years in an order of magnitude bigger way.
Jason: Until people realize that these are a different class of drugs and start treating them as such, it's going to just get worse. There's no path, as far as I know, that Reagan handed out $800 a week to these addicts to feed their addictions. They can live on the street in San Francisco; that is the current policy of the city.
Chamath: All I wanted to do was provide that color, we had a system of funding for the mental health infrastructure, particularly local mental health infrastructure, and we took that back and then never came forward. And all I'm saying is, I'm just telling you where that decision was made. I think that's part of the solution here, yeah, we're going to have to basically build up shelters and support your point. The problem now, for example, is Gavin Newsom says a lot of these things, and now he's gone from a massive surplus to a $25 billion dollar deficit overnight, which we talked about even a year ago because that was just the law of numbers catching up with the state of California, and he's not in a position now to do any of this stuff. So this problem may get worse.
Sacks: Well, they did appropriate. I forget the number, but it was something like 10 billion to try and solve the problem of homelessness. I would argue that they're not tackling it in the right way because a giant special interest has formed around this problem, the building industry, who gets contracts to build 'affordable housing' or the industrial complex. They end up building 10 units at a time on Venice Beach, which is the most expensive land you could possibly build. So now there's a giant special interest and lobby that's formed around this. If you really want to solve the problem, you wouldn't be building housing on Venice Beach, you'd be going to cheap land just outside the city and you'd be building large-scale shelters that can house 10,000 people, not 10. You'd be having treatment services built into them. That's not what they're doing.
Chamath: By the way, do you guys want to hear this week in Grift? That's a great example of Grift. I read something today in Bloomberg that was unbelievable. There's about $2 trillion of debt owned by the developing world that has been classified by a nonprofit, the Nature Conservancy in this case, as eligible for what they called nature swaps. So this is two trillion of the umpteen trillions of debt that's about to get defaulted on by countries like Belize, Ecuador, Sri Lanka, Seychelles, you name it. And what happens now are the big bulge bracket Wall Street banks and the Nature Conservancy go to these countries and say, "Listen, you have a billion dollar tranche of debt that's about to go upside down and you're gonna be in default with the IMF. We'll let you off the hook and you know we will negotiate with those bondholders to give them 50 cents on the dollar. But in return, you have to promise to take some of that savings and protect the rainforest or protect a coral reef or protect some mangrove trees." All sounds good, except then what these folks do is they take that repackaged debt, they call it ESG, they market it back up and then they sell it to folks like Blackrock who have decided that they must own this in the portfolio. So it literally just goes from one sleeve of Blackrock, which is now marked toxic emerging market debt, and then it gets into someone's 401k as ESG debt. Is that unbelievable?
Jason: So you convert your savings and buy some ESG investments to make yourself feel good, worth two trillion dollars.
Sacks: All you know about E.S.G. is that Exxon is like the number seven top-ranked company according to E.S.G., and Tesla is even on the list. It's crazy that it's a complete game. All of those letters individually mean so much and should be worth a lot to a lot of people, but when you stick them together it creates this toxic soup where you can just hide the cheese. Yeah, governance is important in companies, of course the environment is important, social change is important, but why are these things grouped together in this? Just perverts? Alright, speaking of Microsoft, they're gonna put $10 billion or something in the chat. GPT-3 Degenerate AI, as I'm calling it now, is the hottest thing in Silicon Valley. The technology is incredible. I mean, you can question the business model maybe, but the technology is pretty well... Yeah, so what I'd say is $29 billion Azure credits a year.
Friedberg: That's one way to work at it. We should also look at a lot of other businesses that ended up being worth a lot down the road. I mean, sure you can model out the future of a business like this and create a lot of really compelling big outcomes, potentially.
Jason: So Microsoft is close to investing 10 million in OpenAI in a very convoluted transaction that people are trying to understand. It turns out that they might wind up owning 59% of OpenAI but get 75% of the cash and profits back over time, 49% of OpenAI. But they would get paid back the $10 million dollars over some amount of time. And this obviously includes Azure credits and GPT-3, as everybody knows this. It's an incredible demonstration of what AI can do in terms of text-based creation of content and answering queries, and it's taking the net by storm. People are really inspired by it. Do you think that this is a defensible real technology? Do you think this is like a crazy hype cycle?
Sacks: Well, it's definitely the next VC hype cycle. Everyone's kind of glomming onto this because VC really needs a savior right now. Just look at the public markets; everything we're investing in is in the toilet. So we all really want to believe that this is gonna be the next wave. And just because something is a VC hype cycle doesn't mean that it's not true. So, as I think one of our friends pointed out, you know, mobile turned out to be very real. I think cloud turned out to be, I'd say, very real. Social was sort of real in the sense that it did lead to a few big winners. On the other hand, Web 3 and crypto was a hype cycle that hasn't turned into a big bust yet, probably a hype cycle so far. No one can even explain what Web 3 is. In terms of AI, I think that if I had to guess I would say the hype is real in terms of its technological potential. However, I'm not sure about how much potential there is yet for VCs to participate because right now it seems like this is something that's going to be done by really big companies. So OpenAI is basically a, it looks like kind of a Microsoft proxy. You've got Google, I'm sure they'll develop it through their DeepMind asset, and you know I'm sure Facebook is gonna do something huge in AI. So what I don't know is, is this really a platform that startups are gonna be able to benefit from? I will say that some of the companies we've invested in are starting to use these tools. So, I guess I guess where I am is I think the technology is actually exciting. I wouldn't go overboard on the valuations. I wouldn't buy into that level of the hype.
Jason: But you think there could be hundreds of companies built around an API for something like GPT-3?
Sacks: Maybe. Yeah, I don't think startups are going to be able to create the AI themselves, but they might be able to benefit from the APIs. Maybe that's the thing that has to be proven out.
Friedberg: There's a lot of really fantastic machine learning services available through cloud vendors today. Right. So, as there has been one of these kind of vendors, and obviously OpenAI is building tools a little bit further down on the stack, but there's a lot of tooling that can be used for specific vertical applications. Obviously the acquisition of Instacart by Bio and Tech is a really solid example. And most of the big dollars that are flowing in biotech right now are flowing into machine learning applications where there's some vertical application of machine learning tooling and techniques around some specific problem set. And the problem set of mimicking human communication and doing generative media is a consumer application set that has a whole bunch of really interesting product opportunities. But let's not kind of be blind to the fact that nearly every other industry and nearly every other vertical is being transformed today. And there's active progress being made in funding and getting liquidity on companies and progress with actual products being driven by machine learning systems. And there's a lot of great examples of this. So, you know, the fundamental capabilities of large data sets, and then using these kind of learning techniques in software and statistical models to make kind of predictions and drive businesses forward in a way that they're not able to with just human knowledge and human capability alone is really real. And it's here today. And so I think let's not get caught up in the fact that there's this really interesting consumer market hype cycle going on where these tools are not being kind of, you know, validated and generating real value across many other verticals and segments.
Jason: When you look at this Microsoft opportunity and you see something that is this, it can be difficult to understand what it signals to you as a capital allocator and company builder.
Chamath: I would categorize deals into two buckets. One is easy and straightforward, and the other is more complex. This one clearly falls into the latter category, but that doesn't mean it won't work in our group chat with the rest of the guys. Someone said there's a lot of complex law when transitioning from a nonprofit to a for-profit, and there's a lot of complexity in deal construction. The original investors have certain expectations that they want to see.
Chamath: There may or may not be legal issues at play here that you encapsulated. In the last episode, there's a lot of stuff we don't know, so it's important to give those folks the benefit of the doubt. If you're asking me, it's in the "too hard" bucket for me to take seriously. That being said, I haven't seen the deal, so I can't comment. Here's what I will say: the first part of what Sacks said is really important for entrepreneurs to internalize, which is "Where can we make money?" I think Google will open source their models because the most important thing they can do is reinforce the value of search, and the best way to do that is to make them widely available and as free as possible. That will cause Microsoft to have to catch up, and Facebook to really look in the mirror and decide whether they're going to cap the bet they've made on AR/VR and reallocate very aggressively to AI. I mentioned this on the Lex Friedman podcast, but that should be what Facebook does. The reason is, if Facebook, Google, and Microsoft have roughly the same capability in the same model, there's an element of machine learning called reinforcement learning from human feedback. These are LHF pipelines, and they're what will make your stuff unique. So if you're a startup, you can build a reinforcement learning pipeline, build a product that captures a bunch of usage, and that data set is unique to you as a company. You can feed that into these models, get back better answers, and make money from it. Facebook has an enormous amount of reinforcement learning inside of Facebook, every click, comment, like, and share. Twitter has that data set, Google inside of Gmail and search, and Microsoft inside of Minecraft and Hotmail. So my point is, David's right: the huge companies will create the substrates, and they'll be forced to scorch the earth and give it away for free.
Chamath: And then on top of that is where you can make money, and I would just encourage entrepreneurs to think, "Where is my edge in creating a data set that I can use for reinforcement learning that I think is interesting?" That's kind of saying, "I buy the ingredients from the supermarket, but then I can still construct a dish that's unique. The salt is there, the pepper is there, but how I use that will determine whether you like the thing or not." And I think that, you know, that is the way that I think we need to start thinking about it.
Jason: Interestingly, as we've all pointed out here, when OpenAI was started as a nonprofit, the stated philosophy was, "This technology is too powerful for any company to own. Therefore, we're going to make it open source." And then, somewhere in the last couple of years, they said, "Well, you know what? Actually, it's too powerful for it to be out there in the public. We need to make this a private company and we need to get $10 billion dollars from Microsoft." That is the disconnect I am trying to understand.
Friedberg: That's the most interesting part of the story, Jason. I think if you go back to 2014, that's when Google bought DeepMind, and immediately everyone started reacting to a company as powerful as Google having a toolkit and a team as powerful as DeepMind within them. And that sort of power shouldn't sit in anyone's hands. I heard people close to the organization and the company comment that they thought this was the most scary, threatening, and biggest threat to humanity: Google's control of DeepMind. And that was a naive point of view, but it was one that was deeply held by a lot of people. So Reid Hoffman, Peter Thiel, Elon Musk - a lot of these guys funded the original Open AI business in 2015. Here's the link, so I'm putting it out here. You guys can pull up the original blog post.
Friedberg: So, what happened was that it was all in a non-profit, and then the non-profit owned stock in a commercial business. But your point is interesting because, at the beginning, the idea was that instead of having Google and all of this, they would make it all available. Here's the statement from the original blog post in 2015: Open AI is a non-profit AI research company. Our goal is to advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return. Since our research is free from financial obligations, we can better focus on the positive human impact." And then they went on. And the whole thing about Sam, Greg, Elon, Reid, Jessica, Peter Thiel, all donating to support Open AI, including donations and commitments of over a billion dollars, although we expect that to only be a tiny fraction of what we will spend in the next few years. That's a really interesting historical perspective on how this thing all started seven years ago and how quickly it's evolved, as you point out, into the necessity to have a real commercial alignment to drive this thing forward without seeing any of these models open source. During that same period of time, we've seen Google share AlphaFold and share a number of predictive models and toolkits and make them publicly available and put them in Google's cloud. And so there's both tooling and models and outputs of those models that Google has open sourced and made freely available. Meanwhile, Open AI has kind of diverged into this deeply profitable, profit-seeking kind of enterprise model. And when you invest in Open AI, in the round that they did before, you could generate a financial return capped at 100X, which is still a pretty amazing financial return. You put a billion dollars in, you can make $100 billion. That's funding a real commercial endeavor at that point.
Friedberg: Well, then it is the most striking question about this whole thing regarding what's going on in AI, and it's one that Elon has talked about publicly and others have kind of taken sides on. AI offers a glimpse into one of the biggest and most existential threats to humanity, and the question we're all going to be tackling, and the battle that's going to be happening politically, regulatory-wise, and perhaps even between nations in the years to come is: who owns the AI, who owns the models, what can they do with it, and what are we legally going to be allowed to do with it? This is a really important part of the story, yeah.
Jason: To build on what you were saying, I just mentioned PyTorch, which people may not know. That's another framework, P.Y.T.O.R.C.H. This was largely built inside of Facebook, and then Facebook said, "Hey, we want to democratize machine learning," so they made it open source and put a bunch of executives to work on the project. They have a huge stake in this. Then, TensorFlow, which you don't have an investment in, came out of Google, and then you invested in another company.
Chamath: I don't have an investment in TensorFlow. We were building silicon for machine learning.
Jason: Right, but it's based on TensorFlow.
Chamath: No, no, no. The founder of this company was the founder of TensorFlow, not part of me, of the TPU, which was Google's internal silicon that they built to accelerate TensorFlow. Does that make sense?
Jason: And so, that's the thing. I don't mean to be cynical about the whole project or anything, it's just the confounding part of this of what is happening here. It reminds me, I don't know if you remember this, the biggest opportunity here.
Chamath: The biggest opportunity here is for Facebook. I mean, they need to get in this conversation ASAP. I mean, to think that, like, PyTorch was like a pretty seminal piece of technology that a lot of folks in AI and machine learning were using for a long time, TensorFlow before that, and what's so funny about like Google and Facebook is they're a little bit kind of like they're not really making that much progress. I mean, Facebook released this kind of like random version of AlphaFold recently. It's not that good. I think these companies really need to get these products in the wild as soon as possible. It cannot be the case that you have to email people and get on some list. I mean, this is Google and Facebook, guys, come on, get it.
Jason: This is the, I think, the big innovation of OpenAI, to bring you in the conversation. They actually made an interface and let the public play with it to the tune of $3 million a day in cloud credits or costs, which, by the way, just on that, my son was telling me, he's like, "Hey, Dad, do you want me to tell you when the best time to use GPT-3 is?" I'm like, "Huh?" He's like, "Yeah, my friends and I tried, we've been using it so much. We know now when we can actually get resources, and it's such an interesting thing where like a 13-year-old kid knows, you know, when it's mostly compute intensive that it's unusable and when to come back and use it last."
Sacks: It's been a while. I don't know, maybe the iPhone or something. Yeah, look, it's powerful. There's no question it's powerful. I mean, I'm of two minds about it because whenever something is in the hype cycle, I just reflexively want to be skeptical of it. But on the other hand, we have made a few investments in this area, and I mean, I think it is powerful and it's going to be an enabler of some really cool things to come. There's no question about it.
Jason: I have two pieces of more insider information. One, I have a ChatGPT iOS app on my phone. One of the nice folks at OpenAI included me in the test flight, and it's the simplest interface you've ever seen. But basically, you type in your question and it keeps your history, and then you can search your history. So, it looks sort of like you're an iMessage, basically, and it has your threads. And so, I asked him, "Hey, what are the best restaurants in Yonville? A town near, um, Napa?" And then I said, "Which one has the best duck?" And it literally gave me a great answer. And then I thought, "Wait a second, why isn't this using a Siri or Alexa-like interface?" And then why isn't it? Oh, here's a video of it. I gave the video to Nick.
Chamath: By the way, Jason, this is what you're doing right now: you're creating a human feedback reinforcement learning pipeline for ChatGPT. So, just the fact that you asked that question, and you know, over time, if ChatGPT has access to your GPS information and then knows that you went to restaurant A versus B, you can input it and it may actually prompt you to ask, "Hey, Jason, we noticed you were in the area. Did you go to Bottega? If you did, how would you rate it? One through five?" That reinforcement learning now allows the next person that asks, "What are the top five restaurants?" to say, "Well, you know, over 1,000 people have asked this question. Here is actually the best answer versus a generic rank of the open web, which is what the first dataset is. That's what's so interesting about this. So, this is why if you're a company that already owns the eyeballs, you have to be running to get this stuff out there.
Jason: And then this answer, uh, you know, cited Yelp. Well, this is the first time I've actually seen ChatGPT cite and this is, I think, a major legal breakthrough. It didn't put a link in, but if it's gonna use Yelp's data, I don't know if they have permission from Yelp, but it's quoting Yelp here. It should link to French Laundry, Bottega, and Bouchon. Bouchon actually has the best duck confit, for the record, and I did have that duck. So, I asked this afterwards to see, you know, in a scenario like this, but it could also, if I was talking to it, I could say, "Hey, which one has availability this afternoon or tomorrow for dinner," and make the phone call for me, like Google Assistant does, or any number of tasks. This was an incredibly powerful display in a 1.0 product.
Chamath: I was thinking about what you said last week and I thought back to the music industry and, in the world of Napster, and what happened was there were a lot of musicians, with Metallica being the most famous one, famously suing Napster because it was like, "Hey, listen, you're allowing people to take my content which they would otherwise pay for. There's economic damage that I can measure." That legal argument was meaningful enough that ultimately Napster was shut down. Now there are other versions of that that folks created, including us at Winamp, we created a headless version of that. But if you translate that problem set here, is there a claim that Yelp can make in this example that they're losing money? That is, if you were going through Google or if you're going through their app, there's the sponsored link revenue and the advertising revenue that they would have got that they wouldn't get from here. Now that doesn't mean that ChatGPT can't figure that out, but it's those kinds of problems that are going to be a little thorny in these next few years that have to really get figured out.
Sacks: If you were a human reading every review on Yelp about Duck, then you could write a blog post in which you say many reviewers on Yelp say that it is the best duck. So the question is, is GPT held to that same standard, or something different? And is it linking to it enough?
Chamath: This is the question that I'm asking. I don't know.
Jason: It should be because I'll argue it should be because if you look at the four-part test for fair use, which I had to go through because blogging had the same issue, we would write a blog post and we would mention Walt Mossberg's review of a product and somebody else's. Then people would say, 'Oh, I don't need to read Walt Mossberg's review, I have a Wall Street Journal subscription.' And we would say, 'Well, we're doing an original work. We're comparing two or three different reviews and we're adding something to it. It's not interfering with Walt Mossberg's ability to get subscribers in the Wall Street Journal.' But the effect on the potential market is one of the four tests. And just reading from Stanford's quote on fair use, another important fair use factor is whether your use deprives the copyright owner of income or undermines a new potential market for the copyrighted work. Depriving a copyright owner of income is very likely to trigger a lawsuit, even if you're not competing directly with the original work. We'll put the link to Stanford here. This is the key issue and I would not use help in this example. I would not open the Yelp data. Yelp would get no commerce and Yelp would lose this. So, chat GPT and all these services must use citations of where they got the original work. They must link to them and they must get permission. That's where this is all going to shake out."
Sacks: I believe permission, I mean, you can't get a big enough data set if you have to get permission in advance, right?"
Jason: It's going to be the large datasets. Yelp, the App Store reviews, Amazon's reviews. So, there are large corpora of data that you would need. Like Craigslist has famously never allowed anybody to scrape Craigslist. The amount of data inside Craigslist as one example of a dataset would be extraordinary to build chat GPT on. Chat GPT is not allowed to because, as you brought up robots.txt last week, there's gonna need to be an AI.txt. Are you allowed to use my dataset in AI and under what conditions? And how will I be compensated for it? I'll allow you to use Craigslist, but you have to link to the original post and you have to note that the other..."
Chamath: Gray area that isn't there today but may emerge is when Section 230 gets rewritten because if they take the protections away for the Facebooks and the Googles of the world for being an algorithmic publisher and saying an algorithm is equivalent to a publisher, what is essentially saying is that an algorithm is kind of like doing the work of a human in a certain context. And I wonder whether that's also an angle here, which now this algorithm which today David you use, you said the example I read all these blog posts I write something, but if an algorithm does it, maybe can you then say no, actually there was intent there that's different than if a human were to do it. I don't know. My point is, very complicated issues that are going to get sorted out. And I think the problem with the hype cycle is that you're going to have to marry it with an economic model for VCs to really make money and right now there's just too much betting on the come. So, to the extent you're going to invest, it makes sense that you put money into open AI because that's safe, because the economic model of how you make money for everybody else is so unclear."
Jason: It's clear actually. I just signed up for chat GPT premium. They had a survey that they shared on their Discord server and I filled out the survey and they did a price discovery survey. Friedberg, what's the least you would pay, the most you would pay, what would be too cheap of a price for chat GPT Pro and what would be too high of a price? I put in like fifty bucks a month was what I would pay. But I was just thinking, imagine chat GPT allowed you, Friedberg, to have a Slack channel called research and you could go in there or any time you're in Slack you do slash chat or slash chat GPT and you say slash chat GPT. Tell me, you know, what are the venues available in which we did this actually for, I did this for venues for on Sunday. I said what are the venues that seat over three thousand people in Vegas? And it just gave us the answer. Okay, well that was the job of the local event planner. They had that list. Now you can pull that list from a bunch of different sources. I mean, what would you pay for that a lot?"
Friedberg: Well, I think one of the big things that's happening is all the old business models don't make sense anymore. In a world where the software is no longer just doing what it's done for the last sixty years, which is what is historically defined as information retrieval. So, you have this kind of hierarchical storage of data that you have some index against and then you go and you search and you pull data out and then you present that data back to the customer or the user of the software and that's effectively been how all kind of data has been utilized in all systems for the past sixty years in computing. Largely what we've really done is kind of built an evolution of application layers or software tools to interface with the fetching of that data, the retrieval of that data and the display of that data. But what these systems are now doing, what AI type systems or machine learning systems now do, is the synthesis of that data and the representation of some synthesis of that data to you, the user, in a way that doesn't necessarily look anything like the original data that was used to make that synthesis. And that's where business models like Yelp, for example, or like a web crawler that crawls the web and then presents web page directories to you. Those sorts of models no longer make sense in a world where the software, the signal to noise is now greater. The signal is greater than the noise in being able to present to you a synthesis of that data and basically resolve what your objective is with your own consumption and interpretation of that data, which is how you historically use these systems."
Friedberg: And I think that's where we're going back to the question of the hype cycle. I don't think it's about being a hype cycle. I think it's about the investment opportunity of fundamentally rewriting all compute tools. If all compute tools can use this capability in their interface and in their modeling, then it very much changes everything. One of the advantages businesses will latch onto, which we've talked about historically, is novelty in their data. Being able to build new systems and new models that aren't generally available, in biotech and pharma for example, having screening results from very expensive experiments and running lots of experiments and having a lot of data against those experiments gives a company an advantage in drug discovery. We'll talk about that in a minute, versus everyone using publicly known screening libraries or publicly available protein modeling libraries and then screening against those, and then everyone has the same candidates, the same targets, and the same clinical objectives they're trying to resolve from that output. So, I think novelty and data is one way that advantage kind of arises, but really, that's just kind of where is there an edge. But fundamentally, every business model can and will need to be rewritten, that's dependent on the legacy of information retrieval as the core of what computing is used to do.
Jason: On my other podcast, I was having a discussion with Molly about the legal profession. What impact would it be if chat GPT took every court case, every argument, every document, and somebody took all of those legal cases and the legal profession and then the filing of a lawsuit, the defending of a lawsuit, public defenders, prosecutors? What data could you figure out? And then, just to think of the recent history, look at Chester Boudin. You could literally take every case, every argument he did, put it through it, and say, versus an outcome in another state, and you could figure out what's actually going on with this technology. What impact did this have on the legal field? You are a non-practicing attorney with a legal degree.
Sacks: I never practiced law other than one summer at a law firm. But yes, I did pass the bar. Yes, I just tried. Yes, of course.
Chamath: I may not have passed the bar, but I know enough to know that you can't. I would be curious in terms of a very common question that an associate at a law firm would get asked, such as summarizing the legal precedents in favor of X. Right? And I can imagine GPT doing that instantly. Now, I think there are two questions. One is, can you prompt GPT in the right way to get the answer you want? And I think Matthew shared a really interesting video showing that people are developing some skills around knowing how to ask GPT questions in the right way, which is called prompt engineering. Why? Because GPT is a command line interface. So if you ask GPT a simple question, like what's the best restaurant in Napa, it knows how to answer that. But there are much more complicated questions that you kind of need to know how to prompt in the right way. So it's not clear to me that a command line interface is the best way of doing that. I can imagine apps developing that create more of a GUI. So, we're an investor for example in Copy AI, which is doing this for copywriters and marketers, helping them write blog posts and emails. So, I imagine putting that GUI on top of ChatGPT. They've already been kind of doing this. So, I think that's part of it. I think the other part of it is on the answer side. You know, how accurate is it? Because in some professions, having 90 or 95 or 99% accuracy is okay, but in other professions, you need six nines accuracy, meaning 99.9999% accuracy. Okay. So, I think for a lawyer going into court, you know, you probably need, I don't know, it depends on... Yeah, exactly. So, is 99% accuracy good enough? Is 95% accuracy good enough? I would say probably for a court case, 95% is probably not good enough. I'm not sure GPT is even 95% yet.
Sacks: But could it be helpful? Could the associates start with ChatGPT to get an answer and then validate it? Probably. Yeah.
Chamath: If you had a bunch of associates working on some law model for a year, that's reinforcement learning we just talked about, I think you'd get precision and recall off the charts and it would be perfect. Just a cute thing. I don't know if you guys got this email that came about an hour ago from Reid Hoffman. Reid said to me he created a Fireside Chat Bot, a special podcast miniseries where he will be having a set of conversations with Chat GPT. You can go to YouTube and see Reid having a conversation with it. It has an AI-generated voice powered by the text-to-speech platform Play.ht.
Jason: I mean, we have a conversation with the two Davids every week. What's the difference? We know how this is gonna turn out.
Sacks: But actually synthesizing Jamas's point about reinforcement learning with something you said, Jake Allen, in our chat, which I actually thought was pretty smart.
Jason: Well, that's the first thing.
Sacks: Yeah, so I'm gonna give you credit here because I don't think you've said it on this episode. As you said, these AI capabilities are eventually going to become commoditized or certainly much more widely available. I don't know if that means that they'll be totally commoditized or if there will be four players, but there'll be multiple players that offer them. And you said the real advantage will come from applications that are able to get ahold of proprietary data sets, and then use the proprietary datasets to generate insights. And then, layering on what you said about reinforcement learning, if you can be the first out there in a given vertical with a proprietary data set, and then you get the advantage of reinforcement learning, that would be the way to create, I think, a sustainable business.
Chamath: Just to build on what you said, this week is the JPMorgan Conference. Friedberg mentioned it last week at dinner on Wednesday with this really interesting company based in Zurich. And what they have is basically a library of ligands. Right. And so these ligands are used as a substrate to deliver all kinds of molecules inside the body. And what's interesting is that they have a portfolio of like 1000 of these. But really what they have is they have all the nuclear medicine data about whether it works. So, you know, they target glioblastoma and so all of a sudden they can say, "Well, this ligand can actually cross the blood-brain barrier and get to the brain." They have an entire dataset of that and a whole bunch of nuclear imagery around that. They have something for soft cell carcinoma. So then they have that dataset. So, to your point, that's really valuable because that's real work that Google or Microsoft or Open AI won't do right and if you have that and you bring it to the problem, you can probably make money. You know, there's a business there to be built.
Jason: Just building on this conversation, I just realized like a great prompt engineer is going to become a title and an actual skill. The ability to interface with these AI engineers. Well, no, a prompt engineer somebody who is very good at talking to these instances and and and maximizing the result for them and refining the results for them. Just like a detective who asks great questions, that person is going to be 10 or 20 times more valuable. They could be the proverbial 10X engineer in the future. As we talk about austerity and doing more with less and the 80% less people running Twitter now or Amazon laying off 18,000 people, Salesforce laying off 8,000, Facebook laying off 10,000 and probably another 10,000, what catalytic effect could this have? We could be sitting here in three or four or five years and instead of running a company like Twitter with 80% less people, maybe you could run it with 98% less people.
Friedberg: Look, I think directionally it's the right statement. I mean, you know, I've made the statement a number of times that I think we moved from this idea of a creator economy to a narrator economy. Where historically it was kind of a labor economy where humans used their physical labor to do things, then we were knowledge workers, we used our brains to make things, and then ultimately we kind of resolved to this narrator economy where the way that you can kind of state intention and better manipulate the tools to drive your intentional outcome, the more successful you're gonna be. And you can kind of think about this as being the artist of the past. Da Vinci was what made him so good was he was technically incredible, trying to reproduce a photographic-like imagery using paint. And there's these really great kind of museum exhibits on how he did it using these really interesting kind of like split mirror systems. And then the better artist of the 21st century was the best user of Adobe Photoshop and that person is not necessarily the best painter. And the artist of the 22nd century isn't going to look like the Photoshop expert and it's not gonna look like the painter. It's gonna look like something entirely different. It could be who's got the most creative imagination in driving the software to drive new outcomes. And I think that the same analogy can be used across every market and every industry. However, one thing to note, Jake, it's not about austerity because the Luddite argument is when you have new tools and you get more leverage from those tools, you have less work for people to do and therefore everyone suffers. The reality is new work emerges and new opportunities emerge and we level up as a species and when we level up, we all kind of fill the gaps and expand our productivity and our capability set.
Chamath: I thought what Jake was saying was more that Google will be smaller, but that didn't mean that the pie wouldn't grow. It's just that that individual company is run differently, but there will be hundreds or thousands more companies or millions more.
Jason: More. That's sort of I have an actual punch up for you instead of narrative. It's the conductor economy. It's like you're conducting a symphony, a punch up, punch up there. But I do think like we're gonna there's gonna be somebody who's sitting there like remember Tom Cruise in Minority Report as a detective was moving stuff around with the interface and you know with the gloves and everything. This is kind of that manifested. You could even if you're not an attorney you can say, "Hey, I want to sue this company for copyright infringement. Give me my best arguments," and then on the other side say, "Hey, I want to know what the next three features I should put into my product is. Can you examine who are my top 20 competitors and then who they hired in the last six months? And what are those people talking about on Twitter?" You can have this conductor who becomes really good at that?
Friedberg: Yeah, the leveling up that happens in the book Ender's Game I think is a good example of this where the guy goes through the entire kind of ground up and then ultimately he's commanding armies of spaceships and space and his orchestration of all of these armies is actually the skill set that wins the war. Yeah.
Chamath: People predicted that there would be like all these people that create these next gen forms of content. But I think this Reid Hoffman thing could be pretty cool. Like what if he wins a Grammy for his computer-created podcast mini-series?
Friedberg: The thing I'm really excited about is when's the first AI novel gonna get published by a major publisher? I think it happens this year. When's the first AI symphony gonna get performed by a major symphony orchestra? And when's the first AI-generated screenplay get turned into an AI-generated 3D movie that we all watch? And then the more exciting one I think is when do we all get to make our own AI video game where we instruct the video game platform what world we wanna live in? I don't think that's happening for the next three or four years, but when it does, I think everyone's got these new immersive environments that they can live in.
Chamath: I have a question: when I say live in, I mean video games. When you have these computer systems just like to use a question of game theory for a second there, these models are iterating rapidly. These are all mathematical models, so inherent in let's just say this is the perfect answer right? Like if you had perfect precision recall, if multiple models get there at a system-wide level, everybody is sort of like they get to the game theory optimal, they're all at Nash equilibrium, right? All these systems working at the same time, then the real question would then be what the hell do you do then? Because if you keep getting the same answer, if everybody then knows how to ask the exact right question, and you start to go through these iterations where you're like, maybe there is a dystopian hellscape where there are no jobs, maybe that's the Elon world, which is you can you can recursively find a logical argument where there is no job that's possible, right? And now I'm not saying that that path is the likely path, but I'm saying it is important to keep in mind that that path of outcomes is still very important to keep in the back of our mind as we figure these things out.
Jason: Well, Friedberg, you know you were asking before about this, like, you know, will more work be created? Of course, artistic pursuits, podcasting is a job now, being an influencer is a job, yada yada, new things emerge in the world. But here in the United States in 1970, I'm looking at, um, Fred, look at the St. Louis Fed, 1970, 26.4% of the country was working in a factory, was working in manufacturing. You want to guess what that is in 2012? 10%. It was 26% in 1970 and in 2015 when they stopped the percentage in manufacturing states that they discontinued this, it was 10%. So it's possible we could just see, you know, the concept of office work, the concept of knowledge work is going to follow a pretty inevitable path of manufacturing. That that seems like a pretty logical theory, or no? I think we should move on. So, how would we like to ruin the show now? Should we talk about Biden and the documents and ruin the show with political dog or should we talk about, since it's been such a great episode so far, what do we want to talk about next? Couple of choices, you guys are talking about?
Sacks: We all know, we all know, second, we all know, Jaeckel, that according to you, when a president is in possession of classified documents in his home that apparently have been taken in an unauthorized manner, basically stolen, he should have his home raided by the FBI.
Jason: Almost close, close. Yeah, if so, anyway, the Biden, as of the taping of this, has now said there's a third batch of classified documents. This group, I guess there was one at an office, one at a library. Now this third group is in his garage with his Corvette. Certainly not looking good. Uh...
Sacks: Independence. They say that in his defense, they say the garage was locked, meaning that you could use a garage door opener to open or close it. It was locked when it went closed. Pretty much as secure as the documents at Mar-a-Lago. Same equivalency.
Sacks: No, no, no. To be perfectly fair, the documents at Mar-a-Lago were locked in the basement. The FBI came and checked it out. They said, "We'd like you to lock those up," so they were locked up, making them a little bit safer, but functionally the same.
Jason: The same. The only difference here would be what stands out when you look at these two cases.
Sacks: Well, in one case, Merrick Garland has appointed an independent counsel to investigate Trump, and there's no such special counsel or investigator appointed to investigate Biden. I mean, these things are functioning, but...
Jason: Somebody's on it, though.
Sacks: I don't think they've appointed a special counsel yet.
Friedberg: No, they did. As of an hour ago, a special counsel was appointed. Okay. Did that just happen? Yeah, one hour ago, Robert.
Sacks: Okay, I guess there are some real questions to look into here. The documents were apparently removed twice. Why were they moved and in what order? What was a classified document doing in Biden's personal library? What do the documents pertain to? They seem to touch on the Biden family's business dealings in Ukraine and China. So, there are some real things to look into here. But let me just take a step back. Now that the last three presidential candidates have been ensnared in these classified document problems—Biden now, then Trump, and Hillary Clinton before Trump—I think it's time to step back and ask: Are we over-classifying documents? I mean, are we fetishizing these documents? Are they all really that sensitive? It seems to me that we have an over-classification problem. Meaning that, ever since FOIA—the Freedom of Information Act—was passed, the government can avoid accountability and prying eyes by simply labeling any document as classified. So, over-classification was the logical response by the permanent government to the Freedom of Information Act. And now it's gotten to the point where just about everything handed to a president or vice president is classified. So, I think I can understand why they're all making this mistake, and I think a compounding problem is that we never declassify anything. There are still all these records from the Kennedy assassination—that's crazy.
Jason: They were supposed to have declassified these documents, but the C.I.A. keeps filibustering on the release of the JFK assassination documents. They've been told they have to stop and release them, but they keep redacting stuff which is making it difficult. I hate to be a conspiracy theorist here, but what are they trying to cover up? I mean, this happened a long time ago.
Sacks: That's the only way to interpret it. But even for more mundane documents, there are very few that need to be classified after, say, five years. You could argue that we should be automatically declassifying them after five years, unless they go through a process to get reclassified. I mean, what do you guys in business do? I know it's not government business. How many of the documents that you deal with are still sensitive or trade secrets five years later?
Jason: Certainly not 20 years later. Like, five years?
Sacks: Yeah, five years. I mean, the only documents in business that I think I deal with that you could call sensitive are the ones that pertain to the company's future plans, right? Because you wouldn't want a competitor to get those. There's a handful of things.
Chamath: Even the captain isn't that sensitive, because by the time you go public, it has to be public. Yeah. It's like there are 100 people who have that. I mean, it's so in business, I think our experience has been that there are very few documents that need to remain secret. Now, look, if Biden or Trump, whoever, is reviewing the schematics to the Javelin missile system or two, or how we make our nuclear bombs or something, obviously that needs to stay secret forever. But I don't believe our politicians are reviewing those kinds of documents. We both don't really understand what it is that they're reviewing.
Jason: Why are they keeping it classified for five years?
Jason: Why are they keeping them? Was the issue we discussed previously resolved? We actually agreed on that. I think they're just keeping Mementos.
Sacks: I think there's a simple explanation for why they're keeping them, Jason, which is that everything is more classified. And there are a zillion documents, both from Biden and Trump. These documents were mixed in with a bunch of personal effects and mementos. My point is, if you work in government and handle documents, they're all classified.
Jason: If the national archive asked for them back or you find them, you should just give them back. I mean, that is what's going to wind up being the right thing to do. Didn't give them back? So that's the only difference here.
Sacks: Well, no, no, no. Hold on. The FBI went to Trump's basement. They looked around, they said, "Put a lock on this." They seemed to be okay with it initially, then maybe they changed their minds. I don't know. I'm not defending Trump, but that's the point I'm making. Is that now that Biden, Trump, and Hillary Clinton have all been ensnared in this, is it time to rethink the fact that we're over-classifying so many documents? I mean, just think about the incentives that we're creating for our politicians. Okay, just think about the incentives: number one, never use email. Remember Hillary Clinton and the whole email server? You gotta be nuts to use email. Number two, never touch a document. Never let anyone hand you a document. Never let one hand you a document. I mean, if you're a politician or an elected official, the only time you should ever be handling anything is going to a clean room.
Sacks: You know, make an appointment, go in to read something, don't take notes, don't bring a camera, and then leave. I mean, this is no way to run a government. It's crazy. Who does this benefit? Who does this benefit? It doesn't benefit our elected officials. It makes it almost impossible for them to act like normal people. It benefits the insiders. The permanent government.
Jason: You're missing the most important part about this, Sacks. If you want to go into conspiracy, there is this: Biden planted the documents so that we can create the false equivalency and start up Biden versus Trump 2020. This ensures that now Trump has something to fight with Biden about, and this is going to help Trump because...
Chamath: They're both tainted, equally tainted from the same source, puts Trump in the news cycle.
Sacks: No, I think it's the opposite. I think Merrick Garland now is going to have to drop the prosecution against Trump for the stolen documents, or at least that part of what they're investigating him for. They might still be investigating him over January 6th or something. It's going to be more sticky with Georgia, yeah I agree with that. Actually, I think it's gonna be hard to do.
Sacks: But my point is, just think about it: both sides are engaged in hyper-partisanship right now. The conservatives and the right are attacking Biden for the same thing the left was attacking Trump for. My point is, take a step back and think about the incentives we're creating for how to run our government. You can't use email, you can't touch documents, and by the way, if you're a business person, don't ever go into politics because they will investigate every deal you ever did privately prior to getting into politics.
Jason: What are you going to do when you try to get your Treasury position? What's it going to be like?
Sacks: Nuts! You'd have to be nuts to go for it. So, you're not going to take a position that the Washington insiders, by which I mean the permanent Washington establishment, i.e. the deep state, are creating a system in which they're running things and the elected officials can barely operate like normal functioning humans. It's interesting, that's what's going on.
Chamath: I heard a great rumor. This is total gossip mongering. Oh, here we go. Uh, I heard that Ken Griffin's best bet is to get selected as Treasury Secretary. I mean, if he wanted it, Ken Griffin would get it, and then he would be able to divest all of Citadel tax-free. So he would make the market like $30 billion, which is a genius way to go out. Then it occurred to me, "Oh my God, that's my and Sachs's path with a lot less money." But why would it be tax-free? When you get appointed to those senior posts, you're allowed to either stick it in a blind trust or you can sell with no capital gains. What? Yeah.
Sacks: What? Because they want you to divest. Yes, anything that presents a conflict, they want you to divest. And so the argument is, if you're forced to divest to enter government, you shouldn't be forced to do so.
Jason: If I become mayor of San Francisco or Austin,
Friedberg: Secretary of Transportation, I'm qualified for that.
Jason: I'd take the bus. I got an electric bike.
Chamath: To answer Friedberg's point, I think Citadel Securities, there's a lot of folks that would buy that because that's just the securities trading business and then Citadel the hedge fund, probably something like a big bulge bracket bank or Blackstone, probably Blackstone. In fact, because now Blackstone can plug it into a trillion-dollar asset machine, I think there would be buyers out the door.
Jason: This is an incredible gift. Now I know why such a cabinet position has no cap gains.
Chamath: It's not a gift at all, but it's an incredible loss. They forced you to sell everything. I think you're misusing the word to continue to genuflect to the left, or you're just not understanding.
Friedberg: That's what I think.
Jason: You take a cabinet position. Where does that exist?
Friedberg: If you were asked to serve.
Sacks: Look, any normal person who wants to serve in government can't use email, can't touch a document, and every deal they've ever done gets investigated. Why would you want to do it? That you get to divest tax free?
Chamath: I think you protested too much, David. The fact is...
Jason: That you know this rule and...
Friedberg: I don't know. It's like people I look up to...
Chamath: Engage it means to engage in a petty or small-scale swindle. I don't think selling a $31 billion BlackRock and Blackstone would be considered a petty small-scale.
Friedberg: Did any of you watch the Madoff series on Netflix? No? Oh my goodness, it's so depressing. I have to say, that Madoff series, there's no glimmer of light, hope, positivity, or recourse; everyone is a victim and everyone suffers. It's just so dark. Don't watch it; it's so depressing. That one is really depressing; it's so victimizing.
Chamath: One person died of cancer.
Friedberg: It was a hard choice. I didn't realize that the trustee had gone and taken money from people who were 80 years old and retired, and had spent that money decades ago. He sued them and took their homes away, and no one had any idea that they were part of the scam. It was a brutal and awful situation.
Chamath: By the way, that's going to be really interesting as we enter this SPF trial because that is the track that is what happens if you got another district of New York said that this case is becoming too big for them because all the places that SPF said money all those packs and all those political donations, they have to go and investigate where that money went and see if they can get it back and it's going to open up an investigation into each one of these campaign finance and election and kind of interfering, ProPublica, on the other end of the spectrum. I did watch this weekend, 'Triangle of Sadness'. Have you guys watched?
Jason: God, this is great. It's so dark.
Chamath: To the Davids. Listen, this is one of the best setups you'll see in a movie. So basically, it's a bunch of people on a luxury yacht. So you have a bunch of rich people as the guests, then you have the staff that interacts with them. This is mostly caucasian and then under in the bowels of the ship what you see are Asian and black workers that support them. Okay, so in some ways, it's a little bit of a microcosm of the world, in a microcosm of something else.
Chamath: And then, what happens is there's a shipwreck. Right. So, the plot is, you have this caucasian patriarchy that gets flipped upside down because after the shipwreck, the only person who knows how to make a fire and catch the fish is the Filipino woman who is in charge of cleaning the toilets. So, she becomes in charge. So now you have this immigrant matriarch instead.
Jason: That's a great meditation on class and it's really well done.
Chamath: It didn't end well. I thought I had thought.
Jason: It's hard to wrap that one up. Well, you know what they say: Boys will be boys, and they'll throw you in jail, but they'll make you king." That's a famous Bob Dylan quote. Alright, well it's been a great episode. Great to see you, besties. What's on the austerity menu tonight? What are we having? Salad and some tuna sandwiches?
Chamath: No, I think Kirsten is doing a rod draw. Yeah, that's Jake. I once had a great day rod fishing in Venice, and the best meal I've ever had. So good. I agree.
Chamath: When it's done well, the dirac kicks in.
Sacks: There's only one way to cook a fish. Do you know what that is? You gotta do it the way they did in Venice; you gotta cook the whole fish. Okay, and then after you cook the fish, then you debone it, and that's the way to do it.
Jason: That was back when Sacks and I used to enjoy each other's company. Podcasts have made us into mortal enemies.
Sacks: Jacob, I'm a little disappointed. You couldn't agree with my take on this document scandal instead of dunking in a partisan way. I tried to explain why it was a problem of our whole political system.
Jason: I like your theory. I think you know you keep Biden. I think Biden is a grifter. I told you these guys are grifting. I just think your party is a little bit more, but you know, compare your craft.
Chamath: Are we going to play on Saturday after the Wild Card game? Are you guys interested in playing on Saturday as well? Because I have the hall pass. I can arrange a game on Saturday.
Jason: I'm not sure. I'll need to check with my boss.
Friedberg: Who's going to the sacks? Are you guys all going? Are you gonna come to play poker at that livestream thing for the day? In L.A., I doubt it.
Jason: He doesn't want to interact with humans.
Chamath: That doesn't play well in confirmation hearings.
Sacks: He destroyed me on camera, and every time he bluffed I folded. Every time he had the nuts I called. It was brutal. It was a classic Sacks shellacking.
Jason: That's what's going on here. No, no, no.
Friedberg: It has to do with the cabinet positions. He doesn't need to be seen recklessly gambling.
Friedberg: You could take any cabinet position, if you wanted. Which one would it be? State, Treasury, a lot of travel - you'd never stay at home, you'd always be on a plane.
Sacks: I don't think those cabinet positions are that important. I mean, they run these giant bureaucracies that are permanent - you can't fire anyone. So if you can't fire a person, do they really report to you?
Friedberg: The idea was to put a bunch of hardline CEO-type people in charge and have them blow up these things and make them more efficient. It didn't really work, did it?
Sacks: Yeah. Well, you know why a CEO is actually in charge? Like Elon Musk - he walks in and if he doesn't like what you're doing, he can just fire you. You can't fire anyone in government. How do you manage them when they don't have to listen to anything you say? That's our whole government right now - our cabinet heads are figureheads for these departments, these giant bureaucracies.
Chamath: Is that a no or is that a yes you still take, state, look?
Jason: At that. I think I have another ambassadorship first. What is the best ambassadorship? You can't you can't, London or France?
Sacks: Historically, you can tell which ambassadorship is the best one based on how much they charge for it. Yeah, I think London is the most expensive. I think that's 10 to 15 million.
Chamath: 10 to 15 million.
Jason: Ten to fifteen million—that's what Sacks's fourth least expensive home cost."
Friedberg: No, no, you have to spend that every year to run it. Jason, you only got for him. You could be the ambassador to Guinea or the ambassador to the U.K. You get the same budget.
Sacks: Actually, what's kind of funny is I know two people who served as ambassadors under Trump and it was really cheap to get those because no one wanted to be part of the Trump administration. They were on a fire sale after because of Trump, who wanted to be untainted.
Chamath: By the way, one of them, and you can just be about the name, was telling me it was the best thing because he ended up selling. They already got the all-time high to take the job. He was like, "I gotta get out of all of this stuff."
Sacks: No, but listen, let me tell you about the ambassadorships. It was a smart trade by those guys, because an ambassador is a lifetime title. So, no one remembers who was president when you were ambassador; no one cares.
Jason: So you're going for the ambassador?
Chamath: I think it's fair to say that I'm not interested in ceremonial things; I'm more interested in making an impact, and the problem with all these positions is that being a cabinet official is not much different than being an ambassador.
Chamath: So you're going to enlist in the Navy?
Jason: You know what has a bigger impact overall, being an ambassador for All in Pod or being the ambassador of Sweden? Actually, All in Pod is more impactful.
Friedberg: By the way, this is why I take issue with your statement about the term "mainstream media" because I think you have become more of the mainstream media than most of the folks that are independent media.
Jason: Trust me, it's independent.
Chamath: And stop genuflecting. It's independent. Who knows if this thing's gonna last another three episodes? I just like saying the word genuine.
Jason: Do you like genuflecting? That's the top word of 2023 for you so far?
Jason: Oh, is that someone doing an analysis with ChatGPT? Of the words used here? No, but Sacks brought that word up. It's a wonderful word; it's not used enough.
Jason: Alright everyone, we'll see you next time on the All In Podcast. Comments are turned back on, so have at it, you animals!