Support The Bulwark and subscribe today.
  Join Now

Kara Swisher: A Tech ‘Tough Love’ Story

March 1, 2024
Notes
Transcript
Tech has allowed some very bad people to do some very bad things—including in the democracy arena. Swisher joins Tim for the weekend pod to share her burns of Elon, Trump and the vampires of Silicon Valley. Plus, a capitalist case for government doing more to rein in Big Tech.

show notes:
Kara’s new memoir, “Burn Book”
Tim’s Walter Isaacson interview

This transcript was generated automatically and may contain errors and omissions. Ironically, the transcription service has particular problems with the word “bulwark,” so you may see it mangled as “Bullard,” “Boulart,” or even “bull word.” Enjoy!
  • Speaker 1
    0:00:08

    Hello, and welcome to the Bulwark Podcast. I’m delighted to be here with Cara Swisher, host of the on with Cara Secret Podcast, co host of the Secret Podcast, which is always right around the Bulwark podcast and the Apple charts, not that I obsess over those.
  • Speaker 2
    0:00:21

    I do. I haven’t loved.
  • Speaker 1
    0:00:22

    And she’s the author of three books. Including the brand new memoir burn book. Thanks for doing this, Kara.
  • Speaker 2
    0:00:27

    Thank you for having me.
  • Speaker 1
    0:00:28

    So if you don’t mind, I wanna start, like, as I was reading the book, I I became obsessed with one question that’s kind of not really a a topic of the book, but it’s it’s adjacent. So if you don’t mind, can we do one big picture question?
  • Speaker 2
    0:00:40

    Nope. Not at all. It’s your podcast. Sure.
  • Speaker 1
    0:00:42

    So know, because you started this, what? In the nineties, in the nineties, you started pouring
  • Speaker 2
    0:00:46

    in the early nineties. Mhmm.
  • Speaker 1
    0:00:48

    So I was thinking about this as I was reading through, and I was like, going back, through the nineties into the midoffs, maybe even lateoffs. If you ask people if they thought the technological advances to date were, like, good or bad for society, basically everybody would have said good. There are always some blood types, but basically everybody would have said with the positive. Today, if I raise that question with my peers, there’s a lot of uncertainty And so I wonder where you fall on that spectrum sitting it now twenty twenty four. Or has it been a net positive this transformation we’ve covered or net negative?
  • Speaker 2
    0:01:19

    It’s interesting. I don’t think you could even net it out. Right? I think one of the key quotes in the book is the Paul Verillo quote, which is when you invent the ship, you invent the shipper, when you invent electricity to invent electrocution. And I think that is, you know, is electricity a net positive?
  • Speaker 2
    0:01:33

    Are cars a net positive? It is, but maybe not. If at if we if the planet burns up. Right? Sure.
  • Speaker 2
    0:01:40

    We don’t know what we don’t know in the future and how weird things are gonna come out And we never will because things change over time. I would say it’s a net positive and has the potential to be a real net positive, but I would say the negatives have far outweighed the positives in some critical areas like democracy. Right? And and the things, the deleterious effects of wealth, the deleterious effects of partisanship have been boosted and amplified by social media and these technologies. And it’s given some very bad people and ability to be very bad at scale.
  • Speaker 2
    0:02:13

    And so that’s been super problematic for any kind of comedy. And it doesn’t have to pull us apart, but that’s what these tools have been used for for the most part.
  • Speaker 1
    0:02:23

    Yeah. I mean, I wonder if that’s true though, that it doesn’t have to. And I guess my challenge with the question is I think about the phones. Right? It’s because in the book, I admitted to you in the green room.
  • Speaker 1
    0:02:31

    I hopped around. Good. It’s long. But I, you know, I’m trying to get through everything, but I I have to run and I have to do the Apple chapter, or the various parts we talk about Apple. And, you know, cook and jobs are on balance.
  • Speaker 1
    0:02:43

    I mean, you paint three-dimensional pictures, but on balance, you know, more towards the white hat side of things.
  • Speaker 2
    0:02:47

    I would agree.
  • Speaker 1
    0:02:48

    You know, I really think about this question and it kind of comes down to A lot of the negatives that have happened have been downstream of the hardware phone question. Right? And if you think about the loneliness, the teen suicide, the democracy, the polarization. Right? The fact that we’re getting all of this right now in our handheld divide.
  • Speaker 1
    0:03:06

    And I just I do wonder if you look back on that with a little bit of I don’t know if this is a cigarette situation.
  • Speaker 2
    0:03:12

    I’m not doing a, you know, guns don’t kill people. People kill people thing here, but in this case, you know, it it’s just a phone. It could be used for a lot of things. It’s a tool. You know, there’s that Brad Smith at Microsoft had a book, which I thought was very smart.
  • Speaker 2
    0:03:25

    I think it was tools and weapons. It’s either a weapon or a tool, every single nuclear power tool. Yes. For sure. And a very promising one.
  • Speaker 2
    0:03:33

    Weaponsened. Absolutely. Know what I mean? That kind of thing. And I think it’s just how you use these things, in this case, a screen, a TV screen.
  • Speaker 2
    0:03:41

    Is that is that a weapon? Cause it’s used to broadcast propaganda by Donald Trump. Yes. But it really is the propaganda. Right?
  • Speaker 2
    0:03:49

    Really? It’s what he’s saying. And he would find any media. I mean, I say, you know, Hitler didn’t need Instagram. Did he?
  • Speaker 2
    0:03:54

    He had lots of other tools. But if he had it, very powerful piece of technology for him. And so I tend not to blame the items themselves, even the software itself. What I do blame is when they, for example, Facebook, when they allowed lots of people to come in some of these chat groups, the way they did it, they didn’t limit it. So rage could move very fast.
  • Speaker 2
    0:04:16

    I do blame certain social networks for pushing more virality over context and accuracy. I blame them for doing it at speed. I blame them for not putting safety measures in place. Once it’s deployed, how they manage it is more is more what I worry about. And so I think it’s very hard to blame the device itself, except if it’s built in a way that is addictive, which I think some of these things are, some of the software is, or it’s built in a way so you can’t put it down.
  • Speaker 2
    0:04:47

    It’s like cigarettes. It’s addiction.
  • Speaker 1
    0:04:49

    Yeah. I basically agree with that. It’s just so I’m addicted. So I’m on the addict list. So I’m just trying to work through
  • Speaker 3
    0:04:54

    it, but,
  • Speaker 2
    0:04:54

    you know
  • Speaker 1
    0:04:54

    I can
  • Speaker 2
    0:04:54

    see that. I can see it from your activity.
  • Speaker 1
    0:04:56

    How about that? I don’t hide it. I don’t hide my at all. And so, you know, but when I’m on X, you see somebody will put them up a viral chart, and they’ll be like, oh my goodness. Loneliness, like, look at how much it’s dropped over happiness.
  • Speaker 1
    0:05:10

    Look how much it’s dropped or polarization. Look how much it’s up. Always feels like it’s like twenty eleven, twenty thirteen. Like, that chart starts to go up or down. And it’s like, what it what was that?
  • Speaker 1
    0:05:20

    And it’s like, well, it’s when it’s it was about the time when everybody had phones social media in their hands. Yeah.
  • Speaker 2
    0:05:24

    It wasn’t phones as much. It was a social media on top of it and the addictive nature. You know,
  • Speaker 1
    0:05:28

    the smartphone element, right now,
  • Speaker 2
    0:05:29

    like the flip phones.
  • Speaker 1
    0:05:30

    Yeah. The social media on your phone, I guess, the combination.
  • Speaker 2
    0:05:33

    Yeah. Exactly. But it’s a combination of all of them. Tristan Harris has become an advocate against this stuff, you know, against the way tech is used. He worked for Google.
  • Speaker 2
    0:05:42

    And One of the things, you know, he is absolutely right about is this stuff crawls down your brain stem. Right? It it peels to human nature. There’s lots of parts of human nature of wanting to coalesce and be with people, and then there’s a part of human nature that wants to be by themselves, you know, you know sniffing the whatever, right, with your addictive whatever it happens to be in many cases of phone. These things are built for addiction, and and they don’t have to be.
  • Speaker 2
    0:06:06

    Right? You can churn, for example, if you hit the side of an iPhone three times, it becomes black and white. And it becomes less interesting to people on a visceral level. And you don’t touch it as much if it’s black and white. I know it sounds dumb, but it it really does work.
  • Speaker 2
    0:06:19

    And they could do a lot of things that don’t make you descend into addiction. They could put, like, an Uber app. Are you spending a lot of time on your Uber app? No. You call it.
  • Speaker 2
    0:06:28

    You use it. It goes. That should be on the front page. Facebook should be deep in a folder, so it takes a minute to get to it. But they don’t naturally do that.
  • Speaker 2
    0:06:37

    The other thing is when they design these things, they it’s very much like a Sina where if you push this button, you wanna push it. Don’t you push this button? And it goes way back to AOL days when I was covering it. It was here in Washington, DC. Britney Spears was a big clicker.
  • Speaker 2
    0:06:52

    I mean, people would click on anything about Britney Spears back then. Yeah. And one of the pictures was fuzzy. Of her and the front page of AOL. And I said, why is that fuzzy?
  • Speaker 2
    0:07:02

    Why is that picture fuzzy? And this guy who ran the the front page said, well, we make it fuzzy so they click in. They’re in. They lean in. They click in.
  • Speaker 2
    0:07:09

    You know, it’s the same thing as a casino or whatever is else used. And so
  • Speaker 1
    0:07:13

    Well, now we literally have the casino on our phone too.
  • Speaker 2
    0:07:16

    The sports, you
  • Speaker 1
    0:07:16

    know, the sports can’t blink. So we do have both.
  • Speaker 2
    0:07:18

    We don’t have to design it like that. It doesn’t have to be designed so it appeals to addictive qualities. And that is on tech companies. The way they design the software is designed to make you not be able to put it down.
  • Speaker 1
    0:07:31

    You described yourself and looks kind of as Cassandra about some of these threats. There have been some critics that have said, like, well, even you were too chummy early on, you just kind of talk about that process for you. You’re living amidst this.
  • Speaker 2
    0:07:43

    I’m sorry to tell you. It’s largely from men I competed with and be. So, fine. Alright. We’ll put it we’ll put it there.
  • Speaker 2
    0:07:49

    When I started my career at the Washington Post in Washington, I was a beat reporter. You know what that is. You can’t go on these assholes. Right? Like, you cannot do that.
  • Speaker 2
    0:07:57

    This happened today at Google. This happened today, you know, it was your news reporter. That’s what you do. And, you know, it’s like when they they accuse people who cover Trump of that. I’m like, there are feet reporters.
  • Speaker 2
    0:08:07

    What do you want? You want them to go in with like a hammer at him? Like, I’m not sure what
  • Speaker 1
    0:08:11

    We need to learn. We need to know. I’m always a Maggie Haberman defender on this. There needs to be Maggie Haberman’s and people that shout about how terrible what she reports is.
  • Speaker 2
    0:08:19

    She’s just not. It’s just it’s not true. It’s just But I see why they do it because they hate him and they want her to do something about it because she’s near him. Fine. She’s a beat reporter.
  • Speaker 2
    0:08:28

    I’m sorry, kids. That’s what she does. She tells you what they’re doing. Every now and then, the times makes a mistake, but usually they do a pretty good job covering it in general. I know I know they’re mad at the age thing, but whatever.
  • Speaker 2
    0:08:38

    I don’t That’s because they wanna win. That’s that’s different from anything else. So I was a beat reporter, and then when I left to do all things D, A lot of these things are written by people who were born years after we were covering this stuff. We did very heavy duty coverage and very critical coverage of Google trying to take over the market. We did extensive coverage about sexual harassment and Silicon Valley around the Ellen Powder.
  • Speaker 2
    0:09:02

    We were one of the leading groups of people pushing on the disaster that was Uber, including it’s it’s terrible CEO, Travis Callanick, This was in the times this thing was wasn’t tough until twenty twenty. Hey, why don’t you look at the archives of the times in twenty eighteen, my very first column for them, I call them digital arms dealers. That’s nice. Like, give me a break. I’m sorry.
  • Speaker 2
    0:09:23

    It’s just not true. And so, you know, chummy? I don’t. I don’t I I have to know them. I have to speak to them.
  • Speaker 1
    0:09:31

    You don’t seem very chummy to me, by the way. Just if I know.
  • Speaker 2
    0:09:34

    I would pick any ten fortune covers back then over care. So I’m not sure. Like, we were known as me, and what was really interesting in this phenomena is all these PR people that I cut that I had deal with back then was like, I don’t know who the fuck you were talking to, but she terrified us and was really not very nice to us, like, and not not was tough on us. And yet, the PR people defending you, I guess. I don’t know.
  • Speaker 2
    0:09:55

    But one of the things that drives me crazy about it is that we were among the first a call in the question in the through these interviews, Mark Zuckerberg, and the anti Semitic stuff. Two thousand ten, I did an interview with Mark in which Walt and I really drilled him on privacy. He much though that he started sweating and had to take off that was He
  • Speaker 1
    0:10:12

    asked her to take off his hoodie.
  • Speaker 2
    0:10:13

    Yeah. Exactly. So in a lot of ways, I’m like, what do you want me to do? I I have to speak to them. It centers around Elon.
  • Speaker 2
    0:10:20

    I literally say in the book, I really loved what he was doing when he was doing space and cars, and he was a little bit of a narcissist. He was a little bit juvenile. I didn’t see this coming. And for some reason, people are like, we knew it was coming. I’m like, where?
  • Speaker 2
    0:10:35

    Where did you write it was coming? Nobody did. Nobody saw this this dramatic shift. Very few people, maybe one or two. In the industry around him.
  • Speaker 2
    0:10:45

    Everybody loved this guy, and he was more interesting than everyone’s because he was doing significant things around Starlink was amazing. I’m sorry. It just is. And the fact if you say Charlie Sykes amazing, everyone’s like, you love Elon. I’m like, I really don’t.
  • Speaker 2
    0:10:58

    You can see, I don’t. But StarLink was amazing. What he did was Tesla. It pushed forward electric vehicles. I’m sorry to tell you, but it was dead until he pushed it forward.
  • Speaker 2
    0:11:08

    It was. Same thing with space. He’s innovative space. And this is a guy who attacks me regularly. I still say I’m sorry, but you have to be honest about his accomplishments, even though he’s become one of the more dangerous figures in technology.
  • Speaker 2
    0:11:24

    And now he is. So what are we gonna do about that now? So that’s what drives me nuts. I’m sort of like
  • Speaker 1
    0:11:29

    Who do you think one of my when I was asking around? What do I ask Kara? You said he’s one of the most dangerous. Who who is the most dangerous? Right now of our overlords.
  • Speaker 2
    0:11:36

    He is. Elon? Yeah. He’s got money and means to sue. He’s been suing all kinds of people to suit Open AI today.
  • Speaker 2
    0:11:42

    Because he’s, you know, he’s hurt that they kicked him out, I think. But he has some cockamamie reason for it. You know, he sued another Roberta Kaplan case, this, group that was pointing out the hate on. He’s trying to quash their free speech is what he’s doing. You know, he’s got his myths all over space, and he can decide things that our government should be deciding.
  • Speaker 2
    0:12:01

    He’s got his myths in Ukraine. He really is ill equipped to do so. Our space program depends on Elon Musk right now. So that’s not good.
  • Speaker 1
    0:12:10

    I interviewed, Walter about this. I love Walter Isaac’s extended. His point is, right, is, like, this is a government problem. I what I I heard your interview, and we talked about your interview with him, and there are some very good critiques of his book. Yeah.
  • Speaker 1
    0:12:21

    But he is right on this point about the starlink thing. This is our our government. I thought we’d get in the situation where this this crazy person is responsible for this.
  • Speaker 2
    0:12:29

    I would agree. I think that’s correct. It is our government’s fault, but the the matter is, that’s a privatization that’s been going on forever. Right? The the privatization of space, our government which built the internet by the way paid for the internet created and then everybody else made money off of it except our government, you know, really has abrogated this responsibility and basic research in AI AI now is being run by private companies.
  • Speaker 2
    0:12:53

    That’s why Elon suing. He wants to get in, right? He wants to get in on it. And he’s doing his own thing. But right now, AI is dominated by, Microsoft Open AI is a smaller company, but is dominated by all the bigs again.
  • Speaker 2
    0:13:05

    And so this is a critical national security issue and everything else, and our government is sitting on its hands. So, you know. Yeah. I want to, I
  • Speaker 1
    0:13:14

    wanna AI thing. We just a couple more things in Elon. Just really quick, there’s NBC story yesterday. It’s really good that lists all of the various oversight things Elon’s dealing with right now from the you see to all, you know, all the various agencies and how I think he’s also lost his mind, but he’s financially motivated, incentivized to try to help Trump this time because of all the threats facing him.
  • Speaker 2
    0:13:34

    He is.
  • Speaker 1
    0:13:35

    I’m curious your take on the psychology of this. He he did a tweet yesterday. I never went to therapy on my gravestone. We highly recommend therapy on this podcast. I don’t know if if you have any mutuals anymore with Elon, if you can help him get there.
  • Speaker 2
    0:13:48

    No. What I think was. I said something publicly, and they said, what can Elon do? I said seek therapy.
  • Speaker 1
    0:13:54

    It might have been a joke, but it’s a good advice. Actually, he should seek therapy. You know,
  • Speaker 2
    0:13:57

    and I think probably he probably read that.
  • Speaker 1
    0:13:59

    I think the big thing about Elon is is it related to Twitter? Right. Does is there something about the Twitter platform that breaks people’s brains because he’s not alone on this. And or is or was this underlying? And you have this hilarious haram based story in the book.
  • Speaker 1
    0:14:14

    We’re like, you you you introduce Salzberger to him. Like, and you knew him personally. It’s like, you know, was the this craziness always underneath in something triggered him or was it something about the app? Like, well, how do you assess?
  • Speaker 2
    0:14:26

    Well, you know, he’s always been a troubled person, and he doesn’t hide it. Like, if you go back and look at some New York time stories, he’s sort of very emotional around when Tesla was in big trouble. And he’s talked about it. Compared to a lot of people, he talks about his mental health struggles. He has several times.
  • Speaker 2
    0:14:42

    He said he’s manic depressive, I think. At one point, he doesn’t hide his unhappiness, and he never was. And that made him unlike people because a lot of them feel robotic. And Elon always felt emotional all the time. You know, you could see I ran into him at a party once, and I go, how’s it going?
  • Speaker 2
    0:14:58

    He goes, I’m really lonely. And I was like, Oh, okay. TMI, but okay. Nine children. Yeah.
  • Speaker 2
    0:15:03

    I know. I was like, oh, I I didn’t know what to say. I was like, oh, well. Okay. Right.
  • Speaker 2
    0:15:08

    Then I’ll get a drink over here at it.
  • Speaker 1
    0:15:10

    Maybe if you had hugged him. Maybe if you’d hugged him in that moment, we wouldn’t be here right now, Kara.
  • Speaker 2
    0:15:14

    No. Thank you. You know, he did wasn’t dating someone, I think, or It was it was weird. It was weird. I remember being a I felt bad for him.
  • Speaker 2
    0:15:20

    And I think he has long mental health struggles. I think as you saw in the Wall Street Journal, he enjoys medicating himself with a variety of drugs, self medicating. And I think that story was very important to write because it links to some of the behavior. I think COVID was a real moment for him that he, you know, a lot of people got radicalized during COVID, the vaccine stuff. For some reason, he got pulled into that whole anti vax kind of thing or questioning the vax, and then he got into informactin.
  • Speaker 2
    0:15:51

    And we had an interview during that period where he just went off the rails, and he had never done that. I have to say in an interview for sure where you threatened to leave the interview because I doubted his intelligence on COVID, and I was like, I just don’t think you know what you’re talking about, and that offended him greatly. And, you know, he didn’t leave the interview, of course, because it was he’s such a paper tiger in that regard. And so, you know, I think it built. And there was always an element of these dank memes, boob and penis jokes, boobs, you know, And I remember thinking when he did it a couple of times, god, this guy is in his forties.
  • Speaker 2
    0:16:25

    What is he doing? This is kinda sad. Like, how sad? Like, kind of thing? It was a minor part of it.
  • Speaker 1
    0:16:30

    Maybe it’s three frontal cortex.
  • Speaker 2
    0:16:32

    Yeah. I was like, oh, whatever. It’s so juvenile, but okay. But I think Twitter did help do that. I think it was a combination of COVID.
  • Speaker 2
    0:16:39

    I think he’s got as he got richer, you know, all these people, it happens in politics too, and they’re not even rich. They have people around them licking them up and down all day. They think they own the world. They’re so hypocritical. Like, you saw that Hunter Biden thing with Matt Gates where he goes Yeah.
  • Speaker 2
    0:16:55

    You know, what did you take drugs? He’s like, you’re not the person to be talking to me about that. But but that’s how Matt Gates is. He’s, you know, come on Matt Gates. We know you’re a partier.
  • Speaker 2
    0:17:03

    It’s ridiculous. And to be so high handed about drug use. By the way, I don’t find any whatever. Take whatever drugs you want. But I think he changed got radicalized.
  • Speaker 1
    0:17:13

    Stay away from needles, kids.
  • Speaker 2
    0:17:14

    Needals, needles, kids. Yeah. Stay away from needles, kids. I’m talking about, you know, what I’m talking about. So he changed.
  • Speaker 2
    0:17:19

    He he became radicalized I know it sounds crazy, but the one thing that I remember him getting so upset about in one of the interviews was Biden did not invite him to a car comp fab, electric car comp fab. He had all the big ones.
  • Speaker 1
    0:17:32

    Right.
  • Speaker 2
    0:17:33

    And I gotta say he is the pioneer of that. Right? He was the pioneer of that. And he wasn’t invited, and he was so mad not to be invited. And he was, like, a little much.
  • Speaker 2
    0:17:43

    I deserved to be there. I, like, and that that’s where he turned on Biden. And I remember calling someone from the Biden administration. I was like, you should have invited him. They’re like, we, you know, they didn’t because of the union issues.
  • Speaker 2
    0:17:53

    That was what was the problem there. Because they it’s not a unionized shop Tesla isn’t. And he he was furious about that. It was fascinating to me. I’m like, what do you care?
  • Speaker 2
    0:18:03

    And he was like, I deserve to be there. I just he so to sum it up, I think he’s become more radicalized. I think he’s changed. And he thinks because he’s so rich, he thinks he’s untouchable, and it who does that remind you of? Who has changed also, by the way?
  • Speaker 2
    0:18:18

    Trump was not this way.
  • Speaker 1
    0:18:20

    All Are you sure?
  • Speaker 2
    0:18:21

    He was a little bit, but it was harmless. It was harmless and silly and performative. When he was on that show, a lot of it was tongue in cheek. You know what I mean? And that then it he became the character he was playing on TV, and it fed into the way he was.
  • Speaker 2
    0:18:37

    And by the way, now that we see all the sexual assault stuff over the years, it’s like, oh, yes, no. He was always like this. But he hid it well, I guess. He hid it well. I see
  • Speaker 1
    0:18:46

    a little bit of a different parallel that it’s kind of similar though. When you talk about this rich guy resentment, that’s hard for me to get. And One thing I was just I was dying to ask you about is, the, andresen manifesto. Mark andresen is is one of these guys don’t know big venture capitalist, also brilliant guy started netscape, and, you know, manifesto about tech optimism. I’m interested in your take on, and, I just wanna read one bit for it.
  • Speaker 1
    0:19:10

    Our enemy is the ivory tower, the know and all credentialed expert world view, indulging in abstract theories, luxury beliefs, social engineering, connected for the real world, delusional, unelected, and unaccountable playing god with everyone else’s lives with total insulation for the consequences. I have two questions about this. One Why are the richest people in the world so resentful of people in the supposed ivory tower? And do they why do they not realize he’s talking about himself here?
  • Speaker 2
    0:19:34

    He’s talking about himself.
  • Speaker 1
    0:19:35

    Very confused.
  • Speaker 2
    0:19:36

    He’s a very He’s always been a very troubled person. I don’t know what else to say. He’s he’s a very difficult, complex person. And, in when I knew him, I used to talk to him almost nightly. Which was interesting.
  • Speaker 2
    0:19:47

    Really? Yeah. We used to text about no. We text. We talk about politics or text about, you know, different things.
  • Speaker 2
    0:19:52

    He’s very gossipy. He he’s a very gossipy personality. He was. I’m sure he still is. That’s about him.
  • Speaker 2
    0:19:58

    That is about him. These people in Silicon Valley, it’s a miracle that they can see themselves in mirrors there. You know what I mean? It’s a miracle. They’re like vampires.
  • Speaker 2
    0:20:07

    They can’t see themselves. And so Why?
  • Speaker 1
    0:20:10

    What is this about? What is the result?
  • Speaker 2
    0:20:11

    Well, it’s, again, a combination of mental challenges of extreme wealth, godlike tendencies. They all think they’re in a video game, which which they’re ready player one. They think they know better because they know about one thing. They know about, oh, I’m gonna tell you about Ukraine or whatever. By the way, one of the good things about tech is a natural questioning of the status quo.
  • Speaker 2
    0:20:32

    That’s a good thing.
  • Speaker 1
    0:20:33

    Like,
  • Speaker 2
    0:20:33

    why are we doing it like this? But instead of why are we doing it like this, the there now thing is What they’re doing is bad, and we must kill it. Like, it’s changed from, let’s try a new way to, let’s kill them because they’re hurting us. And so they’re contrary for a contrary sake, which is ridiculous, and it’s it’s infected people in the media too. Yeah.
  • Speaker 2
    0:20:55

    You know, it really badly. Some people, everybody. Like, there’s a whole bunch. There’s a whole strain of, you know, Matt Tyby, those people who are, like, lap dogs to Elon Musk, and then he kicked them, which was a surprise. You know, they kick he kicked all of them.
  • Speaker 2
    0:21:08

    Yeah. He’s kicked everybody in that Twitter files thing. He’s kicked them all. It’s fantastic. I knew it would happen.
  • Speaker 1
    0:21:12

    That was kind of satisfying.
  • Speaker 2
    0:21:14

    It was sad. It was sad actually for them, but you knew where that was going. You know, they really feel like they’re victims. One of the things that I used to get, because I was considered, although many men think I’m not tough enough, too bad. Mom is not mean enough, too bad.
  • Speaker 2
    0:21:29

    They they used to call me me. Like, they’d always they would call me these tech muggles when I’d write something, and they’re like, you’re mean to me. And I’m like, what are you talking about, your company collapsed. I said it collapsed. Like, they’re like, yeah, that’s real mean.
  • Speaker 2
    0:21:43

    And I was like, again, I would always be like, I’m not your fucking mama. I don’t know what your problem is. We’re not friends. This is not I’m not trying to get you. It’s just facts.
  • Speaker 2
    0:21:52

    Your company collapsed. I would get that a lot. You’re not nice to me or you’re my There was at that scene in the book with the Google guys where I called them. I said I was writing a story about them trying to take over search, and this was early two thousands at some point. Two thousand eight, maybe.
  • Speaker 2
    0:22:07

    And I wrote this thing in doctor Sousfa saying would not could not have a monopoly or something like that. I made it rhyme. I had covered the Microsoft trial where they were the antitrust trial many years before in the nineties, and I said they at least Microsoft knew they were thugs. These people pretend they’re not. You know, they they have their giant colorful balls and their pogo sticks and their soft food, but they’re the same.
  • Speaker 2
    0:22:30

    It’s the same killer. So they called me all hurt. They’re like, that really hurt us calling us thugs. And I was like, well, I think you’re thugs. I don’t know what to tell you.
  • Speaker 2
    0:22:39

    And And they said, we’re not bad people. And, you know, they reference their don’t be evil, you know, thing. And I said, you know what, guys? I don’t think You’re evil. I really don’t actually.
  • Speaker 2
    0:22:51

    I said I’m worried about what you’re building. The next person is gonna be evil, and they’re coming. You know, they’re coming. Evil is coming for this. These tools are so power They’re so pervasive.
  • Speaker 2
    0:23:00

    They can amplify really bad things. What you’re building is dangerous. Even if you aren’t bad, the next guy is sure to be bad. I’ve or he’s coming. The bad guy’s coming.
  • Speaker 2
    0:23:11

    And they never got that. They never understood that They never understood history or anything else. And that was very troubling to me about these people. And they would always say you’re mean. And I’m like, I’m not mean.
  • Speaker 2
    0:23:22

    I’m just I I’ll tell you one other example is I wrote a column in the New York Times in two thousand nineteen in which I said, if Trump loses the election, this is my hypothetical. If Trump loses the election, he’s gonna set start saying it was stolen. He’s gonna say it was a lie. He’s gonna perpetrate it up and down the online ecosystem. It’s gonna have resonance because it’s gonna go up and down, up and down, and it’s gonna it’s gonna radicalize people.
  • Speaker 2
    0:23:43

    And then he’s gonna ask people to do something about it in the real gonna jump off online into offline, and we are fucked if he does that. Like, this is gonna get violent. Yeah. Because he had already started with violent phrases on Twitter before that. And I said, I think
  • Speaker 1
    0:23:58

    it was back to sixteen. He was doing it in twenty sixteen.
  • Speaker 2
    0:24:01

    Right. I put this scenario out, right, which is happened. Right? And I said, this is this is the most likely scenario based on what I’ve seen this guy do. I got calls from every one of those social media sites saying, how dare you say this?
  • Speaker 2
    0:24:16

    This is this will never happen. I’m like, this will happen. This is exactly where this is headed. And they were mad at me for saying so. And, you know, and I said, I think at this moment, you are quickly becoming handmaidens to sedition.
  • Speaker 2
    0:24:30

    That’s what you’re doing.
  • Speaker 1
    0:24:33

    Yeah. Let’s do the trump thing because JBL in our in the newsletter yesterday for the tried wrote, and created JBL’s law, which I really liked, which is relevant to this. Any person or institution, not explicitly anti trump will become a tool for Trump’s authoritarianism eventually. And that this was true of all, and he was talking about courts and Mitch McConnell, but this is true of the tech companies too. And I just I you know, all these guys, you write about this in the book about how, you know, Trump wins and then they all go to try to work him over, to try to meet with him, to try to be on the inside.
  • Speaker 1
    0:25:01

    And that that even includes the White Hack guys. Tim Cook is out there, you know, trying to work Trump over, and they’re putting out press elisa’s together about about manufacturing or whatever. Mhmm. Talk about that how that that was happening in real time and and what you write about in the book about these guys accommodating Trump and and the dangers of that.
  • Speaker 2
    0:25:17

    Well, I hadn’t been a beat reporter for a while, but I got the tip that they were all going, which was a shock to me because nobody sent an And you know, these people are so performative. Everything they do requires a press release or a tweet or whatever. But suddenly, it was silent because they were embarrassed. They had trashed Trump to me. Off the record a million times.
  • Speaker 2
    0:25:34

    Right? Like, oh, what a clam, what a buffoon? A buffoon was the common word, and he can’t win. And He’s an idiot. We can work with Hillary.
  • Speaker 2
    0:25:42

    You know, that’s what they thought was gonna happen. And and some of them were more explicit. Cheryl Sandberg was a big supporter who was at at Facebook was a big supporter of Hillary Clinton. Meg Whitman, had famously shifted. Now, by the way, she didn’t go to the meeting.
  • Speaker 2
    0:25:55

    She said he’s a Right. He’s Ron DeSantis what she said. She was a Republican. She was like the only Republican.
  • Speaker 1
    0:25:59

    They never told us to do the right thing. We were the ones. We see it clearly. Meg, Meg is a fellow traveler.
  • Speaker 2
    0:26:04

    Yeah. She was. And she was very for her to shift like that was really quite something to watch. And because she was she’s conserv I’m not a conservative, but she you know, she’s a typical Republic And being a Republican in Silicon Valley in California right there, she was a unicorn. There’s a couple of John Chambers, I think, was one.
  • Speaker 2
    0:26:22

    There’s a couple, but not many. And there certainly were no trumpers. There were no trumpers. And so when they I heard about this, I was literally with my son at a at a farmer’s market and I’m like, they’re going where? All of the what and then I started to see who was going.
  • Speaker 2
    0:26:35

    And I was like, it’s all of them going. And so I said surely, they’re gonna say something publicly about his comments on immigration because immigration built Silicon Valley. Surely, they can’t go to this meeting without making a statement about immigration. And I got on the phone with all of them including Elon. He was the one who who actually was like, listen.
  • Speaker 2
    0:26:51

    I don’t think he’s gonna do this Muslim ban. I’m gonna stop him, blah, blah, blah, blah, like, I’m Jesus kind of thing. And I said, you’re not gonna stop him. He said he’s gonna do it. This guy, for all his ridiculous clownishness, I think he’s gonna do it.
  • Speaker 2
    0:27:03

    Like, he said so. He promised his people. This is not a hard thing to do, like the wall or whatever, but he said it. And I had counted it up, and I was like, he said it’s seven hundred and twelve times the campaign trail. He’s gonna he doesn’t let he’s a racist.
  • Speaker 2
    0:27:16

    He’s a long time racist. This guy has persistently been attacking people of color. So I feel like he’s gonna do it and different people. And and it’s just I don’t know. Anyway, I talked to all of them.
  • Speaker 2
    0:27:27

    They thought he wasn’t gonna do it, and they’re like, we’ll talk to him off the record. And I’m like, no. You you’re the powerful people. You’re the ones who stand up for immigration because it’s helped build your industry. And none of them did, and it was really something to see.
  • Speaker 2
    0:27:40

    And then they sculked out, they never made a statement, and Trump used the entire thing as a press release. Trump was smart enough to use it. And he did a little bit.
  • Speaker 1
    0:27:48

    Multiple times, he used all them for press releases.
  • Speaker 2
    0:27:50

    Love me. Like, I put them on my counsel. They’re on my side. The smart guys are on my side.
  • Speaker 1
    0:27:54

    Tim Apple’s bringing the jobs back to America from China, the whole thing.
  • Speaker 2
    0:27:58

    Which he wasn’t. But, okay, you know, and he got it wrong in Las ways. But, you know, when he got it wrong, they didn’t correct him either. Right. By the way.
  • Speaker 2
    0:28:07

    Right. Which was fine. I got that. You know, someone from Apple was like, what are we gonna do? Say president’s an idiot.
  • Speaker 2
    0:28:11

    I said that we could start. Maybe. Yeah. But they can’t. I got that one.
  • Speaker 2
    0:28:14

    I got that he he’s a polite man. He’s not gonna call him out right there. But all of them were happy to call me and insult him, but none of them were happy to do it on the record, which I thought was really nefarious. I just was like, You’re kidding me.
  • Speaker 1
    0:28:27

    Like Welcome to my life, Carol.
  • Speaker 2
    0:28:28

    Yeah. I know. They they wanted their money back. They wanted that there was all this income and they wanted the money repatriated. It hadn’t been repatriated this cash that they wanted.
  • Speaker 2
    0:28:36

    They wanted tax breaks, and they wanted, no regulation. And so that’s what they got.
  • Speaker 1
    0:28:41

    I wanna do another area where you were warning and, and how it ties to knowledge’s media stuff. You warned all these companies to Murdoch. You told Don Graham, in the book that sort of wipe out his classified business. He laughed and said, Ouch. Who guess he was wrong on that one.
  • Speaker 1
    0:28:56

    You can talk about that if you want, but I’m also I’m more curious about where your warnings would be now. To these media companies with, with, particularly with regards to the AI and how things are gonna get even worse, frankly, or more complicated at least, maybe not worse.
  • Speaker 2
    0:29:10

    When when we have these technological upheavals, there’s one in farming a long time ago. You know, most people used to a third of people used to be farmers. Now it’s so tiny. The the population and people who do farming. Same thing with manufacturing, mechanization, and and robots, and things like that.
  • Speaker 2
    0:29:25

    That’s changed that that Now it’s coming for the white collar. This AI stuff is for white collar, really. And it’s gonna decimate certain industries, and it’s gonna really change the way we work. And media is one of those places. I don’t think decimation, but I do think we’ve already had the shit kicked out of us in terms of online advertising, which is now dominated by two tech companies, which Google and Facebook or Meta and Alphabet.
  • Speaker 2
    0:29:48

    They have sucked up all the digital advertising for the most part. And then some companies do okay, like the New York Times and some others, So that so the economic stuffing is knocked out of it to start with. And now these tools will make it so every single company that has information will be able to be much more efficient and costs. And where do you think the costs are, people? That’s where most costs are.
  • Speaker 2
    0:30:08

    And so anything, you know, one of the lines I have in the book is anything that can be digitized will be digitized. Now it’s not just digitized, but it’s smart digitization. Like, it’ll take it’ll do head like in media. It’ll do all kinds of things. Now it’s not gonna write stories report them.
  • Speaker 2
    0:30:23

    That is that is not true, but it can collate and collect information in a way that people used to do, that we don’t need people to do that anymore.
  • Speaker 1
    0:30:34

    I worry a little less about the job, so I worry about it than I do about the consumers. I had, you know, your cohost Scott always because he’s kind of AI optimist ish with caveats, you know, smart about it. But and so when I was pushing him back on this. The one area where we kind of both are like, yeah, this one’s tough is. Yeah.
  • Speaker 1
    0:30:50

    I said to him, I was like, if I sometimes get confused, Not that often, but every once in a while, I get tricked by something online. And I am a we just talk. I’m an addict. I’m a, I I consume more information than anyone. So if I’m getting con con if I’m getting tricked, What is what is my aunt gonna do?
  • Speaker 1
    0:31:05

    What’s my, you know, what I mean? What are people that didn’t go to college gonna do with AI? I got and I don’t think we have anybody’s even trying to come up with answer to this?
  • Speaker 2
    0:31:13

    Well, I think it’s gonna not affect, you know, blue collar workers as much at all. I mean, some of it is. I mean, people are worried about, say, autonomous vehicles. I think there’s not enough truck drivers, and I think a lot of truck driving should be done. It’s dangerous job.
  • Speaker 2
    0:31:26

    And so it could change that industry for in a good way, actually, you could see it. But I it’s very hard. I think you one of the problems with tech is that it’s addictive, but it’s also necessary. You can’t do your job in a white collar situation without digitization. You just can’t.
  • Speaker 2
    0:31:41

    And so it’s unavoidable. It’s unavoidable and addictive and it knocks the stuffing out of the economics of most businesses. That’s really scary.
  • Speaker 1
    0:31:49

    But what about the misinformation side of what about people getting confused, people not knowing what’s real and what’s fake?
  • Speaker 2
    0:31:55

    Well, it started with cable, like with Fox News, which is very effective, but now it’s scale. Right? Now it’s at scale. So people if people are getting all their news from Facebook, what Facebook picks to put in front of them is important. The problem is the people at Facebook don’t care what they put in front of people.
  • Speaker 2
    0:32:11

    It it, you know, Nazis or cat videos. It’s all the same to them, right, kind of thing. And so and then it also it gives you what you want. So if you start down one road, you get to the other road. Right?
  • Speaker 2
    0:32:23

    And so it’s a it’s a it’s a path of radicalization that happens. It used to be called propaganda, but now it’s propaganda at scale and it that you do it yourself propaganda. They don’t have to put up a poster you know, in Berlin in the thirties of, depicting Jewish people as vermin, for example. They don’t have to do that. You know what they can do?
  • Speaker 2
    0:32:43

    They can send an individual message to one person. They know their fears. They know their what they like. They know their fears what they like. They know their habits.
  • Speaker 2
    0:32:51

    They can send messaging that is so designed at them that it’s dangerous. It really it it they it’s designer propaganda is what it is. And and very much aimed at individual people. You know, I’ve talked about my mom just being total, you know, just complete. And that’s just Fox News.
  • Speaker 2
    0:33:09

    During COVID, She’s like, it’s just the flu. That went on for a while. I did one time, which was incredible. I did an interview with Hillary Clinton, and my mom called me and she goes, that Hillary Clinton. She’s saying, this, this is this about people like me.
  • Speaker 2
    0:33:22

    People like me is their favorite phrase. Right? They’re trying to get us. People like me. And I said, well, that sounds vaguely familiar.
  • Speaker 2
    0:33:28

    And I said, can you just tell me more about it? And she started to take it. And it was my interview. She was quoting, except it was through the lens of right wing media. Right?
  • Speaker 1
    0:33:37

    Right.
  • Speaker 2
    0:33:37

    Which wasn’t accurate at all. They had twisted it. And I said, mom, that’s not what you said. Oh, she’s like, no. That’s what she said.
  • Speaker 2
    0:33:43

    And I go It’s your daughter’s and it was my interview. It’s not what she said. I made her go listen to it, and she did. And she came back. She’s like, okay.
  • Speaker 2
    0:33:52

    That’s not what she said. But she’s still, you know, plotting against our country and really secretly running it. And I was like, y, y, y, y, y, like, it didn’t matter. Yeah. So that’s propaganda.
  • Speaker 2
    0:34:01

    And it’s very good. It’s propaganda on speed is what it is.
  • Speaker 1
    0:34:04

    Might just hope level for our politicians’ ability to actually regulate this in a way is just is basically nil. I know that Mark Andreessen’s worried that he’s being over regulated. Our mutual friend, Luther, well, he texted me and I was like, what should I ask Carrie? He said, they can’t these guys can’t even, you know, end the self preferencing that he’s obsessed with. Right?
  • Speaker 1
    0:34:23

    Which is not like Google is, you know, is putting its own its own products at the top of Google search. So if government or if these guys can’t regulate just the basic stuff about privacy about you know, self preference. How how in god’s name are they gonna handle the a the AI side of this? And there’s, like, what is the optimistic angle on that.
  • Speaker 2
    0:34:44

    There isn’t because this is so private. Right? That’s the issue. Is that not just AI, but space is private. Everything that’s important that government used to have a hand in as private.
  • Speaker 2
    0:34:55

    AI is run by big companies. It’s not run it by our government. It’s not. Our government doesn’t have a handle on it. This is something our government because of national security issues because of all kinds of things should be deep into, and they just aren’t, in the way they used to be, at least.
  • Speaker 2
    0:35:09

    And so now decisions are being made by big companies. I’m not sure what could happen. And also, there’s a ton of money at stake. Like Sam Altman is raising seven trillion dollars. For a chip factory.
  • Speaker 2
    0:35:21

    Microsoft is a multi trillion dollar company, Nvidia, which makes chips, truly multi a multi trillion dollar company. Apple, a multi trillion dollar company. You know, they’re all We’ve
  • Speaker 1
    0:35:31

    got guys making a hundred eighty grand a year. In in DC in Charlie Sykes trying to you know, put some bumpers on this, and there’s just no hope.
  • Speaker 2
    0:35:39

    But they haven’t. They they’ve had the chance for three decades now, and they haven’t. And, you know, one of the things is I was at an event last night, and Amy Klobuchar has tried her hardest to get even a basic antitrust spill through, you know. And she got kneecapped by the tech companies who were spending in districts in by Democrats, FYI, who just pulled away from her bills because they got kneecapped in the, you know, in their own districts or whatever. These companies have enormous lobbying organizations now that that really can move the needle here.
  • Speaker 2
    0:36:10

    And it’s like standard oil got it together before we could break it up. Right? They got it together and so and there’s so many of them. There’s so many of
  • Speaker 1
    0:36:19

    This is my free market, my old free market coming through. I think that the regulatory status is more important than the competition side. Right? I I don’t know. I I to me, I I always think that the obsession with the antitrust is a little overstated.
  • Speaker 1
    0:36:30

    I like, of Google with the exception of Google. Right? Because, yeah, right? Because some of these we see now. I mean, even Google now chat to you, like, there are other companies that are disrupting You know, people’s like Facebook’s a monopoly.
  • Speaker 1
    0:36:42

    I’m like, really? They’re a nineteen social media company. Alright.
  • Speaker 2
    0:36:45

    I’m gonna push back on that because I am also a capitalist myself. I’ve built lots of businesses. But In that time, they got to dominate digital advertising. See, in that time that that Amazon didn’t have to pay sales taxes when other retailers did. They got to dominate.
  • Speaker 2
    0:37:00

    So they get to dominate and build a great business off the backs, and they don’t pay the cause. Facebook got to dominate and didn’t pay the price for propaganda anti Semitism, guess who’s it’s like they’re opiate makers. And we’re saying, thank you. And you don’t have to pay the price or your damage that it it’s, you know, pharmaceutical companies don’t get to do it. Don’t get they definitely break laws, but they there’s laws to break.
  • Speaker 2
    0:37:22

    There are zero laws in place. I mean, There should be more than one. Right? There just should be. And there aren’t any.
  • Speaker 2
    0:37:29

    There’s an you antitrust is just one of them. We I just think we need to update our antitrust laws. It’s a hundred years now. Companies shifted and it should be done smartly. I think we’ve gotta update our algorithm.
  • Speaker 2
    0:37:39

    What is in those algorithms? What how do they make decisions? Let’s talk about safety. These are just basic things that don’t don’t hinder capitalism. Privacy, why are they scraping your content in mind.
  • Speaker 2
    0:37:50

    Like, at least we have copyright laws that are good. Should they be able to just shoplift your ship, Tim? No. I mean, why why? And and if and you have to sue them to get it back.
  • Speaker 1
    0:37:59

    I want a penny. Give me that Spotify. That’s right. Cash. I want I point zero zero one pennies for every time they use my tweets.
  • Speaker 2
    0:38:06

    You know, AOL was doing it. At one point, they’re like, we make fifty dollars from every user. I’m like, where’s my Vig? Because it’s my information. I’m Pass are made by walking.
  • Speaker 2
    0:38:15

    It’s a very famous quote.
  • Speaker 1
    0:38:16

    There
  • Speaker 2
    0:38:17

    that’s my walking. I wanna be paid for my my, you know, they scrape everybody’s information and then they We’re
  • Speaker 1
    0:38:23

    down here in the content minds.
  • Speaker 2
    0:38:25

    Right. And then, and then they say you’re welcome. And also say, oh, you know, because it’s capitalism. I’m like, is this capitalism? No.
  • Speaker 2
    0:38:33

    It’s the really top people get to use their positions. To help them in other ways. I don’t think that’s competition. The way America wins is through innovation at the lower level And if every of all AI is dominated by big companies, do you think there’s gonna be a lot of little companies which are the lifeblood of this country in terms of innovation? You know, I like these big companies good for them, but we need little companies growing almost constantly to one, be innovative, and two to keep up.
  • Speaker 2
    0:39:01

    In terms of things. And that’s what’s gonna get killed here is real capitalism, which is what I am for, not, you know, because. So
  • Speaker 1
    0:39:09

    okay. We’re we’re about out of time. Couple rapid fires. I’ve got a I got my happy person, my happy character for the book. I did not know this about you.
  • Speaker 1
    0:39:16

    But you’re kind of responsible for America’s best governor, Jared Polis?
  • Speaker 2
    0:39:19

    I am. Like, I do like it.
  • Speaker 1
    0:39:21

    Told his mother to cash out on their digital digital greeting card. One sentence on that.
  • Speaker 2
    0:39:27

    Oh, Jared, he was such a he was like, you know that Michael j Fox show where he was the conservative and his parents were hippies? That was
  • Speaker 1
    0:39:32

    Alex Peteaton.
  • Speaker 2
    0:39:33

    And it’s p he was Alex Peteaton. A gay one, but he was Alex Bekeaton. And so
  • Speaker 1
    0:39:38

    Yes, sir. Maybe this is why maybe
  • Speaker 2
    0:39:40

    he went so long. Yeah. That was interesting. It was called Bloomount Arts. At the time they were buying traffic, everybody was trying to buy traffic.
  • Speaker 2
    0:39:46

    And so they sold their company. It was a greeting card company. And he was he was doing candies and flowers. He was so funny. He was a funny little partner.
  • Speaker 2
    0:39:53

    And you
  • Speaker 1
    0:39:54

    got no big on that either. But you told you suggested to his mom, maybe it was time to sell.
  • Speaker 2
    0:39:58

    They used to send me cards. I was like, I don’t want your cards, real cards, and like that. I’d like cash on the barrel.
  • Speaker 1
    0:40:04

    On the other side of the equation, are the worst people in the world, the friends of the actual genius entrepreneurs who get rich off RSUs like David ball sacks. Are those the worst characters in the book?
  • Speaker 2
    0:40:16

    He’s not in the book. He’s just on the back
  • Speaker 3
    0:40:17

    of the book.
  • Speaker 2
    0:40:18

    I have Martin completely. I don’t I’m not interested
  • Speaker 1
    0:40:21

    in talking about That’s probably the best way to
  • Speaker 2
    0:40:22

    do it. I’m not interested in talking about enablers and minions. I’m not they don’t interest me. Suck up.
  • Speaker 1
    0:40:27

    They only interest me because they’re real. I do think. In some ways, I kind of I at least dislike Elon and Teel and these people, but at least they were innovators. Kind of people that are riding on their backs, the ones that bug me. Okay.
  • Speaker 1
    0:40:39

    This is a request from a listener. You can you can reject it if you want, and we’re gonna p g it up a little bit. Kiss, Mary, disappear, Elon Musk, Mark Zuckerberg, Sam Altman.
  • Speaker 2
    0:40:50

    It’s tough. Dis- disappear, Elon. Go to Mars. Enjoy yourself. Would be great.
  • Speaker 2
    0:40:54

    It it’s we’ve had enough of you and take Bill Ackman with you. I probably would marry Sam Altman because it would be a beautiful gay marriage together with us. I guess I’d have to kiss him more.
  • Speaker 1
    0:41:07

    He’s a he’s a nice fella.
  • Speaker 2
    0:41:08

    I, like, I could I’m not gonna kiss Mark’s diaper.
  • Speaker 1
    0:41:12

    And he’s got his muscles now, though. He’s been
  • Speaker 2
    0:41:14

    working out. He was so skinny. So was Jeff Bezos. He was skinny, skinny. They’re both skinny little things.
  • Speaker 1
    0:41:19

    So Speaking of Guila Gays, I think the the most lesbian clause I’ve ever written is in this book, the hardware store is my safe space. Okay. My final quest I asked you, you could tell one story that you that was in the book. I wanna hear the ice sculpture story.
  • Speaker 2
    0:41:31

    Okay. So during this period of craziness, whenever it was adorable, My wife worked for Google many years after I’d started covering them, but she went there. I stopped covering them when she went there. We went to this baby shower for Anne Wojiski and Survey Brand. They’ve since divorced, but that’s they were having their baby and you walked in and there were all these baby photos.
  • Speaker 2
    0:41:48

    And when you walked in, they always have these assistants. They are they’re full of assistance, these people. And and they all have swingy blonde hair. All of them women. They’re all women.
  • Speaker 2
    0:41:56

    And they said, would you
  • Speaker 1
    0:41:57

    like a blanket? Do you have a swingy blonde hair?
  • Speaker 2
    0:41:58

    No. As you can see, I do not. I it’s just Kara and me and my EOR back there. They said, would you like a onesie or a diaper? And I was like, what?
  • Speaker 2
    0:42:08

    And so they may they love dressing up. These people like forced fun. I used to call it forced joy. So they made you put on a diaper or onesie and then gave you a sucker and a baby hat and a bottle of a fake baby bottle to put liquor in. And I said, I’m not putting any of this on.
  • Speaker 2
    0:42:24

    There’s no fucking way. You know, I’m putting any of this stuff on. And a rattle. There was a rattle involved, and and and so I ran inside before they could make me do it. And they, like, chased me.
  • Speaker 2
    0:42:33

    And I was like, I’m not putting on this shit. And I walked inside and it was like, it was a dystopian version of people pretending they were babies. And there was a bounce house, there was baby food, everything in little baby food jars. All the food was in baby jars. People who wrote Sergei was in a onesie on roller skates.
  • Speaker 2
    0:42:49

    There was all kinds of bouncy balls. It was like, it was a nightmare. And I had toddlers. And so I was like, this is bullshit. Like, this is really weird.
  • Speaker 2
    0:42:58

    You know, I was like, I don’t need this. I have it at home. I don’t need stuff, and I didn’t like. I they’d always tried to get you to act like a child, which I hated. They had slides in their offices.
  • Speaker 1
    0:43:05

    There’s a metaphor there, I think.
  • Speaker 2
    0:43:07

    Yeah. Like, we’re childish. Childlike. And I was like, you’re childish. That’s for sure.
  • Speaker 2
    0:43:11

    And so there was an ice sculpture there too, which I was riveted to. It’s a full it’s a torso of a woman. And out of the breasts, came white Russians that you put your cup up and they Mowski. To the boob like it was breastfeeding. It was so ridiculous.
  • Speaker 2
    0:43:27

    And I look over right near the breastfeeding is, Gavin Newsom, who was mayor of San Francisco at the time, and he’s in one of his fantastic suits. That guy can dress. Right? And I he didn’t have a diaper on. And I was like, how did he’s like, how did you get out of it?
  • Speaker 2
    0:43:41

    I said, I ran. I wasn’t gonna do a dignity. And he’s I said, how did you get out of it? He said, I knew you’d be here, and you would take my picture, and that would be the end of my political career in the diaper at the behest of billionaires. And we were laughing his because it was so calling to fact check this with him was so funny.
  • Speaker 2
    0:43:57

    We were laughing the heart. We were he was like, oh my god.
  • Speaker 1
    0:44:00

    This was not a hallucination. Right? Really happened. I just need you to confirm. And then
  • Speaker 2
    0:44:04

    we’d had some of the white rush and it was delicious. So I love that. It was just so it was everything wrong and right with that time of period. It was so re fucking ridiculous, but at the same time, it was kind of sweet. It was weird and sweet and strange and also What is wrong with these people?
  • Speaker 1
    0:44:21

    And Also therapy inducing. Yeah. Take this full circle. Cara Swisher, host of the podcast. I’m with Cara Swisher and pivot she’s got a new book, burn book.
  • Speaker 1
    0:44:29

    Go get it. Thank you so much for taking
  • Speaker 2
    0:44:31

    the time with us. I just remember I never wore a diaper and neither did Gavin Newson. So vote for him for president. We didn’t We resisted the diaper. We resisted the diaper.
  • Speaker 2
    0:44:39

    No.
  • Speaker 1
    0:44:39

    It’s a low president bar these days, but, you know, we’re gonna take it.
  • Speaker 2
    0:44:42

    That’s where we are right now. Thank you so much Tim. I love your podcast. I love your work. But I would get off the internet a little bit for you.
  • Speaker 2
    0:44:48

    I have to say you’re very present.
  • Speaker 1
    0:44:49

    Thank you. I have to good advice. I appreciate that. My husband agrees. We’ll talk to you later.
  • Speaker 2
    0:44:54

    Okay. Thanks, Tim.
  • Speaker 3
    0:44:55

    Add a bed at the cracker moon, layer the music and have a swing. I can’t stop thinking your face. In the middle of the mace. I’m six feet under the bode tree with my crap, new age for also be Diamond’s weather, once for stars, I’m sitting in Jane Mansfield’s car. I Independence.
  • Speaker 3
    0:45:30

    I’m California. My mind’s off duty upside down, but my heart’s on overdrive. Hey. My heart’s on overcharged.
  • Speaker 2
    0:45:47

    I need to take
  • Speaker 3
    0:45:48

    your shower when I look at you. You sing and hurt like a bad tattoo. I wish you changed my I love you. A cruise with candy and to get some breeze with hidden treasure at I sleep. I like the light and hate the heat, but I’ll lick the blood right off your street.
  • Speaker 3
    0:46:17

    I’m Candy.
  • Speaker 1
    0:46:30

    The Secret Podcast is produced by Katie Cooper with audio engineering and editing by Jason Brown.
Want to listen without ads? Join Bulwark+ for an exclusive ad-free version of The Bulwark Podcast! Learn more here. Already a Bulwark+ member? Access the premium version here.