How Privacy’s Defender Cindy Cohn Changed the Future of Encryption

Guy Kawasaki:
Good morning. It's Guy Kawasaki. This is the Remarkable People Podcast, and this is gonna be a particularly joyous podcast because my guest is Cindy Cohn and she is the executive director of the EFF, the Electronic Frontier Foundation. And I have to say, she's one of my most favorite tech people.
And, she has a new book out. I'll hold it up, so you recognize it when you're at the store. And basically she's talking about defending privacy. I read the book yesterday for the first time, and I knew her from before because she's been on the podcast before. But I read the book, which is her memoir, and I had no idea.
You're such a badass. I mean, I knew you were a badass Cindy, but like the Department of Justice just shits a brick whenever they see you coming. Right?

Cindy Cohn:
We try to give them a run for their money, you know.

Guy Kawasaki:
So listen, I know you've announced that you're gonna be leaving the EFF midyear and they're probably popping champagne.

Cindy Cohn:
Well, they shouldn't be because I'm leaving EFF because I wanna get a little closer to the fight than you can be as an executive director. So, I'm not done. I'm not done with them because they're not done, right? If nothing needed fixing, then I could go off and do something else.
But, no, I feel like, especially now, a lot of the things that I warn about in the book are really happening at a pretty unprecedented speed and size. So no, part of the reason I'm stepping down is because I miss those days. I wanna get back into the fight at a level that is pretty hard to do when you're the executive director of 125 person organization. Even one as great as EFF.

Guy Kawasaki:
So have you decided what you're gonna do?

Cindy Cohn:
Not yet. I'm talking to a bunch of folks. There are several options on the table. I'm trying to figure out what the best fit is. And I'll make a decision probably later this spring, but the first job was to make sure that EFF was in good hands, and I'm feeling really comfortable about that.
They're at the very end of their search, and I'm very happy how that's gone. So I wanted to get EFF buttoned up before I really turned in earnest to what I'm gonna do next. But I've had lots of conversations with a lot of people and something fun will come out of it.

Guy Kawasaki:
If I was president, I would appoint you to the Supreme Court.

Cindy Cohn:
Yeah I’m not sure that’s gonna happen with this president, but you never know, you know.

Guy Kawasaki:
If you got appointed, the confirmation vote would be very close.

Cindy Cohn:
It would be hard. One of the things that you do when you're an impact litigator is you know that that's a very different posture than being a politician where you try to make everybody like you. I mean, I certainly try to be nice and kind, but it's not my job to make everybody feel good.

Guy Kawasaki:
Well, if you can run over people while you're smiling, that's a valuable skill. I wanna go first back into your checkered past, and I would just like you to tell us about the hacker culture that you started in. I mean, John Perry Barlow, Mitch Kapor, like what were those days like?
Many people listening to this will probably not even know what we're talking about here.

Cindy Cohn:
EFF was founded in 1990, so we predate the worldwide web. So the internet was really made up of a lot of academics, people who were in government, and a few people who were kind of learning how to play online. And I talk a little bit about one of the early networks called The WELL, which was one of those.
Most of the people involved in this knew that the internet was gonna spread, but they didn't exactly know how. And we started to see the government, some of its policies that were created in an earlier time just weren't working in the digital age. And then in other situations, we kind of felt like the government was taking liberties with the new technology, or at least didn't understand it.
And EFF was founded because of a series of Secret Service raids on people who were engaging in early internet conversations. Think of the kind of early version of Reddit, right? Where there's these open forums where people can talk. It wasn't owned by one company, but that was kind of the experience of a user of some of these news groups.
And then the Secret Service would show up and they would seize everything that was plugged into the wall. They nearly bankrupted a little company called Steve Jackson Games, which is a game company. It still exists right now out of Austin, Texas.
And John Perry Barlow and Mitch Kapor and John Gilmore and later Steve Wozniak, was one of the early funders, kind of realized that there needed to be a pretty strong voice on the user side making sure that we really protected people's rights as we moved into this digital age. And they founded the Electronic Frontier Foundation to do just that.
The story that they tell is that, early on, they wanted it to be a fund. It's called Foundation, but it's not actually a foundation in the way most people think about it because they thought they were gonna raise money and hire lawyers to help some of the people who were involved in this.
And they couldn't find any lawyers that knew enough about the internet to really do this. And so we kind of started growing our own. And that's why the organization at this point has a pretty full slate of lawyers and technologists and activists all aimed at this. But it started because, in the early 1990s, there just weren't a lot of lawyers who knew enough about digital technologies to do this.
But, you know, the hacker culture was deep in the early internet, but to be on the internet at that particular point in time, you had to be fairly technical.
You had to be able to navigate your way around systems and, really, a kind of basic hacker mentality is to try to like do something and not realize there isn't a tool really to do that thing and then decide, well, you're just gonna have to build it yourself, and then give it back to everybody else.
And that kind of spirit still exists in the internet, but it's a little overshadowed now by these kind of, this idea that like four large companies decide all the things that we get to do.
And if they don't want a feature, you don't get it. And I think that's too many people's experiences of the internet now. And things were better and are still better in pieces of the internet where people really kind of take ownership of their reality and are willing to dig in, build it, and then offer that to others.

Guy Kawasaki:
This is a massively hypothetical question, but do you think like John Gilmore and John Perry Barlow and you and Mitch Kapor, if you were doing this ten, fifteen years later, would you now be the Peter Thiels and the Alex Karps? I mean, what happened to Silicon Valley?

Cindy Cohn:
I mean, it's so hard to know. I don't think so, because I think that where we came from was really about empowering the end users, not about scaling and blitz scaling, you know, gigantic companies. And I think that in some ways the voices that we tried to lift up were very different voices than the Peter Thiels.
And I think there's a philosophical difference. Again, I don't wanna paint them all with one brush. But I think that the idea that it's important to lift up the people who are alive today as opposed to some kind of theoretical future is really where EFF is grounded and where the founders grounded us.
We are a civil liberties organization. We care about who people are today and what they're doing today. And I think some of those Silicon Valley people were really kind of convinced themselves that today didn't matter, some kind of theoretical future people.
You know, since there's more theoretical future people than people alive today, we ought to use cost benefit analysis and decide there's more benefit to those future unborn people. And I think there's a bit of kind of dorm room sofa street at the bottom of all of that, I'd rather help the people who are alive today.
And that is kind of EFF's founding ethos. We're a little closer to the people who built human rights than we are the kind of Silicon Valley people. I feel a lot closer to the Eleanor Roosevelts than the Peter Thiels in terms of my approach to thinking about these things.

Guy Kawasaki:
Okay. That is a drop the mic moment. You're closer to Eleanor Roosevelt than Peter Thiel. I mean, let me think about that. I mean, that's quite a spectrum there, Cindy.

Cindy Cohn:
Well, look, they're all people who've had a lot of power and had to decide what to do with it. And one of them tried to build international structures so that everybody could have rights and liberties, and others have, let's just say, made some other choices.

Guy Kawasaki:
I don't know about you Cindy, but when I die, I don't want people to say, “Yeah, he made crypto successful and ensure low rates of long-term capital gains.” I wanna do more in my life than that.

Cindy Cohn:
The couching of long-term vision in short-term greed is fairly obvious right now. And I just think it's important to try to be clear-eyed about what's happening. I see a bunch of these days, I think, right now a lot of this is couched in defending Western civilization.
And I don't think you defend Western civilization by dehumanizing people, by throwing them in detention, by taking away their rights like Western civilization is not about that, but when something gets clothed in it, somehow people think they're talking on a higher philosophical level.
But that's why I prefer the human rights framing over the Western civilization framing because I just think it's kind of phony and I'm not sure if they're fooling themselves, but they're not fooling, I think, many people who have kind of steeped themselves in the real world work to try to help lift up everybody and give everybody dignity.

Guy Kawasaki:
If somebody believes that preserving Western civilization is done by doing what they're doing now, they are delusional. I mean, if you said, “How can we destroy Western civilization?” You could not do a better job than what they're trying to do.

Cindy Cohn:
I think that's right. I mean, again, I come from a classical, educational background. I was an English major. We read all the greats, and I don't think any of them were talking about throwing people in concentration camps and detention centers and, lifting up people based on their skin color as superior intellects.
Like, none of that is what those people were about, and I really think that it's important to look not just at what people are saying, but what they're doing. And when you look at what's happening right now, it's contradictory to why EFF was founded. But it's also contradictory to, I think, a lot of the things that they're claiming they're trying to do.

Guy Kawasaki:
Cindy, I had two conservative listeners and I just lost them after that first fifteen minutes of this.

Cindy Cohn:
And that's so wrong too. You know, EFF, you know, John Perry Barlow and John Gilmore, they're libertarians. Barlow was especially a libertarian. He felt like human consciousness and human beliefs were freer when we were very careful about what power the government had.
That is a legit way to think about human rights and liberties. And I think it's completely consistent with a conservative viewpoint. What's sad for me right now is that the kind of, what I think of as a classic conservative mindset, which is, what role should government play in people's lives and how do we best lift up everybody?
Is it with government, the risks of government intervention versus the risks of government nonintervention? That's a legit argument to have. Signing up to some guy as your king isn't conservative.

Guy Kawasaki:
Cindy, we're eighteen minutes into this.

Cindy Cohn:
We haven't even talked about the book.

Guy Kawasaki:
And we've still got, you know, thirty years to catch up here, or it's actually thirty-eight years. So, okay. Fast forwarding, relatively speaking for this interview, I want you to explain the case of Bernstein versus the Department of Justice and why encryption and software was a matter of free speech.

Cindy Cohn:
Yeah, this is one of the ways in which the government approach to technology kind of just had to change over time. Encryption is old. Caesar had a cipher. The founders of the U.S., you know, Jefferson was in Paris and Adams was in London. They had to talk to each other in a secure way.
They used encryption, so the government kind of thought of encryption as something around military and governmental affairs. And so encryption was on the U.S. munitions list next to surface to air missiles and tanks when we started in the nineties. That was understandable in an earlier era.
But one of the things that the founders of EFF and John Gilmore was very central to this knew was that we were gonna need to have privacy online. We were gonna need to be able to control who could hear what we were saying to each other if we were gonna have rights online. And so these encryption regulations were in the way.
Bernstein v. Department of Justice was a case that the EFF approached me and asked me to take on, I was a young lawyer, to try to get rid of those encryption regulations so all of us could have privacy online. The tool of privacy online is encryption. It's also the tool for security online, it turns out.
And so they asked me if I would take on a case. And the way that these encryption regulations worked was you needed a license before you could export encryption from the U.S. But export means putting it in a place where a foreign person can get it. And of course, everything you do on the internet is amenable to foreign people seeing.
And in fact, that's how the science of cryptography developed in the nineties, is people sharing ideas back and forth online. And so we put together a lawsuit, actually, I mean, they presented it to me already. It was already being developed for a math PhD student named Daniel Bernstein.
He was a Berkeley PhD student who wrote a little encryption program and wanted to publish it online as part of this conversation about the science. And the way the regulations worked, if he did that without a license, he would go to jail as an arms dealer, which is a very, very serious offense.
And when John Gilmore called me and I said, “Well, what does this program do? Does it like blow things up?” And he said, “No, it keeps things secret.” And I said, “Well, you should be able to publish instructions about how to keep things secret online. That's a First Amendment problem.”
And he said, “Well, we think so too. Will you take the case?” So, we did, and the way that we argued this was really based in science. The publication of scientific ideas has always been protected under the First Amendment.
It was very clear even in the early 1990s that some scientific ideas are published via code, that coding is a way to be very precise about how to do things especially more like a recipe. We use the analogy of a recipe a lot.
Coding breaks down a problem into little, teeny pieces that are small enough ultimately if you compile it for a computer to be able to do. So what we said was, “Look, this is how science happens, and the science of cryptography is a real science, and we need to protect it, and the First Amendment needs to protect the publication of this code.”
And so code is speech, for sure, but really in this context of the ongoing development of science, it was a very, very strong, I think, both metaphorical and practical argument. And we were successful with it. The lower court and the Ninth Circuit both held that the publication of Dan's code was speech for purposes of the First Amendment and that the regulations didn't protect it.
And then ultimately as we were making our way up through the courts, the government actually backed down. And so we didn't get the clean win that we wanted. As a lawyer, I like a clean win, but we actually got encryption freed, and that's part of the reason why we all have encryption in all of the things we do online now, and it's available to us.

Guy Kawasaki:
And if the Department of Justice had won, what would we be looking at today? What would the internet be like today?

Cindy Cohn:
I think the internet wouldn't be a place where you could have any kind of private conversation, tools like Signal and WhatsApp that a lot of people are using these days to do everything from communicating with their loved ones to political organizing. You know, the web itself is encrypted now and that's some work that EFF and our friends at an organization, called Let's Encrypt, did.
So that when you type in, you think you're going to your bank to do a transaction, you're actually going to your bank. So there's encryption that you might not see, but it is on the back end of this that makes the internet secure because otherwise people can get hijacked really easily and some people still do get hijacked, but it's much better than it would be.
So whether you're doing transactions or surfing the web, getting access to information, any of those kinds of things, they really wouldn't be possible in anything like a trustworthy way, if we hadn't freed up encryption.

Guy Kawasaki:
So, pardon my ignorance, but is this essentially saying that this is HTTPS? I mean.

Cindy Cohn:
Yep.

Guy Kawasaki:
That's why we're HTTPS as opposed to HTTP?

Cindy Cohn:
Correct. That's what the “S” stands for, I believe, is secure. Right. Hypertext transfer protocol secure. I might be wrong about the “S” but yeah, and the early internet was HTTP, and we have slowly worked, and EFF had an actual tool that we used for a while called HTTPS Everywhere that would help things present in HTTPS, but now it's built into the browsers.
Most of what people do browsing is HTTPS because we were able to get the security turned on in the protocol and used in most of the internet.

Guy Kawasaki:
So, in my mind I see young Cindy Cohn, fresh outta law school, in this small, little practice, right? And now she's taking on the Department of Justice. And, you may not have a clean win, but you won. I mean, let's face it, you won, right? So now is this the equivalent of, you went to your first Olympics, and you won a gold medal. I mean, how do you top that for the rest of your career?

Cindy Cohn:
It definitely was an amazing victory, and it was important and lots of people helped. I was able to sit at the front of a fight, but there were other people doing a lot of work on it. But yeah, I do feel like I got my first chance to play in the big leagues and we were able to win.
And the sad part about it is we keep having to defend that win. Right now, the U.K. and Australia are very hostile to encryption. The U.S. periodically there's efforts to try to break encryption. We learned from the Snowden files that the NSA once they lost their regulatory authority through the export regulations started going, kind of, in the back door and weakening encryption.
So on the one hand, we won this tremendous victory, and it was a huge victory and made a big difference. On the other hand, it feels like I've spent the last twenty years since trying to defend what it is we stood up for, which is the right to have a private conversation online.

Guy Kawasaki:
All right, so now we're gonna take another leap in time to September Eleventh, and my first question about September Eleventh is, did the September Eleventh Commission find that it was security leaks or insufficient security that led to successful terrorist attacks? Was the security the issue?

Cindy Cohn:
No, absolutely not. They very clearly rejected that theory. And you know what they found was that there were communication problems between the NSA and the FBI and the various government entities.
But the big thing was that something called groupthink, that the people at the top including George Bush. Remember Bush was warned that Osama Bin Laden was gonna strike inside the U.S.
They just couldn't take in information that was inconsistent with their worldview. And this is a psychological phenomenon that has been studied. So no, it wasn't about the lack of security. It wasn't because they were somehow limited in their ability. You know, the things that they do to protect our rights limit some of the things that the government can do.
Those things weren't what the September Eleventh Commission found were the problem.

Guy Kawasaki:
Okay, so the September Eleventh Commission did not find that security, it is an issue, but correct me if I'm wrong, but since September Eleventh, the words “National Security” have been used to justify many things as if the Commission found that it was security.
So the Commission findings were not addressed, but we introduced this whole new concept of, whenever you wanna do something you wanna do, you just say it's for National Security. Is that unfair judgment on my part?

Cindy Cohn:
I think that's true. Sadly, and you know, the September Eleventh Commission, it happened a little later than a lot of the laws. The Patriot Act was passed right after. The September Eleventh Commission came out significantly later, and the Congress rushed in because they were frantic to do something because the attacks were horrible and clearly things had gone wrong.
And a lot of the things they did were things that the FBI and the NSA really had wanted for a while, but they didn't map to the problems that the September Eleventh Commission found nearly as closely as I think people would hope, or that a lot of people think, because we felt insecure as a result of the attacks, which were terrible.
Anything that said that it was gonna bring us security got a lot of support and you and I were both adults during this time. I think a lot of people now might not remember it in the same way that we do, but making people feel more secure was a government priority whether it actually made us secure or not seemed to be not nearly as important as the story they were telling.

Guy Kawasaki:
So Cindy is the bottom line that National Security and privacy, it's kind of a zero sum game? It's mutually exclusive, or can we have both?

Cindy Cohn:
Oh, we can definitely have both. And again, this is a point the September Eleventh Commission didn't find the thing that made us unsafe was the civil liberties and the security things. It found that there were other things.
The thing that will make us most secure is if the people at the top of our country actually believe it when analysts tell them there is a threat coming, instead of not believing it, there are things that we can do to make ourselves more secure. I think the number one was putting doors between the cockpit and the rest of the plane like that is a very significant difference.
And something that actually has good data behind it about making it a lot harder to hijack a plane. So there's lots of things we can do that make us secure. The problem is that not everything that goes by the banner of making us secure actually makes us more secure.
And for many of those people, those things, our rights are what we give up. There's a thing that we lose in exchange for this false promise.

Guy Kawasaki:
Well, so in a sense you are saying it's a zero sum game.

Cindy Cohn:
No, I'm saying that real security is consistent.
There's plenty of things you could do that are real secure that do not undermine our civil liberties or our privacy, and that those ought to be the things we do first. And I think privacy and security go hand in hand. On a technical level, encryption does that, right?
Encryption gives you privacy. It also gives you security. They quite literally are the same things, but I also think for, in terms of protecting our rights, those things go together most of the time as well.

Guy Kawasaki:
As I was reading your book and I read about all the efforts in mass surveillance and stuff, it seems to me that with all this mass surveillance that was going on, shouldn't the Epstein files be like totally documented? Shouldn't every conversation, every email, everything be in the Epstein files if all this surveillance was going on?

Cindy Cohn:
I think that this is one of the other pieces that's hard for people to reckon with is that there's a lot of road between technically surveilling everything and actually making that information available to people. And we've seen this with police body cams is another one where, in theory, that ought to give a lot of accountability to the police.
But what ends up happening is it really doesn't, it's very hard for defendants to get ahold of it. The cops take a look at the footage first and then conform their stories to what the footage shows. So it ends up helping them mislead the public about what they're doing. So you have to look both at the technology and at how we're using it.
And I think the Epstein Files is a good example of, there's tons and tons of these emails. It's not like there wasn't a pretty strong trail, but yet getting them out in a way that both protects the victims and identifies the things that we as a society need to know in order to make judgments based on it.
We're still fighting to make that happen. I think people think that sometimes that mass surveillance will automatically make us safer. And it's far more complicated than that. And I think in many instances it doesn't. And again, you have to look both at does it make us safer and what are we losing as a result of it?
And we just had an incident with the Ring doorbell cameras, right? Where people saw in the Super Bowl commercial, this idea that ring doorbell cameras can help find a lost puppy. And I'm a big dog lover. I mean, it pulled on my heartstrings too, but I think immediately people saw, what else can you track with that through a neighborhood?
And do we feel comfortable that our law enforcement and ICE will never misuse this mass surveillance technologies in ways that harm us or our neighbors?
And part of what happened there was there was a huge percentage of the people who saw that ad who immediately made that connection and said, “This mass surveillance isn't gonna make us safer. We don't want any part of it.”
And, as a result, Ring dropped the arrangement it was making with Flock, which is a company that does mass surveillance for license plate readers and basically said, “We can't do this.” People are understanding how this mass surveillance can be misused at a level that surprises us.

Guy Kawasaki:
When I saw that ad, all that could go through my mind was, are you telling me that your agency and your CMO and your advertising people, none of you discussed the thought that people might be appalled by the thought that all these Ring cameras are gonna be taking pictures and they're not just gonna find your little pet, they're gonna find anything?
How can that get to the Super Bowl without somebody being a devil's advocate? Are we more groupthink? I mean.

Cindy Cohn:
It sounds like groupthink to me. I mean, I don't know, but I do think that sometimes people get so caught up in their own stories that they really can't see the reality. And we see this in the higher echelons of a lot of parts of our government and our corporate America sometimes.
And it's good that there was a reality check, right? That enough people made enough noise that they realized that this wasn't landing in the way that they thought it was gonna land, but a similar thing’s going on now, you know, Meta has been saying that they're gonna put facial recognition in their glasses that they sell to the public.
And I think there was this kind of nasty internal memo saying, “This is a time when civil liberties groups that might criticize us are too busy with the Trump administration, and so they won't notice,” which of course we immediately were like, “Ah, we noticed, and we're gonna attack it.”
But I think it's another moment where the people at the top of these organizations really are not connected with what's going on with real people on the ground who are gonna see that like facial recognition in glasses is not gonna just be used for finding puppies, right?
It's gonna be used for the kind of horrible anti, you know, detentions and civil liberties violations and discrimination that we're seeing.

Guy Kawasaki:
There must be multiple times a week that you say to yourself, “How can people who are supposed to be so smart be so dumb?” I mean, it's kind of astounding, right?

Cindy Cohn:
Well, it's possible that we do it as well. It's easier to see this in other people than yourself, but I think you get caught up in your own story, and it's important to have a range of people in the room or that you're talking to so that you're not so surrounded by sycophants and people who always agree.
The nice thing about being at EFF is like we are not surrounded by people who agree with us, but the more power you get, the more that tends to happen. And recognizing that and finding ways to get a voice that isn't beholden to you to give you a straight answer is, I think, a challenge for anybody who gains a lot of power.

Guy Kawasaki:
Okay, so now, because I have the rare privilege of interviewing someone I consider a constitutional expert, I have some constitutional questions.

Cindy Cohn:
All right.

Guy Kawasaki:
First of all, will you please explain for the layman or layperson, what is the First Amendment? Because a lot of politicians throw the First Amendment around as if, you know, Facebook is not putting out enough conservative articles. That's a violation of the First Amendment.
And I went to law school for only one week, but I recall that the First Amendment prohibits Congress from controlling the press, not Meta or Apple or Hunter Biden. So what exactly is the First Amendment?

Cindy Cohn:
You're correct that this, “Congress shall make no law,” has meaning. It means that this is about the government, and it has been included to mean government regulations. So, it doesn't have to be an actual law. It can be something, say that the FCC does, or the FTC does. It's how the government has to interact with us.
And it says that the government can't restrict our speech except in very narrow circumstances. So, yes, when a big company decides whose speech it wants to host, and whose speech it doesn't wanna host, that's generally outside of the First Amendment. Again, the doctrine is big and complicated, so there are tiny little exceptions.
But in the whole, you know, just like if you run a bar, you get to decide, you know, you can't discriminate based on race, but you can say, “This is a bowling bar. If you don't wanna be a bowler, don't come here.” You can do lots of things that restrict how you get to control who you're hosting, right?
And there's some famous First Amendment cases about a parade, and that the people who organize a parade get to decide who gets to put a float in the parade, and that's generally not something the government gets to dictate, or if you have a bookstore, you get to decide what books you put in your bookstore and the government doesn't get to tell you, “No, you can't put that book in.”
So these are all things where the kind of private nature of it means that the fundamental decision making needs to stay with the company. So, that's the first part of the First Amendment. One of the things that sometimes people say is, “You know, it says ‘Congress shall make no law.’ That means no law.”
And you know, traditionally that hasn't meant zero. What it's meant is that there's a very high standard for when the government gets to regulate in ways that impact speech and that it has to be a very close fit between what it's saying it's doing and how it actually works. And they call this strict scrutiny or intermediate scrutiny or rational basis.
There's levels, depending on what the government says it's doing. A lot of what the First Amendment requires is for it to be actually doing that and not having a lot of side effects. So, the government saying, “Well, we just wanna make sure that there's no criminals on the internet. So therefore everybody has to show their ID before they go online.”
Well, there's not a very close fit between having a government issued ID and being a criminal. So that would be an example of something that doesn't have a very good fit between the means and the ends.

Guy Kawasaki:
So when the FCC prevents Stephen Colbert from having a Senate candidate on, and it says it's because of equal time on broadcast TV, is that a First Amendment issue? Did we cross the line there when they put the squeeze on Stephen Colbert?

Cindy Cohn:
I think so. I mean, we're still waiting to hear the facts, right? Did the CBS do this voluntarily and basically over censor Colbert? Did the FCC actually say something? We know in the context of Jimmy Kimmel a while ago that the FCC chair actually did say something.
In this one, I think we're still trying to understand whether CBS news, CBS, which is of course, you know, Bari Weiss and now really controlled by some people very friendly to the president, whether they self-censored or whether they were actually censored by the FCC is something that, again, this is just happened yesterday, so the facts will come out.
But either way, it's a problem. It's a problem if the rules are so unclear that self-censorship happens. Vagueness is a doctrine in the First Amendment.
Even if the government doesn't directly threaten, if it writes its rules so unclearly or so broadly that it causes self-censorship, that can be a First Amendment problem, or this isn't about the First Amendment at all. This is about CBS news kowtowing because it's trying to go on bended knee to Trump. And that's the piece that I just can't tell yet.

Guy Kawasaki:
Yeah. And, when Stephen Colbert then puts this episode on YouTube, is that an act of civil disobedience? What is that? I mean, can they stop that too?

Cindy Cohn:
Well, you know, less so, right? Because the FCC’s authority really doesn't include the internet at the same level that it includes broadcast. The whole point of the FCC, even having the power to regulate broadcast networks like CBS is because the airwaves are minimum. You know, there's only so many things you can put on the airwaves.
And so the government is charged with giving licenses out for who gets to broadcast. And those licenses can come with terms. The internet, you don't need a license to publish anything on the internet. The government doesn't regulate the internet at the same way that it regulates broadcast.
And so this is why Colbert, kind of, did what he did is because if you don't put it on the broadcast networks, it shouldn't be subject to this FCC rule, because the internet isn't regulated in the same way as broadcast networks are. And so this is why he came to the conclusion, I think the right one, that regardless of whether he could broadcast it on TV, he could put it on YouTube.
And then his only limitation is whether Google, which owns YouTube, Alphabet owns YouTube, will allow it or not. Now that's its own other problem, right, on the corporate side. But I think he was right to kind of figure out where the rules were and where the rules weren’t and go to where the rules weren't in order to get his voice heard.
That's great and I'm glad he was able to find a way through it so the rest of us could see for ourselves what it was he was actually doing and saying, but you shouldn't have to do that.
And the First Amendment also has a doctrine that says, “Just because you can speak in another place doesn't mean that it's okay for the government to scare you away or regulate you away from speaking in the place that you wanted to.”
So there's a pretty time honored idea that simply because you can find your way around the block and still get your voice heard, that doesn't mean that the First Amendment problem goes away.

Guy Kawasaki:
Although Cindy, I gotta admit that if I were Talarico, I would say this is the best thing that ever happened to me, right?

Cindy Cohn:
There is a thing that was coined by my friend Mike Masnick, called The Streisand Effect, which says that the more you try to squelch something, the more attention it gets, right? Because the fact that it was tried to be censored makes it even more enticing to people like you and me who are interested in this stuff.
But it was named after Barbara Streisand, you know, this is in the early internet days, there was somebody who had a plane that flew over and would take pictures of people's properties and you could buy one and have an overhead picture of your property. This is before drones were really such a common thing.
And Barbara Streisand got very upset that there was somebody taking a picture of her property from overhead and made a big noise about it. And so suddenly all of us learned about this service, and that you could take a picture of Barbara Streisand's property from the sky, that ever would've heard about it had she just not done it.
And so we call this The Streisand Effect, in honor of Barbara Streisand, who was the first one to have this experience, kind of, so big happened. And I think they just Streisanded, you know, Mr. Talarico's fight in Texas at a level that you can't pay for that kind of publicity.
And I hope that's the lesson that people learn is that censorship backfires. It's not maybe your best strategy.

Guy Kawasaki:
The bummer is poor Jasmine Crockett is probably pulling her hair out. Like, why didn't they ban me? Why did they ban my opponent, right? In a bizarre way, maybe if your book was banned and my book about Signal was banned, it would help us, right?

Cindy Cohn:
Exactly. I think that's right. And I really appreciate your book about Signal. I think that you're trying to help as broad an audience as possible understand how to use this tool is one of the ways that we continue to honor the need for privacy that people have. So thank you for it.
But yeah. I wish somebody would come along and censor it for us. And then, we could be on all the late night TV shows talking about how we were censored.

Guy Kawasaki:
And then we'll be banned from the broadcast TV, so we will all go viral on YouTube.

Cindy Cohn:
It could just keep going, right? And then we end up back on the internet.

Guy Kawasaki:
Alright, so now we covered the First Amendment, skip a few amendments, the Fourth Amendment. What is the Fourth Amendment?

Cindy Cohn:
Well, the Fourth Amendment says that “You have a right to be secure in your houses, your papers, your things, from the government coming and searching and seizing them.” It is a basic privacy right against the government coming into your private space and taking your stuff.
And it requires something called reasonableness. It says that “The government can't do it unless it gets a warrant.”
That term gets used a lot, but in the context of the Fourth Amendment, it generally means that a judge has signed off on the police coming to, not just into your house, but that's kind of the canonical thing, but also having access to your papers, which can mean your email, it can mean your other stuff that might not physically be in your house.
Quite famously, the Fourth Amendment says, “It protects people, not places.” So, your phone calls are protected by the First Amendment, even though they, as a technical matter leave your house, to go to the places they go. And the same thing should be true for your emails and other things that you do.
So the Fourth Amendment is your basic privacy protection against the government having access to you, your home, and your stuff. And it creates this zone of privacy or should create this zone of privacy around you.

Guy Kawasaki:
You know, a lot has happened in 250 years or so. So, if you were able to amend the Fourth Amendment to bring it up to speed to 2025, what would you do?

Cindy Cohn:
Oh, gosh. I think there's a lot that one needs to do. I think that we need to be really clear again, this idea that it protects people, not papers, needs to be very explicit that, now most of us, you know, perhaps in colonial times, we stored our papers in our home.
Now our papers are stored all over the place and one of the places that they are stored is with these third parties, your bank or your ISP or your other places. And making sure that the Fourth Amendment really protects those things wherever they are.
It would undermine something called the third-party doctrine, which has been used to undermine people's privacy in the digital age. That would be one of the core things I would do to rewrite the Fourth Amendment.
This reasonableness standard has really created a situation in which the amount of privacy we get has shrunk over time. And I think that that's a very bad ratchet because it's this reasonableness standard basically means that people are used to it, then it's not a privacy violation anymore.
And I just don't think that that's right. I don't think the fact that you get used to having your rights violated means that you should have fewer rights. And I think that that's what's happened with a lot of the Fourth Amendment.

Guy Kawasaki:
Now I'm having a hard time keeping two conflicting thoughts in my brain. So on the one hand I now understand the Fourth Amendment, but then this thing called the NSLs. So how does an NSL exist if the Fourth Amendment exists? So can you explain the NSL, and you know, poor Guy, the apparent contradiction between those two things?

Cindy Cohn:
There's another part of the Fourth Amendment that I would get rid of, which is the distinction between the content of your messages and the metadata. So, the classic way to think about this is the text of your letter inside an envelope is the content and the addressing on the outside is the metadata.
Same thing with your email, right? The two from subject versus the actual content of it. And this is another area where the Fourth Amendment has really been undermined because the courts have basically said that “The only thing that gets protected is the content, not the metadata.”
And that has meant that things like national security letters, NSLs, which are subpoenas so they're not approved by a judge, can be issued to your ISP or your service provider, your phone company, and get the metadata about you, who you talk to, how often you talk to them, where you are when you talk to them.
Your location is often tracked by these kinds of things, and that doesn't require a warrant, that doesn't require a judicial warrant. It can be done with this NSL, which is really a form of a subpoena, which doesn't require a judge. And this is what has allowed national security letters to exist.
They're very, very commonly used to get information about millions, hundreds of millions of people from their service providers. And it's done under a cloak of secrecy. National Security and the national security letter has meant that the FBI has claimed that it can gag people who received this.
Your ISP can't tell you that it has received one of these. When we started the fight, it was forever. They were gagged forever. We've poked a few holes in that now.
It's still longer than I would like, but the combination of being able to get this information about us and prevent anybody from ever telling us that we've gotten it really has meant that the government has a lot of free range to spy on us and to know what we're up to, which I think is pretty inconsistent with the Fourth Amendment, with what the founders are trying to do with the Fourth Amendment.

Guy Kawasaki:
Every chapter of our book about Signal has a little epigraph, you know, a little quote. And when I read your book yesterday, I'm gonna alter our book because there's a quote in there about the metadata where somebody said, “We murder people based on metadata.”

Cindy Cohn:
“We kill people based on metadata.” Yeah.

Guy Kawasaki:
And I read that. I said, “Man, that is a great quote for the chapter about metadata in my book.”
So now people are wondering, oh, what's the big deal about, you know, knowing the outside of the envelope? Well, somebody said, “They killed people based on it.”

Cindy Cohn:
Oh yeah. Yeah, they definitely do because it helps you figure out who the networks are, who's talking to who, and how often. If you know who's talking to who and how often they're talking to them, you know a lot about a person. If you know where they are, and most of us, we've got these phones, they're pinging all the time in order for our calls and texts to go service.
If you know where somebody is at night and where somebody is at day most of the time, you can find them pretty easily. And most of us are pretty trackable with just three or four pieces of metadata about us. And again, it doesn't just implement you, it can implement all the people you talk to.
And one of the things I try to talk to people about privacy is that even if you think you don't have any need for privacy, I suspect somebody who does.
Whether that's somebody doesn't have papers, or even has papers, but ends up a target in this country right now, lots and lots of people who have legitimate asylum claims or green cards or other things, they're finding themselves swept up in these kinds of things.
If they're a protestor, if they're an activist, if they're just somebody that the government doesn't like, all of this stuff. You are very findable, trackable, and the people that you communicate with and associate with are trackable as well just from your metadata.

Guy Kawasaki:
I've encountered this every day since I wrote this book about Signal and a lot of people say, “Well, what's the big deal? You know, WhatsApp and Apple messages end-to-end encrypted. So I'm completely secure.” And I'm like, I just go, “Oh my God. Let me explain to you what you just said.”
I mean, they collect enough metadata to, I dunno about kill you, but you know, they can really pin you down, right, that Signal does not collect.

Cindy Cohn:
I think that that's right and I think again, it's not like it's not useful to use these things. It could be very, very useful and important and I don't want people to think, Oh, well, all hope is lost because they've got my metadata. But it is the next frontier of something that we need to protect if we wanna give people real privacy.
It helps require the cops to actually identify who they're looking for a lot better. Although there are these things called geofence warrants that are looking at all the people in a particular area that we also have to fight, but it does help quite a bit, but there's no golden shield of protection.
You know, it's not like one of those Harry Potter things where you can just, you know, do that. These fights have to be fought throughout, and just because it helps a little, doesn't mean it goes all the way, but it does help quite a bit.

Guy Kawasaki:
I just wanna point out to you, Cindy, that I'm more the Maxwell Smart’s “Cone of Silence” than the Harry Potter, but you know, that's just an age thing, right?

Cindy Cohn:
They're both great. They're both getting at the same problem for different generations.

Guy Kawasaki:
Next question is about Section 230. What the hell is Section 230?

Cindy Cohn:
Section 230 is pretty straightforward. It basically means that, everyone is responsible for their own words. It means that if you say or do something that's harmful, you are the primary person responsible for that, and that the people who provide you with services should not be responsible except in some narrow areas.
And this is about civil liability. Sometimes this gets very confusing for people. This means like if I say something that's defamatory about you or somebody else, I'm responsible for that. The website that we're using, the tools that we're using, they're not responsible for that, I am.
And that they can continue to provide services to lots of people and not have to be held liable just because somebody says something defamatory through one of these services.
So it's about what lawyers call secondary liability. Primary liability should be with the speaker and when do you give secondary liability? And Section 230 says that “If you're an internet service provider of any kind, you're providing a digital service. You are generally not responsible for what your users say.”
The exceptions for that are criminal things, or if you've been involved in creation, you know, Meta not responsible for what I say on Facebook about you, but they might be responsible for their own ads, you know, for Meta’s ads.
So that's all 230 is. It's a form of secondary liability protection for services in the digital age and puts responsibility for illegal speech, illegal behavior on the people who do the behavior.

Guy Kawasaki:
Let's say I'm a devil's advocate and I say, “All right, so like there's no responsibility for the platform, but what if the platform's algorithm changes and feeds people stuff?” Didn't the platform just assume some responsibility because of their algorithm.

Cindy Cohn:
This is being fought in the courts right now, and I think the answer's no. And the reason I think the answer's no is that like the First Amendment clearly protects the newspaper's decision about what it puts on the front page and how it presents information to you.
A magazine is protected regardless of how it presents information to you and what it decides to put in and what it doesn't decide to put in.
An algorithm isn't very much different than that. Again, there's lots of criticisms you can make about the way these algorithms work, but I don't think that they should break the 230 because I think a publisher, or a platform should be able to make choices about what they wanna provide to people and in what order in ways that are protected.
Again, I really think it's wrongheaded trying to help people figure out what they wanna see using technology to do that is somehow creating liability. I think it's very dangerous, and I don't think it's dangerous because I care about Facebook or any of these companies.
I think it matters because if you think about how the internet works, nobody gets on the internet all by themselves, right? You need service providers upon service providers, upon service providers. Some of them can be public, some of them can be nonprofit, but whatever, like nobody hosts every little piece needed to speak on the internet, right? Here, we're using a browser.
We all have ISPs. We've got this app inside the browser. All of those pieces are necessary in order to speak online. So if we start creating liability for what people say online through all the pieces of this, from the platform on down to the ISP or the telecommunications company, people are not going to do it.
And the only people who are gonna do it are the big companies who can afford to withstand litigation, so I worry that we're cutting off our nose despite our face. We're gonna empower the powerful who can afford to withstand liability, and we're gonna make it so that nobody brings us the next app, right?
I always say, “Craig Newmark started Craigslist and he's an old friend and supporter of EFF, and while I really like Craig. I'm not interested in Craig and Craigslist. I'm interested in the next Craigslist.”
And, you know, the next person who wants to create a space for the rest of us to speak to each other, they have enough barriers already in the way that the world is consolidating, but they shouldn't also have the risk that they are gonna get sued out of existence.
The first time somebody says something actionable on their platform, they're just not gonna provide it. I think it's a recipe for continuing consolidation in our online world if we get rid of Section 230. And it dismayed me to see so many people who I think are genuinely concerned about bad things happening online, not recognizing the consequences of the attacks that they're launching on 230.

Guy Kawasaki:
Huh. So the irony is that all these big companies who might complain about setting aside 230 would actually benefit because they can withstand the attack. And two guys in a garage, two gals in a garage, starting the next WELL could not.

Cindy Cohn:
Yeah. I think that that's right. Ultimately, how it would play out. I don't think that they wanna be sued. I think that they would benefit more if they weren't sued, but they've supported lots of breaking of holes into 230 already.
And I think the reason that they've done that is because they see a competitive advantage that some of these holes in 230 make. You know, they can bear the cost in exchange for blocking out their competitors and they're willing to do that.

Guy Kawasaki:
The next subject is in your book, there is just a wonderful picture of you and Ed Snowden.

Cindy Cohn:
Mm-hmm.

Guy Kawasaki:
Okay. So now what do you think Ed Snowden's legacy will be? Was he a traitor or was he a patriot?

Cindy Cohn:
It depends on what you think about somebody who tells the truth to us about what our government is doing. And I think that that's somebody who's a patriot.
I think what Snowden did was bust through a lot of lies that the government was telling the American people about what it was doing and gave us all a clear picture of some really important things that the government was doing around mass spying, around undermining encryption, around lots of other things.
The little kid who says, “The emperor has no clothes,” like that kid is a hero to me. I'm sure that he pisses off powerful people who benefit from secrecy. But I think that the American people deserve to know if you have a self-governing society, you have to be able to make reason decisions about what your government should and shouldn't do.
And so I think that Mr. Snowden, at great personal risk, told the rest of us things that we needed to hear.
I think we're still needing to hear them. So I think of him as a hero in the same way I think of Daniel Ellsberg as a hero who helped us see the truth through the lies about what the government was doing in Vietnam back in the day.
And some of the American founders who were calling out the British for their lies about what they were doing. I think those people are heroes and I think Snowden deserves to be up in that pantheon.

Guy Kawasaki:
And how do you wrap your mind around the fact that Ed Snowden is living in Russia? That is kind of a contradiction, no?

Cindy Cohn:
Look, I will tell you, where you live when the government takes your passport away is where you are. When your government takes your passport away, he can't go anywhere else. He doesn't have a passport. He doesn't have papers to travel.
He was in Moscow in a transit lounge trying to get someplace else when the government pulled all his papers and locked him in. He's stuck in Russia. He didn't pick it. And again, I think that he is trying to make the best of that situation, but I think that if you look at the story about how he ended up in Russia, like he didn't pick it.
He was trying to get other places. He was trying to get out and travel in a way that wasn't in U.S. airspace because the U.S. was threatening to shoot down. He was trying to kill him, shoot down planes that he was in. And I'll tell you, if you've traveled around the world and you don't have a passport, you don't get to keep traveling.
And that is what happened to Ed. And I think that, he's the first to tell you that's not how he planned his life. That's not what he wants, but he's kind of making the best of the situation. Mr. Putin has let him stay in part because he likes the black eye that that shows the American government about what it does to people who tell the truth.
I think that we should recognize that. I'm glad that he's let Mr. Snowden stay and let him live, but the reason that Ed Snowden is in Russia is because that's where he got stuck and because Mr. Putin wants the American people to see the hypocrisy of their government. And if you don't see that, then I don't think you're looking at this situation based on the facts.

Guy Kawasaki:
So it's not an ideological choice. It's a lack of choices where he can go.

Cindy Cohn:
Yeah. He can't go anywhere else. If you talk to the people at the NSA, they want him to come here and face the Espionage Act, which has the death penalty for what he did. And I think that's the wrong way to look at somebody who told the American people the truth about what their government was doing, some of which was very highly illegal, right?
Like he unveiled the government's illegality. And that is not the right way to address that by throwing the death penalty at somebody.

Guy Kawasaki:
Okay, so now my next topic is we're gonna have a little privacy lightning round. Now when you're on a podcast and the host says, “We're gonna do a lightning round,” it's usually like, “Okay, Guy, barbecue or sushi, iOS or Android, Mac or windows,” you know, but my lightning round is different.

Cindy Cohn:
All right.

Guy Kawasaki:
You are the goddess, the queen of privacy, your privacy's defender, as this book says.
So I'm gonna ask you about your real day-to-day practical use of technology because I am so curious. So we're just gonna go down a list. All right? And let's start with who is your phone carrier?

Cindy Cohn:
Verizon.

Guy Kawasaki:
Okay.

Cindy Cohn:
We don't have the choices we should have.

Guy Kawasaki:
Okay. What messaging app do you use?

Cindy Cohn:
Oh, largely Signal.

Guy Kawasaki:
Okay, what email do you use? Are you a Proton girl or are you just using a hosted system?

Cindy Cohn:
For my personal one, I use something that's hosted by Tucows.

Guy Kawasaki:
Okay. And what about EFF?

Cindy Cohn:
EFF at this point is a Microsoft shop, because Microsoft lets you hold your encryption keys. But we are not happy about having to do that. We tried to host our own email till long after everybody else did, and the amount of resources we put into doing that became unsustainable for a nonprofit.
Ultimately we tried to take people's donations and use them to build good in the world, and I couldn't have a growing percentage of that money going just so that we could self-host our email and try to fight the spam and the bots on our own. And I think that's a real problem that we need to fix.
But ultimately, I want people's donations to go to the fighty lawyers, activists, and technologists, not to a huge chunk for us supporting our own infrastructure.

Guy Kawasaki:
Okay.

Cindy Cohn:
It was a hard decision.

Guy Kawasaki:
Okay. Cloud storage, zero-knowledge? Are you using iCloud or Google Drive or Proton Drive? What do you use for your cloud?

Cindy Cohn:
I don't use much. I mean, again, EFF is on Microsoft, and so that's what we use for it. I don't use a lot of cloud storage myself. I am an Apple user largely. So I guess to the extent that there is some stuff, it's there.

Guy Kawasaki:
We've talked briefly about email and messaging and cloud storage, but it seems to me that maybe the chat bot that you use knows more about you than all of those things. Right?

Cindy Cohn:
It definitely can. I mean, you know, people are trying to build some chatbots that are a little more privacy protective, but so far we are not seeing that at the level that we need to.

Guy Kawasaki:
Yeah. I mean, basically we need a Signal of chat bots, right?

Cindy Cohn:
Correct. And there are people trying to build them, but it's not constantly training the model. You can download a local version of the model and have it interact with you and not feed it back up to the cloud. That's how a lot of the enterprise uses work, and I think the rest of us should have easy use of those kinds of things.

Guy Kawasaki:
So vis-a-vis your iPhone, are you using biometrics to unlock your phone?

Cindy Cohn:
Occasionally. I just wanna fight the hypo a little bit here. Like my work is to try to set the law and the policy and the rules in a way that support you and individual choices are sadly not enough to get us there. So, I appreciate and support people who make individual use the Proton Mail, make individual choices, don't use proprietary software.
I have tons of people in my life who won't use any services from any of those people. I have decided that what I wanna be is the sharp end of the spear and that I will do what I need to do in order to get there rather than the choice to use all of the purest technology. So, if people email me and they wanna have a conversation, I will try to move them over to a secure thing.
But if they don't wanna do it and they wanna do it over an insecure channel, I will talk to them. I will not push. We wanna be available to people and be able to help them wherever they are. So, sorry, that's a long way of saying I'm not sure that my personal choices are the best way to think about what you should do.

Guy Kawasaki:
But still, you have to admit that it's a reasonable line of questions to ask, what was someone who knows so much about what's going on? What is she using?

Cindy Cohn:
I think that's fair. And I'm not saying it was an illegitimate thing to ask me. I'm just saying that many, many, many of these problems need to be solved not by personal choices. And that sometimes when you reduce it to personal choices, you make people think that they can totally protect their privacy if they just make all the right choices.
And we need to fix this on a whole other level. It is not up to you alone to protect your privacy by the choices you make. We need better tools available to more people, and we need a regulatory and a legal and a constitutional framework that supports that. And that's the work that I do. So, I just don't want people to think that it's all your fault.
There is a little self blamey, like if you use Google, it's all your fault. You don't have any privacy and you don't care. And I don't want people to get to that kind of place with themselves because it's not fair. It's not a fair fight. It's not like you could just make all the right choices and protect yourself.

Guy Kawasaki:
Well, Cindy, pardon my French, but this is why you are Cindy fucking Cohn because of that answer, man. That is a great answer. Yeah. Alright, so Cindy, fucking Cohn. My last question for you is what is your next big battle? I mean, we kind of started the interview with that, but I'm really curious, like, what do you wanna do next?

Cindy Cohn:
Well, I mean, look, I would like to spend time trying to plug all of these national security holes that we built into our Constitution. And I'm looking for ways to best do that. I miss being in the courtroom. I like being in the courtroom. I'm not sure the courts are the right place to start with that.
So these are the things I'm puzzling over a little bit. I'm starting to puzzle a little bit about AI and how do we actually build a privacy protective AI world.
And so I'm kind of trying to take a little bit of this time with the book tour and the other things I'm doing to close up EFF to see what shows up and think about what I wanna do and try to find the way to connect my skills with the right fight.
I know what fires me up, and certainly suing the government fires me up. So I will do something along those lines. But I'm trying to give some space to see which of the right fight’s next. And I haven't landed on that yet, but that's in part because I think the world presents you with things.
And I'm trying to create space so that the world can present me with things. I didn't come outta law school and think, Oh man, I really wanna free up encryption or I really wanna stop the NSA mass spying. Like those fights presented themselves to me and I'm like, “Great, this is a place that I can plug in.”
And that's how I've made all the choices so far, and that's kind of where I am now.

Guy Kawasaki:
Well, Cindy, I, for one, am so happy that you took up these fights and I sleep a little better knowing that you're out there fighting for privacy.

Cindy Cohn:
Well, thank you. That's so kind and it's been such a joy to get to know you from the first time we did this together, Guy. It's been really such a ray of light in the world. So thank you.

Guy Kawasaki:
Well, thank you very much and best of luck. I'll hold up your book one more time. This is Cindy's book, and you must read it and you also should write a nice little check to the EFF to support the EFF. I got my letter last night that says, “Thank you for your support in 2025, Guy, we look forward to working with you in 2026.” I got that yesterday.

Cindy Cohn:
Absolutely. Good job team. I had nothing to do with that. I mean, other than telling them to do it, we did not schedule the timing. But we stand on the shoulders of people who realize that these are important issues and want to support us.
It takes people who do this work for the long haul who are really in it for the long haul, and who can make their lives work doing it in order to really win. I always say the Department of Justice lawyers, they spend ten to twenty years getting good at what they do.
We need to be able to do that as well, and we can do that because of the ongoing support of people who make it so that I can grow up young lawyers into sophisticated ones. So, thank you.

Guy Kawasaki:
So again, thank you Cindy, and, you know, my best wishes to you. If you have a big retirement soiree, please get me invited. I would love to come.

Cindy Cohn:
I will. There's definitely a few things planned. I'm not privy to all of them. I think they wanna surprise me, but I will make sure that the team lets you know.

Guy Kawasaki:
I think we should rent out Levi's Stadium and we'll get Bad Bunny to perform for you.

Cindy Cohn:
I would love that.

Guy Kawasaki:
Get Beyoncé, Bad Bunny, Snoop Dogg. We get them all.

Cindy Cohn:
Get everybody. Yeah, sure. That would be fabulous.

Guy Kawasaki:
Let me just thank my team, which is Madisun Nuismer, she's the brains behind me. And there's Tessa Nuismer, who's a researcher for us. There's Shannon Hernandez, a sound design engineer, and the incomparable Jeff Sieh, also a co-producer.
So that's my team. And you know, Cindy, I know someday I'm gonna get an Apple News notification, Cindy Cohn now appointed, and I'm gonna say, “Oh my God, she did it.” I look forward to that day. Yeah.

Cindy Cohn:
Thank you so much.

Comments (0)

AI Article