On March 5, 2014, Claremont McKenna College President Hiram Chodosh sat down with Eric Schmidt, the executive chairman of Google, and Jared Cohen, head of Google Ideas. The two Google executives co-wrote The New Digital Age: Transforming Nations, Business and Our Lives (2014), which addresses questions about how technology will change privacy, security, war, intervention, diplomacy, revolution, and terrorism. In this book, Cohen and Schmidt argue that technology gives hope for a future of promise and innovation. To conduct research, they traveled to volatile and repressive societies, and met with activists, political leaders, and entrepreneurs from around the world. The discussion took place at the Athenaeum on the campus of Claremont McKenna College.
Presenter: Good afternoon. And welcome to the Marian Miner Cook Athenaeum. We are honored to host two very influential and visionary leaders and recent authors in a conversation today at the Athenaeum. Eric Schmidt and Jared Cohen's recent book is called The New Digital Age: Reshaping the Future of People, Nations, and Business. It combines the increasingly important intersection of technology and world affairs, examining a world where everyone is connected, a world full of challenges and benefits that are ours to meet and to harness. Eric Schmidt is the executive chairman of Google and also served as Google's CEO for a decade. During this time, he oversaw Google's technical and business strategy, scaled infrastructure, and diversified product offerings. Regarded as one of Silicon Valley's greatest leaders, Schmidt helped grow Google from a small startup to one of the world's most influential companies and, possibly more impressively, the filming location for The Internship. Jared Cohen is the director of Google Ideas and a former adviser to Secretaries of State Condoleezza Rice, a somewhat recent Athenaeum guest as well, and Hilary Clinton, hopefully a future Athenaeum guest. He also serves as an adjunct senior fellow at the Council on Foreign Relations. In January 2013, Mr. Cohen is one of the three members on the first high-level American delegation to North Korea following the death of Kim Jong-Il. In 2013, he was also named one of Time magazine's 100 Most Influential People and studied at Oxford as a Rhodes Scholar. We must thank the Athenaeum, the Office of Student Affairs, and the Silicon Valley Program for cosponsoring this talk. And with that, please join me in welcoming our moderator, Claremont McKenna College President Hiram Chodosh, Eric Schmidt, and Jared Cohen to the Athenaeum.
Hiram Chodosh: So what advice would you each give us? I mean, what do you observe around the country and what do you observe in places like this that we should take stock of if we're trying to work hand-in-hand to, again, maximize the benefits of technology and minimize the harms?
Jared Cohen: I think the most useful piece of advice I could give everybody here is something that might contradict what many intuitively think, given how tough the job market is, which is you're very lucky to be graduating as part of this generation. And the reason I say that is when I graduated, the sort of technology boom was still…we were sort of in the earlier stages of it, but I came later to it because I didn't grow up with it, you know, in elementary school and middle school and so forth. And you all are part of the first generation that's really had an opportunity to engage with these tools from start to finish, and so whether you want to be a lawyer, whether you want to go into government, whether you want to go into medicine, whether you want to be an academic, there's a fundamental truth that all of you will realize if you choose to, which is you will know more about technology than whoever you go and work for. And that really is your comparative advantage, this notion that you're a technologist or you're something else is really something that ends with your generation, because technology is really part of everything, and ultimately making sure that you utilize that as your comparative advantage is how you're going to get ahead.
Eric Schmidt: I think it's helpful to have a sort of prediction of what the world will look like in ten or 20 or 30 years. And I think it's fair to say that we'll have many of the same geopolitical problems, as much as we've tried to make them better—we're not going to go to world peace, no war, no conflict, there's going to be tensions. There'll be new structures in the world. We'll still have Republicans and Democrats fighting, we'll have, you know, a new president and a new president and a new president. A lot of those things will be the same. The thing that will be really different is the level of connectivity of everybody, the level of automation of some things. So if you extrapolate, first we're going to get everybody connected—we're working on that now. But in ten or 15 years after that happens, what's going to happen? They're going to want more bandwidth. They're going to want faster phones. They're going to want more apps. They're going to have more games, they're going to have more of a voice, they're going to want to see the problems around them be addressed. The book is really about that trend and it brings issues with it. So I think when you think about a university, what you want to do is prepare sort of the next leaders for a world where everybody's connected, the ideas are very fast-spreading, where global competition for labor means that you have to do a smart job or you get automated out. And just as an example, we talk about this a little bit in the book: Here's China doing incredibly well, good for them, seven and a half percent growth, et cetera, et cetera, lots of people being lifted out of poverty—great job. We'll ignore the bad parts of China in this conversation. Now, what happens when robots replace all those jobs in Shenzhen? Now you've got a real problem.
Hiram Chodosh: So you started that answer by saying, “We're still going to have a lot of the same geopolitical problems that we have now.” But one of the premises of your book is that technology itself offers some hope for transforming even that. Could you talk about the book in that context? What is the central claim of the book in that dimension?
Jared Cohen: I think what we argue in the book that's new is the unprecedented empowerment of individuals in places that have traditionally been highly autocratic and totalitarian. In the introduction, you heard that we went to North Korea—North Korea offers a useful way to answer this question, because it's the extreme case study. So we went to North Korea about a year ago because it's the Internet's last frontier, it's a cult of personality, it's the most totalitarian society on earth. It's a horrible place. But you leave there, and what you realize is history is filled with many examples of countries just like North Korea and worse. But you also realize that North Korea's the only one that is truly this bad left on earth, and it's because it's the only country on earth where there's an absence of doubt. So in the future, you are still going to have autocracies—you know, you're going to still have dictatorship. But that level of autocracy, where you literally control human minds by preventing doubt from existing anywhere, has literally been eliminated or is in the process of being eliminated by the Internet in the same way that scientists were able to eradicate smallpox.
Eric Schmidt: I agree with that, and I think that going back to the thing of many things are the same. Those of you studying history, you'll see the same themes over and over again, you know. Power-hungry people, governments, culture, religion, conflict, and so forth. So what's new? Well, there's at least two things. The first is the empowerment of the individual by smartphones and other kinds of devices has never occurred on this scale. And shockingly, not everyone is perfect. There are, in fact, bad people who are being empowered, but the overwhelming benefit of getting billions of people empowered by these devices is good—it lifts them out of poverty, it educates them, it serves as a great way to protect women against the terrible things that occur to them in the Third World and on and on and on. The other thing that's interesting is that the ability to do both data-leaking, right, data permanence, is another new thing. Now, somebody like Snowden can basically take a whole bunch of documents and leak them and you can't get them back, right? And by the way, that's also true of all of those videos of you drunk at 16. You remember, right? Well, even if you don't remember, the Internet remembers. Data permanence. And society will adapt to these things, but they're material changes, right, in how people are organized, how governments will work, how the law will work.
Hiram Chodosh: You raise Snowden. One of our students asks, “What is Google's stance in light of what Edward Snowden has exposed?” What is your view? What are your individual views?
Eric Schmidt: We debated this at quite a length, and I tend to have more of the sort of California-ish view, Jared has more of the East Coast view, and we came to sort of a common view on this.
Hiram Chodosh: Midwestern view.
Eric Schmidt: Yeah, a Midwestern view. We met in Chicago or what have you. It is clear that the Snowden revelations were useful to us, 'cause it exposed that the NSA, through the GCHQ, was watching traffic between our data centers. So we fixed that—we fixed that for the computer scientists using 24-data bit keys with perfect forward secrecy, which as best we can tell, and our math majors can do the math, are unlikely to be breakable by anyone in the current human lifetimes. So we think we're okay there. I don't think, however, that's an endorsement for random people taking large numbers of documents that they're not supposed to leak and leaking them, because people can get hurt.
Jared Cohen: Well, and what I would add to this is if you think about, you know, Snowden leaked literally well over a million documents. It is impossible for a human being to read all of those documents and make a determination that they've read through them and nobody's going to get hurt from this. But let's say that you—you know, everyone's talking about Snowden now, people were talking about Manning before. The reality is, we're going to keep seeing more of these people.
Eric Schmidt: So you believe that there will be more such leakers?
Jared Cohen: And there's a couple reasons for this. One is the celebrity factor around leakers like Snowden and Manning is something we haven't seen before. And if you think about Snowden's tactics versus Manning's tactics, Snowden waited until he could be in a physical environment where he wasn't able to be grabbed by U.S. law enforcement.
Eric Schmidt: So the rule is, when you do the leaking—and this is not an endorsement of leaking—do so from your future country point of residence.
Jared Cohen: So what's happened is, Snowden's been able to shape public opinion about himself over time. And regardless of what one thinks of Edward Snowden, we have to assume that in the future, there'll be a bulk leaker that has much different intentions, that's far less careful, that's far less discriminant, and people as Eric mentioned really will get hurt. The other question too is we think about bulk leaking right now in the context of leaking classified documents from government. But what happens when somebody decides to bulk-leak a major law firm's documents? Or a major corporation's documents?
Eric Schmidt: Or healthcare records?
Jared Cohen: I think the sort of inevitability of this is scary.
Hiram Chodosh: Are you suggesting, though, that bulk leaking is always per se to be condemned? That is, to go back to your North Korean context, or in the context of really horrendous abuse—is there any scenario in which we might celebrate a bulk leak?
Eric Schmidt: We can probably come up with an example involving the Nazis. Right? So there are cases where systematic evil of that level can be—should be—countered by every known means, and I think we would all agree to that. But without being a defender of the National Security Agency, they at least were attempting to operate legally—we can debate whether it was legal or not, I happen to disagree with them. But I don't doubt their motives were what they said. So it's dangerous to aggregate the power of this bulk leaking or any form of data permanence on yourself. “I'm upset with you, I'm going to leak information about you, I'm going to destroy your life.” Who gave me the right to do this to my new friend? I mean, again, you want governments to have processes—sorry, we had a nice conversation about this. Who gave me the right to make that decision unilaterally? The governments—proper governments, functioning governments—have checks and balances. They have courts. They have trials, right? Maybe that's an appropriate penalty after you've been convicted of something. But the presumption of guilt says, “Leave me alone.”
Jared Cohen: If I can just jump in on it, if you five and a half hours of time to spare, you can read our debate with Julian Assange on this while he was under house arrest in the U.K., which he subsequently leaked.
Eric Schmidt: Yeah, he leaked it.
Hiram Chodosh: So in other words, empowerment, the empowerment you're talking about at an individual level, also comes with responsibility. And I want to talk about two dimensions of this that come out of some of our students' questions assembled for today. One has to do with the private disclosure of information. So Carter Wilkinson, who you met, asked, “Do you believe that the rise of wearable electronic devices may lead to an Orwellian society where citizens fear that their lives are broadcast without consent?”
Eric Schmidt: So let's ask people in the audience here: Who is taping or recording this lunch? Fess up, it's okay.
(Audience member stands up wearing Google Glass and admits he is recording it)
Eric Schmidt: Thank you for identifying that. So the Google Glass is a good example, because clearly they could be misused. So rather than just throwing them over the wall, we in fact certify the apps, so in our view they can't be misused with things like face recognition that could be used to stalk people and things like this. This is an example—we're having an informed debate, you know, makes a difference. What's interesting to me is that when I'm in a private meeting, I presume that I have the privacy of speech. I presume that I'm not being secretly taped. But if I were a spy, which I can assure you I'm not, I would assume that I was always being taped. And if I lived in an authoritarian country, I would always have in the back of my mind that there was a microphone and so forth. When we showed up in North Korea, Jared announced that we were going to be spied, and so I said, “Okay, what does that mean, Jared?” He says, “It means I have to take a shower with my bathing suit on.”
Jared Cohen: By the way, it's like the smartest decision I've ever made.
Eric Schmidt: And he said he did not want to be in a porn movie for the North Koreans!
Jared Cohen: And by the way, just so you know, I don't know if you know the background on why I came to this conclusion—I had a colleague of minewho was posted at the U.S. Interests Section in Havana. And the government videotaped him taking a shower, and then when he—obviously he didn't know it—and then he got out of his car and was approached by an agent and showed the video of himself showering.
Eric Schmidt: And this was some Cuban attempt at oppression or pornography?
Jared Cohen: Possibly a combination of the two?
Eric Schmidt: So I talked to Jared and I thought, “That's the stupidest idea I've ever heard. I've never taken a shower with my bathing suit on.” Although he had told me to bring my bathing suit, interestingly enough. So I decided to sleep with the door open. 'Cause I figured I had no privacy anyway. So you can deal with it mentally if you know about it—it's when you don't know about it, it's a problem.
Hiram Chodosh: Well, you said when you're in a private meeting, you presume privacy. But there are lots of other contexts that may not be meetings, they may not be business contexts, they're social contexts, which I think increasingly, the way I experience is, I can't presume that something I say won't be recorded.
Eric Schmidt: What you're seeing is you're seeing the change—which is largely because of the digital age—a change in private space/public space, and it goes something like this. There were many things that were sort of public but known locally that are now known globally. Let's say there's a local article here in the San Bernardino Valley that's critical of you, right? And it's incorrect and so forth and so on—it wouldn't have had any legs, if you will. But now all of a sudden, it's globally available to everyone, right? And then now you've got a problem that somebody's made a misstatement of you that would've been buried 20 years ago. There's lot of examples. I remember getting this letter from a nice lady in Australia who said that she had been convicted of drunk driving and she lost her job, and she decided to appeal in the Australian court, and under their law, the appeals are public. And by virtue of her exercising her appeal, which she lost, the information about her conviction became globally known and she, in her view, suffered great damage. So the question here is, what does Google do with this kind of information? Well, we ultimately decided that it's not our job to make these kinds of censorship decisions. But it's an example in the Australian case where when you appeal something, right, you also are making a choice to give up some future privacy. And maybe that's the right decision, maybe that's the wrong decision, but it's an example where these things hit each other.
Hiram Chodosh: Now that's a great segue to the second dimension of the same question, which has to do with Google's corporate responsibility. We have a question from Tokyo: “What role do governments and large organizations like Google hold in regards to the abuse of this empowerment, and what inroads could be made?” In other words, how does Google see its own corporate responsibility in these terms? That is, against the benefits and the harms?
Eric Schmidt: We spent lots of time on this issue and we ultimately decided that we can't tell what's truthful or not, especially when it involves two people having a dispute. So we always rely on what a government process will tell us, and our default is to leave the information up. So again, if we have a mistake, if we have a fact disagreement, right, we'll end up letting the consumer see the two facts and they can decide—let ranking work it out. This can produce some problems, 'cause if you have a psychotic, dedicated attacker whose job is to produce bad, false information about you, they can make a fair amount of progress on the Web destroying your reputation before the other forms of information can come back. And there are certainly examples where celebrities and powerful people have such people who become unusually focused on them. So I'm not minimizing this. But from our perspective, how would you solve this problem? Do you want to have a division of truth run by the government that tells us? Well, that has lots of problems, right? If you allow people to edit their own descriptions, that has a lot of appeal except then I can assure you that everyone would take all the bad stuff down, right, so you'd only have good stories about everyone. That's not very responsible either. There doesn't seem to be a good way algorithmically to answer this question.
Hiram Chodosh: I know that you've built the principle from medicine of “do no harm” into your corporate fabric. Is it “do no harm” or “do lesser harm”?
Eric Schmidt: Well, we—again—these are where the rubber meets the road, right, when you make these sorts of decisions. A good example is the Innocence of Muslims video, which, by the way, we were ordered a week ago to take down by a U.S. judge for reason that we disagree with involving a lawsuit involving this woman who claims to have been incorrectly cast and duplicitly cast in the movie. And Innocence of Muslims, we decided to IP-block it—that is, prevent its spread in a couple of countries during the riots that were about it 'cause we didn't want to have it. But what was interesting was that after people lost interest in this video, we put it back out and it was fine. So that's an example where we looked at—we try to do it on a case-by-case basis—what's the thing that will inform the most but hurt the least?
Hiram Chodosh: So you started talking earlier about the use of technology taking over mechanized human functions. You talked about what are the Chinese workers in Shenzhen going to do when we have 3D printing and other kinds of things? How should we think about this in the artificial intelligence side of things? That is, those things where we're in an age of big data, where we have the prospect of the singularity. Matthew Lee from California asks, “Where do you stand on the debate surrounding the ethics of the amount of technology should improve our lives, i.e., the singularity may dehumanize us as a society and dilute culture?” But more on the decision-making and the use of technology…
Eric Schmidt: But let's be clear that the singularity is a view of a possible future. But it's not the only future, and there's an awful lot of assumptions in between there. I think it's better to talk in the nearer term. We know the computers are going to get smarter and we know that they're going to be used to make people more effective and have happier lives. You choose to use Facebook, Twitter, Instagram, Whatsapp, so forth and so on, all the Google products and so forth because you in your judgment think it makes you more powerful. This is a good thing. You're smarter as a result. I think that trend should continue. If we get to the point where our obsession with technology really does affect the quality of life in a way that is measured and people agree to it and it becomes a political debate, then you can imagine political action. But we're not there.
Hiram Chodosh: Early in your book, you talked about going to Iraq and the perceived gap between those people making decisions in Iraq about Iraq and their technological sophistication. Could you tell us a little bit about that experience? And then relate it, Jared, to what you see happening in Syria, in the Ukraine, and maybe we can get into some topical issues that way.
Jared Cohen: So Eric and I met and became very good friends in Baghdad, of all places—go figure. When we travel to Baghdad together, what we realized is there were a lot of critical decisions being made about how to restructure the company, where to lay roads and so forth. Nobody was thinking about laying fiber optic cables underneath the roads, so having Eric on the ground ensured that somebody was asking the technology questions which had previously either been falling through the cracks or delegated to somebody who was so far down the chain of command that their ideas were never getting surfaced to decision-makers. So imagine rebuilding an entire country and not thinking about communications infrastructure, not thinking about fiber, not thinking about connectivity. And I would say it's safe to suggest that that trip was in many respects what catalyzed both this book and the ideas in it, which is we need to look at what have traditionally been geopolitical issues through the lens of technology. Now, Eric always likes to say that 80 percent of life is showing up—it's really important to go to these environments and meet people who are using technology but have a totally different set of challenges than we all have. So to give you an example per Syria, I went all the way up to the Syrian border about two or three weeks ago to see a number of friends of mine who are Syrian who I hadn't seen in a long time. And I wanted to understand not what the physical war looks like, because we know that already and you have 140,000 people killed, chemical weapons used, et cetera. But there's a more invisible war that's happening at the same time which has to do with nefarious cyber activity. You have Russian software engineers being recruited to come fight with the Syrian Electronic Army. You have the government doing everything it can to figure out who the dissidents are and who the opposition is. My friends told me a story about checkpoints that they've encountered in Damascus and homes in Aleppo where the government stops your car, asks for your phone, holds a gun to your head if you don't give them your password, and then goes on to see what you or your friends have posted on your wall or something else. So a friend of mine was telling me that her brother got stopped at one of these checkpoints and one of his friends had posted a page on his wall that was associated with some flank of the opposition. Immediately a signal came from the man who was running the checkpoint, somebody up in a building who then shot him in the head. Right? So the stakes are really high. So when you talk about privacy, when you talk about security, let's not get lost in only having that conversation in the context of a very narrow slice of the world's population. The stakes have never been higher. So how are you supposed to tell literally millions of Syrians who are already caught in a physical crossfire, how are you supposed to help them understand what I just described to you? You know, you can't talk about two-factor authentication, use this browser, use that browser. There's a fundamental challenge that as an industry we need to meet, which is how do we make the experience more seamless so we're not just protecting people in the physical world but we're also protecting them in the online world? Which is why keep saying what Syria needs is a humanitarian intervention online that the private sector and all the technical people around the world are uniquely positioned to help orchestrate.
Hiram Chodosh: And just describe to us what those strategies would be.
Jared Cohen: Well, I think it's a combination of things. You know, in many respects, the biggest challenge is a nontechnical one, which is a translation gap that's existing between people who are building the tools that make people more secure and the people who need it most on the frontlines of these challenges.
Eric Schmidt: But I would also say that there are some limits to our vision. We started off with this sort of view that many tech people have, that technology can solve every problem. And so—and Jared has been tutoring me in foreign policy 'cause, you know, I was busy learning engineering, right? Everybody can relate to that. So where does the limit happen? And let's use Syria as an example—140,000 people killed, nine million people displaced, chemical weapons, proxy wars from lots of different players, the Internet has been shut down. How does our vision play there? If you can't get the communications in there, you can't talk to the people, you're sort of left hanging. You've got to have some basic things, you've got to have some power, either solar or better generated power. You've got to have some rights of way and some networks. And if you can get those and the governments don't block 'em too much, citizens can take over. So the only thing I can think of, and we keep debating this every day, what to do about Syria, is perhaps you could airdrop phones in that were peer-to-peer. Right, so you could empower the citizens to tells us what's really going on. But how is it okay that there's a part of the world that is today blacked out? We don't know what's going on and we suspect terrible things. How is that okay, right? In this year, with this sensibility in this room? How could we have allowed that? Right?
Jared Cohen: And I think the answer is one that we believe is maybe unsettling but honest, which is for all the great work that technology can do, for all the individual empowerment that it ushers in, for all of the heightened awareness that it creates, the world is still run by states with serious militaries. And at the end of the day, there's no shortage of horrific videos coming out of Syria, each one worse than the next, and by the way, each one now with fewer and fewer page views. And so all that technology is doing in the Syrian context is helping create a demand for intervention, but if the states aren't willing to do it, then perhaps the old adage of “Never again, always remember” that popped up after the Rwanda genocide is incorrect. The assumption had always been that if only we knew that all these people were being slaughtered, it would change.
Eric Schmidt: So let's go through some thought experiments. There's this horrific, horrific case in 1994 of the Rwandan genocide over four months—Jared actually wrote a book on it. We went to visit—it's a perfectly great place, everybody lives in peace today. Could you have prevented the 800,000 deaths, is that the number?
Jared Cohen: 800,000.
Eric Schmidt: 800,000 deaths by new technology, and I wonder—you certainly, if you'd had your phone, you would've gotten an alert saying your neighbor's about to come kill you. So you could've at least taken some steps to prepare yourself for the machetes. But let's use a current today, and I'll ask you this, Jared. So I'm reading an article on the plane—so basically what's happened is the Russians have invaded Crimea, which is a lower part of Ukraine which they have historically had a lot of links to, and the rest of the country, the guy fled and God knows where he is. But what's interesting in the article was that there's a question as whether the Russians have in fact taken over the military or not. Right, and there were rumors as to what they said, and there's a story that the Russians told the Ukrainian military to give up their arms, and the Ukrainian military had a nice pleasant meeting and decided not to, right? What I wonder is that a hundred years ago, they would've shot each other at that point. And now because they are connected to the Internet, because they know people are watching, there's a little bit more principle at work. Do you agree with that?
Jared Cohen: I agree with that, and we spoke about this the other day. There's this question of why did this happen to Yanukovich now? He's been an autocrat since the first time he was in office.
Eric Schmidt: I'd like to say Yanukovich was the president until this past weekend of Ukraine and he was busy trying to negotiate a deal with the Europeans and then all of a sudden, he just stopped.
Jared Cohen: And what happens to a lot of dictators, what happens to a lot of autocrats is, there's a set of tactics that work for them, that they can get away with until the population gets more and more connected, and then one day those tactics basically cause an outcome that leads to an uprising. So I was in Egypt, for instance, the day of the revolution, and it was very interesting. The notion of shutting down the Internet was something that dictators could use quite effectively. It was very effective in quelling the tide of revolution in Iran in June of 2009 and many other examples before that. But when I talked to a lot of young Egyptians on the street in Tahrir Square and elsewhere and I asked them why they were protesting against Mubarak, the answers were hysterical. It was “Yeah, I wasn't that politically involved—I didn't like Mubarak, but this wasn't my fight. And you know what? Then he shut down my mobile device for three days and he really pissed me off.” But that's not something that an autocrat can anticipate until it's too late. And so the threat to autocrats in the future—and we like the idea of these autocrats having a bit of a dilemma—is they don't know at what point they're overreaching until it's too late, and I would argue that's actually good for the world.
Hiram Chodosh: So those are the people we want not to understand anything about technology, right?
Jared Cohen: We want them overreacting to all of the noise that keeps us sort of frustrated with politics.
Eric Schmidt: We were in Tunisia and the dictator Ben Ali lost his job after the Bouazizi self-immolation. And in talking to the people who were the bloggers who helped make this happen, he had regulated television, the phones were tapped, and so forth, but he hadn't regulated the Internet. He was also 80 years old. He didn't use the Internet. Right? So maybe part of this is also generational.
Hiram Chodosh: Before we open it up—because I do think I would like to open it up, I've asked a number of questions that we received before, but I think I'm going to have David go around and open up—I did want to shift a little bit to the personal. I think it's really inspiring for all of us here to see the amount of time and dedication that you've committed to these world problems. I mean, that is not, I think, what anyone from the outside looking in would have expected. How do you each calibrate what you're doing—your day job—with everything else? I think that's something that we all struggle with in terms of how much we do that is just central to those key results and how much we do because we care about the communities around us, we care about our society, we care about our world. And just by way of personal reflection, if you could each sort of talk to us and our students in particular about how you have managed your own time and energy in light of what sometimes can be very difficult tradeoffs.
Eric Schmidt: The old model of the CEO as solely focused on shareholder profits and earnings and so forth is really being replaced by CEOs and executives that lead based on inspiration and by education because you're dealing with knowledge workers. They don't just do what you tell them—you can't just scream at them and they do it. Maybe that worked in the past, I'm not so sure, but it certainly doesn't now. So it's extremely helpful to work in a company or an institution that has a purpose that's higher or bigger than you. You lead an institution that's clearly an outstanding institution, it's trying to make the world a better place through education. We are 100 percent supporters of what you're trying to do. Google is trying to get the world connected and believes that the connectivity and the power of information is fundamental to the future. So at least for me, and I think, Jared, it's similar for you, I decided I'd spend my time on just trying to get the world connected and try to get these barriers out of the way, 'cause I figure that's sort of the best leverage of my time. And it turns out if you have a passion like that, you can spend lots of time on it. People have a lot of free time, right? So you only work so many hours a day and you see your family and so forth—you got free time. Do it on Saturday, Sunday, think about it. So if you discipline yourself a little bit, you can have a lot of impact, but more importantly—and this is sort of my core advice—is life is not what you do but who you travel with. The quality of your life outcome will be the people you enjoy—and I don't mean laughing, I mean the people you are doing things with, whether it's your family, your colleagues, the sense of purpose that they have—that's how you look back on. So spend your time figuring out who you want to hang with to change the world and you'll have a great outcome.
Jared Cohen: And what I would add is I was very lucky that at a young age I figured out that I liked traveling to unstable parts of the world.
Eric Schmidt: In fact, he doesn't like to go to normal, safe places, I've discovered.
Jared Cohen: But what's interesting is you spend time in places like Iran, you spend time in these highly autocratic countries, these very poor environments, and you meet people—it really affects you, especially when I did this when I was a student, an undergrad and a grad student. And you can't meet somebody in Iran who lives in a very different world from you and not want to sort of help rectify the horrible situation that they're in. You become deeply impacted by these people that you meet even if you don't remember all of their names, and I'd say for me that that's what drives me. So then it's always been about who are the people that I'm trying to be helpful to, and then the next question is, where's the place that I can do that most effectively? For a while, it was working for the State Department. But then I realized there were limits because we couldn't build things there, so I then came to Google, an engineering company, because we could build things there. But what's interesting is the days of a job that you're in for 30 years don't apply to our generation. And so the piece of advice that I would give all of you is there's going to be a lot of peer pressure that you're going to encounter for having gamed out a career path. And the biggest mistake that you can make is to choose your next job as a means to get some other job that you've gamed out in the future. I'm a big believer that ambition is best fueled by uncertainty, and the safest bet that you can make is uncertainty that's tied to doing something that you're really passionate about. If you're really passionate about something, you're going to do it well, everything's going to work out, and you really should also pick your next job by who your boss is. Ultimately, who you work with matters more than anything. Again, I really believe that lots of things work it out. When I was in grad school, everybody would sit around getting coffee at this coffee shop call Blackwell's at Oxford, and they literally would have meetings about how to be governor by the age of 40. What's happened to a lot of them is they've woken up, now 32, 33 years old, and realized that they've been doing one job after the next, one degree after the next, because it's part of a plan, and they've now realized they've sort of missed the window where it's safe to figure out what they actually care about and they're kind of lost. You don't want to fall in that trap.
Hiram Chodosh: This is wonderful advice. Let's open it up to some questions from the audience, hear some other voices.
Audience: The question is, (inaudible) to consumers, to enterprises, you're empowering them, and in return, sometimes very subtly, to receiving revenue. You're gaining from your products and services just as much as we are, but we often don't realize that. Is this a model that can be applied to larger technology companies, new ventures, projects and products that we have now into the future? Not to say it's a new model, but it's a model that you have scaled so effectively that we can now use.
Eric Schmidt: Well, you're seeing this strategy playing out in many different ways. And the fact of the matter is that free is better than charged-for. So if you can come up with a 100 million to a billion-unit network of people and activities, you can find a way to monetize it. So it looks like the current sort of aggression in the industry is to try to find as many corners as possible where you can aggregate users to solve some problem, however local, but do it for free or do it for a nominal charge, and then grow that as fast as you can and then sort of see how far you get. And the market is now rewarding such fast-growing networks with enormously high valuations on the anticipation of great future revenue.
Audience: I know you talked earlier about technology like robots possibly taking over people's mechanical roles in places like China, but what's your opinion on cognitive systems like IBM's Watson or other deep-learning software that have the ability or possibility of taking over thinking capacities rather than just mechanical?
Eric Schmidt: Well, it's clear that things like Watson are going to be tremendous assets for knowledge workers. There's a series of trials now with Watson essentially…think of it as cancer care. Oncological information, which is a very complicated field. And the trial is whether Watson can survey the information in a relatively unstructured way, whether it can tell a doctor something that he or she didn't already know about this particular patient. And we'll find out pretty soon. It's very exciting. We just bought a company called Deep Minds that its underlying technology allows it to do generalized learning. So you let it wander around the information space and it figures things out. It does not then follow that it becomes a good judge of character, you know, or a good predictor of the future. So I think we want to distinguish between knowledge and insight, and we can probably build enormously successful knowledge systems—you know, the information you need right now at your fingertips, the whole bit. As to whether that will replace human judgment, that's a big leap.
Audience: There's a funny little cartoon that says—a guy sitting at his computer—“Oh, I think my next fire alarm is going off because I just got pitched an offer for temporary housing and a fire extinguisher.” I was wondering, when do you think we're going to see an opening of APIs and an open standard governing all of the Internet of things? How is that is going to affect display advertising?
Eric Schmidt: It's a very good question. So the core question is, will the current closed systems that automate much of our world, will they have open interfaces? And I don't think we know. There are some reasons to think it will be much harder than in computer platforms, and the reason is that you couldn't have the kind of property you describe. Let me give you an example. You have some home automation system and there's some evidence of a crime in the home. Does the computer then call the police? You know, assume that you have some smoke alarm that can smell—I'm making this up, 'cause our smoke alarms don't do this—and imagine it smells marijuana, right? What to do? There are so many questions like that, right? Or your watch detects that you have a problem with your heartbeat—does the watch call the doctor now or does it wait for you to press the button? Oops, you've had a heart attack, you can't press the button. All of those are issues society's going to have to work out—I don't think we know. The other issue has to do with abuse, so the moment you open the API, the Chinese can come in and decide to set the fire alarm off. Not a good thing.
Hiram Chodosh: Eric, you've raised many big ethical questions, and some of these very important ethical questions in certain contexts. How would you describe the resources that Google puts into ethics to help Google think through some of these things in terms of its new technological developments?
Eric Schmidt: We have Jared's group, which is trying to apply these questions. You're trying to solve traditional problems with new ideas. But most of our decision-making is much more pragmatic—how do we build a product people actually want? Ultimately, because we're Google, when we launch a product, we're so heavily watched, if you will, that if we sort of do something that's on the other side of some line, we get lambasted and we get feedback very fast. A classic example is that one of our engineers invented a particularly clever way of doing face recognition, and then face recognition doing predictive modeling to predict not only who that person was but when you would meet up with him. So this person—I'm sitting in, we're doing a product review, I'm with Larry and Sergei—and this 23, 24-year-old stands there and does this, and I get this white-as-a-sheet look. And everyone starts laughing and he thinks that he's being praised, right? And I am reacting to the impact that such a tool would have—it has realtime data collection, realtime face recognition. Think of the misuse. We had a huge debate over what to do with this, and ultimately decided not to predict.
Hiram Chodosh: Does that impose on you as a company the need to cordon off some of those technologies into a special space that's very high-security?
Eric Schmidt: This is where judgment—no, from a physical sense, the answer is no. But you want your companies to use judgment on these things. The fact that you can build it does not necessarily mean that you should unleash it on the world even if it's legal. Let me give you as an example of face recognition. Face recognition databases are legal in the United States and they are illegal without a license in Europe. So that's an indication right there that that's an issue that we can debate—we have to think about that.
Hiram Chodosh: How early in the technological process do those ethical judgments take place? Because obviously, if the engineers are free to develop what they want…
Eric Schmidt: We actually encourage the invention of these things, and then a relatively early review. But when the hammer comes down, the engineers get it. But you don't want to go and say, “Don't work in this space,” and indeed, the technological ideas in these particular areas were repurposed, so for example, the face recognition stuff was then added to Picassa and became the most successful form of ranking of faces that you wouldn't name. So again, we use the technology in a completely different way and the engineers are actually happy.
Audience: How do you think the model that he described earlier will respond to this generation's demonization of advertising and things like that?
Jared Cohen: Well, I guess I don't know that I would take it as an assumption that all people of a particular generation have a demon view of advertising, but I think that your question before was about transparency. I mean, the reality is, as a public company, there's plenty of information available about revenue associated with advertising.
Audience: You gave the example of the Russian invasion of Crimea and how technology sort of provided a space in which it enabled diplomacy. Could you speak more to the effects of technology on the way wars are fought, for example, with the (inaudible) and making it so that traditional determinants of the outcome of war such as population size and power of your military aren't necessarily determinants of the outcome of war, but rather your technology and the ethics involving that?
Jared Cohen: Yeah, it's a really excellent question and one I think we could probably spend hours on. I'll try to sort of touch on a few pieces, and if you read the book, there is an entire chapter on the future of war and the role that technology has on it. One question we get a lot is about the future of drones and UAVs, and the conclusion that we come to is we don't think there's any state on earth that will ever be comfortable with the idea of robots making the decision of when to pull the trigger. You have to assume that with the proliferation of UAVs and drones, there's also going to be treaties that make sure that a human hand is always behind the trigger. Where it gets complicated is there's two types of wars that you're seeing right now, physical wars and cyber wars. One's invisible and attribution is very difficult, and one is very obviously very visible. Where this gets complicated is you can have two nations that are at peace with each other in the physical world but basically at war with each other in the online world. So the U.S. and China's actually a very interesting example, which is, you know, they're sort of frenemies. But if you look at the relationship in the online world, it's about as adversarial—it's probably more adversarial than the physical one between the U.S. and Iran in the sense that there's attacks happening every single day. And so the question that we raise in the book is, the online world and the physical world are still part of the same system, they're just two different fronts, so at what point does the virtual front spill over into the physical front? So when does a Chinese cyber attack carry such significant consequences that it warrants a physical-world response? And we haven't seen that spillover moment yet. What Eric will say is he believes that that spillover moment is when lives are lost.
Eric Schmidt: And these doctrines are evolving now. Let me ask a much simpler question. We have students here on the campus and you're in charge, and a bunch of students buy these inexpensive drones that are photography drones and they're flying them around. Is that okay with this college? I ask not because I have a strong opinion about it but because the next generation of technology enables GPS-enabled, self-directed, pilotless drones for photography, right? I suspect that people will be uncomfortable with these things. Right? For all sorts of reasons.
Hiram Chodosh: Yes. Eric Schmidt: And there'll be a whole conversation about what's appropriate and what's not.
Hiram Chodosh: Absolutely, and I think too that just thinking about this example, even now I think we all have some discomfort. Forget flying photography—just photography.
Eric Schmidt: Well, it's interesting at Google—we have a person wearing Google Glass here. In Google, it is highly accepted to wear Google Glass in all of our meetings. But also remember we have control over who the people in our meetings are because they're employees and guests. Right? I'm not sure we would be quite so liberal if everyone randomly walking around Mountain View, California were wearing Google Glass and just walking into our meetings, right? That would be a different conversation, I suspect.
Hiram Chodosh: There's also the issue, it seems to me, of accountability. Both in the communication systems but also in the production of the photograph. And I'm often very concerned about the lack of accountability that we have in cyber communication more generally. I don't like it, because I think the harms are far greater than the benefits of allowing people to speak anonymously. That's my own normative view.
Eric Schmidt: And by the way, I actually agree with you. What you'll discover if you think about anonymity for a while is that anonymity that's absolutely perfect anonymity is not going to be allowed. Because ultimately, true anonymity, truly hidden speech without any repercussion can be used to violate the law—libel, you know, so forth. And some judge is going to say, “You mean, I can't find the perpetrator?” And yet the technology, in particular the cryptography, allows for truly anonymous speech.
Hiram Chodosh: Yes.
Eric Schmidt: So if you go look at the Manning case with Julian Assange, they spent three months using a system called Tor to try to figure out if they were dealing with legitimate versions of each other, and they did various tricks using Tor. What they didn't know at the time was that the NSA had broken Tor and was busy watching what they were doing. So when you start playing with these anonymity games, you’ve got to be really sure you're anonymous or you're going to get caught, and you have to be very careful. So this is like a whole new set of issues that society has to face.
Jared Cohen: There's also one other important piece, and I'm sorry, I forget your name, but your question also raises another interesting point which is about the asymmetric nature of this. So as it pertains to a physical conflict, if you don't have the tanks and the weapons, there's not much that you can do, but as it pertains to cyber war and nefarious cyber activity, one individual can be as destructive as a state in some cases.
Eric Schmidt: Or a very, very highly educated technical country can be equally effective as a much larger rival that's not so technical.
Jared Cohen: And one of the things that we say in the book is, as we think about a dangerous terrorist organization in the future, it may be less about who their charismatic leader is and more about who their chief technology officer is. The Mexican drug cartels, you know, they build their own submarines—we see them increasingly using over-the-counter drones for various things. So the nature of warfare will change online as well as you have these asymmetric battles between terrorist groups and illicit networks and states, and because attribution is so difficult, you may be engaged in a war where you don't even know who you're fighting against.
Eric Schmidt: And just to finish this out, there was an article last week which I don't know is true or not by David Sanger that said that the president was given a set of options to go destroy Syrian infrastructure using various technological solutions—effectively forms of cyber war. And this would be the first use on the battlefield, unquote, of this kind of technology, and he decided that it was unlikely to have the kind of impact that was claimed and it would set a precedent. So that's an example of the kind of decisions—if this is true, it's just a leak, I have no idea—if it's true, it's an example of the kind of decisions that are facing policymakers that were hard to imagine in terms of complexity since probably the atomic bomb.
Hiram Chodosh: There's a major challenge in all criminal justice in which the government is always trailing behind the criminals. In fact, there's a kind of evolutionary process of advancement of technology in which crime is always a little step ahead of government in preventing it or cracking down on it. This gives us some pause in terms of your more optimistic view about technology generally. And so I'm just wondering, against that context, why we should be optimistic about technology in this particular dimension.
Jared Cohen: Well, I would argue actually with regards to terrorists and criminals, while there are new threats, they're going to have a much more difficult time in the future. And I think I think it's important to understand terrorism's never going away, criminality's never going away. What we want to do is increase the likelihood that people who are planning attacks get caught before they happen or, you know, God forbid, they're successful, get caught after it happens. There's a great story from about six months ago that illustrates the point that I'm going to make, which is you had a 45 million-dollar ATM heist that was orchestrated over the course of 12 hours. It was the largest ATM heist in history. You literally had tens of thousands of ATMs hit up around the world in about 45 different countries. Now what's interesting is the whole thing was orchestrated by a big, elusive transnational organized criminal ring. But they needed people to physically go to all these ATMs and take the cash out. So they turned to basic street criminals, and it turns out that basic street criminals are kind of moronic and they went to these ATMs, took out the money, and then decided to post Instagram photos of themselves celebrating with the cash that they'd just stolen, not realizing that Instagram geolocates by default. And then by catching the criminals, they were able to find the communications they had with somebody who led to another. So the larger point that I'm making here is, wherever you see criminal activity, wherever you see terrorist activity, wherever you see illicit activity, there's always one moron in the group. And the nice part is every one of these people has to opt in to technology because you can't be relevant operating out of a cave in Tora Bora, and so the digital trail means that the margin for error goes down. So a great example is the arrest of El Chapo, who was the leader of the Sinaloa cartel. He had escaped from prison about a decade ago, and all the interrogations of all the captured Sinaloa cartel members produced absolutely nothing in terms of finding him. Ultimately what found him was one SIM card leading to another SIM card leading to another SIM card.
Eric Schmidt: An even more humorous story has to do with the founder of Norton Antivirus stuff who wound up living in Belize. There was a suspicious death next door, he was a person of interest, he fled, but he decided to inform everyone through his blog where he was. And he's actually having a nice time and he's in different countries and so forth, and he's picture-taking using an iPhone in Guatemala and is posted on his Twitter page. And within about two seconds, the metadata has his GPS coordinate, and that goes right to the Belizean police who called the Guatemalan police who arrest him. They delete the picture within some number of minutes—like, you know, four or five. So they knew that they had made a mistake, but once the data permanence was there, the curious ending of the story was the Guatemalan police talked to him, decided they had no basis to send him back to Belize, but he was also in the country illegally, so they exported him to Miami, which is exactly where he wanted to go.
Hiram Chodosh: Before we conclude, I just wanted to quote from the conclusion of this really stimulating book. It's really clear to me that our guests today are committed about our individual and collective roles in minimizing the harm and maximizing the benefits of the digital age. In their conclusion, they warn that creating net benefits, quote, “may not happen if all of us were just to sit back and pretend that only the optimistic developments are relevant for us." Making change in our communities, expanding the middle class, access to the best teaching and learning methods, they write, “are all choices, and we can continue to choose in ways that enable the digital age to live up to its promise.” I want to thank you both for coming.
***