美国联邦通信委员会(FCC)前主席汤姆-惠勒(Tom Wheeler)说,我们以前经历过信息和技术革命,可以追溯到古腾堡。现在轮到我们站在历史的终点站了,工业资本主义的规则可能已不再适用于互联网资本主义。因此,我们的任务不是逃离,而是站起来,认清挑战,应对挑战。
JOANNE MYERS: Good morning, everyone. I'm Joanne Myers, and on behalf of the Carnegie Council I'd like to thank you for beginning your day with us.
Our speaker, Tom Wheeler, is a man who has worn many hats. Most recently he was chairman of the Federal Communications Commission (FCC), but before taking on that position he helped start or was responsible for founding several companies offering new cable, wireless, and video communication services. Currently he's a visiting fellow in governance studies at the Brookings Institution.
But that's not all. He is also a noted historian of the Civil War, and his two books on this period, Take Command!: Leadership Lessons from the Civil War and Mr. Lincoln's T-Mails: How Abraham Lincoln Used the Telegraph to Win the Civil War, reflect his deep interest in how past events and present trends reflect—or don't—what happened long ago.
It's not surprising then that the theme of his most recent book, From Gutenberg to Google: The History of Our Future, looks at how network revolutions of the past have shaped the present and set the stage for the future. It is a perfect vehicle for Tom to combine his passion for history with his experience in communications.
Just a brief footnote: When you hear the word "network" used today, it is being used as a shorthand way for describing the physical links that bind society together.
In an age of smartphones, smart appliances, smart watches, drones, and driverless cars, it's easy to think that new technology is disruptive as it revolutionizes our lives in unprecedented ways. History, however, tells us a very different story as life-changing innovations just didn't happen; the free flow of information began a long time ago. As many of you know, the original network revolution began in the 15th century with the invention of the printing press. Four centuries later, the great network-driven transformation appeared when the telegraph and the railroads came into being, thus drawing people closer than ever. You could say that the benefits of connectivity were realized at last.
Now as we enter the third network revolution, our speaker tells us that the disruptive nature that has been brought about by cutting-edge technology is not new. We have been here before. And as in times past it is no surprise that with each new wave of social change upheavals are to be expected.
In From Gutenberg to Google Tom writes that it is important to understand the historical context for how technology evolves. To this end he traces the pattern of network-driven outcomes over time and tells us that while networks may be the primary enabling force, history shows it is the secondary effects of such networks that are transformational. More specifically, it is how we react to new technology that has put primary importance.
The question is: What exactly are new social, scientific, and technological advances doing to our lives? What is the history that we are making? For the answers, please join me in welcoming a very special guest, Tom Wheeler. Thank you for joining us.
TOM WHEELER: Thank you very much, Joanne, and thank you everybody for coming out early this morning.
I've been introduced with this book by a lot of people but nobody who clearly had read the book and internalized it as much as Joanne did. I think I can just go straight to the Q&A because Joanne has explained it all. But thank you very much for your caring about the book, Joanne.
I've got with me here a copy of a tweet that went viral recently that listed the things that we take for granted that did not exist in 2003: Facebook, Twitter, iPhone, iPad, Android, YouTube, AWS, Apple App Store, Uber, Airbnb, blockchain, Bitcoin, Square, Spotify, Dropbox, Instagram, Snapchat, WhatsApp, Pinterest, Kickstarter, Messenger, Quora, Tumblr, hotspot, Hulu, Box, Nest, Fitbit, Oculus, Kindle, and 4G didn't exist 15 years ago and are pretty common and defining in how we live our lives today.
Technology has impacted our personal activities and also our economic activities. Robots wrote a lot of your newspaper this morning. Financial reports, sports scores, this sort of thing, are being done now by computers, by artificial intelligence (AI). Last week Warner Music announced a 20-album deal with an algorithm. It's not Beyoncé. It's a computer algorithm.
Fifty-two percent—this is a shocking statistic—of the companies that were in the Fortune 500 at the turn of the century—2000—don't exist anymore. Fifty-two percent. And we are at a period where we have the highest level of wealth inequality in a hundred years. No wonder there is all of this anxiety around the country and around the world. No wonder that there is this search for stability amidst all of this change.
I get a kick—when we were talking at the table here, people go around today and say, "Oh, we've never seen so much change." No. Baloney. And what From Gutenberg to Google is about is talking about the predicates that created where we are today.
Let me just put this in perspective, and Joanne gave you a preview of this: We talk about how we're in the middle of the information revolution. The first information revolution, the original information revolution, was Gutenberg's printing press because Gutenberg picked the lock that had kept information locked away so that it could be exploited by the priestly and the powerful. That helped bring the end of the feudal economy, and that also challenged the Catholic Church, who began a campaign to talk about how this information was fake news.
Four hundred years later the first high-speed network in the middle of the 19th century was the steam railroad. Think about it for a second. For as long as mankind had existed, geography and distance defined the human experience. Your personal experiences and your economic activities were constrained by muscle power of either your own muscles or animal power as to how far you could go, and suddenly the railroad destroys that, and I mean suddenly. It destroyed the agrarian economy. Self-sufficient agriculture became corporatized agriculture. It destroyed the artisan economy, which was the heart of local communities. It created all kinds of fears.
One of my favorite stories is that in some of the early trains in the United States going at the incredible speed of 20 miles an hour, people would take notebooks and pencils onboard to see if they could write because they weren't sure their brains would be able to operate at that kind of speed.
And then, boom, boom, immediately on the heels of this revolution in physical speed came the first electronic network, the telegraph, the fact that information could be available essentially everywhere at the same time, creating the monster that we live with today, which means the telegraph opened the environment where because it is possible to know a piece of information in real time it therefore becomes essential to know that piece of information, which is why we spend our lives looking at our mobile phones.
Does all this sound familiar? Social discord and economic upheaval as a result of new technologies.
These are the stories that I call "The History of Our Future," the subtitle of the book From Gutenberg to Google: The History of Our Future, and it's rooted in two facts: One is the Darwinian evolution of technology. We have this image of two guys and a dog in a garage having an epiphany and all of a sudden something happens. No. There is an evolutionary process. And the products of that evolutionary process produce upheaval and change.
If you peel back the onion of Internet technology, for instance, and look at its evolution, you find Johannes Gutenberg because the genius of Gutenberg was not a printing press. That had been done for centuries, wood plates carefully carved out that had all of the information and were printed. Gutenberg saw information not as its collective whole but in its smallest usable parts, and that's the concept that is behind the Internet and digital code today: Break a thing into its smallest parts, send it out across the Internet, and reassemble it at the other end. That is exactly the breakthrough that Johannes Gutenberg had in the middle of the 15th century.
If you follow the evolution of computer chips, you come back to the steam engine, and in particular you come back to a man by the name of Charles Babbage, a British mathematician, who interestingly happened to be present at the running of the first commercial trip by a railroad in the history of the world, the Stockton and Darlington Railway in Northeast England, where this speeding locomotive—again that was about 17 miles an hour—ran over somebody. I don't mean to chuckle at the running over of somebody, but Babbage, who was there, invented the cowcatcher to prevent that kind of thing from happening in the future.
But, as I say, he was a mathematician. He was a fellow of the Royal Astronomical Society, which meant that he had the responsibility to calculate astronomical tables, which is an awful process. Think of your multiplication tables and how we suffered through that. You can imagine these large numbers being constantly calculated and the potential for error.
So the process was that you would have two mathematicians do the calculation, and then they would check their work against each other to find out if one of them had made a mistake. One day Babbage looks across the table at the guy he's working with and says, "I wish to god these calculations had been performed by steam." And his life changed, and he spent the rest of his life designing what today we would call a computer using gears, cogs, and spokes. He left behind hundreds of pages of blueprints. He only built a small demonstration model; he was never able to put the whole thing together, but it contains all the components of computing: input, output, calculation, storage, everything that today we look at and say, "That's what makes a computer," Charles Babbage did in the Victorian Era.
In the 1990s the London Science Museum decided, "Well, let's see if we can build Babbage's computer," and they did, and it worked.
The interesting thing is that Babbage was then forgotten. When he died, he was forgotten. We think of Alan Turing and these other great pioneers in computing. They never heard of Babbage.
But we can trace computing back to the steam era, and whether it is printing or computing it drove significant upheaval. Printing gave us the Reformation. When Martin Luther tacked his Ninety-Five Theses on the church door at Wittenberg, it wasn't breakthrough theological thinking. Those ideas had been cruising around Europe with theologians for some time, but the thing was that those who were advocating them, their reach was only as far as their voice.
Gutenberg happened to coincide with the growth of a network of commercial printers who were looking for material, and that's what took off. And of course the Reformation only brought us decades of war. So was there upheaval from that? You bet.
The printing press also helped the Renaissance get out of Northern Italy probably faster than it otherwise would have. We look back today on the Renaissance and say: "Oh, such beautiful things that were going on, art and thought and discovery." It must have been hell to live through that period because everything that anybody had been taught to believe was torn asunder by new ideas that were spread by the printing press.
We think about how the railroad and the telegraph together drove the Industrial Revolution. It destroyed local economies, it destroyed agrarian self-sufficiency; it did to the basic center of economies in communities what Google is doing today.
If a farmer wanted a plow in the middle of the 19th century, he went to the local blacksmiths. Two blacksmiths performing 11 tasks produced a plow in about 118 hours. When the raw materials were centralized in a factory 52 men doing 97 different tasks produced the same plow in three-and-three-quarter hours, and the railroad then took it back out to an interconnected market. That destroyed the economy for local artisans such as blacksmiths.
That's the history of our future. Every history has a terminus. We are the terminus of history.
Each of us is living at our own terminus of history, which means that it is now our turn, and our time has been developed by our progress down two simultaneous paths: One is ubiquitous connectivity; over here let's say it moves from the printing press to the telegraph to the telephone to ubiquitous wireless connectivity, and the other is low-cost computing that starts with our friend Charles Babbage, goes to mainframe, goes to Moore's law, and chips being in everything.
We have all grown up watching these two things happen. Sometimes they would tickle each other. But now they've had sex.
Somebody always chuckles when I say that in the audience. I had a call from a friend of mine who read the book. The line is in the book, by the way. There is a one-sentence paragraph that says, "And then they had sex." A friend of mine calls and says, "How in the world did your editor ever allow that to stay in?"
But the fact of the matter is that that combination created a new asset, created the capital asset of the 21st century, digital information. Computers talking to computers universally connected have made the digital information that is exchanged the asset of the 21st century.
Ninety percent—try this one on for size—of the data that exists today has been created in the last two years. We are in an exponential curve. There are 39 million volumes in the Library of Congress. Every single day we produce 3 million Libraries of Congress in terms of the data that the effect of this ubiquitous connectivity and ubiquitous computing coming together produce.
You may recall that The Economist had a cover a couple of years ago in which they said that data was the new oil. Well, they were wrong. There is a huge difference between the asset of the 21st century and the criteria of the asset that has existed throughout the history of humankind, and that is that data is inexhaustible. You can't compare it to oil. There is a limited amount of oil in the ground, and there is a limited amount of supply. There is a limited amount of gold, and there is limited amount of supply. For all of history the assets that drove the economy have been driven by limits, and now we have an inexhaustible supply that computers talking to computers create new data that creates new software-driven products that creates new data that creates . . . and it becomes our own perpetual motion machine.
And about half of that data that is created every day is data about you and me, is the collection of information about our practices on a scale that Big Brother could have only imagined, being delivered not by totalitarian governments like Orwell envisioned, but by capitalists. And that union of computing and communications leads us to what's next, four quick things.
One, the nature of our networks is changing. Network has always been, "Take something from A to B." When you have a network that is nothing more than connected microchips out here, suddenly the role of the network evolves to become an orchestrating activity rather than a transporting activity.
Quick example: When you bring up Waze or Google Maps or whatever in your car for the connected car, that is a classic example of how a network works: get information, bring it back, point to point.
An autonomous car takes thousands of inputs to make the decision of how not to hit the school bus. And all of that work is done out here in the network. These devices are this size because of the computing power that has to be in here, the chips that have to be in here.
You move that computing power out into the network, and suddenly the size of this disappears, and you start seeing on your glasses, for instance, the screen displays. The nature of networks is moving from transporting to orchestrating.
Second, artificial intelligence. We hear an awful lot about artificial intelligence. We can talk more about it if you want, but what is artificial intelligence? It is the use of computing power to create and store data and have ready access to it over the network to bring in a known algorithm that does a pretty good job making a prediction.
Third big change: blockchain. And I mean a lot more than Bitcoin. Forty years after Gutenberg, Luca Pacioli, a mathematician in Venice, came up with the idea of double-entry bookkeeping, and Gutenberg's printing spread it throughout Italy and then throughout Europe. What that enabled was two banks to know they were each keeping score the same way, and that meant that I could cash what was drawn on this bank, and they would have relationships back and forth with each other because it was a matter of trust—we knew we were keeping score the same way—and that trust was what the bank sold.
Same thing exists today in credit cards. You trust that they'll take your card, the merchant trusts that they'll get paid for it. Blockchain takes advantage of this distributed, orchestrating network to move that trust out and to redefine trust.
The fourth thing that is going to define our future is cybersecurity. We are somehow surprised at all the attacks, the cyberattacks we're constantly hearing about. Networks have always been attack vectors, so why should we be surprised that the network of the 21st century is an attack vector?
I don't care whether it was Native Americans following animal paths to attack the tribe next door or Caesar using a road network to conquer the world. Networks have always been attack vectors. We have a new network. Why are we surprised that it is the attack vector of the 21st century? The only thing that's surprising is that we didn't think about that ahead of time, and we're now playing whack-a-mole with how we deal with cybersecurity.
Joanne quoted the book as saying: "It is never the primary network that is transformational but is the secondary impact of that network," and I want to review four quick secondary impacts and their effect on us, and I'd be happy to talk about anything you want to talk about.
The first is that all of this data that is being created somehow through some form of what I call "digital alchemy," your information about each of your personal lives has become a corporate asset, and that like how pre-printing information was hoarded to allow people to exert power by denying information, we now have a handful of companies that are using the hoard of that information that they have collected about you to control and dominate markets. They also use that hoard of information to target you because they're running what's called an "attention economy," where the more they keep you online the more ads they can sell you, and therefore they deliver to you things you want to hear, things you want to see, things that they know as a result of monitoring your activities you're interested in seeing, and that drives tribalism.
The basic human instinct is tribal. Democracy calls on us to rise above tribalism in the hope that together we can do something better for everybody, but technology encourages tribalism, and the economic effects of the technology discourage the "Hey, if we all pull together, we can lift ourselves up," and as a result eats away at the underpinnings of democracy.
As I said, we have the highest wealth concentration since the 1920s. Here's a statistic that will blow you away: When most of us in this room were born, the expectation was that you would grow up and earn more than your parents. A new study out shows that in 1945, those born in 1945, there was a 90 percent expectation, a 90 percent reality that you would earn more than your parents. In 1955, that fell to 70 percent; in 1965, that fell to 59 percent; in 1975, it fell to 57 percent, and if you were born in 1985 there is only a 50 percent probability that you will end up earning more than your parents.
The dichotomy in economic opportunity, the focusing and targeting of the information that has been collected about each of us is acting to destroy what our founders called E pluribus unum, "out of many, one," and to produce the sense of anxiety and insecurity that we discussed at the outset.
So, should we be surprised that that insecurity, producing a demand for fast answers, has resulted in the growth of authoritarian governments across the world? It used to be that liberal democracy was on the rise; now authoritarianism is on the rise. Why? Because they say they've got answers.
Should we be surprised that we have slogans instead of solutions? Watch what our friends in the United Kingdom are going through right now, and you shake your head and say, "How could this slogan of Brexit be so destructive?" And then we realize: Well, let's see. We haven't got many stones to throw ourselves if we're sitting here obsessing about a wall and the president talking about shutting down trade with our largest trading partner, which will have an immediate impact on prices and the economy of this country.
The conclusion of the book is that the promise of the "good old days" and that we can go back to something that once was halcyon is fake history and that what the history of our future really teaches us is that we move forward by not retreating but responding and that these forces that have been moving for the last 600 years have now come together and created for us a new challenge, a new reality, and our challenge is how do we use democracy, which is inherently a slow process, and capitalism, which is inherently an expansive process that we previously had put guardrails in place for, how do we say: "All right. Let's look."
When the Industrial Revolution came along we looked and said, "Hey, the rules that worked for agrarian mercantilism no longer work for industrial capitalism," and we created anti-trust rules, consumer protection rules, worker protection rules, and guardrails on capitalism that allowed capitalism to succeed and in fact protected capitalism.
We now find ourselves in a situation where the rules that had worked for industrial capitalism are probably no longer adequate for Internet capitalism, and so our moment is not to flee but to stand up, recognize the challenge, recognize that we've been here before, recognize that all these challenges that we're having today, they probably aren't as rough as the challenges of history.
But what history does teach us is the only thing that works is to engage them, and now it's our turn. We're at our terminus of history, and the book tees up what are we going to do about that.
Thank you, Joanne, and thank you, everybody.
QUESTIONS
QUESTION: Ron Berenbeim.
Your anecdote about Babbage's railroad ride reminded me of another railroad ride in Great Britain. This one was taken by a gentleman named A. C. Pigou, who was a Cambridge economist. As he saw the railroad going high-speed, whatever it was then, he saw sparks flying off the track, and he said: "Suppose one of those sparks burns down a farmhouse." He said, "Clearly the price I paid for this railroad ticket doesn't represent the total cost to society of that ticket." Thus was born the idea of social costs as well as economic costs and the Pigovian tax.
I realize you've touched on a lot of these issues and so on, but how do you see us recognizing and developing metrics for the social costs of the Internet or whatever it is you want to call it and for taxing the negative outcomes?
TOM WHEELER: Wow, what a great question. How much time do we have, Joanne?
JOANNE MYERS: As long as it takes you to answer.
TOM WHEELER: That actually is what I'm working on right now. As Joanne said, I'm at Brookings. I'm also up at the Kennedy School at Harvard, where I'm doing research and writing materials on this issue.
It seems to me that, first of all, your example is spot-on. I talk about in the book how when the railroad went through farmlands it was the cinders coming out of the smokestack that set fire to houses and barns and hayricks and this sort of thing. And who had responsibility? There was applied an English common law concept, literally coming out of feudalism, called the duty of care. The duty of care says that you have the responsibility to anticipate and mitigate things that are going to happen.
So the solution to the smokestack was to put a screen across the top to stop the cinders. Where's the "screen" in the information economy that exists today?
I ran a software company at one point in time. I can tell you that the sheer excitement of, "Hey, can we build—" and "Oh, my god, we got it to work!" totally overpowers "And what's the impact going to be?"
One of our challenges is how do we institute the duty of care concept in this new environment? I think we have to do it legally. As I say, it is the basis for negligence in the law, it is the basis for fiduciary responsibilities in the law.
One of the things that I had to skim over is that the folks who are the pioneers have always made the rules. I don't care whether it's our friend Mr. Carnegie or Mark Zuckerberg. They made the rules until such time as those rules began to infringe on the rights of individuals and the public interest, and then the people stepped forward. I think we're at that point now.
One of the concepts that we have to have in the new rules for the new digital era is the point that you raise about a duty of care.
QUESTION: Anthony Faillace.
I want you to refine that a little bit and ask you, if you're advising the president of the United States, and he asks you, "What are the two or three changes to this regulatory architecture we should make?" what do you tell him?
We've seen in the process—Elizabeth Warren has said let's break up tech companies; Mark Zuckerberg has even said, "Well, maybe we do need regulation." So the dialogue is moving in this direction. What are the concrete changes we make?
TOM WHEELER: I'm reminded of H. L. Mencken's old observation at one point in time. He said, "For every problem, there's a quick, easy, beautiful, simple, and wrong solution." So what we don't want to do is rush after.
God bless Elizabeth Warren and what her suggestion is. I have a piece coming out today at Brookings that says that there's a difficulty with anti-trust enforcement in this era, which basically is it takes so long—the AT&T breakup was a 10-year process—that by the time you get to the end of it the issue you're dealing with is no longer an issue; technology and time have passed it by.
Second is that thanks to the Federalist Society, the Chicago school dominates anti-trust law and the judiciary, and that's based on what is consumer harm measured by price, not what is marketplace harm, the Brandeisian concept. So we probably couldn't get there even after the long 10-year fight.
Therefore, I'm suggesting what we need to do is, we need to instead of break up, break open. What is it that gives these companies this power in the marketplace, and it's their hoarding of this data, and how do we open that so that competitors have access to that data for compensation, which goes back to the second point.
You asked what would I say to the president. There are three things: One, a duty of care; two, a duty to deal, another common law concept. It's what I tried to put in place with the open Internet rules, net neutrality. If we go back again to the roots of common law, the guy running the ferry across the river had a duty to take everybody, not for free, but a duty to take everybody. The guy running the tavern had a duty to provide shelter and food for everybody.
When the telegraph came along, the Congress in 1860 in the Pacific Telegraph Act, Section 3, said that it would be non-discriminatory. There was a duty to deal, that everybody had to get on. The telephone had the same kind of concept. Everybody had to have a right to get on.
When the Internet comes along, the companies say: "Oh, no. Digital is different." We said, "No." The same concept, the concept of net neutrality we imposed on the Internet service providers, the Trump administration has taken that away. So the second test would be a duty to deal.
The third thing that I would recommend is that we need to rethink how we do government, how we do this oversight.
I'll give you a specific insofar as net neutrality is concerned. We had a concept in net neutrality—I tried to put regulatory flexibility in the net neutrality because we didn't know what was coming. So I said: "Here are the four corners of the rules. It's based on just and reasonable activities in the marketplace, and we will empower the FCC to be a referee on the field, to throw a flag when new developments don't meet that test." And the companies came in to me and said: "Oh! That's terrible. That's regulatory uncertainty. We don't know what to expect."
Of course, the day before they had been in saying, "Oh, you can't have strict rules because that thwarts innovation." Well, you can't have both sides of that argument.
So we need to have an agile form of government. As I said, I ran a software company at one point in time. You used to build software like you built cars at the Ford assembly plant: You did this and did this and did this, and it rolled off the assembly line, and it was called the "waterfall" because it went along and fell off like a waterfall.
Now software is never done. You're constantly getting updates to your iPhone. You're always getting updates to your operating system on your computer. It always has to change to reflect the realities of what's going on.
Government needs to do that as well. I tried it three specific times—net neutrality, privacy, and cyber, all of which have been repealed by the Trump administration because of regulatory uncertainty.
But we shouldn't be surprised, and I'll shut up here once I—this is the third thing, and it's crucial, and it's some of the work that I'm doing now. Our government was structured in the industrial era on industrial management concepts. What were management concepts in the industrial era? You had a guy on the floor who followed a set of rules, who was overseen by a supervisor, who had a set of rules to make sure he was following the rules, who was overseen by a manager, who had a set of rules to make sure that these guys—and we're surprised that we have a rules-based bureaucratic structure in our government? Because that's the way everything got managed, except that's not the way the economy gets managed today.
The economy gets managed through agile management, and we need to bring that kind of concept into how we oversee that economy.
So those are the three things that I would say, and, by the way, have said.
QUESTION: Susan Gitelson. Thank you for being so provocative. Fantastic.
TOM WHEELER: Thank you, Susan.
QUESTIONER [Ms. Gitelson]: At the outset you said that we're living in an age of great inequality, more than in a hundred years or so. Using the history of our future, how did we get here, and what do you expect to be in the future?
TOM WHEELER: We got here because the pioneers get to make the rules, and we got here because there's a difference between the experience in the industrial era and the experience today. If your buggy whip factory was put out of business in the industrial era, you didn't really need a new set of skills to go down the street and go to work for Henry Ford.
Today there are two types of skill sets, those who can deal with technology and those who can't. The first get paid well; the second don't. Those who have invested in and developed the first kind, the technology-based companies, themselves get rewarded.
We need to come to grips with how are we going to make sure that in a rapidly changing world we have people who are capable of moving from one side to the other. One of the things I talk about in the book is that Randall Stephenson, the CEO of AT&T, told his people that they have to spend five or ten hours a week—a week—in training in order to keep their jobs. This is in addition to their 40-hour week. But things are changing so fast that if you don't keep up with the technology you will not be employable.
So you ask yourself the question, and I ask this in the book: We went from the 60-hour week to the 40-hour week. Is it time to ask the question should we think about a 30-hour week onto which is added 10 hours of training so people are continually trained? And by the way, that other 10 hours ends up becoming an opportunity for others to have jobs.
We have to think our way through this process, and we have to think anew. The challenge is you don't solve new problems with old solutions.
The danger in this book, the thing that I worry about in this book and try to bend over backward to avoid is "Well, apply the lessons of history." You can't. "We did this in the industrial era, therefore, we do this now" doesn't work.
But we thought creatively as to how to meet our challenges in the industrial era is the same. We need to think creatively how to meet our challenges in this area, and we need to have that as a public debate—and back to your point, sir, Elizabeth Warren and others fortunately are starting to foment that discussion.
QUESTION: Bob Frye. Thank you very much.
You've said a lot of things today which are obviously very important to hear. What would you consider to be—because I think your book is a warning, a clear warning, dealing tomorrow, future, in a way I think analogous. Your book is like the cowcatcher.
TOM WHEELER: I'll take that, Bob. Thank you.
QUESTIONER [Mr. Frye]: So where do you go with that?
TOM WHEELER: Bob, this is actually what I laid in bed last night thinking about, and that is that the kinds of things that I talked about at the end of my presentation, the kind of things we just talked about here, I got to do another book that addresses that question, but this is what I'm working on at Brookings and Harvard, and this is what's exciting.
I don't want to become too Pollyannaish about this, folks, but I talk about we're at the terminus of history. How lucky we are. My god, look at the challenges we get to deal with. And I quote in the book the wonderful line from Hamilton where Lin-Manuel Miranda has Elizabeth Schuyler, "Look around, look around, how lucky we are to be alive right now . . . revolution."
Okay. It's a question of what are you going to do. And Bob, I hope to be to back to Carnegie Council in the relatively near future and address that.
QUESTIONER [Mr. Frye]: I was going to ask what keeps you up at night.
TOM WHEELER: Now you know.
QUESTION: Thanks. My name is Teke Wiggin.
There seem to be two fears about automation these days, two consequences in particular, mass job displacement, but also permanent mass unemployment, which are different. Do you think those fears are overblown?
It seems like there are periodic waves of fears about automation producing these two effects, and they've proven to be overblown in the past. Could you review briefly some of those peak periods where everyone was freaking out about automation?
TOM WHEELER: Thank you. Good question.
Very real, but we're in control. These things will only happen if we sit on our hands.
There is quoted in the book a speech that JFK made in the 1960s about the threat that computers were posing to workers. I had a meeting with the Argentinian secretary of communications, in which his opening line was, "Well, what about the 47 percent of the people that an Oxford study says are going to be put out of work as a result of automation?"
My response was, again, what I said a minute ago: "It's only if we sit on our hands."
How many remember 2001: A Space Odyssey? Remember that movie? And the computer, HAL—by the way, how did it get the name "HAL"?
JOANNE MYERS: IBM.
TOM WHEELER: Bingo! It is the letters immediately preceding IBM.
And the computer HAL is going to take over this space mission. And there's one astronaut who is up and not put into hibernation, and that's Dave, and Dave and the computer HAL start going back and forth, and Dave finally does what? He unplugs HAL. We're still running this.
I don't know how many of you have read Sapiens: A Brief History of Humankind. Terrific book in which [Harari] observes that there are only two animals that have been successfully domesticated by man. One is the horse, and the other is the dog. And the difference between the horse and the dog is that the dog has also domesticated mankind. When my dog hops up on my bed at night, and I go, "Oh, you're the world's best dog," I'm saying to myself: "Now let's see. Who has domesticated whom here?" [Editor's note: see Yuval Harari's Carnegie Council talk about his sequel to Sapiens—Homo Deus: A Brief History of Tomorrow.]
It holds. We have to domesticate AI. We will if we are vigilant, if we don't sit on our hands, and that's our challenge.
JOANNE MYERS: I thank you so much for bringing the challenge to us.
TOM WHEELER: Thank you, everybody, very much.
JOANNE MYERS: And I just want to remind everyone your book is available. Thank you.