The Inevitable Read online

Page 2


  These organizing verbs represent the metachanges in our culture for the foreseeable near future. They are broad strokes already operating in the world today. I make no attempt to predict which specific products will prevail next year or the next decade, let alone which companies will triumph. These specifics are decided by fads, fashion, and commerce, and are wholly unpredictable. But the general trends of the products and services in 30 years are currently visible. Their basic forms are rooted in directions generated by emerging technologies now on their way to ubiquity. This wide, fast-moving system of technology bends the culture subtly, but steadily, so it amplifies the following forces: Becoming, Cognifying, Flowing, Screening, Accessing, Sharing, Filtering, Remixing, Interacting, Tracking, Questioning, and then Beginning.

  While I devote a chapter to each motion, they are not discrete verbs operating in solo. Rather they are highly overlapping forces, each codependent upon and mutually accelerating the others. It becomes difficult to speak of one without referring to the others at the same time. Increased sharing both encourages increased flowing and depends upon it. Cognifying requires tracking. Screening is inseparable from interacting. The verbs themselves are remixed, and all of these actions are variations on the process of becoming. They are a unified field of motion.

  These forces are trajectories, not destinies. They offer no predictions of where we end up. They tell us simply that in the near future we are headed inevitably in these directions.

  1

  BECOMING

  It’s taken me 60 years, but I had an epiphany recently: Everything, without exception, requires additional energy and order to maintain itself. I knew this in the abstract as the famous second law of thermodynamics, which states that everything is falling apart slowly. This realization is not just the lament of a person getting older. Long ago I learned that even the most inanimate things we know of—stone, iron columns, copper pipes, gravel roads, a piece of paper—won’t last very long without attention and fixing and the loan of additional order. Existence, it seems, is chiefly maintenance.

  What has surprised me recently is how unstable even the intangible is. Keeping a website or a software program afloat is like keeping a yacht afloat. It is a black hole for attention. I can understand why a mechanical device like a pump would break down after a while—moisture rusts metal, or the air oxidizes membranes, or lubricants evaporate, all of which require repair. But I wasn’t thinking that the nonmaterial world of bits would also degrade. What’s to break? Apparently everything.

  Brand-new computers will ossify. Apps weaken with use. Code corrodes. Fresh software just released will immediately begin to fray. On their own—nothing you did. The more complex the gear, the more (not less) attention it will require. The natural inclination toward change is inescapable, even for the most abstract entities we know of: bits.

  And then there is the assault of the changing digital landscape. When everything around you is upgrading, this puts pressure on your digital system and necessitates maintenance. You may not want to upgrade, but you must because everyone else is. It’s an upgrade arms race.

  I used to upgrade my gear begrudgingly (why upgrade if it still works?) and at the last possible moment. You know how it goes: Upgrade this and suddenly you need to upgrade that, which triggers upgrades everywhere. I would put it off for years because I had the experiences of one “tiny” upgrade of a minor part disrupting my entire working life. But as our personal technology is becoming more complex, more codependent upon peripherals, more like a living ecosystem, delaying upgrading is even more disruptive. If you neglect ongoing minor upgrades, the change backs up so much that the eventual big upgrade reaches traumatic proportions. So I now see upgrading as a type of hygiene: You do it regularly to keep your tech healthy. Continual upgrades are so critical for technological systems that they are now automatic for the major personal computer operating systems and some software apps. Behind the scenes, the machines will upgrade themselves, slowly changing their features over time. This happens gradually, so we don’t notice they are “becoming.”

  We take this evolution as normal.

  Technological life in the future will be a series of endless upgrades. And the rate of graduations is accelerating. Features shift, defaults disappear, menus morph. I’ll open up a software package I don’t use every day expecting certain choices, and whole menus will have disappeared.

  No matter how long you have been using a tool, endless upgrades make you into a newbie—the new user often seen as clueless. In this era of “becoming,” everyone becomes a newbie. Worse, we will be newbies forever. That should keep us humble.

  That bears repeating. All of us—every one of us—will be endless newbies in the future simply trying to keep up. Here’s why: First, most of the important technologies that will dominate life 30 years from now have not yet been invented, so naturally you’ll be a newbie to them. Second, because the new technology requires endless upgrades, you will remain in the newbie state. Third, because the cycle of obsolescence is accelerating (the average lifespan of a phone app is a mere 30 days!), you won’t have time to master anything before it is displaced, so you will remain in the newbie mode forever. Endless Newbie is the new default for everyone, no matter your age or experience.

  * * *

  • • •

  If we are honest, we must admit that one aspect of the ceaseless upgrades and eternal becoming of the technium is to make holes in our heart. One day not too long ago we (all of us) decided that we could not live another day unless we had a smartphone; a dozen years earlier this need would have dumbfounded us. Now we get angry if the network is slow, but before, when we were innocent, we had no thoughts of the network at all. We keep inventing new things that make new longings, new holes that must be filled.

  Some people are furious that our hearts are pierced this way by the things we make. They see this ever-neediness as a debasement, a lowering of human nobility, the source of our continual discontentment. I agree that technology is the source. The momentum of technologies pushes us to chase the newest, which are always disappearing beneath the advent of the next newer thing, so satisfaction continues to recede from our grasp.

  But I celebrate the never-ending discontentment that technology brings. We are different from our animal ancestors in that we are not content to merely survive, but have been incredibly busy making up new itches that we have to scratch, creating new desires we’ve never had before. This discontent is the trigger for our ingenuity and growth.

  We cannot expand our self, and our collective self, without making holes in our heart. We are stretching our boundaries and widening the small container that holds our identity. It can be painful. Of course, there will be rips and tears. Late-night infomercials and endless web pages of about-to-be-obsolete gizmos are hardly uplifting techniques, but the path to our enlargement is very prosaic, humdrum, and everyday. When we imagine a better future, we should factor in this constant discomfort.

  * * *

  • • •

  A world without discomfort is utopia. But it is also stagnant. A world perfectly fair in some dimensions would be horribly unfair in others. A utopia has no problems to solve, but therefore no opportunities either.

  None of us have to worry about these utopia paradoxes, because utopias never work. Every utopian scenario contains self-corrupting flaws. My aversion to utopias goes even deeper. I have not met a speculative utopia I would want to live in. I’d be bored in utopia. Dystopias, their dark opposites, are a lot more entertaining. They are also much easier to envision. Who can’t imagine an apocalyptic last-person-on-earth finale, or a world run by robot overlords, or a megacity planet slowly disintegrating into slums, or, easiest of all, a simple nuclear Armageddon? There are endless possibilities of how the modern civilization collapses. But just because dystopias are cinematic and dramatic, and much easier to imagine, that does not make them likely.

  The flaw in
most dystopian narratives is that they are not sustainable. Shutting down civilization is actually hard. The fiercer the disaster, the faster the chaos burns out. The outlaws and underworlds that seem so exciting at “first demise” are soon taken over by organized crime and militants, so that lawlessness quickly becomes racketeering and, even quicker, racketeering becomes a type of corrupted government—all to maximize the income of the bandits. In a sense, greed cures anarchy. Real dystopias are more like the old Soviet Union rather than Mad Max: They are stiflingly bureaucratic rather than lawless. Ruled by fear, their society is hobbled except for the benefit of a few, but, like the sea pirates two centuries ago, there is far more law and order than appears. In fact, in real broken societies, the outrageous outlawry we associate with dystopias is not permitted. The big bandits keep the small bandits and dystopian chaos to a minimum.

  However, neither dystopia nor utopia is our destination. Rather, technology is taking us to protopia. More accurately, we have already arrived in protopia.

  Protopia is a state of becoming, rather than a destination. It is a process. In the protopian mode, things are better today than they were yesterday, although only a little better. It is incremental improvement or mild progress. The “pro” in protopian stems from the notions of process and progress. This subtle progress is not dramatic, not exciting. It is easy to miss because a protopia generates almost as many new problems as new benefits. The problems of today were caused by yesterday’s technological successes, and the technological solutions to today’s problems will cause the problems of tomorrow. This circular expansion of both problems and solutions hides a steady accumulation of small net benefits over time. Ever since the Enlightenment and the invention of science, we’ve managed to create a tiny bit more than we’ve destroyed each year. But that few percent positive difference is compounded over decades into what we might call civilization. Its benefits never star in movies.

  Protopia is hard to see because it is a becoming. It is a process that is constantly changing how other things change, and, changing itself, is mutating and growing. It’s difficult to cheer for a soft process that is shape-shifting. But it is important to see it.

  Today we’ve become so aware of the downsides of innovations, and so disappointed with the promises of past utopias, that we find it hard to believe even in a mild protopian future—one in which tomorrow will be a little better than today. We find it very difficult to imagine any kind of future at all that we desire. Can you name a single science fiction future on this planet that is both plausible and desirable? (Star Trek doesn’t count; it’s in space.)

  There is no happy flying-car future beckoning us any longer. Unlike the last century, nobody wants to move to the distant future. Many dread it. That makes it hard to take the future seriously. So we’re stuck in the short now, a present without a generational perspective. Some have adopted the perspective of believers in a Singularity who claim that imagining the future in 100 years is technically impossible. That makes us future-blind. This future-blindness may simply be the inescapable affliction of our modern world. Perhaps at this stage in civilization and technological advance, we enter into a permanent and ceaseless present, without past or future. Utopia, dystopia, and protopia all disappear. There is only the Blind Now.

  The other alternative is to embrace the future and its becoming. The future we are aimed at is the product of a process—a becoming—that we can see right now. We can embrace the current emerging shifts that will become the future.

  The problem with constant becoming (especially in a protopian crawl) is that unceasing change can blind us to its incremental changes. In constant motion we no longer notice the motion. Becoming is thus a self-cloaking action often seen only in retrospect. More important, we tend to see new things from the frame of the old. We extend our current perspective to the future, which in fact distorts the new to fit into what we already know. That is why the first movies were filmed like theatrical plays and the first VRs shot like movies. This shoehorning is not always bad. Storytellers exploit this human reflex in order to relate the new to the old, but when we are trying to discern what will happen in front of us, this habit can fool us. We have great difficulty perceiving change that is happening right now. Sometimes its apparent trajectory seems impossible, implausible, or ridiculous, so we dismiss it. We are constantly surprised by things that have been happening for 20 years or longer.

  I am not immune from this distraction. I was deeply involved in the birth of the online world 30 years ago, and a decade later the arrival of the web. Yet at every stage, what was becoming was hard to see in the moment. Often it was hard to believe. Sometimes we didn’t see what was becoming because we didn’t want it to happen that way.

  We don’t need to be blind to this continuous process. The rate of change in recent times has been unprecedented, which caught us off guard. But now we know: We are, and will remain, perpetual newbies. We need to believe in improbable things more often. Everything is in flux, and the new forms will be an uncomfortable remix of the old. With effort and imagination we can learn to discern what’s ahead more clearly, without blinders.

  Let me give you an example of what we can learn about our future from the very recent history of the web. Before the graphic Netscape browser illuminated the web in 1994, the text-only internet did not exist for most people. It was hard to use. You needed to type code. There were no pictures. Who wanted to waste time on something so boring? If it was acknowledged at all in the 1980s, the internet was dismissed as either corporate email (as exciting as a necktie) or a clubhouse for teenage boys. Although it did exist, the internet was totally ignored.

  Any promising new invention will have its naysayers, and the bigger the promises, the louder the nays. It’s not hard to find smart people saying stupid things about the web/internet on the morning of its birth. In late 1994, Time magazine explained why the internet would never go mainstream: “It was not designed for doing commerce, and it does not gracefully accommodate new arrivals.” Wow! Newsweek put the doubts more bluntly in a February 1995 headline: “The Internet? Bah!” The article was written by an astrophysicist and network expert, Cliff Stoll, who argued that online shopping and online communities were an unrealistic fantasy that betrayed common sense. “The truth is no online database will replace your newspaper,” he claimed. “Yet Nicholas Negroponte, director of the MIT Media Lab, predicts that we’ll soon buy books and newspapers straight over the Internet. Uh, sure.” Stoll captured the prevailing skepticism of a digital world full of “interacting libraries, virtual communities, and electronic commerce” with one word: “baloney.”

  This dismissive attitude pervaded a meeting I had with the top leaders of ABC in 1989. I was there to make a presentation to the corner-office crowd about this “Internet Stuff.” To their credit, the executives of ABC realized something was happening. ABC was one of the top three mightiest television networks in the world; the internet at that time was a mere mosquito in comparison. But people living on the internet (like me) were saying it could disrupt their business. Still, nothing I could tell them would convince them that the internet was not marginal, not just typing, and, most emphatically, not just teenage boys. But all the sharing, all the free stuff seemed too impossible to business executives. Stephen Weiswasser, a senior VP at ABC, delivered the ultimate put-down: “The Internet will be the CB radio of the ’90s,” he told me, a charge he later repeated to the press. Weiswasser summed up ABC’s argument for ignoring the new medium: “You aren’t going to turn passive consumers into active trollers on the internet.”

  I was shown the door. But I offered one tip before I left. “Look,” I said. “I happen to know that the address abc.com has not been registered. Go down to your basement, find your most technical computer geek, and have him register abc.com immediately. Don’t even think about it. It will be a good thing to do.” They thanked me vacantly. I checked a week later. The domain was still unregistered.

>   While it is easy to smile at the sleepwalkers in TV land, they were not the only ones who had trouble imagining an alternative to couch potatoes. Wired magazine did too. I was a co–founding editor of Wired, and when I recently reexamined issues of Wired from the early 1990s (issues that I’d proudly edited), I was surprised to see them touting a future of high production-value content—5,000 always-on channels and virtual reality, with a sprinkling of bits of the Library of Congress. In fact, Wired offered a vision nearly identical to that of internet wannabes in the broadcast, publishing, software, and movie industries, like ABC. In this official future, the web was basically TV that worked. With a few clicks you could choose any of 5,000 channels of relevant material to browse, study, or watch, instead of the TV era’s five channels. You could jack into any channel you wanted from “all sports all the time” to the saltwater aquarium channel. The only uncertainty was, who would program it all? Wired looked forward to a constellation of new media upstarts like Nintendo and Yahoo! creating the content, not old-media dinosaurs like ABC.