We all know that technological innovations have the power to influence geopolitics, but we tend to forget how quickly their returns diminish. The internal combustion engine revolutionized transportation and transformed society. Its birth in 1915 was accompanied by Henry Ford’s invention of another revolutionary technology, the assembly line, and a revolutionary business model, the auto dealership. These innovations transformed not just the United States but much of the world and drove human productivity for half a century.

By about 1965, the internal combustion engine, the automobile, the assembly line and the dealership model had matured. While still indispensable, they were no longer cutting-edge. Most important, they were not generating the vast increase in productivity or the transformation of culture that they once had. The basic architecture of the automobile was complete, from automatic transmissions to air conditioning. The rest was marketing. Neurotic, narcissistic and brilliant, Ford and his generation of revolutionary industry founders were replaced by a generation of managers tasked not with transforming the world but with managing the machine that had transformed it. Process replaced disruption.

The car became a commodity. Commodities do not produce the transformation on which American life is based. Over time, the U.S. auto industry had to deal with competing commodities produced by countries that thrive on them. German Volkswagens and Japanese Toyotas poured onto the U.S. market, while U.S. auto industry managers remained frozen in their processes, assuming their product was eternal. By the 1970s, Chrysler was crumbling. It is in this context that we consider Apple Inc., the tech giant that reported disappointing sales last quarter, causing its stock prices to drop.

The microchip, a core technology that has been at least as transformative as the combustion engine, is commercially about 50 years old. (I held a Hewlett-Packard microchip-driven calculator in my hand in 1972.) The surge in American productivity driven by the microchip ended in the last decade. The products that flowed from microchip technology – the computer and the cellphone among them – remain indispensable to society, but they have already transformed our lives.

In the early years of the auto industry, when new models rolled out of Detroit, everyone was transfixed. Now, a model year means almost nothing. When cellphones first emerged and when Apple created devices that received email, took photographs and provided navigation, in addition to taking calls, consumers greeted each new generation with genuine, almost hysterical anticipation. Yet the microchip, the cellphone and the smartphone also have become commodities. Countries that excel in commodity production have taken over. Most of Apple’s products are manufactured in those places.

It’s been a while since Apple or any of its peers has introduced a cellphone feature that has revolutionized the way we live. Jobs, Gates and Grove – the original tech disruptors – are gone or have moved on. Computing and cellphones are no edgier today than an Oldsmobile was in 1965. The technology is simply part of our lives; the launch of Apple’s $1,000 cellphone met with the same tepid response that greeted the Edsel in 1958. The Edsel was just a car. New iPhones are just phones, and these days nobody lines up at the Apple Store to buy them.

The endless, merciless search for the next big thing that will change everything is what drives the United States. When it’s found, it does just that. But America is easily bored and takes miracles as commonplace. It has no pity for the old, and the old cannot grasp that their time has passed. This is what is happening to the U.S. computer industry, like the auto industry before it. The latter saw itself as the center of American culture and was shocked when the culture moved on. It couldn’t adjust. Apple is now feeling that same pressure and can’t imagine that it will soon be as uninteresting as a minivan. Today, we hear the computer industry emitting the first sounds of the appeals the auto industry made in the 1970s: The Federal government must do something, since without our industry, America is not America. The microchip economy will demand that Federal policy focus on their needs. It might work for a while, but the need even to ask is the first sign of commoditization.

This progression – disruption, maturity, decline – has played out in industries from railroads to electricity. Once revolutionary, each became a commodity, no longer the source of a tenfold return on investment. The U.S. rapidly absorbs core technologies, and they become quotidian.

There is an interesting quirk to this cycle. We can sense that something new is coming, but we all assume it will be more of the same. The first half of the 20th century centered on transportation – cars, planes and rockets. In the 1960s, visions for the future included rockets flying between cities and to the moon, cars that were also boats or planes, personal helicopters, or personal jetpacks reminiscent of “The Jetsons.” Imagination is constrained by the moment. The most radical thought we have is often that the future will be like the present, only more so. It never is.

Silicon Valley and its observers are convinced that artificial intelligence will sustain microchip-driven industry. But the one thing we don’t understand is intelligence, and creating an analog to something we don’t understand is a dubious venture. That humans will create enormously more powerful computers is almost certain. Like intercity rockets, though, this endeavor raises the question of cost. How much will it cost to save hours of airplane flight? What is the cost of replacing a human’s mind with a simulation? I could be wrong – AI may live up to all the promises. If that’s the case, it will revolutionize not only how we live but the very rhythm of technology.

When a radical breakthrough reaches maturity, we await its next act – but there usually isn’t one. I remember holding that HP calculator without the slightest idea of how it would transform the way we live. I looked to the transportation industry to whisk me to Paris for a night and back home for finals. I wish it had.

Apple is living out an old story in American society, but it’s only at the beginning. We will expect more disruption from the company and its industry. There’s much money yet to be made. But the next extraordinary moment will not come from the tech industry. The managers must maintain stability. The next wave will surprise us, and we will not recognize its importance until years after it emerges.

The United States constantly reinvents its past and longs for the future. That is the foundation of its power. It has no mercy on what is passing away and worships that which is coming to be. American nostalgia has many shapes; for me, it is the BlackBerry. Its successor, the iPhone, won the battle (if not my heart). The iPhone now begins its slow transformation from totem to automatic transmission. It will be something that was once exciting, something that still exists but that no one really cares about.

George Friedman
George Friedman is an internationally recognized geopolitical forecaster and strategist on international affairs and the founder and chairman of Geopolitical Futures. Dr. Friedman is a New York Times bestselling author and his most popular book, The Next 100 Years, is kept alive by the prescience of its predictions. Other best-selling books include Flashpoints: The Emerging Crisis in Europe, The Next Decade, America’s Secret War, The Future of War and The Intelligence Edge. His books have been translated into more than 20 languages. Dr. Friedman has briefed numerous military and government organizations in the United States and overseas and appears regularly as an expert on international affairs, foreign policy and intelligence in major media. For almost 20 years before resigning in May 2015, Dr. Friedman was CEO and then chairman of Stratfor, a company he founded in 1996. Friedman received his bachelor’s degree from the City College of the City University of New York and holds a doctorate in government from Cornell University.