Getting Ahead; White Collars Turn Blue

SYNOPSIS: Krugman predicts the future of the American economy in the 21st Century -- this article is kind of creepy if you ask me

When looking backward, you must always be prepared to make allowances: it is unfair to blame late-20th-century observers for their failure to foresee everything about the century to come. Long-term social forecasting is an inexact science even now, and in 1996 the founders of modern nonlinear socioeconomics were obscure graduate students. Still, many people understood that the major forces driving economic change would be the continuing advance of digital technology and the spread of economic development throughout the world; in that sense, there were no big surprises. The puzzle is why the pundits of the time completely misjudged the consequences of those changes.

Perhaps the best way to describe the flawed vision of fin de siecle futurists is to say that, with few exceptions, they expected the coming of an "immaculate" economy -- one in which people would be largely emancipated from any grubby involvement with the physical world. The future, everyone insisted, would bring an "information economy" that would mainly produce intangibles. The good jobs would go to "symbolic analysts," who would push icons around on computer screens; knowledge, rather than traditional resources like oil or land, would become the primary source of wealth and power. But even in 1996 it should have been obvious that this was silly. First, for all the talk about information, ultimately an economy must serve consumers -- and consumers want tangible goods. The billions of third-world families that finally began to have some purchasing power when the 20th century ended did not want to watch pretty graphics on the Internet. They wanted to live in nice houses, drive cars and eat meat.

Second, the Information Revolution of the late 20th century was a spectacular but only partial success. Simple information processing became faster and cheaper than anyone had imagined, but the once-confident artificial intelligence movement went from defeat to defeat. As Marvin Minsky, one of the movement's founders, despairingly remarked, "What people vaguely call common sense is actually more intricate than most of the technical expertise we admire." And it takes common sense to deal with the physical world -- which is why, even at the end of the 21st century, there are still no robot plumbers.

Most important of all, the long-ago prophets of the information age seemed to have forgotten basic economics. When something becomes abundant, it also becomes cheap. A world awash in information is one in which information has very little market value. In general, when the economy becomes extremely good at doing something, that activity becomes less, rather than more, important. Late-20th-century America was supremely efficient at growing food; that was why it had hardly any farmers. Late-21st-century America is supremely efficient at processing routine information; that is why traditional white-collar workers have virtually disappeared.

These, then, were the underlying misconceptions of late-20th-century futurists. Their flawed analysis led, in turn, to the five great economic trends that observers in 1996 should have expected but didn't.

Soaring Resource Prices

The first half of the 1990's was an era of extraordinarily low prices for raw materials. In retrospect, it is hard to see why anyone thought that situation would last. When two billion Asians began to aspire to Western levels of consumption, it was inevitable that they would set off a scramble for limited supplies of minerals, fossil fuels and even food.

In fact, there were danger signs as early as 1996. A surge in gasoline prices during the spring of that year was prompted by an unusually cold winter and miscalculations about Middle East oil supplies. Although prices soon subsided, the episode should have reminded people that industrial nations were once again vulnerable to disruptions of oil supplies. But the warning was ignored.

Quite soon, however, it became clear that nat-ural resources, far from becoming irrelevant, had become more crucial. In the 19th century, great fortunes were made in heavy industry; in the late 20th, they were made in technology; today's super-rich are, more frequently, those who own prime land or mineral rights.

The Environment as Property

In the 20th century, people used some quaint expressions -- "free as air," "spending money like water" -- as if the supplies of air and water were unlimited. But in a world where billions of people can afford cars, vacations and food in plastic packages, the limited carrying capacity of the environment had become perhaps the single most important constraint on the standard of living.

By 1996, it was obvious that one way to cope with environmental limits was to use market mechanisms. In the early 1990's, the Government began to allow electric utilities to buy and sell rights to emit certain kinds of pollution; the principle was extended in 1995 when the Government began auctioning rights to the electromagnetic spectrum. Today, of course, practically every environmentally harmful activity carries a hefty price tag. It is hard to believe that as late as 1995, an ordinary family could fill up a Winnebago with $1-a-gallon gasoline, then pay only $5 for admission to Yosemite. Today, that trip would cost about 15 times as much, even after adjusting for inflation.

Once governments got serious about making people pay for pollution and congestion, income from environmental licenses soared. License fees now account for more than 30 percent of the gross domestic product, and have become the main source of Government revenue; after repeated reductions, the Federal income tax was finally abolished in 2043.

The Rebirth of the Big City

During the second half of the 20th century, the densely populated, high-rise city seemed to be in unstoppable decline. Modern telecommunications eliminated much of the need for physical proximity in routine office work, leading more and more companies to shift back-office operations to suburban office parks. It seemed as if cities would vanish and be replaced with a low-rise sprawl punctuated by an occasional cluster of 10-story office towers.

But this proved transitory. For one thing, high gasoline prices and large fees for environmental licenses made a one-person, one-car commuting pattern impractical. Today, the roads belong mostly to hordes of share-a-ride minivans efficiently routed by computers. Moreover, the jobs that had temporarily flourished in the suburbs -- mainly office work -- were eliminated in vast numbers beginning in the mid-90's. Some white-collar jobs migrated to low-wage countries; others were taken over by computers. The jobs that could not be shipped abroad or be handled by machines were those that required a human touch -- face-to-face interaction between people working directly with physical materials. In short, they were jobs done best in dense urban areas, places served by what is still the most effective mass-transit system yet devised: the elevator.

Here again, there were straws in the wind. At the beginning of the 1990's, there was speculation about which region would become the center of the ballooning multimedia industry. Would it be Silicon Valley? Los Angeles? By 1996, the answer was clear. The winner was -- Manhattan, whose urban density favored personal interaction, which turned out to be essential. Today, of course, Manhattan boasts almost as many 200-story buildings as St. Petersburg or Bangalore.

The Devaluation of Higher Education

In the 1990's, everyone believed that education was the key to economic success. A college degree, even a postgraduate degree, was essential for anyone who wanted a good job as one of those "symbolic analysts."

But computers are proficient at analyzing symbols; it is the messiness of the real world that they have trouble with. Furthermore, symbols can be transmitted easily to Asmara or La Paz and analyzed there for a fraction of the cost in Boston. Therefore, many of the jobs that once required a college degree have been eliminated. The others can be done by any intelligent person, whether or not she has studied world literature.

This trend should have been obvious in 1996. Even then, America's richest man was Bill Gates, a college dropout who did not need a lot of formal education to build the world's most powerful information technology company.

Or consider the panic over "downsizing" that gripped America in 1996. As economists quickly pointed out, the rate at which Americans were losing jobs in the 90's was not especially high by historical standards. Downsizing suddenly became news because, for the first time, white-collar, college-educated workers were being fired in large numbers, even while skilled machinists and other blue-collar workers were in demand. This should have signaled that the days of ever-rising wage premiums for people with higher education were over. Somehow, nobody noticed.

Eventually, the eroding payoff of higher education created a crisis in education itself. Why should a student put herself through four years of college and several years of postgraduate work to acquire academic credentials with little monetary value? These days, jobs that require only 6 or 12 months of vocational training -- paranursing, carpentry, household maintenance and so on -- pay nearly as much as if not more than a job that requires a master's degree, and pay more than one requiring a Ph.D.

So enrollment in colleges and universities has dropped almost two-thirds since its peak at the turn of the century. The prestigious universities coped by reverting to an older role. Today a place like Harvard is, as it was in the 19th century, more of a social institution than a scholarly one -- a place for children of the wealthy to refine their social graces and befriend others of their class.

The Celebrity Economy

This century's last great trend was noted by acute observers in 1996, yet most people failed to appreciate it. While business gurus were proclaiming the new dominance of creativity and innovation over mere production, the growing ease with which information was transmitted and reproduced made it harder for creators to profit from their creations. Nowadays, if you develop a marvelous piece of software, everyone will have downloaded a free copy from the Net the next day. If you record a magnificent concert, bootleg CD's will be sold in Shanghai next week. If you produce a wonderful film, high-quality videos will be available in Mexico City next month.

How, then, could creativity be made to pay? The answer was already becoming apparent a century ago: creations must make money indirectly by promoting sales of something else. Just as auto makers used to sponsor grand prix racers to spice up the image of their cars, computer manufacturers now sponsor hotshot software designers to build brand recognition for their hardware. The same is true for individuals. The royalties that the Four Sopranos earn from their recordings are surprisingly small; the recordings mainly serve as advertisements for their concerts. The fans attend these concerts not to appreciate the music (they can do that far better at home), but for the experience of seeing their idols in person. In short, instead of becoming a knowledge economy we became a celebrity economy.

Luckily, the same technology that has made it possible to capitalize directly on knowledge has also created many more opportunities for celebrity. The 500-channel world is a place of many subcultures, each with its own heroes. Still, the celebrity economy has been hard on people -- especially for those with a scholarly bent. A century ago, it was actually possible to make a living as a more or less pure scholar. Now if you want to devote yourself to scholarship, there are only three choices. Like Charles Darwin, you can be born rich. Like Alfred Wallace, the less-fortunate co-discoverer of evolution, you can make your living doing something else and pursue research as a hobby. Or, like many 19th-century scientists, you can try to cash in on a scholarly reputation by going on the lecture circuit.

But celebrity, though more common, still does not come easily. That is why writing this article is such an opportunity. I actually don't mind my day job in the veterinary clinic, but I have always wanted to be a full-time economist; an article like this may be just what I need to make my dream come true.

Originally published in The New York Times, 9.29.96