Technology Makes Us Richer; The Paper-Bag Revolution

SYNOPSIS: Technological change is so important to prosperity yet so slow

Envision, if you can, a technology that sharply improves the efficiency with which goods can be delivered to the consumer; that, in the view of one prominent economist, is the "most effective innovation during the preceding decade in speeding up American retail sales"; that within only a few years of its introduction becomes a pervasive feature of American life.

Such a technology, according to the historian Daniel J. Boorstin, was the square-bottomed paper bag, invented circa 1870 by one Luther Childs Crowell. The modern paper bag was one of the thousands of inventions, from the sewing machine to the cotton gin to condensed milk, that made us the land of Yankee ingenuity. Other nations might have great scholars and scientists, but America was uniquely endowed with practical visionaries, people who could translate ideas into things that worked. Is America still that place? Do we still have the knack of making productive innovations?

This may seem an odd time to ask that question. Right now, America is feeling supremely confident about its technological prowess. After all, we invented the microprocessor, the personal computer and the Internet -- which President Jacques Chirac of France has glumly characterized as an "Anglo-Saxon" network. The world's second richest man (after the Sultan of Brunei) is a self-made American technology mogul, and 20-something Silicon Valley millionaires are our modern Horatio Alger stories. Surely the spirit of Luther Childs Crowell is with us still.

Yet only a few years ago, American technology seemed to have lost its sparkle. Growth in business productivity, the necessary condition for rising living standards, had slowed to a crawl in the early 1970's and showed no signs of reviving. Innovative products, from VCR's to numerically controlled machine tools, all seemed to be manufactured someplace else. Among those who studied American innovation, gloom was the rule. "Companies evidently find it difficult to design simple, reliable, mass-producible products." So declared "Made in America," the influential 1989 report of M.I.T.'s Commission on Industrial Productivity.

In retrospect, it seems clear that we were too hard on ourselves. But just as we were unduly pessimistic a few years ago, we may be giving ourselves too much credit right now.

There is no question that the prophets of the new technological age have a lot of big ideas. Some of them have also been making big bucks, as business has scrambled to buy the latest and snazziest technology (and as investors, believing in those ideas, have paid huge prices for the right to share in hypothetical future profits). But when it comes to the true bottom line -- the ability of technology to make us more productive and thereby to raise our standard of living -- it's usually not the big ideas but the way they are implemented that really makes the difference.

Remember that 19th-century America had no monopoly on the big technological ideas of its time -- steam power and electricity, gears and camshafts. What propelled us to the world's highest standard of living was the relentless ingenuity with which our inventors and entrepreneurs applied those technologies to small things, things that made workers more productive or domestic chores easier. Their innovations ranged from Crowell's paper bags (and Robert Gair's prefabricated cardboard box) to Frederick Taylor's discovery that steelworkers could do four times more work if you gave them the right shovels. American inventors understood that the road to progress lay not merely in pursuing grand visions but in making better mousetraps -- that is, mundane technologies for ordinary use.

The technologies that get all the attention these days, however, lack that down-to-earth sensibility. Indeed, in the technology that dominates the news -- personal computing -- some observers seriously argue that we have gone past the point of diminishing returns, that the pursuit of high tech for its own sake sometimes comes at the expense of actual utility.

Anyone who has been using personal computers for a number of years knows what these cynics mean. I got my first PC in 1984; it displayed text and numbers in any color you wanted, as long as it was green. Today I work on a computer that provides me with 50 times the power and 100 times the memory, offers me dazzling color and hundreds of nifty features, and in most respects is no more useful than my good old I.B.M. In fact, in some ways I'm worse off: I used to be able to start work in seconds, while now I have to wait several minutes until the machine announces its readiness with an annoying "ta-da!" And software that requires fancy screens and constantly spins the disk is purely counterproductive when you are working on battery power; like many experienced business travelers, I have carefully saved an old DOS-based word processor to use when airborne.

True, Microsoft and Intel have made a lot of money selling us these dubious improvements. But then the high-tech industry has raised the art of planned obsolesence to a level Detroit never dreamed possible. How many people were happy with DOS but had to move to Windows, and then to Windows 95 -- simply because they could no longer get software or maintenance on their perfectly functional old system?

Put this down to personal crotchetiness, if you like; but I am not alone in my sense that something has gone wrong. A recent Scientific American article quotes a Microsoft executive, Nathan Myhrvold, as declaring: "Software is a gas. It expands to fill its container.. . . After all, if we hadn't brought your processor to its knees, why else would you get a new one?" The article then offers evidence that making programs like word processors ever more feature-laden often makes them harder, not easier, to learn and use.

Many of those features are like the "fluted columns, ornamental arches, entablatures, curlicues and encrusted scrolls" that, according to Boorstin, cluttered up 19th-century British machinery -- and that Americans used to have the good sense to leave out. Meanwhile, we seem to have developed a sort of aristocratic disdain for the old-fashioned skills needed to make technology serve the needs of ordinary people. Michael Dertouzos, who heads M.I.T.'s Laboratory for Computer Science (and was one of the authors of "Made in America"), is a notable and courageous advocate of smaller, simpler, more useful machines. Yet in his forecast-cum-manifesto, "What Will Be," he consistently seems to brush off as a minor technicality -- a task for mere "electromechanical gadgets" -- the really hard business of making information technology do real-world work. My favorite example is his "auto-cook," in which a computer directs machinery to select foods from "specially fitted bins," then to "cut, mix and process" that food and finally to move the food into the right receptacles, heat them, perhaps stir, and so on. He finds all of this plausible and is surprised it hasn't happened yet. We may assume that Dertouzos doesn't cook, but even so, his surprise suggests either that he lives on a diet of unusually homogeneous shape and texture -- let them eat poundcake! -- or that he has forgotten just how much hard work is required to create gadgets that produce desired results, not in cyberspace but out here in the material world. From an economic point of view, gadgets are what it's all about.

Luckily, though they don't get much glory, the gadgets keep on coming. Taken one at a time, they don't amount to all that much. When a recent New York Times article described how new technology has helped produce rapid productivity growth in the railroad industry, it was all a collection of little, unglamorous things: better wheels, better suspensions, better brakes, better motors. About the highest-tech item was a transponder-scanner combination not much more elaborate than the bar-code/scanner technology in use on every supermarket checkout line. It's hard to work up a millennial fervor, or even to get interested, in such nitty-gritty stuff. But to the extent that we have actually experienced economic progress in recent years, the main reason is the cumulative impact of such mundane innovations -- not the largely useless profusion of features on PC's, let alone the still hypothetical wonders of the Internet.

So what will the future high-tech economy look like? Will the average worker put on virtual-reality goggles and maneuver through 3-D simulations? Probably not: even aside from the vertigo issue, evidence suggests that 3-D environments are confusing and should be avoided where possible. But we probably will live in a world where everyone has E-mail, a 30-year-old technology that was a plaything for nerds until people like Steve Dorner -- who developed the E-mail software called Eudora in classic American lone-inventor style -- made it easy for ordinary people to use. Will we call up our voice-recognition home computer and tell it to prepare beef Wellington for dinner? Not a chance. But when we stop at the supermarket on the way home from work, the little chip in the cereal box may be able to tell the smart cash register to bill our credit card as we head, without stopping, for the exit. And it won't even seem like a big deal, because taken by itself, it isn't.

In a way, the technological prophets are right: technology will transform the world and make us all richer in the process. But it will do it the old-fashioned way: a little bit at a time.

Originally published in The New York Times, 9.28.97