DECEMBER 4, 2000 VOL. 156 NO. 22
The problem is that novelty, by definition, must be new. Innovation most often occurs when ideas or things are brought together in a way that never happened before, and when such juxtaposition occurs, the result is greater than the sum of the parts. One and one make three. A late 19th century engineer, Wilhelm Maybach, working for Daimler, puts together the newly invented perfume spray with the newly discovered gasoline and comes up with the carburetor. In 1823 Scottish chemist Charles Macintosh, working with a throwaway coal tar by-product, naphtha (used to clean out dyeing vats), stumbles across the fact that it will liquefy rubber. So he spreads the rubber between layers of cloth and invents the raincoat.
Accidents happen: aniline dye falls into a 19th century German researcher's petri dish that contains a bacterial culture, revealing that it preferentially stains and kills certain bacteria. The discovery eventually makes chemotherapy possible.
Serendipity intervenes: in the London summer of 1928, an open window in a hospital lab lets in a spore that settles on a staphylococcus-culture dish left unwashed. A mold grows and contaminates the staphylococcus. The lab user returns. Because he's bacteriologist Alexander Fleming, and because his lab has not been cleaned, penicillin is discovered.
Sometimes innovators don't even recognize the true import of their findings. In 1660s Germany, Magdeburg Mayor Otto von Guericke tries to solve the riddle of a compass needle that doesn't always point (as people thought it should) at the Pole Star. He rubs a model of the earth made of sulfur in order to attract his experimental compass needle. The rubbing produces a noise and a spark (which Guericke mentions in a casual footnote) that turns out to have been electricity.
This every-which-way process has been at work ever since the first prehistoric flint pebble was knapped into a butchering tool. Until about 500 years ago, however, innovation in a world moving at the speed of agriculture came infrequently, giving time for accommodation and a complacent sense of establishment. Then Columbus rediscovered America, and suddenly the rug was pulled out from under every form of Western authority. America had figured neither in the Bible nor in Aristotle, so what was it doing there? Then, within a few decades, returning with explorers from east and west came a flood of new plant and animal species, all of them also absent from the canonical lists. Help! If Holy Writ and the Big A were wrong, which way was up? For more than a century, things taxonomic went to hell in a handbasket while the European intelligentsia behaved like Chicken Little.
Life settled down to better than chaos only in the early 17th century, when French noodler René Descartes saved the day with a trick for thinking things through without screwing up: doubt what isn't self-evident, and reduce every problem to its simplest components. It is these twin tools of methodical doubt and reductionism that allow the editors of Time to produce this special section on invention. Because what Descartes began may now be coming to its final flowering.
Reductionism encouraged the fragmentation of knowledge under the rubric "Learn more and more about less and less." And as the drive to subdivide the natural world into smaller and smaller bits brought the development of the kind of tools needed for this enterprise (microscopes, telescopes, thermometers, dividing engines and all the other instrumentation required for measuring the new data), disciplines split into subdisciplines and sub-subdisciplines. As a result, isolated in their intellectual silos, scientists and their technological sidekicks literally "reduced" human knowledge to myriad, mutually incomprehensible pinpoints of niche expertise. No matter how esoteric a matter might be today, somebody, somewhere has spent years getting a Ph.D. in it.
Let's not knock this. Specialist invention has given millions of us the highest standard of living in history. And not just at the gizmo level. Now and again, deep in the epistemological woodwork, mind-numbingly arcane fields mix and mingle to produce cosmic upheaval with startling new realms of crossbred knowledge: astrophysics, biogeography, psychopharmacology, neurochemistry, paleobotany. This is to be expected. There are more scientists and technologists alive today than in the whole of previous history, and they have as much right to a happy and productive life as the rest of us.
If only we knew what they were going to do next. Because it doesn't always come up roses. James Watt's steam engine generated an Industrial Revolution that gave us a democracy of possessionsand over-population, acid rain, the Love Canal, disappearing forests, tattered ozone layers and Scud missiles. Medical miracles lengthen our lives and take national welfare provision to the edge of crisis. Electronic globalization moves jobs elsewhere.
Up to now, this has been the way you were obliged to play the game. From a social point of view, innovation has been a case of having to take the occasionally rough with the generally smooth, partly because of the interactive nature of invention described above, and partly because for centuries we have lived in a directionless culture of scarcity. At no time was there either need or resources available to share the intellectual wealth beyond a select few. There was no point, for instance, in teaching literacy to the masses without an adequate number of printing presses to provide them with texts. In any case, prior to the Age of Exploration and the Industrial Revolution, there would have been little a literate majority could have done job-wise. Innovation has, all along, been an Elite-driven, top-down thing. For much of history, the task of the individual innovator, working for some entity, state or private, and with privileged access to the contemporary store of facts, has been to satisfy the planning requirements of a king or ceo or politburo. The rest of us were not consulted.
To a large extent this is still so today. For any invention to succeed in the marketplace, it has to be some kind of surprise. Most of all for the competitor. And then for the consumer. As a result, invention (with Descartes's help) has given us a world nobody could have forecast and few can understand because of the esoteric process of innovation and the fact that it has never been subject to general social audit.
This may soon become a thing of the past. Information technology may be on the verge of removing the nitty-gritty, time-wasting, reductionist noodling from the human diary, leaving us freer to indulge in what our connective brains are best at: using the information webs to run connective scenarios based on what options for change present themselves at any given time, deciding what direction we want to go in and leaving it to the reductionist programs of our machines to get on with it.
One consequence might be that we finally recognize history's Great Innovators for what they were: the products of a culture of scarcity that taught us to regard them and their talents as rare. That they were (and are) specially talented is not in doubt here. But semi-intelligent information technology and the transition of a culture from one of information scarcity to one of abundance may for the first time allow all the rest of us into the invention game. Then technology will provide the nuts-and-bolts backup to give shape to whatever any imaginative brain can conceive. And there are more than 6 billion imaginative brains out there across the planet waiting for the opportunity. As this special report amply illustrates, it is already happening.
James Burke is the author of Circles and other books
Write to TIME at firstname.lastname@example.org
TIME Asia home
Quick Scroll: More stories from TIME, Asiaweek and CNN
|Back to the top||
© 2000 Time Inc. All Rights Reserved.
Terms under which this service is provided to you.
Read our privacy guidelines.