James A. Glazier and Gemunu H. Gunaratne (review date February 1988)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1099
SOURCE: “The Fascinating Physics of Everyday Complexity, Beautifully Portrayed,” in Physics Today, Vol. 41, No. 2, February, 1988, pp. 79–80.
[In the following review, Glazier and Gunaratne offer a positive assessment of Chaos, which they praise as energetic and skillfully written.]
About 20 years ago, researchers in a variety of fields, ranging from economics and biology to mathematics and physics, began to question the assumption that complex behavior springs from random and essentially inexplicable causes. Often they were working on unfashionable or interdisciplinary topics and their results were studiously ignored by more mainstream researchers. However, starting about 10 years ago, and largely due to the efforts of mathematicians and theoretical physicists, these disparate groups began to recognize that they were all pursuing fundamentally similar problems describable in terms of the same set of unifying concepts. James Gleick's fascinating book [Chaos] is a history of the development of these ideas, now known generically as chaos.
Gleick makes a convincing case that the theory of chaos has begun a revolution in our understanding of nature comparable to that engendered by quantum mechanics, with “sensitive dependence on initial conditions” playing the role of the uncertainty principle. However, though the initial inspiration for the two movements came at roughly the same time, they developed at different rates. At the turn of the century, the mathematician Henri Poincaré provided the basic formalism necessary to understand complexity. The biologist D'Arcy Thompson in his great book On Growth and Form (1917) proposed that, independent of detailed causes, systems tend to group themselves into broad classes whose behaviors should be explicable in terms of simple mathematical models. Alan Turing in 1952 claimed that morphogenesis and the formation of complex structures were legitimate subjects for physical inquiry. Nevertheless, for some 60 years mainstream physics focused on a reductionist understanding of the basic constituents of matter rather than on an understanding of complexity. Only recently, with the success of the theory of critical phenomena, has the more holistic approach become widely acceptable.
In Thompson's view the understanding of form is visual and metaphoric, often closer to the process by which an artist imposes meaning on nature than to traditional analytic techniques. Researchers on chaos have followed both approaches, some studying the behavior of bifurcations of differential equations, others turning to computational modeling and simulation. The legitimation of computer models as a vehicle for understanding is one of the hallmarks of chaos research.
The theory of chaos has had many successes in treating specific systems, but its most revolutionary contribution has been to effect a change in attitude. It has vastly increased the scope of subjects deemed suitable for investigation. One no longer dismisses complex and irregular behaviors as uninteresting or noisy; instead, one seeks for simple explanations. In this view the future of physics lies in the exploration of complex phenomena: the formation of patterns, the origin of turbulence, the functions of the brain and more. More traditional physicists may find the message threatening, but chaos presents the first real possibility that physics can be extended to the nonequilibrium complexity of the everyday world. One may draw an analogy between the development of chaos research and the birth of physics itself, in which seemingly disparate facts from a variety of fields were united to create a new science.
Gleick, a science writer for The New York Times, understands his subject and has a gift for simple, nonmathematical explanations that are accurate and insightful. Chaos is a scholarly book. Gleick's explanations of phase space and Poincaré sections should be read by all physicists. He has interviewed essentially...
(This entire section contains 1099 words.)
See This Study Guide Now
Start your 48-hour free trial to unlock this study guide. You'll also get access to more than 30,000 additional guides and more than 350,000 Homework Help questions answered by our experts.
Already a member? Log in here.
everyone in the US who contributed to the development of the field, and he switches from discipline to discipline with apparent effortlessness. Even specialists may find that he has extended their knowledge of the boundaries of their fields.
Chaos is also a lively book, and one that is beautifully written. In fact, like any good novel, it is almost impossible to put down. Each chapter focuses on one or a few researchers and discusses their work as part of their lives. Some are fine musicians, some writers, some polymaths, some mountain-climbers. They all seem to share a love for art and culture that has allowed them to escape the restrictive patterns of thought that characterize their disciplines. Among this great range of personalities, as expected, some are attractive and some not. Gleick presents them fairly, without romanticization but always with respect. The portraits will ring true to those who know the subjects. The book is full of vignettes and stories that make both the scientists and the process of science come alive. The frustrations and setbacks are presented in as much detail as the triumphs. Particularly attractive is the chapter devoted to Edward Lorenz, the significance of whose work was not recognized for 15 years because of its extraordinary novelty.
To write a history of current events is a difficult task. It calls for evaluating the importance of individual pieces of research when their long-term significance is still unclear. This sort of judgment is very personal, and we agree with most of Gleick's choices. But we do have a few reservations. We would have liked to see a chapter on Hamiltonian chaos and on the contributions of the great Russian theorists Vladimir I. Arnol'd, Andrei N. Kolmogorov and R. L. Stratonovich. Though Gleick continually refers to the importance of Russian research, a sketch of one of these men and his work, even at second hand, would have rounded out the presentation. Neither the discoverers of fractals—Georg Cantor, Felix Hausdorff, Gaston Julia and Pierre Fatou—nor the important topic of pattern formation are sufficiently discussed. An experimentalist will search in vain for a mention of Geoffrey Taylor, whose work inspired much of the current experimental research on hydrodynamic chaos. Overall, however, these are minor quibbles and it is the breadth and evenhandedness of Gleick's presentation that make a lasting impression.
Science books that are accurate, nontechnical and exciting are extremely rare. Gleick's book is worthwhile reading for every physicist, chaos specialist or not. It cogently addresses many of the problems, both scientific and philosophical, that face contemporary physics. Since it is eminently readable and uses no mathematics it would be an excellent text to use in a history or philosophy of science class to introduce nonspecialists to the excitement of physics. The illustrations are familiar but well chosen, and Gleick provides information allowing one to duplicate many of the basic computations on a home computer. One could scarcely ask for a better popularization of contemporary research. It will be interesting to look back in ten years to see how Gleick's judgments have fared.
Pat Coyne (review date 27 May 1988)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1026
SOURCE: “Whirling around in Whorls,” in New Statesman, May 27, 1988, pp. 31–32.
[In the following review of Chaos, Coyne finds Gleick's book adequate for lay readers, but notes shortcomings in Gleick's incomplete grasp of the topic and in his newspaper-style prose.]
Odd how the vocabulary of a newly credulous age seems to be invading even the best guarded of territories. Who would have thought that the words “catastrophe” and “chaos” could have become part of the common currency of that most self-consciously rational of disciplines, mathematics? Perhaps there is more to come? Could other current American predelictions, like the musings of Nostradamus, be quantified, codified and find themselves in the textbooks?
Don't rule it out. James Gleick's book [Chaos] is an object lesson in how, beneath the best explored of surfaces, there can lurk not the occult but—much worse, as far as science is concerned—the unpredictable. Moreover we are not talking of some obscure branch of algebra. Gleick's claims for the mathematics of chaos resemble those of Douglas Adams for the number 42. “Life, the Universe and Everything” is there as Gleick asserts that chaos sheds light on stock markets, heart attacks, earthquakes, the rise and fall of animal populations, the mechanics of star clusters and much, much more.
From where does chaos spring? In a phrase: from non-linear equations. Gleick rightly identifies the tendency among scientists to try and reduce everything to the linear—i.e. formulae without awkward exponents such as x2 or x3—which even a computer can solve. They share too an almost religious belief that in the real universe the natural state is equilibrium. Left to themselves things will more or less stay where they are.
On the whole, science has done pretty well by both tenets, but, as ever, complacency sets in and there is a tendency to think that they are universally applicable, Just as a (very) little thing like the precession of the orbit of Mercury (43 seconds of arc a century for anyone interested) prompted Einstein to formulate the theory of General Relativity, so seemingly trivial problems in apparently unrelated disciplines have stimulated the development of revolutionary techniques.
Take, for example, weather forecasting. It is commonplace that forecasts are accurate for, at best, no more than a few days, but until recently the general view has been that this is due to the complexity of the problem and the huge number of variables involved. Given a big enough computer and sufficient good data, ran the argument, accurate longer-range forecasts should be possible. Maybe not. As long ago as 1963, Edward Lorenz, a meteorologist at Massachusetts Institute of Technology, had developed a very simple model of the atmosphere comprising no more than a dozen non-linear equations (compared to hundreds of the linear variety in conventional models), which seemed to mimic in essential details the behaviour of the weather. But the most intriguing aspect was the model's extraordinary sensitivity to its initial conditions. A difference in the fourth or fifth decimal point, of say a temperature fed into it, would lead to wildly different predictions for weather only days ahead. Mocking the scientific faith in equilibrium, it seemed that the tiniest cause could have massive effects. “It was as if,” said one mathematician, “the flap of a butterfly's wing in Brazil could set off a tornado in Texas.”
Little notice was taken of Lorenz for 15 years until suddenly, in the late 1970s, a whole host of problems started yielding to the techniques he had pioneered. The seemingly random fluctuations of fish numbers in a pond, the shape of snowflakes, the spread of epidemics, the movement of cotton prices over a century, the deadly propensity of the human heart to go into fibrillation rather than maintain a steady beat: all turned out to have aspects in common. That common factor was chaos; the potential to go suddenly from seeming equilibrium to wild fluctuation. Yet that chaos contained a curious sort of order.
Pictured graphically, the solutions to the chaotic equations resembled whorls, never repeating themselves but seemingly centred around a point or points, dubbed finally “strange attractors.” In solving them, the mathematicians had to resort to techniques which had been regarded as the purest of pure mathematics. Fractional dimensions, for example, turned out to have physical meaning. Fractional dimensions? They lie between the point (1) and the line (2) or between the line and the three-dimensional solid. What, for example, are the dimensions of a ball of string? The two dimensions of the (linear) string or the three of the ball? Mathematically, the answer turns out to be somewhere in between.
One result of these researches has been the generation of the Mandelbrot set. Arguably the most complex as well as the most beautiful object in mathematics, its infinitely detailed whorling shapes contain forms of size down to the sub-sub-microscopic, never repeating, yet revealing ever more variations on the basic shape. The Mandelbrot set has become a public emblem of chaos, featuring on book covers, brochures and even spawning a travelling exhibition of its own.
So does Gleick bring order to chaos? Up to a point. A certain breathlessness infects his prose:
A picture of reality built up over the years in Benoit Mandelbrot's mind. In 1960 it was a ghost of an idea, a faint unfocused image. But Mandelbrot recognised it when he saw it and there it was on the blackboard of Hendrik Houthakker's office.
One puts it down to the baleful influence of the New York Times, for which he works. He gives, too, the impression of not quite understanding a lot of what he describes, though, to be fair, the same could be said of a good few scientists in the field. On the plus side he has talked to virtually everybody of note on the subject and makes a very creditable attempt at rendering it intelligible to the lay person. One thing missing is a sense of irony. Just 60 years ago the search for the causes of atomic spectra ended with quantum mechanics and the abandoning of causality, so now the quest for more predictable weather forecasts leads to chaos. Disorder and unpredictability, it seems, lie at the heart of things.
Brian Pippard (review date 22–28 July 1988)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1458
SOURCE: “Nowhere Twice,” in Times Literary Supplement, July 22–28, 1988, p. 800.
[In the following review, Pippard discusses chaos theory and offers a favorable assessment of Gleick's treatment of the subject in Chaos.]
Haydn's Creation opens with a Representation of Chaos which, while adhering to the rules of musical grammar, confounds at every turn the listener's expectations. Chaos is for Haydn no illogical riot but a paradoxical coexistence of logic and unpredictability, and it is in this sense that the word was imported into science in 1975 to describe what has since been recognized as a pervasive mode of behaviour. The technical literature is voluminous—there is already a new journal dedicated to its study—and very heavy reading it makes, even when not presented in the impenetrable shorthand of pure mathematics.
To appreciate what is involved, consider a tennis-player repeatedly bouncing a ball into the air off his racket. Of course he times his movements to keep the ball in play, but let us replace him by a simple automaton that jiggles the racket up and down perfectly regularly; will the ball lock into synchronization with the racket so that it bounces to the same height every time? Yes, if the amplitude of the racket's oscillation is about right. But let the amplitude be increased steadily, and at a certain moment the even spacing of the bounces will give way to alternating longer and shorter intervals; the motion is still regular, but now with a repetition cycle of two bounces, one higher and one lower, over and over again. Further increase of the racket's amplitude leads to a close succession of period doublings; the cycle switches from two to four different heights of bounce, and then from four to eight, and so on until a critical amplitude is reached when one has to wait indefinitely for the pattern to repeat itself. This is the chaotic response of the ball to the perfectly regular motion of the racket.
The seemingly random response is not, as with many other random processes, the consequence of extraneous disturbance; exactly this sequence of period doubling occurs when the process is modelled mathematically and the rules for the ball's trajectory are laid down unambiguously. This is an example of an iterative program of the sort that computers thrive on. Given the position and speed of the ball at one moment, the computer will rapidly calculate its position and speed one period of the racket's oscillation later. And it will apply the same procedure to find position and speed one further period later, and so on until it's told to stop. The pair of values, position and speed at a given moment, can be represented by a point on a graph, and the whole history of the ball's progress from each sampling to the next shown as a series of points. In the simplest case the points may converge on to a single position, or they may jump between two, or four, or higher powers of two. In the chaotic state no point is touched twice, but discrete regions of the graph are spattered with points at a density only limited by the time the computer is allowed to run. Yet, however many points there are, their distribution always shows a pattern within the occupied regions, no matter how fine the scale on which they are examined. This is an interesting distinction between chaotic and random processes, for the latter always lead eventually to an even coverage of the available area.
To have carried through programs of this enormously repetitive character before the development of adequately powerful computers was not feasible, but now that they are readily available many different iterative programs have been studied, and have shown that period-doubling sequences leading to chaos are remarkably prevalent. Moreover they bear a great family likeness to one another, in spite of being generated by quite different sets of equations. It is the universality of their features that attracts mathematicians and physicists, who are too busy disentangling the underlying reasons to spare time explaining the basic ideas in an assimilable style.
James Gleick thinks it a pity that the fascination of chaos should be withheld from a wider audience [in Chaos], since no great command of mathematics is needed to appreciate the general ideas. One of the most entertaining aspects is still only imperfectly understood at a deep level, the connection between chaos and the rather new topic of fractal geometry. Everyone has seen a picture of a man carrying a picture of a man carrying a picture of. … This is an example of one aspect of fractals—structures possessing no inherent scale, so that a small portion when magnified appears identical to the initial structure. The patterned distribution of points described above has this characteristic fractal structure. Looked at from another point of view, the infinite regress of pictures is the outcome of an iterative process, and it is from the study of iterations that there have arisen computer-drawings of marvellous complexity, some of which Gleick reproduces. Among them is a mass of curlicues like sea-horse’ tails, but tails that are covered with spines, each of which when magnified is seen to be another sea-horse with a spiny tail, etc. The resulting picture is neither stiff nor trivial, but as entrancing as florid, yet perfectly disciplined, calligraphy.
There is, however, much more to chaos than computer programs which, besides their intrinsic interest, alert one to the possibility of similar behaviour occurring elsewhere and in real life. The motions of fluids are governed by seemingly innocent equations which nevertheless can generate those chaotically fluctuating flow patterns that have long been studied under the name of turbulence. Two particles in the fluid may begin close together, but after a while move progressively further apart until their paths bear no relation to one another. This habit of divergence, leading ultimately to uncorrelated behaviour of elements that were initially hardly distinguishable, is characteristic of all chaotic systems. However well you know the initial state there will come a time, usually a rather well-defined “horizon of predictability.” beyond which no useful statement can be made about details of behaviour. This is bad news for weather-forecasters, for the atmosphere is almost certainly a fluid in a state of chaos. To be sure, there are gross features of climate for which prediction may be possible over months or even years; but if one is interested in details of the weather (will rain spoil the next Test match?) there is no point in blaming the forecasters for their mistakes. Nor will a better knowledge of the physics, or more weather stations, or a larger computer do much to extend the range of reliability. Long-range forecasting is likely to remain in the province of sorcery rather than science.
This may also be bad news for those physicists who are persuaded that their prime task is to discover fundamental principles, the application of those principles to particular systems being a somewhat derivative activity. What the study of chaos shows is that there are countless processes which will never be brought to a state of computer-predictability, however well the basic laws are known, and which will have to be studied in their own terms if anything useful is to be said. The physicist can console himself, however, with the knowledge that there are also many systems, like the planetary orbits, which are not chaotic—indeed it is from examining these that physics sprang into being. It was possible to make progress by ignoring for the time being the awkward, apparently lawless, phenomena; returning to them, as now with chaos, in the confidence of great achievements. Is there a lesson here for the social scientist, or is the chaotic state of the social organism such that the only regularities discernible are those that no one really wants to know about, while all the answers to interesting questions lie beyond the horizon of predictability?
The study of chaos is surely more than a mathematical recreation, and may well prove an important aspect of all sciences. James Gleick has made a very praiseworthy job, in Chaos: Making a New Science, of assimilating what is known so far and presenting it in digestible doses. His achievement is the more remarkable in that he is by calling a writer rather than a scientist. But he has talked at length with many of the leaders, mostly still young, and has put together an admirably accurate-story. The journalistic tone of his character-sketches I find incongruous in an intellectual adventure. But if this is the way to attract readers, and he is probably a better judge of this than I, it's a small price to pay for a book that few would have been enterprising enough to attempt.
Peter Campbell (review date 4 August 1988)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1163
SOURCE: “Thinking,” in London Review of Books, August 4, 1988, pp. 14, 16.
[In the following excerpt, Campbell offers a positive assessment of Chaos.]
. … James Gleick's Chaos tells an exhilarating tale. It starts a quarter of a century ago with work on weather forecasting by Edward Lorenz and finishes with an account of the penetration of ‘chaos’ research into sciences as different as epidemiology and astronomy. Science has traditionally turned a blind eye when a graph fluttered unmanageably and thus the future could not be plotted on a straight line or a smooth curve. It is such non-linear phenomena which chaos research investigates.
‘Chaos’, as it is used in this context, is confusing. It is not, for instance, a synonym for ‘random’. Weather is chaotic: you never know exactly when the next cyclone will come across the Atlantic—but it is not random. The behaviour of warm damp air, on any scale from a cupful to a cyclone, follows the general laws of expansion, contraction and movement which apply in more predictable systems. In terms of averages and general patterns, it is stable and describable. Science up to now has looked to deal with phenomena which settle to regular patterns. It has assumed that in chaotic systems like the weather the problem was a scarcity of data, a confusion of superimposed patterns and rhythms, all interacting with each other. If only you could get all the detail right, the rest would look after itself. How can a process be both part of a world which obeys physical laws and also impossible to predict in principle?
Lorenz's discovery of the meaning which can be given to this proposition occurred when he was trying to model world weather. He had set up a highly unrealistic, very simple computer program in which variables (temperature, air pressure and so on) produced a circulation pattern. The picture the program gave of swirling air masses bore a decent resemblance to real maps of world weather. But the interaction of the equations was complex. Lorenz could not predict what was coming up next. At one point he wanted to examine one part of a computer run at greater length. As a shortcut he keyed in the values as they stood at the starting-point of the section he was interested in, rather than re-run the whole thing from the beginning. At first, the output chugged out of the printer in the same pattern as it has on the first run, but it quickly began to diverge, and soon bore no resemblance to the original run. Eventually Lorenz worked out what he had done: the computer worked to six decimal places, the print-out only showed three and in keying in the values he had assumed that the difference—one part in a thousand—was inconsequential. It was not. The evidence that equations which gave a good approximation of weather were so sensitive to small changes in initial conditions was bad news for the future of long-range weather forecasting. Even if your sampling grid was six feet (and not sixty miles, which is the scale used at present), the effect of small sampling errors would spread through the system very fast; and no matter how fine you made your grid, the amount of information needed for perfect predictions would be a set further away. As you moved towards the infinitely small, the impossibility of knowing the effect of an even finer dimension would pursue you. Small errors, Lorenz showed, can have large effects. It is not surprising that the huge computing power put into weather forecasting has had only fair results.
Inherent in this discovery was rather shocking news. The assumption that a good approximation will always produce results in due proportion had been proved wrong. The figures had only to be a little bit out for some predictions to be way out, and this limit applied even in systems like damped pendulums which seemed, on the face of it, unlikely to behave chaotically. Chaotic effects can be found in very simple systems. A waterwheel fed by a regular stream will behave chaotically at some rates of flow, and not at others. Regions of instability may be surrounded by stable regions. At some rates of increase animal populations fluctuate randomly from year to year, at others they stabilise, or follow regular cycles of increase and decrease.
Computers were the tool which made the mapping of chaos possible. When millions of computations are made, patterns begin to appear. A few hundred dots may make no pattern, but as they build up (apparently randomly), shapes begin to appear. For many researchers, the experience of watching patterns arise on a screen as unpredictable results arrive seems to have been crucial. The topological transformations which drew the patterns out of the data opened new windows. When these phenomena were investigated, Mandelbrot's fractal geometry began to come into the picture. His discoveries dealt with patterns which repeated at smaller and larger scales. A coastline has gulfs which have bays which have smaller bays which have ragged rocks which have jagged edges, and so on. The mathematics of patterns which repeat themselves in this way at different scales was relevant to chaotic systems. One example is provided by the theoretical orbits of stars in galaxies which, when examined at higher and higher energies, turn out to behave with patterned unpredictability. The ability to reveal an infinity of detail, to be complex but patterned, suggests a better analogy for brain function and biological morphogenesis than models based on linear descriptions.
To the layman the ideas of chaos are wonderfully attractive. They are difficult but not opaque, they seem to correspond to experience. To some (Kuhn amongst them) they constitute a paradigm shift, a new view which the mis-match between the evidence of experience and an existing theoretical structure forces on a sceptical scientific community.
Gleick's book is enthralling. It is hard for the innumerate, like me, to hold back the thought that the world is chaotic in this precise way. The illustrations he gives of images generated on computer screens are just such combinations of the mechanical and the unpredictable as instinct would suggest underlie the natural world. The astonishing thing is that it is possible to simulate them. The exploration of the universe of numbers has become possible. It turns out to be more diverse than anyone guessed and that this diversity seems to have a strong connection with diversity in the physical universe. Perhaps that is why God gets seven mentions in the index. Chaos theory seems to resolve at least the more mechanistic anxieties about determinism, and among the answers to Einstein's remark about God not playing dice is one from Joseph Ford: ‘he does, but they are loaded.’
Gleick and [Ed] Regis [in Who Got Einstein's Office?] both tell their stories using the common science writer's mix of biographical sketch, exposition and quotation. They are lucky to have such stories, and readers are lucky that they tell them so well. …
John Franks (review date Winter 1989)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 3808
SOURCE: A review of Chaos, in Mathematical Intelligencer, Vol. 11, No. 1, Winter, 1989, pp. 65–69.
[In the following unfavorable review of Chaos, Franks disparages the notion of a “chaos revolution” and objects to Gleick's misrepresentation of chaos theory, fractal geometry, and mathematical methodology.]
[Chaos] is a book about new ways in which mathematics is used to model phenomena in the real world. It is intended for a general audience. The author is James Gleick, formerly a science reporter for the New York Times. He does a good job explaining what constitutes a mathematical model (by which he means a differential equation or a difference equation) and what it does. The theme of the book is that even rather simple non-linear models can, and typically do, exhibit extremely sensitive dependence on initial conditions. What this means is that two solutions of a non-linear ordinary differential equation can start with very close initial conditions but then diverge rapidly from one another while remaining in a bounded region. The effect is that after a moderate amount of time the position of a solution can appear to be a random function of the initial condition. This seems quite paradoxical because the differential equation certainly generates a deterministic system.
Only recently have scientists begun to realize the extent to which simple dynamical systems produce this random-seeming, complex, “chaotic” behavior. The story that this book recounts is the sometimes painful process of acquiring this understanding and the histories of the scientists involved. The result, in Gleick's view, has been a revolution in the way science views nature—a “paradigm shift.” The book is subtitled “Making a New Science.” We are told, “Where chaos begins classical science stops” and “Chaos poses problems that defy accepted ways of working in science.” This revolution, Gleick says, was carried out in the face of numerous obstacles by “a few free-thinkers, working alone, unable to explain where they are heading.” These stories of individual achievement, in diverse disciplines but with a common theme, are the heart of his book.
We are presented with an exciting and romantic story, but I suspect that Gleick has greatly overestimated the achievements of chaos theorists. Only time will tell. Also I think he missed the obvious explanation of the sudden interest in chaos in many branches of science. Thirty years ago there was very little a scientist could do with a typical non-linear model. It is generally impossible to solve non-linear differential equations. And it is not even clear what a scientist could do with a complicated, explicit, closed-form solution if one could be found. It would probably be no more enlightening than the original differential equation. As a result researchers have emphasized linear systems, which are much more tractable but greatly limited in the range of phenomena they can model. This may even have reached the point, as Gleick suggests, of scientists training themselves not to see non-linearity in nature.
The necessity of this oversimplification changed with the advent of inexpensive, easy-to-use digital computers. Now one can numerically approximate the solutions of non-linear differential or difference equations and, equally important, graphically display the results. The computer is a viewing instrument for mathematical models that will, in the long run, be more significant than the microscope to a biologist or the telescope to an astronomer. Nearly all of the scientists whose work is discussed in this book made heavy use of computers and were among the first to do so in their discipline. It is no more surprising that numerous types of complex dynamical phenomena have been discovered in the last twenty years than would be the discovery of numerous kinds of bacteria if thousands of biologists were, for the first time in history, given microscopes.
Most researchers now understand that seemingly random behavior is an inherent element of non-linear dynamical systems and not just the result of experimental error or “noise.” This is an important insight. I would suggest that its widespread acceptance has more to do with readily accessible computers than the brilliance or courage of Gleick's protagonists. What is surprising and fascinating is the resistance to accepting it that is documented in this book. Indeed, it was well known to Henri Poincaré and George David Birkhoff, but only when computers made it impossible to ignore did it gain acceptance.
We are probably nowhere near the end of the exciting new insights into nature that can be gained by computer viewing of mathematical models. All of the models discussed in Chaos: Making a New Science are relatively simple ordinary differential equations or difference equations. As computers become more powerful and supercomputers become widely accessible and easy to use, we can expect similar insights into non-linear partial differential equations. If non-linear ordinary differential equations are intractable without computers the typical non-linear partial differential equation is hopeless. The breakthroughs came first in simple ordinary differential equations because a personal computer (in some cases even a hand calculator, as we learn in this book) is adequate to experiment with them. Much greater computational power is necessary for partial differential equations, because their numerical solution requires many more arithmetical steps.
Despite these advances, I would speculate that the real evolution in scientific modeling and the greatest impact of computers on science is probably yet to come. Computers have made it easy to view the solutions of non-linear differential equations. But now that we have computers it is no longer clear that differential equations are the best models to use, especially in areas like biology or the social sciences. One can envisage a wholly new mathematical construct, better suited to digital computers, which could provide a new type of model. The emergence of such a construct capable of accurately modeling a variety of phenomena would indeed be a true paradigm shift. Nothing this dramatic appears to have happened yet.
There is indeed a revolution in progress, but it is not, as this book suggests, the “Chaos revolution.” Instead it is the computer revolution, and the current popularity of chaos is only a corollary. This revolution may still be in its infancy, but computers have already taught us one remarkable fact. There is mounting evidence that sensitive dependence on initial conditions may well be as close to the rule as to the exception for non-linear dynamical systems. This is a surprising fact, well documented in this book. It surprises us because it was invisible before the computer, but with computers it is easy to see, even hard to avoid.
I would have liked to see more information on the mathematics behind chaos. Much of this mathematics can be readily described at the same technical level as the rest of the book. For example, a discrete-time dynamical system discovered by Stephen Smale and commonly called the “Smale horseshoe” is described, but its significance is never discussed.1 It was one of the first examples of sensitive dependence on initial conditions to be completely and rigorously understood. But more important, a theorem of Smale shows that the horseshoe or its analogue for continuous time systems occurs as a subset of almost any system that displays chaotic behavior.
An even more serious omission is the absence of any discussion of the close connection of the Smale horseshoe with the doubling map on the interval. The doubling map is the function on the unit interval whose value at x is defined to be the fractional part of 2x. An understanding of this simple dynamical system goes a long way toward demystifying the paradox of deterministic randomness. If we think of x as written as a binary number with a leading “decimal point” then this function simply shifts the decimal point to the right one place and deletes the first digit. Clearly, if an initial x is chosen whose value is significant to 16 binary places, then after 16 iterates its value will appear to be completely random and independent of the starting value. This is essentially the same mechanism that underlies the complex behavior of most chaotic dynamical systems. The opportunity Gleick missed here was to give the non-specialist reader a real insight into the nature of chaos. The doubling map is readily understandable and closely related to the dynamics of the Smale horseshoe. The Smale horseshoe, in turn, is an essential ingredient of almost all chaotic dynamics!
I was also surprised to note that, despite a lengthy discussion of the Lorenz attractor, there was no mention of the work of John Guckenheimer and Robert Williams concerning this remarkable example. It is a great achievement to be the first to observe an important phenomenon in nature, but it is equally important to be the first to understand and explain it. Edward Lorenz wrote a system of three first-order ordinary differential equations as a simplified model of atmospheric convection. When he studied the system with a computer he observed all the hallmarks of chaos: very complicated trajectories of solutions, sensitive dependence on initial conditions, seemingly random behavior in a deterministic system, etc. His paper2 on this example was published in the Journal of Atmospheric Science in 1963. Gleick tells of the extraordinary influence that the work of Lorenz, and this example in particular, has had in the development of non-linear dynamics. However, there is no mention of the work of mathematicians who explained the remarkable things Lorenz had discovered and catalogued. This work by Guckenheimer and Williams3 represents one of the outstanding achievements of mathematics in the understanding of so-called strange attractors. It reduces the dynamics of a system of three non-linear differential equations to the study of a class of functions from the interval to itself, in fact, one not unlike the doubling map described above. This omission is especially surprising because one of Gleick's main points is that nature seems to have penchant for one-dimensional dynamics, but the one-dimensionality is often well disguised. Guckenheimer and Williams certainly showed this to be the case for the Lorenz attractor.
The sections on fractal geometry and its inventor, Benoit Mandelbrot, are the only parts of the book that have little to do with chaotic dynamics. Fractal geometry is a static theory only marginally related to any kind of dynamics. Mathematicians may find it disconcerting to learn that the term fractal geometry does not refer to a body of mathematics, as, for example, the terms projective geometry or algebraic geometry do. In fact, there appears to be no formal mathematical definition of the word fractal. This book mentions no theorems in fractal geometry. A highly regarded mathematical text4 with the title The Geometry of Fractal Sets deals primarily with mathematics developed prior to 1950, which is certainly not what Gleick is describing. There may be recent mathematical results in fractal geometry, but, if so, they clearly play a secondary role. Instead, fractal geometry is viewed by its proponents as a new framework and set of tools for describing nature. Gleick tells us, “In the end, the word fractal came to stand for a way of describing, calculating, and thinking about shapes that are irregular and fragmented, jagged and broken-up.”
The principal tenet of this descriptive framework is that nature is extremely irregular and the degree of irregularity remains constant when viewed on different scales. That is, natural phenomena exhibit a self-similarity of complexity across scales. “Above all fractal meant self-similar,” according to Gleick. The most important tool for fractal geometry is Hausdorff dimension. This is a numerical invariant of metric spaces that is defined as a limit of numbers obtained from covers of the space by smaller and smaller balls. For manifolds it equals the topological dimension, but in general it is not an integer.
The connection of Hausdorff dimension with the ordinary concept of dimension is somewhat tenuous, but to the uninitiated the idea that an object can have a dimension of 1.2618 has a kind of science-fiction-like appeal. It would have been better if the word “dimension” had never been attached to this number. Often in mathematics or physics a common word will assume a special or technical meaning sometimes quite different from its usual sense. The use of the word dimension here is one example, and I am told that “universal” is used by physicists in a different sense than its common use. Any exposition of mathematics or physics for non-specialists should take pains to explain which words are being used in an unusual way. Unfortunately, this has not been done by Gleick in this book nor in general by the adherents of fractal geometry. On the contrary, I feel that the fact that Hausdorff dimension assumes fractional values may have been emphasized to glamorize the concept. This sort of mystification should be avoided in science, rather than catered to. Incidentally, Gleick suggests that mathematicians prefer the term Hausdorff dimension to fractal dimension or fractional dimension because they spitefully want to deny credit to Mandelbrot. This is both false and insulting.
One of the principles of fractal geometry holds that Hausdorff dimension is an important measure for physical objects.
As Mandelbrot himself acknowledged, his program described better than it explained. He could list elements of nature along with their fractal dimensions—seacoasts, river networks, tree bark, galaxies—and scientists could use those numbers to make predictions.
The mathematical definition of Hausdorff dimension, involving limits as size goes to zero, may not make sense for a physical object, but one can make similar philosophical objections to measuring length. In any case, Gleick tells us,
Mandelbrot specified ways of calculating the fractional dimension of real objects, given some technique of constructing a shape or given some data, and he allowed his geometry to make a claim about the irregular patterns he had studied in nature. The claim was that the degree of irregularity remains constant over different scales. Surprisingly often, the claim turns out to be true. Over and over again, the world displays a regular irregularity.
Like fractal geometry this book describes better than it explains. The reader may be entertained, but is unlikely to gain new insights into nature. My greatest disappointment was the way in which mathematics is portrayed. One could read this book and come away with the view that mathematical proofs are an obstacle to the pursuit of truth—a sort of self-imposed mental straitjacket worn by stodgy old pedants. This probably overstates Gleick's view somewhat, but he definitely feels that mathematics has greatly suffered from an interest in rigor. A reader for whom the concepts of proof and rigor are vague at best could easily conclude they are better avoided. I had hoped for a more sympathetic view of the discipline.
Mathematics has a methodology unique among all the sciences. It is the only discipline in which deductive logic is the sole arbiter of truth. As a result mathematical truths established at the time of Euclid are still held valid today and are still taught. No other science can make this claim. The phrase “mathematical certainty” is commonly used in general discourse to represent the highest standard of truth. A theorem proven today may be forgotten in the future because it is uninteresting but it will never cease to be true (barring drastic changes in our standards of rigor). I would contend that an important criterion for judging a scientific discipline is the half-life of its truths. Mathematics does extremely well by this measure and mathematicians are justifiably proud that their standards of truth are higher than those of other sciences.
The use of deductive logic rather than empiricism is taken for granted by mathematicians to such an extent that they rarely contemplate alternatives. Scientists in other disciplines seldom appreciate this methodology and in some cases even disdain it. With few exceptions, Gleick's treatment of mathematics echoes this attitude. Rigor is blamed for increased narrow specialization in mathematics and for decreased contact between mathematicians and other disciplines. This seems to me to be implausible; in particular I note no lack of narrow specialization in empirical disciplines and no greater tendency toward interdisciplinary research.
In fairness, these views may only be reflections of the attitudes of some of the scientists interviewed by Gleick. We are told, for example, that Mandelbrot felt it necessary to emigrate from France because of the stifling influence of Bourbaki. Curiously, this book contains a fair amount of Bourbaki bashing—not because of Bourbaki's pedagogical style, where, in my view, it might be deserved, but for their rigor.
The methodology of mathematics does raise valid philosophical questions that need more attention. There is an apocryphal story of a meeting between a youthful Albert Einstein and an aging Henri Poincaré at which Einstein said, “I considered taking up mathematics, but I decided against it because there is often no connection with the real world in mathematics and it is impossible to tell what is important.” To which Poincaré replied, “Well, in my youth I considered becoming a physicist, but I decided against it because in physics it is impossible to tell what is true.” Clearly there is a tradeoff between high standards of truth and applicability. Sadly, this issue attracts little attention from either side.
This book contains considerable discussion of the dynamics of complex functions. I believe that, as far as we know today, this is pure mathematics that models nothing in nature, yet has great appeal for even the most down-to-earth experimentalist. It is difficult to view pictures of Julia sets or Mandelbrot sets without becoming something of a Platonist. These sets seem to have a reality of their own even if they don't reflect some part of what we glibly refer to as the real world.
What constitutes a good mathematical model? This question needs to be paid much more attention by all scientists. How do we judge if a given piece of mathematics accurately reflects some aspect of nature? This is not a mathematical question; at least, it is not a question whose answers are subject to proof. Historically the ability to make non-trivial quantitative predictions has been an important test for models. My own view is that a good model must explain, at least to some extent. For example, one could videotape some phenomenon, digitize the result, and use that to produce a “mathematical model” capable of reliably describing the process. But such a model would add nothing new to our understanding. Similarly, it is not enough to find a mathematical system that exhibits similar behavior to a physical experiment, but has no apparent connection with it.
I fear that many of the models of chaos may be faulted for this deficiency. In reading this book I was repeatedly struck by the parallels between chaos and catastrophe theory. Gleick quotes extravagant claims of chaos proponents, such as “twentieth-century science will be remembered for just three things: relativity, quantum mechanics, and chaos.” Fifteen years ago similar claims were made for catastrophe theory. I must admit that I was one who at least entertained the possibility that those claims were correct and that catastrophe theory represented a new “paradigm shift.” Now such a view seems extremely improbable.
Gleick himself yields to the temptation to engage in hyperbole on behalf of chaos. After a brief account of the inadequacy of pre-chaos science in dealing with complex natural phenomena, he says,
Now all that has changed. In the intervening twenty years, physicists, mathematicians, biologists, and astronomers have created an alternative set of ideas. Simple systems give rise to complex behavior. Complex systems give rise to simple behavior. And most important, the laws of complexity hold universally, caring not at all for the details of a system's constituent atoms.
Despite having carefully read this book, I do not know to what he refers when he speaks of universal laws of complexity.
Like chaos, catastrophe theory was highly interdisciplinary, claimed breakthroughs in numerous areas, was warmly received by researchers in several disciplines, and received a great deal of media attention. Both chaos and catastrophe theory are based on some interesting and beautiful mathematics, dynamical systems and singularity theory, respectively. But both have been rather phenomenological in their modeling, finding a similarity in a mathematical model and a physical phenomenon to be an adequate basis for adopting that model.
In discussions with colleagues I have encountered considerable objection to the comparison of chaos to catastrophe theory. Some feel it is unfair to chaos while others think it is unfair to catastrophe theory. Probably both groups are right. Comparing catastrophe theory to fractal geometry may be unfair to catastrophe theory since it has a great deal more substantial mathematics behind it than fractal geometry. On the other hand a comparison between catastrophe theory and non-linear dynamical systems may under-rate the impact of non-linear dynamics. The insight that seemingly random, complex behavior is inherent in non-linear dynamics and not a result of error or noise is an important one. For me it is difficult to think of it as a recent breakthrough or a revolutionary idea since it was taught to me in a routine fashion over twenty years ago when I was a graduate student. But the full impact of this idea on the conduct of scientific research is only now being felt. Catastrophe theory has not had a comparable impact.
Researchers in numerous disciplines are now obliged to study non-linear dynamics if they hope to provide good mathematical models. They have to know about disorder if they are going to deal with it. This new attitude is a very good thing—I believe that non-linear dynamics is the place most researchers should be looking for models. Unfortunately, our knowledge of chaotic systems, beyond the fact that they exist in profusion, is still extremely limited. This is true in both a theoretical and a practical sense. I am concerned that extravagant claims like those quoted above raise unrealistic expectations that have no chance of being met in the near future. Despite its portrayal in this book, chaos is not a new tool that can solve the problems of every discipline. It is difficult for professional scientists, much less the general public, to distinguish excessive hype from solid scientific advances. Chaos has given us both.
Whether in the long run chaos enjoys a greater success than catastrophe theory in providing useful mathematical models remains to be seen. This judgment will not be made by mathematicians or by popular science writers. It will be made by the practitioners of the various disciplines to which the techniques of chaos are being applied. And the criterion by which chaos will be judged is the quality of the models it provides for physical phenomena.
Notes
S. Smale, Diffeomorphisms with many periodic points, Differential and Combinatorial Topology (S. S. Cairns, ed.), Princeton: Princeton University Press (1963), 63–80.
E. N. Lorenz, Deterministic non-periodic flow, Jour. Atmos. Sci. 20 (1963) 130–141.
J. Guckenheimer and R. F. Williams, Structural Stability of Lorenz Attractors, Publ. Math. IHES, 50 (1979) 59–72.
K. J. Falconer, The Geometry of Fractal Sets, Cambridge Tracts in Math. #85, Cambridge University Press (1985).
Tony Osman (review date 6 May 1989)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1010
SOURCE: “Patterns of Order in Disorder,” in Spectator, May 6, 1989, pp. 32–33.
[In the following review, Osman discusses the development of chaos theory and offers praise for Gleick's treatment of the subject in Chaos.]
We could well be in at the beginning of a new science, as important as those of Newton and Darwin. Like the science of those revolutionaries, the new science of chaos, marvellously described in James Gleick's book, [Chaos,] affects the way we see the world. Newton gave us the universe as a celestial machine: Darwin described a world in which the forms of life evolved by chance and survived by competition. The theorists of chaos guide us through the real world around us, show why conventional science fails so badly to predict ‘real’ events, and show how their new science makes the complexities of reality comprehensible.
Chaos theory shows why, for example, we cannot hope for accurate long-range weather forecasts. It shows how the relatively simple message that is all that our genes can carry can prescribe the barely imaginable complexity of the air passages in our lungs or the branching of our blood vessels. And the chaos theorists show that there is order, and predictability, in events whose courses seem completely baffling—Stock Exchange crashes and disease epidemics, to take two examples where we have no working theory at the moment.
What scientists do, in their highest moments, is to look for rules of cause and effect (‘Laws of Science’) that apply to a wide range of occurrences, and use these rules to make predictions. Surprisingly frequently, science fails with real events. Weather forecasting is notoriously erratic, but science cannot even deal with relatively simple events. It cannot predict the annual incidence of disease after an innoculation programme: it cannot predict the changes in animal population that will occur when fertile animals are left in an isolated area.
Science works well in predicting the movement of the planets, say, and badly with these others because the events steadily change the conditions that the rules are supposed to apply to. As the fish in a pond multiply, they fill the space and reduce food supplies. As a drop of water falls from a tap, it leaves a residue bouncing up and down as the water for the next drop accumulates. And so on. The events are what scientists call ‘non-linear'—doubling one element in the ‘cause’ side does not multiply in a simple way the corresponding element labelled ‘effect’.
The unpredictability—the lack of any apparent pattern—is aptly described as chaos: chaos theory brings order to the seemingly confused events of the real world. This book shows in clear and fascinating detail how the chaos researchers found patterns of order in disorder (and a disorder in apparent order).
Chaos theory is a revolution only made possible by the computer, because a computer can painlessly and quickly run through what computerists call iteration—seeing what happens when a process runs repeatedly in cycles, each cycle acting on the results of the previous one. (This is like, but normally more complex than, compound interest, where your principle for year two includes the interest earned in year one.)
The results of iteration can be very surprising. In the situation of the fish in a pond, or elephants in a reservation, the progeny of the first cycle are there to breed for the second: and the available food is diminished by what the first generation ate before the food plants had seeded. Commonsense says that the population will rise to some number that balances the production of food, and settle there in equilibrium. But if, on a computer, you write an equation for the change in population, putting in constants representing breeding rate and ‘eating rate', you find that there are several possible outcomes, depending on the constants you use. There is the possibility of equilibrium: but another set of values leads to a population that soars above equilibrium and then ‘crashes’. This has actually happened in the elephant reservations in Tanzania, to the surprise and alarm of those who thought that the reservations would conserve the elephants. Chaos theory explains their error.
It also explains the complete failure of all long-range weather forecasting by what it called the ‘butterfly effect’. Meteorologists believe that with enough information, they really could get the weather forecasts right. But Gleick's book [Chaos] tells how Dr Edward Lorenz of the Massachusetts Institute of Technology ran the elements of climate iteratively through a computer and found that even a tiny change—the effect of the flight of a butterfly, for example—would be so multiplied by repetition that prediction would be impossible.
Benoit Mandelbrot's discoveries, which started while he was in the pure research department of IBM, showed how chaos theory could reveal the simplicity in apparent complexity. His discoveries are connected with what mathematicians call ‘fractals'—shapes with a fractional number of dimensions. It isn't easy to explain this idea, but it is easy to see the effects of the manipulations that produce fractal figures. If you draw an equilateral triangle, and then, a third of the way along each side, start the drawing of another triangle, each side one third the length of the original one, and then on each of the two exposed side of the three new triangles, draw triangles one third the size of the second batch, and so on, repeatedly, you get a shape of striking beauty formed by a very long line that encloses an area barely larger than the original triangle. The DNA of our genes must contain some instruction of the type ‘travel one third of the original distance and then split into two: repeat this a very large number of times’ to generate the labyrinth of our blood vessels or of the air passages in our lungs.
Good books explaining scientific research are rare. Gleick's is outstanding because he is a dramatic explainer and because, as a very long list of references shows, he has assimilated enough research to be able to give a panoramic picture.
John Franks (essay date Summer 1989)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 816
SOURCE: “Comments on the Responses to My Review of Chaos,” in Mathematical Intelligencer, Vol. 11, No. 3, Summer, 1989, pp. 12–13.
[In the following essay, Franks objects to Gleick's portrayal of mathematicians and the goals of mathematics in Chaos, and asserts that Gleick misses an opportunity to introduce the public to the rewarding creative aspects of mathematical research.]
The several responses to my review [of James Gleick's Chaos] raise some interesting questions. What does “doing mathematics” mean? Is it possible or desirable to give an honest explanation of its meaning to a general audience? How important is the role of theorem-proving in doing mathematics?
It is wrong to try, as James Gleick does in these pages, to make a dichotomy between discovery and proof in mathematics. Usually the discovery is the proof, or at least it is inextricably tied to it. Very rarely is a proof a historical afterthought. Almost always it is a proof, or the process of finding it, that creates new mathematical knowledge. A proof is not some kind of super spelling checker that merely validates mathematical facts. Most mathematicians would consider proofs to be the central content of mathematical knowledge. Who would be satisfied if God were to announce that the Riemann Hypothesis is true, but deny us the proof?
Let me paraphrase an old joke about money. In “doing mathematics” proving theorems isn't everything, but it's way ahead of whatever is in second place. Proving a theorem is one of the most creative acts of which the human mind is capable. Most mathematicians find it an exhilarating experience. It is often exciting, sometimes even thrilling. There is also pain and disappointment when a putative theorem falls through. Several years ago a well-known mathematician caused a minor flap by comparing mathematics to sex (in the page of the Bulletin of the American Mathematical Society). The propriety of his simile may have been questionable, but I prefer it to the comparison in Professor Devlin's letter in response to my review, which likens proving a theorem to a mechanic working on a Buick.
Mr. Gleick is probably right when he suggests that many mathematicians consider the concept of a “mathematician, … ostentatiously not proving much” as something of a self-contradiction. A view commonly held by mathematicians is that a mathematician is someone who creates mathematics and that, for the most part, creating mathematics means discovering and proving theorems. There is no intent here to denigrate those who use mathematics but prove no theorems, especially those who use it in creative ways. But there is a difference between a composer and a musician. And mathematicians take great pride in their craft—even to the extent of believing the theorem may outlive the application, as the musical composition outlives the performer. Mr. Gleick considers this a myopic view of the history of science; I acknowledge it is not an unbiased one. I hope, at least, he realizes it is not a petty jealousy directed at a single individual.
By and large, mathematicians have done a terrible job of communicating to the general public what we are doing (or what we think we are doing). Professor Devlin says, “Gleick very wisely steered well clear of any whiff of real ‘mathematics’ as it is perceived by most people.” I saw, instead, a missed opportunity. Mr. Gleick is a rare find for the scientific community—an exceptionally talented writer with an interest in science. I do not believe he is an antagonist of mathematics, quite the contrary. But I wish he had chosen to give his readers “a whiff of real mathematics” as it is perceived by mathematicians. Obviously this does not mean citing theorems and proofs; and it certainly does not mean reciting a list of who did what when. It does mean explaining the mathematician's perspective on mathematical creativity as a process by which new knowledge is attained. Professor [Morris] Hirsch does not suggest this process is the only legitimate topic in a history of chaos, but he is correct in saying it played a major role, which Gleick neglected in his book.
Mr. Gleick tells us that Lanford's proof wasn't needed to validate the discoveries of Feigenbaum. But, by the same token, applications to DNA weren't needed to validate the theorems of knot theorists. It is a wonderful thing when a branch of mathematics suddenly becomes relevant to new discoveries in another science; both fields benefit enormously. But many (maybe most) areas of mathematics will never be so fortunate. Yet most mathematicians feel their work is no less valid and no less important than mathematics that has found utility in other sciences. For them it is enough to experience and share the beauty of a new theorem. New mathematical knowledge, like knowledge of subatomic particles or knowledge of Mars, is an end in itself! This is part of the “whiff of real mathematics” we need to communicate to the general public.
Morris W. Hirsch (essay date Summer 1989)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1707
SOURCE: “Chaos, Rigor, and Hype,” in Mathematical Intelligencer, Vol. 11, No. 3, Summer, 1989, pp. 6–8.
[In the following essay, Hirsch objects to Gleick's misrepresentation of chaos theory in Chaos and his failure to focus on the contributions of mathematicians, particularly Stephen Smale, toward a scientific understanding of chaos.]
Gleick's book Chaos [reviewed by John Franks, Mathematical Intelligencer, vol. 11, no. 1, 1989] captures vividly and faithfully the personalities of the researchers, the atmosphere they worked in, and the spirit of the times (as far as I can tell—I don't know all the people in the book), as well as skillfully expounding many fascinating themes of current research.
But to my mind the book has one great defect: It doesn't do justice to the rigorous mathematics underlying a great deal of the research in nonlinear dynamical systems. In fact a nonmathematician or even a mathematician unfamiliar with the material would find it hard to tell that rigorous mathematical proof—as contrasted with conjecture, heuristic, experiment, and computer simulations—played a vital part in this research.
This book severely underrates the importance of rigorous mathematics and its influence on the understanding of chaotic dynamics. It is not easy to learn from it that many important chaotic systems, including the earliest and the most influential examples, were first identified and explored not by computer simulation, and not by physical experiment, but by mathematical proof (Poincaré, Birkhoff, Levinson, Smale, Anosov, Kolmogorov, Arnold, Moser …).
These and many other mathematicians achieved by rigorous mathematical analysis crucial insights into what is now called chaos. It is difficult to imagine that what they discovered could have been found through any kind of experimentation any more than the existence of irrational numbers—which was even more astonishing when it occurred—could have been discovered by computation. By ignoring this, Gleick missed an opportunity to attack the public's profound ignorance about the role and nature of mathematics; he has also misrepresented a chapter in the history of mathematics. Let me illustrate what I mean with Smale's work on horseshoes, which is far more important than the book suggests.
In discussing the horseshoe Gleick writes of Smale's “intuition,” claiming that he “turned his ideas about visualizing global behavior into a new kind of model”; he “put his horseshoe through an assortment of topological paces,” and “the horseshoe provided a neat visual analogue of the sensitive dependence on initial conditions that Lorenz would discover in the atmosphere a few years later”; “Smale's horseshoe stood as the first of many new geometrical shapes that gave mathematicians and physicists a new intuition about the possibilities of motion. In some ways, it was too artificial to be useful. …”
But the only mention of mathematics in this discussion is “mathematics aside”! Thus the reader learns that Smale had a new “model” (of what?), but it was merely an “analogue” of what Lorenz would actually “discover.” The horseshoe is “an enduring image,” but it was “artificial”; the only reason it is famous is because it caused a “paradigm shift.”
This is profoundly misleading. It conveys little of what Smale accomplished, what its importance is, why it is has been so influential. The horseshoe is important because Smale proved very interesting things about it, and these theorems tell us important facts about many other dynamical systems, facts that could not have been found by intuition, simulation, or experiment. In these theorems and facts lies the importance of Smale's work. The “paradigm shift” was due chiefly to them, not merely to a “new kind of model,” “a neat visual analogue,” or a “new geometrical shape” that was “too artificial to be useful.”
Smale proved the following things about the horseshoe map:
1. He proved it is chaotic. For example, arbitrarily near any point there are periodic points with arbitrarily high periods, and there is also a point whose orbit comes arbitrarily close to every other point—a precise and extreme form of sensitive dependence of long-term behavior on initial conditions.
2. He proved the horseshoe is structurally stable. This is not something that can be discovered or verified from calculations or simulations.
3. He proved that any system having a “transverse homoclinic orbit” (a concept due to Poincaré) must contain a horseshoe as a subsystem; such systems are thereby proved chaotic. This is a profound result because in many systems coming from physics, biology, etc., it is comparatively easy to demonstrate the existence of such orbits. This result has been used many times to prove that horseshoes are embedded in many particular systems, thus revealing their chaotic nature.
4. He proved that the chaotic dynamics of the horseshoe is isomorphic to the dynamics of the shift map in the space of bi-infinite sequences of zeroes and ones (so-called “symbolic dynamics”). Now, the dynamics of the shift map are quite transparent; for example, it is easy to see that periodic orbits are dense. Smale's isomorphism immediately opened up the possibility of similarly analyzing other chaotic systems. A great deal of research has since been done on this, greatly enlarging our understanding of chaotic dynamics.
5. He proved horseshoes exist in all dimensions greater than or equal to 2. Therefore, structurally stable chaotic systems exist in great abundance in those dimensions (for flows one more dimension is needed). This was new and striking information. Earlier examples of chaotic systems, such as Birkhoff's, were strictly 3-dimensional.
These discoveries and the subsequent work they inspired gave strong impetus to the now prevalent belief that chaos is a common phenomenon. They are different in kind from the simulations of Lorenz and others, have had a different kind of influence, and have led to different insights. Today finding horseshoes is one of the chief ways of analyzing the dynamics of a chaotic system and one of the very few ways of rigorously demonstrating chaos. Far from being merely an artificial image, the horseshoe is a natural source of chaos and a basic tool for investigating it in all fields of applied dynamics.
Many of the new insights into nonlinear dynamics arising from the work of many mathematicians would probably not have been discovered (or appreciated, if they had been discovered) without their rigorous mathematical foundations. Moreover, these discoveries are not isolated facts; many of them are parts of extensive theories that have been shown to apply to many fields of science. In fact, they have also been used in reverse to obtain new theorems about nonchaotic systems. In contrast, the insight provided by a simulation such as Lorenz's, no matter how striking, does not extend very far to other systems.
I do not mean to denigrate the importance and influence of simulations and intuitions such as that of Lorenz and many others. The chaotic phenomena they have discovered are important, seemingly quite different from horseshoes, and apparently less tractable to rigorous analysis.
Curiously, the horseshoe has sharpened appreciation of the Lorenz system, because the latter seems highly resistant to a rigorous analysis. I believe that only recently has it been proved that it has periodic orbits of arbitrarily high period. The horseshoe is a tame kind of chaos; compared to it the Lorenz system seems quite wild.
I've discussed Smale's work because I know it well and it's extremely important. But there are many other examples of rigorous mathematics leading to insights about chaos. For example, Birkhoff proved what Poincaré had conjectured: the existence in the restricted planar 3-body problem of infinitely many periodic orbits. Gleick mentions many names, but again he fails to distinguish between computer simulation (e.g., by Ulam) and rigorous mathematics (e.g., by Kolmogorov).
Both Poincaré and Birkhoff deserve much more space than Gleick gives them. And why isn't poor Levinsion's name revealed on page 48? After all, he sent Smale the letter with the “robust and strange” example, with “stability and chaos together”; following which, Smale's disbelief “slowly melted away.” The discoverer of such an important example deserves credit. He's mentioned without explanation on page 182, but he is not indexed.
Another difficulty with this influential book, as Franks emphasized, is that Gleick hyperbolizes chaos into “a way of doing science,” “a method, a canon of beliefs” (p. 38), something with “universal laws” (p. 304). What does this mean? What are these laws?
A remark of René Thom is pertinent here: “It is to be expected that after the present initial period of wordplay, people will realize that the term ‘chaos’ has in itself very little explanatory power, as the invariants associated with the present theory—Lyapunov exponents, Hausdorff dimension, Kolmogorov-Sinai entropy—show little robustness in the presence of noise” (Behavioral and Brain Sciences, 10 (2) (June 1987), 182).
Thom was criticizing the hypothesis of chaotic dynamics in a biological model. Indeed, I have observed many biologists looking to chaos as a magic key to unlock nature's secrets. Inquiry revealed that in fact they had a superficial knowledge of nonlinear dynamics and a respect for its difficulties. They had the impression, however, that there is a “theory of chaos” that would somehow solve their mathematical problems. I told them that a chaotic system is really bad news—if your system seems chaotic, perhaps it is, but you probably can neither prove nor disprove this, nor can you accurately simulate its long-term behavior. In any case, to call a system chaotic is not to say much more than that it is nonlinear and complicated.
This was not a welcome message. After all, they had heard wonderful things about the wonderful world of chaos. Now Gleick tells them that chaos is indeed a “method,” with “universal laws.” Such hype prejudiced a generation of biologists against the truly original ideas in Thom's catastrophe theory. Nonlinear dynamics is in no danger of a similar fate; but exaggerated claims may send some scientists on a wild goose chase, while discouraging others from seeking the valuable insights that dynamics can offer.
The foregoing complaints are of interest only to mathematicians. The defects I ascribe to the book will not interfere with anyone else's pleasure in reading it. Let me emphasize that, despite my criticisms, I think Gleick's book is really first rate. Anyone who has tried to explain mathematical concepts to nonscientists will appreciate how remarkable is its exposition of difficult ideas. But I wish he had given less publicity to the nonexistent science of chaos and more to rigorous mathematics.
Gale H. Carrithers Jr. (review date Winter 1990)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1516
SOURCE: A review of Chaos, in Southern Humanities Review, Vol. XXIV, No. 1, Winter, 1990, pp. 75–77.
[In the following review, Carrithers offers a positive assessment of Chaos.]
In the Prologue to this fascinating account, [Chaos,] Gleick attributes to Joseph Ford this characteristic claim of the “chaos movement”: “Relativity eliminated the Newtonian illusion of absolute space and time; quantum theory eliminated the Newtonian dream of a controllable measurement process; and chaos eliminates the Laplacian fantasy of deterministic predictability.” James Gleick adds that “of the three, the revolution in chaos applies to the universe we see and touch, to objects at human scale.” Gleick, since 1978 an editor and reporter at the New York Times, chronicles the story.
The first chapter features three papers by Edward N. Lorenz published in 1963–64. It was he who showed mathematically and graphically that long-range weather prediction is inherently impossible (rather than merely difficult), and who later gave us in 1979 “the butterfly effect,” as in “Predictability: Does the Flap of a Butterfly's Wings in Brazil Set Off a Tornado in Texas?” Yes; at least it may; though it might not.
The following ten chapters are in several ways a cautionary tale—of scholarly investigators in many lines of research historically and institutionally defined as disparate, needing to know about one another's works, and only very slowly (one may feel) making those longer-range connections, while meeting discouragement at the short range. This might be said to be the account of a meta-science coming into being. That meta-science defies “accepted ways of working in science,” which Gleick construes as focussing on linearity, and repeatable results within what some chaos scientists deem reductive limits; and it “makes strong claims about the universal behavior of complexity,” the “global nature of systems” featuring “orderly disorder created by simple processes.”
For example, Stephen Smale, a topologist, developed the “horseshoe” transformation for mathematically understanding the chaotic properties of dynamic systems as stretching, squeezing, folding—like taffy on the arms of a mechanical taffy-puller. Apart from contributing to the rapprochement in the 1960s of mathematicians and physicists (who had “simply despised each other” for three decades, according to Ralph Abraham), this line of work fostered computer modeling. A development of that: a graphic by Philip Marcus effectively modeled and explained the dynamic stability of the weather system which we have known as the Great Red Spot of Jupiter.
Gleick describes the confluence of working modes of ecological biologists with the making of the new science of chaos, and interprets it in a proportion worthy of a metaphysical poet: “The hitherto received mathematics of ecology is to the mathematics of Steve Smale what the Ten Commandments are to the Talmud.” The story is partly of change of heart: from the bearing that mathematically stable models were the interesting ones, to fascination with the relevance of mathematically unstable models. The story has something for everyone, including the textual editor. Smale circulated second-generation photocopies of Lorenz's paper on “Deterministic Nonperiodic Flow” sent to him by James Yorke in 1972, with the latter's return address on it. He, rather than Smale, became known as the promulgator of Lorenz. But of course the crucial point was the growing recognition of the pervasiveness, in a large sense the regularity in nature of sensitive and non-linear dependence on initial conditions. It meant that “nature” must be in Huxley's words not only queerer than we suppose but queerer than we can suppose.
Gleick might well have used Huxley's words, or the Russian formalist term defamiliarization, in describing a certain movie. Frank Hoppensteadt made a movie from each of a thousand different values of the parameter in the logistic nonlinear equation (such as x=rx[1-x]), as plotted on the graphic display screen of a very powerful computer. As the parameter increases, chaotic unpredictability develops, giving way to “fleeting bits of periodic behavior,” which on the movie of the computer screen resemble “flying through an alien landscape.”
The landscape of chaos science's nature invites celebration by a Gerard Manley Hopkins. It is more than dappled, “a geometry of the pitted, pocked, and broken up, the twisted, tangled, and intertwined,” in which, for example, the degrees of roughness or irregularity of a computer-generated coastline may look the same no matter how much the image is magnified. Such is the fractal geometry pioneered by Benoit Mandelbrot, and illustrated in the eerily beautiful picture section. Such is the geometry of “strange attractors,” that can look like stylized owl eyes or butterfly wings, “an infinitely long line in a finite area,” modelling a certain turbulence as a trajectory in phase space.
“The Ice Ages,” we learn, “may simply be a byproduct of chaos.” Sensitive dependence on initial conditions, and multiple scaling patterns recursively dependent, or, as one might uneasily say, self-referential, could well mean that climate, like nature, is not only a cluster-bound notion but a very parochial one. That is a little more than Gleick says, but he describes Mitchell Feigenbaum developing a “universal theory” of different real-world systems moving from orderly to turbulent in ways not just analogously but measurably the same. And his breakthrough, “so original and so unexpected,” was rejected by journals for two years. Yet the theory gained dissemination anyway in “the way most science is now disseminated—through lectures and pre-prints,” for him in 1976 and after.
Gleick's book, then, is a story of science itself as more complexly a matter of process than the myth of taxonomy has held, and of nature and very numbers as more a process than mythology has held (at least complex numbers). “When a geometer iterates an equation instead of solving it, the equation becomes a process instead of a description, dynamic instead of static.”
Gleick notes that “To see an image on a [computer] graphics screen does not guarantee its existence in the language of theorem and proof.” But, he adds in the kind of observation which is one of his strengths, “the numerical power of computation and the visual cues of intuition would suggest promising avenues and spare the mathematician blind alleys” in ecological shifts, in heart-beat fibrillation, and such. Intuition, play, and the spirit of competitive gaming (especially games played against the self or in some sense against the “house”) have all contributed enormously to the bizarre flowering of chaos science, itself an exotic design with its exoticism seeming to repeat at every scale. Is it mathematic or poetic justice that computer-generated fractal landscapes have proven “phenomenally realistic … in special effects for movies.” Norman Packard is quoted as remarking that “the phenomenon of chaos could have been discovered long, long ago. It wasn't, in part because this huge body of work on the dynamics of regular motion didn't lead in that direction. But if you just look, there it is.”
Look playfully? To repeat, this is in more than one sense a cautionary tale, not only of the need for communities of connection, but (overlapping that) the need for playfully imaginative viewing. “Only the most naïve scientist believes that the perfect model is the one that perfectly represents reality. Such a model would have the same drawbacks as a map as large and detailed as the city it represents. …” This book about chaos “in the new sense: orderly disorder created by simple processes,” is likewise a book about some scientists, and sectors of their disciplines, accommodating the Kuhnian notion that “You don't see something until you have the right metaphor to let you perceive it.” Is that accommodation, that boundary of interaction, a psycho-perceptual fractal boundary, with similar interfoldings at every scale? Is fractal boundary the best metaphor for perceiving interaction in a small organization between, say, professionalism and cronyism?
The question is meant to suggest something of the teasing suggestiveness of Gleick's story. Certainly he insists on a great deal. He summarizes by sketching an analytic, always already at least incipiently reductive old science in which “Simple systems behave in simple ways. … Complex behavior implies complex causes. … Different systems behave differently.” And this he contrasts with the nature, chaos-scientific, wherein “Simple systems give rise to complex behavior. Complex systems give rise to simple behavior. And most important, the laws of complexity hold universally, caring not at all for the details of a system's constituent atoms.”
Strong words. Do I hear Haldane whisper “queerer than we can suppose”? As at large, so occasionally and inevitably in small: we find assertions or predications which may seem unearned. There is a slippery use of Gustave Mahler at one point, a skewed remark about the eighteenth century by John Fowles uncritically invoked at another. To say the heart's rhythms “so precisely [measure] the difference between life and death,” to say fibrillation is a disorder “just as mental disorders … are disorders,” to speak of “ideas, emotions, and all the other artifacts of consciousness” (my emphases) seems to reduce or even beg some questions. But those questions deserve their own books. This book is delectable good news for anyone who rejoices at the mind extending its claims on the unintelligible, and anyone who believes that the most important thing coming out of the investigation arena is the investigator.
Marek Kohn (review date 30 October 1992)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 676
SOURCE: “Wild at Heart,” in New Statesman & Society, October 30, 1992, p. 39.
[In the following review, Kohn praises Genius as “a formidable work of scientific biography,” but notes that Gleick's “guardedness” inhibits his ability to humanize the portrayal of Richard Feynman.]
After winning his Nobel Prize, Richard Feynman was dogged by the fact that he did not get it for something readily identifiable, like inventing the transistor or discovering penicillin. He was grateful to the reporter who suggested he tell inquirers, “Listen, buddy, if I could tell you in a minute what I did, it wouldn't be worth the Nobel Prize.” It was a line tailor-made for the maverick physicist; demotic, mandarin and abrupt.
In the absence of any neat association with a device or a theory, Feynman gathered renown simply as a genius. His way of thinking and speaking, scornful of protocol, won him an entranced following. But despite the celebrated bongo-playing and the long grey locks of his later years, he was a man neither of the counterculture nor of the people. He voted for Eisenhower in the 1950s and steered clear of political issues in subsequent decades. Although magnificently charismatic, he offered the lay public no glib visions, scientific or mystical.
Feynman did, however, seduce it with a mythology of himself. No scientist ever embraced anecdote so wholeheartedly, nor used it so skilfully. Fundamental to the canon were those concerning the wisdom imparted to him by his father, a struggling small businessman with an incandescent passion for scientific thought. The lesson of one was that you could know the name of a bird in half a dozen languages, but you would know nothing about the bird. A secondary function of this story was as a dig at Feynman's rival, Murray Gell-Mann, a keen bird-watcher.
In another tale, the young Richard asked his father why, when he pulled his toy wagon forward, the ball in it rolled to the back. Nobody knew, Feynman senior replied. Although the phenomenon had been named inertia and the laws governing it determined, nobody really knows why a body tends to resist a change of motion. As a scientist, Richard Feynman retained a sense of radical uncertainty and of the provisional nature of all theory. He never looked forward to a grand theory of everything.
Nor did his own career tie up into a coherent whole. There was quantum electrodynamics; there were his dizzingly brilliant physics lectures at Caltech; there was even an excursion into genetics. And there was Los Alamos, where he was one of the cadre that made the Bomb. There, as James Gleick observes, they did science by the seats of their pants. Pragmatism and working approximations resulted in the Trinity mushroom over New Mexico. In the long run, approximations to safety may conceivably have led to Feynman's two rare cancers, the second of which killed him.
Gleick's narrative [Genius], consistently measured and elegant, is a formidable work of scientific biography. His very guardedness, however, keeps him at a distance from his human subject. Having analysed both the nature of Feynman's self-mythologising and the general mythology of genius—yes, he unceremoniously deconstructs his own title—Gleick seems reluctant to risk turning his subject into a character. Feynman's faults are indicated, most notably in his callous attitude to womanising, and so are his charms, but the synthesis of a personality from these details is far more rudimentary than that of a scientific presence.
It was apt that Feynman's swan-song was Newtonian. As a member of the commission investigating the Challenger disaster, during his final illness, he demonstrated the fatal weakness of the space shuttle's O-rings by dunking a sample of the rubber in iced water. To the last, he reminded the world that it is governed by ordinary physics.
Gleick links Feynman to the theme of his previous book, Chaos, by emphasising Feynman's scepticism about invoking mysterious quantum effects to explain complicated phenomena like the weather or the mind. Feynman realised that tiny ordinary events could have massive consequences. He believed that nature is not weird, just wild.
Christopher Potter (review date 31 October 1992)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 856
SOURCE: “The Pig Who Abolished the Future,” in Spectator, October 31, 1992, pp. 36–37.
[In the following review, Potter offers a positive assessment of Genius.]
Despite the recent success of perhaps half a dozen popular science books which might be said to fulfil the expectations of C. P. Snow's projected ‘Third Culture,’ it is probable that most readers even of The Spectator will have only the haziest notion of who Richard Feynman (pronounced Fineman) was. And yet he was undoubtedly one of the greatest physicists (or is it mathematicians?—modern physics is so theoretical it is hard to distinguish between the two) of the second half of the 20th century and, excepting Einstein, certainly a match for the rich trove of physicists of the first half: Heisenberg, Dirac, Pauli, Bohr et al. But what immediately separates the genius of Feynman from that of his peers is his obvious lack of cultural roundedness. He managed to get into Princeton with a perfect result in physics and the lowest score ever in history and English. He was also, as many intelligent and sensitive children are, desperately unsporty—which he compensated for by assuming an unpleasantly exaggerated machismo. In the late 1960s, when Feynman was in his early thirties, he was the subject of a protest march, one placard suggesting his middle initial P stood for Pig. Feynman was certainly promiscuous and may well have been psychologically damaged by the death from tuberculosis of his young first wife Arline; but—perhaps it comes as a relief—the psychological make-up of the man is not much explored in this biography.
Like many an oddfish before him, he survived his early years by playing the fool. It was a role he was to refine throughout his life. Even on his deathbed he couldn't resist a joke. The Los Angeles Times sent him a proof of his obituary which he returned with a note saying that he had not read it because he did not want to spoil the surprise. He encouraged and exaggerated stories about himself, though it is true that his second wife divorced him on the grounds that he played the bongo-drums incessantly and did calculus in bed. His polished anecdotes were eventually turned into a highly idiosyncratic autobiography, Surely You're Joking, Mr Feynman!, which, to his frustration, was to overshadow any account of his real achievements as a scientist. James Gleick (pronounced Glick) has made an excellent job [in Genius] of redressing the balance and redirecting us to the products of Feynman's genius.
Publicly, Feynman is best known for the part he played in the Challenger disaster enquiry. The expected whitewash and affirmation of NASA's continuing programme of research was not forthcoming thanks mainly to Feynman's memorable and public demonstration of the failure of the rocket's sealing rings, using nothing more than a piece of rubber and a glass of iced water. As one of Feynman's colleagues put it:
The public saw with their own eyes how science is done, how a great scientist thinks with his hands, how nature gives a clear answer when a scientist asks a clear question.
In Feynman's view the great discoveries in science tend to be made by the young because they don't know enough. Too much learning is a dangerous thing. Know too much and every new idea can be too easily dismissed. He only read enough of the work of other scientists to understand the problem; he would then derive his own solution. This process of keeping his mind unclogged by the fashionable enabled him to continue to be innovatory throughout his life, long after most of his contemporaries had done their best work. His other great ability was really to get inside a problem: ‘If I were an electron …’ was a typical Feynman locution.
Feynman won the Nobel Prize for, in his own modest view, sweeping a longstanding problem in quantum physics under the carpet. He found a way of getting the quantum physical equations to give the ‘right’ answers by ‘ignoring’ the bits of the equations that gave rise to troublesome infinities. However, the so-called ‘Feynman diagrams’ may prove to be his greatest contribution to science, a computational device particularly useful to quantum physicists. Feynman made the simple but imaginative leap of considering all of a particle's history at once. In the quantum world it could no longer be said that the future exists and the past has passed.
Whether or not this is the way the world is in reality was not the sort of question Feynman was prepared to answer. For him, the laws of nature were to be constructed, not discovered. As science becomes more theoretical, reality necessarily becomes shadowier. He was not convinced by scientists like Stephen Hawking who believe in the ultimate simplicity of the universe. His universe was like an infinitely layered onion. The more layers that are peeled off the more interesting the universe seems. Our understanding goes deeper, but ultimately the universe, like God, remains unknowable and mysterious. ‘I have approximate answers and possible beliefs and different degrees of certainty about different things’ was as close as he would get to a credo.
Freeman Dyson (review date November 1992)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 621
SOURCE: “Doubt as the Essence of Knowing: The Genius of Richard Feynman,” in Physics Today, Vol. 45, No. 11, November, 1992, p. 87.
[In the following review, Dyson offers a positive assessment of Genius.]
Six years ago Richard Rhodes published his historical study, The Making of the Atomic Bomb. Like most of my friends, I thought the last thing the world needed was another fat book about the atomic bomb. But it turned out that Rhodes had done his homework and gone back to primary sources; he discovered a wealth of new facts that the earlier books had missed. I was forced to reverse my initial judgment. After all, the world did need a comprehensive and reliable history of the atomic bomb, and Rhodes had supplied it.
My initial reaction to James Gleick's new book [Genius] was the same. After three books by Ralph Leighton and all the other published reminiscences of Feynman's life and work, who needs another book about Feynman? And again, after reading the book, I changed my mind. Like Rhodes, Gleick has made an extraordinarily thorough investigation of primary sources. He has interviewed a multitude of people who were involved with Feynman at every period of his life from beginning to end, including family, childhood friends, colleagues, students, government officials and medical doctors. He has had access to early personal notes and correspondence that amplify and sometimes correct the stories Feynman remembered many years later. Gleick has assembled out of this material a portrait of Feynman far more complete and authentic than any of the earlier accounts. Although I am a longtime friend and admirer of Feynman, I feel that I know him better after reading this book than I did before.
In a brief review I can mention only two aspects of the book that I found particularly illuminating. First, the picture of Feynman's family background and childhood in Far Rockaway is grittier and bleaker than the idyllic stories that Feynman told later. Feynman remembered his father as a kindly philosopher whose profound words shaped Feynman's own way of understanding the natural world around him. In reality, Feynman's father was a harassed and unsuccessful businessman who was forced to travel to earn a living and had little time left over for his children. The fact that Feynman could create a legend of the philosopher-father out of such a meagre reality is an important clue to understanding his character. One of the most valuable lessons I learned as a student of Feynman was a rule that he applied whenever a question about priority of discovery threatened to arise. Feynman's rule was to “always give them more credit than they deserve.” Feynman consistently applied this rule throughout his life, and it saved him from many time-consuming irritations. It now appears that, consciously or unconsciously, he applied the same rule to his father.
The other part of the book that I found most novel and informative was the discussion of Feynman's view of scientific explanation. This view appears in a section with the title “The Explorers and the Tourists.” In Feynman's view, scientists are explorers and philosophers are tourists. The tourists like to find everything tidy; the explorers take Nature as they find her. Feynman did not agree with the view prevalent among philosophers that the purpose of science is to reduce natural phenomena to a few fundamental laws. He believed that all natural phenomena are worth exploring and explaining, whether or not the explanation turns out to be fundamental. Gleick sums up Feynman's scientific credo in one sentence, the clearest statement I have seen of the true spirit of science: “He believed in the primacy of doubt, not as a blemish upon our ability to know but as the essence of knowing.”
Thomas A. Bass (review date 1 November 1992)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1472
SOURCE: “Casanova of the Mind,” in Los Angeles Times Book Review, November 1, 1992, pp. 1, 8.
[In the following review, Bass offers a positive assessment of Genius.]
Save for the beatified Einstein, few physicists have become famous. Robert Oppenheimer or Werner Heisenberg might be exceptions. But how many other physicists can we pick out of the serried ranks?
Maybe one other—Richard Feynman, who won the requisite Nobel Prize and taught for many years at Cal Tech before becoming famous, first as a popular author and then as a member of the panel examining the Challenger space shuttle disaster in 1986.
Feynman was famous, but perhaps for the wrong reasons, as James Gleick asserts in his new biography of the scientist. Before his death of cancer in 1988, Feynman's fame had become the mask of a carefully cultivated persona, argues Gleick, a persona that actually obscured the man's real accomplishments.
It is Gleick's aspiration to use Feynman's life as a window into the history of modern physics, our “modern secular religion,” as Gleick calls it. He has performed a monumental task of sifting through Feynman's papers and interviewing many of the important figures in his life. The book [Genius] is ambitious and thorough, but Gleick has a tough assignment when he follows Feynman in retelling stories that the scientist himself had already narrated. Feynman—or actually Feynman and his collaborators, since he never really wrote any of the books that bear his name—did a brilliant job of bringing his raucous Broadway patter to the page.
The Feynman myth began expanding outward to become part of our national heritage in 1985, when he published a collection of autobiographical squibs called Surely You're Joking, Mr. Feynman! Assembled by the son of one of his colleagues, this as-told-to book became a bestseller. It was followed after Feynman's death by another grab bag of stories titled What Do You Care What Other People Think?, which also became a bestseller.
Not since James Watson's Double Helix—a book greatly admired by Feynman—had a scientist exposed himself so brazenly to the public. While Watson admitted that scientists have egos, Feynman confessed that they also have ids. This “poetic Casanova,” as Gleick calls him, put science next to sex, where it belongs in alphabetical order. His books are full of brainy pranks and skirt-chasing honed to a science of its own.
The Feynman myth charts the following trajectory: A gifted Jewish kid from Far Rockaway—a genius, some might say—after a meteoric ascent through MIT and Princeton, arrives at Los Alamos as one of the youngest in a select group of bomb builders. In an age before machines could do this kind of work, he was the fastest computer in the West.
He learned how to break into safes, pick up showgirls in Las Vegas and play the conga drums in Brazil, while computing his way to the forefront of theoretical physics, where his intuitive, informal style, and the course of studies gathered by his friends into The Feynman Lectures became the model for younger physicists to emulate.
“Most physicists are nerds,” says a practicing friend of mine, “but Feynman was not a nerd. He was articulate, charismatic, funny, heroic. He was the person we all wanted to be.”
Aside from his books, it was television that made Feynman a national folk hero. When President Reagan appointed a NASA-friendly panel of experts to examine the Challenger disaster, the only uncompromised member of the group was Richard Feynman. He would become famous as the man who demonstrated the cause for the disaster, and in the process revealed the back-scratching chain of command and mess of old-boy contracts that made the shuttle a technological kludge ripe for disaster.
Feynman's experiment was replayed for weeks on the nightly news: The professor pries a rubber gasket out of a model of the shuttle, squeezes it with a C clamp, and dunks the apparatus into a glass of ice water. In a few seconds he has found the “rat” the nation was looking for, an inelastic O-ring with no tolerance for cold temperatures. On prime-time TV, 250 million people were being instructed on what physics is good for: tinkering up the solution to a problem; reproducing with known variables a previously unexplained event.
“I have tried not to lean on them too heavily,” Gleick says of these and other stories narrated in Feynman's books—stories that Gleick claims are “mostly accurate, but strongly filtered.” So what did Feynman leave out in the telling? Behind the safecracker, conga drummer and practitioner of what he called “aggressive dopiness” lay a scientist troubled by his engagement in building the atomic bomb and someone whose life had more shadows and conflict than he ever let on.
The most poignant example of this darker Feynman is a letter he wrote to his first wife two years after she died of tuberculosis. “You, dead, are so much better than anyone else alive,” he wrote to his beloved childhood companion, before ending with, “P.S. Please excuse my not mailing this—but I don't know your new address.”
“That he had written such a letter to a woman he loved, two years after her death, could never become part of the iconography of Feynman, the collection of stories and images that was already beginning to follow him about,” writes Gleick. “The Feynman who could be wracked by strong emotion, the man stung by shyness, insecurity, anger, worry, or grief—no one got close enough anymore to see him.” Instead, his life got transposed into stories “in which Feynman was an inadvertent boy hero” mastering the world “by virtue of his naivete … commonsense cleverness … and emperor's-new-clothes honesty.”
Feynman is the perfect subject, and Gleick's biography succeeds in opening a grand perspective onto the history of quantum mechanics and particle physics during the half-century in which they came to dominate science in the United States. As the bomb builders gathered at Los Alamos, writes Gleick, “a final valedictory was being written to the Protestant, gentlemanly, leisure class structure of American science,” which would come to be dominated in the postwar years by increasingly expensive accelerators and other experimental toys.
Today the big-ticket items are harder to finance, and many of the best graduate students, including Feynman's son, have gone into computers or molecular biology or new areas in physics, such as chaos theory, which was the subject of Gleick's last book. In this context—and in spite of his biographer's claims—Feynman was more a marker at the end of an era than a signpost leading into the next.
He got “the Swedish prize,” as his colleague and arch-rival Murray Gell-Mann puckishly calls it, for independently inventing in the 1940s, with Schwinger and Tomonaga, the theory of quantum electrodynamics. This was the last big development in the field, a grand finale to 20 years of discoveries in quantum physics. But perhaps the final word on the subject is Gell-Mann's, whose quarks are the last subatomic particles—or metaphors—to be squeezed into the model.
As its title indicates, Genius spends a lot of time discussing the subject of genius and speculating on the reasons for our current shortfall in producing geniuses. One suspects the problem lies not in the diminished capacity of our age, but in the fact that this 19th-Century concept doesn't quite fit the specialized, speedy, cooperative nature of science today.
Feynman himself debunked any claims to genius. He barely squeaked into graduate school at Princeton with some of the lowest GREs in English that the school had ever admitted. His second wife thought she was married to “an uncultured man with a Ph.D.” “We are not that much smarter than each other,” was his own evaluation of the subject.
After Feynman, the next modern scientist to grasp fame's golden ring is Stephen Hawking. But where Hawking's ideas partake of a kind of Church of England domesticity, Feynman's are infused with tonic doubt. “We may now be near the end of the search for the ultimate laws of nature,” intones Hawking. Feynman's rejoinder (borrowed from another occasion): “I've had a lifetime of people who believe that the answer is just around the corner.”
“I don't have to know an answer,” said the prophet from Pasadena. “I don't feel frightened by not knowing things, by being lost in a mysterious universe without any purpose, which is the way it really is as far as I can tell.”
“Feynman was a larger-than-life character, a mixture of hero and pain-in-the-ass,” said one of his colleagues. I found myself moved by the final pages of Gleick's book, sorry to see this vital life flicker out in a bon mot. “I'd hate to die twice,” were Feynman's last words. “It's so boring.” For all Feynman's flaws—no, because of his flaws and time-bound thoughts—this is a life worth reading.
S. S. Schweber (review date 26 November 1992)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1495
SOURCE: “From Thought to Expression,” in Nature, November 26, 1992, pp. 375–76.
[In the following review, Schweber offers a positive assessment of Genius.]
Scientific communities usually seek to convey the importance and the history of their discipline through the lives of their outstanding practitioners—individuals whose very creativity renders them unlikely to be the best representatives of their worlds. Physicists have principally chosen theorists as their heroic figures (Newton, Maxwell, Einstein, Planck, Bohr, Dirac, Pauli) or experimenters who left their mark in both experimentation and theory (J. J. Thomson, Rutherford, Fermi), thereby emphasizing the ties of the discipline to the tradition of natural philosophy. The cult idols are the reification of the aspirations and myths of the community and the embodiments of its cherished values. (And just as what is not said can be as revealing as what is, those left out of the hall of fame disclose much about the community.)
Einstein and Bohr, and to a lesser degree Planck and Rutherford, are the heroes from the period before the Second World War. They have come to stand for the advances ushered in by relativity and quantum mechanics. Dirac embodies the ‘genius’ of the quantum-mechanical revolution and is the prophet who pointed the way in the synthesis of relativity and quantum mechanics. And Landau and Feynman have become the heroes of the post-quantum-mechanical era. Both were enormously creative and innovative and made lasting conceptual contributions (a necessary condition for admission to the pantheon). Both encompassed all of physics, and could therefore be taken as representing the unity of the enterprise, resisting the fragmentation of the discipline and the attendant specialization. Although at various times they set and dominated the intellectual agenda of the community, they were often also the main challengers of the ruling orthodoxies. They were thus simultaneously insiders and outsiders. Similarly, both were unusual in their social behaviour—and the community was willing to condone in them conduct that would not be acceptable in society at large. And after the Second World War, during an epoch shaped by the Cold War, polarized by concerns about national security and dominated by military-industrial complexes, both men consciously distanced themselves from the conflict. They could therefore be seen as ‘pure scientists’ transcending national rivalries.
There is so far no satisfactory intellectual and psychological portrait of Landau or an adequate account of his life. But with James Gleick's Genius we now have a superb biography of Feynman. The book is a moving, beautifully written, literate and perceptive account of Feynman's life, which Gleick sets sensitively in the context of the physics community, of the larger culture and of the times.
We get a feel for life in the Jewish community of Far Rockaway during the 1920s. We learn what it was like to be an undergraduate at the Massachusetts Institute of Technology during the 1930s. We become privy to the extensive anti-semitism in the élite American universities at that time; to life at Los Alamos during the Second World War; to the politics of universities after the war; to the dynamics of selecting and receiving Nobel prizes; to the world of NASA stripped of its public-relations gloss, and the behind-the-scenes activities of the presidential commission investigating the Challenger disaster.
As readers of Gleick's earlier book Chaos know, this author has a gift for capturing the atmosphere of a community and for providing sharply focused, perceptive sketches of its principal characters. We get to know Schwinger, Bethe, Weisskopf, Dyson, Gell-Mann and some of the other leading theorists, and, while doing so, we learn a great deal about the sociology of the theoretical physics community of the time and the competitive dynamics that animated it. The book also contains lucid accounts of many of the advances in theoretical physics since the middle of this century—Feynman diagrams, renormalization theory, the explanation of the superfluidity of liquid helium, partons and quarks, the standard model—advances to which Feynman made important, fundamental contributions. Gleick's accomplishment in this area is uncommon because he is not a scientist by training, yet his presentation of the science is accurate and readily accessible to the general reader.
Gleick never met Feynman. His knowledge of his subject stems from interviews with members of Feynman's family, with colleagues and former students, and with friends dating back to Feynman's high school and college days. He read and listened to taped interviews with Feynman that various historians had made earlier, in particular the extensive and remarkable interviews that Charles Weiner recorded in the late 1960s and mid-1970s. He studied videotapes of Feynman's lectures and television programmes, and pored over some 25 cartons of papers, notes and correspondence that Feynman had deposited in the archives of the California Institute of Technology. It is clear that Feynman was unusually gifted, highly original and inventive. But the vivid portrait that Gleick draws would not have been possible without access to the personal letters that Feynman had kept in his house. Gweneth, Feynman's wife, allowed Gleick to read the correspondence between Feynman and his parents, between Feynman and his first wife, Arline, and letters to Feynman from lovers and personal friends.
Feynman's relationship with Arline—his first girlfriend, his first and perhaps his only true love, and his first wife—shaped his subsequent life. Arline was diagnosed as having lymphatic tuberculosis while Feynman was at MIT. Despite the strenuous objections of his parents, the two got married before Feynman went to Los Alamos early in 1943. Throughout his stay there, Arline was in a hospital in Albuquerque and she died shortly before Trinity in 1945. Feynman never recovered from the loss, seemingly arriving at the conviction that he would never again be fulfilled in love. Two years after Arline died, Feynman—the supreme rationalist—wrote her a heart-rending love letter. His subsequent interactions with women were always affected by the scars incurred by the loss—until his marriage to Gweneth, the emptiness and yearning were filled by one-night stands and tempestuous, destructive love affairs. Similarly, in physics he came to accept human limitations: unification and, in particular, a theory of everything were, in his view, fantasies with which the community deluded itself. Physics consisted of a set of algorithms that answered with a high degree of certainty questions of the form: ‘If I do this, what will happen?’ Feynman searched for such algorithms with extraordinary courage and unshakeable integrity.
Gleick is captivated by Feynman's “genius.” (Who wouldn't be?) At times he seems obsessed by the notion of genius and by the search for what constitutes its possession. He fails to find the holy grail, but does make clear that to understand Feynman one has to apprehend his peculiar connection to his social world and his distinctive intellectual relationship to the scientific community in general and the physics community in particular. In intellectual matters Feynman was able to straddle the gulf between self and community. He could adhere to many of the tenets, assumptions, forms of thought and styles of reasoning that characterized the theoretical physics of his day while also transcending their limitations. One aspect of Feynman's genius was that he could make clear what was obscure to most of his contemporaries. His doctoral dissertation and his 1947 Reviews of Modern Physics article that presented his path-integral formulation of nonrelativistic quantum mechanics helped to clarify in a striking manner the assumptions that underlie the usual quantum mechanical description of the dynamics of microscopic entities. And he did this in the very act of transcending the usual formulation with a startling innovation. It may well be that his reformulation of quantum mechanics and his ‘integral over paths’ will turn out to be his most profound and enduring contributions. They have deepened considerably our understanding of quantum mechanics and have greatly extended the systems that can be quantized. And judging from the work of Michael Atyah and Ed Witten, Feynman's path integral has already substantially enriched mathematics and provided new insights into spaces of infinite dimensions.
Early on, Feynman also learned to walk the tightrope between his own psychological needs and the requirements of belonging to a community. He came to accept and appreciate the fact that the act of creation was for him also an act of consummate isolation.
The creative act depends on private visions and solitary constructions and always draws on the legacy and the resources of the community, be it in the arts, literature, technology or the sciences. We call people geniuses when their ability to synthesize these communal resources overwhelms us; and if the synthesis happens to result in a startling outcome, as in the case of Feynman, we are amazed and awe-struck. But by creating a category of people we label as geniuses, we are on a slippery slope that may lead us to believe that there is an innate quality attached to the attribution, when instead we should focus on its social and cultural dimensions. Rather than Genius, I would have called this impressive book The Remarkable Life and Science of Richard Feynman.
Alan Lightman (review date 17 December 1992)
Last Updated on June 8, 2022, by eNotes Editorial. Word Count: 4542
SOURCE: “The One and Only,” in New York Review of Books, December 17, 1992, pp. 34–37.
[In the following positive review of Genius, Lightman provides an overview of physicist Richard Feynman's life and career.]
Richard Feynman was the Michael Jordan of physics. His intellectual leaps, seemingly weightless, defied explanation. One of his teammates on the high school math team in Far Rockaway, Long Island, recalls that Feynman “would get this unstudied insight while the problem was still being read out, and his opponents, before they could begin to compute, would see him ostentatiously write down a single number and draw a circle around it. Then he would let out a loud sigh.” At twenty-three, he amazed a Princeton colleague when he worked out at the blackboard a proof of an important proposition of physics that had been only loosely conjectured eight years earlier by the Nobel Prize winner Paul Dirac. In 1960, in his early forties, restless and unable to find a physics problem worth working on, Feynman taught himself enough biology to make an original discovery of how mutations work in genes.
Feynman rarely read the scientific literature. When he did, he would read only far enough into an article to see what the problem was, fold up the journal, and then derive the results on his own. When a colleague, after perhaps months of calculations, walked into Feynman's office with a new result, he would often discover that Feynman already knew not only that result, but a more sweeping one, which he had kept in his file drawer and regarded as not worth publishing. The mathematician Mark Kac has said that “there are two kinds of geniuses, the ordinary and the magicians. An ordinary genius is a fellow that you and I would be just as good as, if we were only many times better.” But for the second kind, “even after we understand what they have done, the process by which they have done it is completely dark. …” He called Feynman “a magician of the highest caliber.”
Scientific genius alone would not have explained Feynman's legend. It was also his style. He was stubborn, irreverent, unrefined, uncultured, proud, playful, intensely curious, and highly original in practically everything he did. He had a mystique. There are hundreds of “Feynman stories,” some told by Feynman himself in his popular book Surely You're Joking, Mr. Feynman, and others passed along by word of mouth from one physicist to another, like beheld visitations passed from one disciple to another. As a graduate student at Princeton, for example, Feynman would spend long afternoons leading ants to a box of sugar suspended by a string, in an attempt to learn how ants communicate. When Feynman noticed that his Ph.D. thesis adviser, John Wheeler, pointedly placed his pocket watch down on a table during their first meeting. Feynman came to their second meeting with a cheap pocket watch of his own and placed it on the table next to Wheeler's. At Los Alamos, when he was working on the Manhattan Project, the young Feynman continually alarmed other scientists and the military brass by cracking their safes, which were filled with atomic secrets.
When he was preparing to accept the Nobel Prize in the presence of the king of Sweden, Feynman worried that it was forbidden to turn one's back on a king; he might, he was told, have to back up a flight of stairs. He then practiced jumping up steps backward, using both feet at once. Feynman hated pomp and authority of all kinds. After being elected to the prestigious and highly selective National Academy of Sciences, he withdrew from the organization, saying that its main function was only to elevate people to its exalted ranks.
There was something almost uncanny about the way Feynman could get to the heart of a question. On February 10, 1986, during the public hearings on the Challenger shuttle disaster, as a member of the committee of inquiry, he performed an experiment of deadly simplicity. He dropped one of the shuttle's O-ring seals into a glass of ice water, the temperature of the air on the day of the launch, and showed that the rubber when squeezed did not stretch back under such cold. I cannot resist telling my own story about Feynman, one of the three professors who conducted the oral examination for my Ph.D. in physics at the California Institute of Technology. Wearing, as usual, a white shirt without a tie, he began the examination by asking me two questions. The first question I answered without much trouble. The second I struggled with. His two questions had precisely marked the limits of my knowledge, like artillery shells fired at a small boat, one landing just short, one long. During the following three hours, he asked no further questions.
James Gleick, the author of the widely read book Chaos, has taken on the difficult task of writing about Feynman [in Genius] as both a scientist and a human being. Gleick never met Feynman. But he has interviewed over a hundred people, including Feynman's family and many of the world's leading physicists; he has read unpublished letters and notes by Feynman and others, talked to a number of Feynman's girlfriends (who remain discreetly unidentified in the book), reviewed documents about Feynman obtained from the FBI and CIA under the Freedom of Information Act as well as hundreds of pages of unpublished interviews by the science historian Charles Weiner. The result is a thorough and masterful portrait of one of the great minds of the century. In describing not only Feynman but the physicists around him, Gleick also succeeds in giving us a rare insight into the scientific community, its values, and its mentality.
Feynman was born in New York on May 11, 1918. His father, Melville, a Jewish immigrant from Minsk, Byelorussia, had a practical, vivid appreciation of science; he once explained to his son Richard that a dinosaur twenty-five feet high with a head six feet across, if standing in the front yard, would almost be able to get his head through the second-floor window. Melville Feynman sold police uniforms and automobile polish, among other ventures, without notable success. Gleick tells us that Feynman's mother, Lucille, had a gift for humor and a love of storytelling.
As a child in Far Rockaway, Feynman tinkered with radio sets, gathering spare parts from around the neighborhood. Many theoretical physicists like Feynman have spent their childhoods building things, but Feynman retained throughout his life an immediate, tactile sense of physical phenomena. Even his mathematical calculations have a certain unfussy and muscular style. Feynman rigged a motor to rock his sister's crib, freeing himself to read the Encyclopaedia Britannica. But he was intimidated by athletics, by stronger boys, and by girls, and was afraid that he would be regarded as an intellectual sissy. Like so many other socially fragile, budding scientists he sought refuge in an intense concentration on math and science, but he was particularly interested in their practical side. His manliness, he saw, lay in his ability to do things with his hands.
Correspondingly, he avoided all pursuits that seemed to him “delicate,” such as poetry, drawing, literature, and music. In fact, Feynman had little respect for the humanities, which he regarded as slippery and inferior to science, and even less respect for humanists. When he was in his early thirties, he wrote that “the theoretical broadening which comes from having many humanities subjects on the campus is offset by the general dopiness of the people who study these things.” Yet Feynman had an appreciation of the workings of human psychology in science. In his brilliant little book The Character of Physical Law he places great value on seeking different formulations of the same physical law, even if they are exactly equivalent mathematically, because different versions bring to mind different mental pictures and thus help in making discoveries. “Psychologically they are different because they are completely unequivalent when you are trying to guess new laws.”
In the fall of 1935, Feynman entered MIT, where he found that virtually everyone else was socially and athletically inept, and obsessed by science. He easily skipped first-year calculus and taught himself quantum mechanics before his sophomore year: He joined a fraternity, one of the two that took in Jews. He met another precocious physics student, T. A. Welton, and together they rederived for their own satisfaction the basic results of quantum physics, which they wrote down in a notebook that they passed back and forth to each other. Feynman briefly read Descartes and decided that philosophy was soft and that philosophers were incompetent logicians. It was in his junior year at MIT that he became engaged to Arline Greenbaum, whom he had met a few years earlier in Far Rockaway, and who became, besides physics, the love of his life.
Where MIT was working-class in tone and unbuttoned in manner, Princeton was patrician and genteel. The afternoon Feynman arrived there as a graduate student, in the fall of 1939, he was invited to a tea with Dean Eisenhart. As he stood uneasily in the suit he hardly ever wore, the dean's wife, a lioness of Princeton society, said to him, “Would you like cream or lemon in your tea, sir?” “Both please,” Feynman blurted out. “Surely you're joking, Mr. Feynman,” said Mrs. Eisenhart, thus supplying the title for the memoir Feynman published fifty years later. Feynman hated people who, he felt, used manners and culture to make him feel small. He became aggressively unrefined.
Feynman and his thesis adviser at Princeton, John Wheeler, worked on the nature of time. Einstein had already shown, at the beginning of the century, that time is not absolute, that the rate at which clocks tick depends on the motion of the observer. But what determines the direction of time? Why is the future so distinct from the past? It was well known that, at the microscopic level, the laws of physics were indifferent to the direction of time; they gave the same results whether time flowed forward or backward. Feynman and Wheeler solved a difficult problem concerning electricity by assuming that, in the way electrons emit radiation, time flows both forward and backward. It seemed a crazy idea, but it was the kind of deep and important crazy idea that caused physicists to skip meals and stay at the blackboard. By the time the young Feynman presented his calculations to a departmental seminar in early 1941, his audience had come to include the great mathematician John von Neumann, the physicist Wolfgang Pauli, who was soon to win the Nobel Prize and was visiting from Zurich, and the sixty-two-year-old Einstein, who seldom came to colloquia. After listening to Feynman's talk, Einstein commented, in his soft voice, that the theory seemed possible.
Around this time, Arline, who had been suffering from fevers and fatigue, was diagnosed as having tuberculosis. She was to spend much of the rest of her short life in sanatoriums. Against his parents’ strong objections, Feynman married her. The only witnesses to the wedding, in a city office on Staten Island, were two strangers.
In 1942, many of the physicists at Princeton began fanning out to work on military projects at, among other places, MIT's Radiation Laboratory (the “Rad Lab”), the University of Chicago, Berkeley, and Oak Ridge, Tennessee. While still at Princeton, Feynman collaborated with Paul Olum and Robert Wilson on a device for culling the fissionable form of uranium from the nonfissionable. This was the beginning of the Manhattan Project. In March of 1943, Feynman and Arline took the train to Los Alamos. Arline entered Presbyterian Sanatorium in Sante Fe while Feynman lived in the barracks at Los Alamos, driving the twenty-five miles over rutted roads to see her every weekend.
At Los Alamos, Hans Bethe, the great nuclear physicist from Cornell, was in charge of all theoretical work. Where Bethe was calm, careful, and professorial, Feynman was quick, fearless, intuitive, and irreverent. Feynman was just what Bethe was looking for. He made the twenty-five-year-old Feynman a group leader, promoting him over older and more senior physicists. Feynman was able to solve a critical problem on how neutrons bounce around among uranium atoms and start a chain reaction.
Arline died in the summer of 1945. Two years later, when he was at Cornell during a frustrating impasse in his theoretical work, the twenty-nine-year-old Feynman wrote a letter to his dead wife, placed it in a box, and never read it again. After Feynman's death, Gleick discovered the letter, which reads in part:
D'Arline,
I adore you, sweetheart.
It is such a terribly long time since I last wrote to you—almost two years but I know you'll excuse me because you understand how I am, stubborn and realistic; & I thought there was no sense to writing.
But now I know my darling wife that it is right to do what I have delayed in doing. … I want to tell you I love you. I want to love you. I always will love you.
I find it hard to understand in my mind what it means to love you after you are dead—but I still want to comfort and take care of you—and I want you to love me and care for me. I want to have problems to discuss with you. …
P.S. Please excuse my not mailing this—but I don't know you new address.
Arline's death was the great tragedy of Feynman's life. Gleick suggests that he never let anyone get close to him again, although he had many affairs, a second brief and unpleasant marriage, and a third, apparently satisfying, one to Gweneth Howarth, with whom he had two children. Gleick also suggests that Feynman may have treated many women as sex objects because he felt no one measured up to his first wife. For the rest of his life, Feynman pursued only beautiful women, some of them the wives of his friends and colleagues, but he had no interest in their intellectual companionship. His attitude toward women is suggested by the conclusion of his Nobel address in 1965:
So what happened to the old theory that I fell in love with as a youth? Well, I would say it's become an old lady, that has very little attractive left in her and the young today will not have their hearts pound when they look at her anymore. But, we can say the best we can for any old woman, that she has been a good mother and she has given birth to some very good children.
Beginning in his late twenties, Feynman started to be followed around by Feynman stories. He heard the stories, polished and embellished them, and retold them. He relished his image as a rough-hewn, philistine hero. Gleick writes that after Arline's death, “The Feynman who could be wracked by strong emotion, the man stung by shyness, insecurity, anger, worry or grief—no one got close enough any more to see him.”
Feynman eventually emerged from his depression at Cornell and, in the late 1940s, did the work that won him the Nobel Prize, showing how electrons interact with electromagnetic radiation—e.g., radio waves—and other charged particles. His theory, called quantum electrodynamics, has been confirmed by experiments to greater accuracy than any other theory of nature. (Quantum electrodynamics predicts that the magnetic strength of the electron is 1.00115965246; the measured value is 1.00115965221.) Quantum electrodynamics explains all electrical and magnetic phenomena, which include everything we experience in daily life except gravity.
Feynman shared his prize with Shin'ichirö Tomonaga of Japan and with Julian Schwinger of the US, who had both independently derived their own formulations of quantum electrodynamics. These alternative formulations were, however, much harder to work with than Feynman's. Schwinger was in many ways the antiparticle of Feynman. He dressed expensively and meticulously, drove a black Cadillac, spoke elegantly in long sentences with subordinate clauses, lectured without notes, and prided himself on arriving at the end of complex mathematical calculations with no dust on his shoes from taking an occasional blind alley.
Feynman made two other major contributions to physics, both worthy of a Nobel Prize. He developed a theoretical explanation for superfluids—fluids that are totally frictionless and that will spontaneously glide over the walls of a beaker and will pass through holes so tiny that even gas could not get through. He also worked out a theory for the weak nuclear force, one of the two kinds of nuclear forces. Both theories were developed at the California Institute of Technology, where he spent the second half of his life. Murray Gell-Mann, Feynman's rival at Caltech, had independently arrived at the weak-force theory, and the department chairman judiciously arranged for both Feynman and Gell-Mann to publish their important work in a joint paper. Like Schwinger, Gell-Mann was very different from Feynman. His interests in science were narrow, but he had broad interests outside science, while Feynman was engrossed with virtually all of science, but with almost nothing outside it.
At Caltech, Feynman became more concerned with education, although he did not have the patience to supervise students preparing theses. In 1961, Caltech decided to revise its physics curriculum and asked Feynman to help. Lecturing at the blackboard to freshmen and later to sophomores, he began with atoms, moving up to larger phenomena like clouds and colors on ponds and down to the smaller, like electrons and the quantum world. Without consulting books, he slowly built up the entire edifice of physics as he understood it, the physical world as he saw it. Soon graduate students and other professors came to listen. Feynman's Caltech lectures eventually became the three-volume Feynman Lectures on Physics, which can be found on the bookshelves of almost every professional physicist in the world. The lectures ultimately failed to accomplish their intended purpose. Apparently simple on the surface, they were in fact deeply sophisticated. But they are a triumph of human thought, and deserve a place in the history of Western culture, along with Aristotle's collected works, Descartes's Principles of Philosophy, and Newton's Principia.
Feynman won the Nobel Prize for his work in quantum electrodynamics, the quantum theory of how electrons interact with radiation and other electrically charged particles. Electrons are the simplest electrical particles. Normally found in the outer parts of atoms, they produce light and other forms of electromagnetic radiation as well as most of the interactions between atoms and molecules. Quantum physics, one of the two pillars of twentieth-century physics along with Einstein's relativity, is the physics of the subatomic world. The theoretical foundations of quantum physics were laid in the 1920s, principally by Erwin Schrodinger, Werner Heisenberg, and Paul Dirac. A basic idea of quantum physics is that particles of matter sometimes behave as if they were in several places at once. This uncertainty about the location of things is negligible for macroscopic objects like people, but it is extremely important for subatomic particles like electrons, where the phenomenon has been repeatedly observed and has immense consequences. Another important idea, also derived from experiment, is that physical quantities like energy are not indefinitely divisible into smaller amounts, but instead have a smallest, indivisible unit, called the quantum (as US currency has a smallest unit, the penny). Both ideas run counter not only to intuition but to the Newtonian theoretical conception of the world before 1900. In order to mathematically describe these two basic ideas of quantum physics the theories of Schrodinger. Heisenberg, and Dirac had to represent matter and energy not by certainties but by probabilities (or, technically speaking, amplitudes, which are closely related to probabilities). Thus, while in the Newtonian scheme a physical law would show how a particle moves from A to B under the action of a force, in the quantum scheme a physical law would show how the probabilities for a particle to be at various places evolve under the action of a force.
The quantum theory of the 1920s gave a good description of isolated particles, but it did not accurately describe the interactions of particles. Experiments began to turn up small discrepancies in particle behavior. For example, in the strange quantum world, subatomic particles are constantly appearing out of nothing and then disappearing again. Each particle, such as the electron, surrounds itself with a cloud of other, ghost-like particles, called “virtual particles,” which fleetingly come into existence and then slip away into oblivion. Electrons interact with the ghost-like particles around them, and those interactions alter the properties of the electron, such as its mass and electrical charge. In reality, the physicists found, there are no isolated electrons. The quantum ghosts are everywhere. Their shadows have been seen in experiments. When physicists in the late 1930s and early 1940s tried to modify the quantum theory of Schrodinger, Heisenberg, and Dirac so as to accurately describe particle interactions, they ran into technical difficulties with the ghosts. Once the ghosts began popping up in the mathematics, the equations couldn't be solved.
One of the triumphs of Feynman's quantum electrodynamics was that it provided a method for dealing with the ghosts. Roughly speaking, the method involves treating the ghosts as part of the electron. Experiments on electrons do not penetrate inside the cloud of ghost-particles around them; we never observe the “bare” electron at the center of the cloud. What we observe is the electron and its cloud. When the thing we call an electron is redefined to include the virtual particles around it, the technical difficulties go away.
Other scientists in addition to Feynman contributed to this redefinition of the electron (and its subatomic cousins). However, Feynman's own version of quantum electrodynamics had two further, and unique, features. First, it made use of mathematical methods that were much easier to work with than the methods of other versions, particularly the version of Schwinger. This was Feynman at his practical best. Second, Feynman's quantum electrodynamics provided a new picture of the world. In other descriptions of quantum physics, even after certainties are replaced by probabilities, a particle advances from A to B in tiny increments, with forces acting to move the particle (or the probability of the particle) from one increment to the next. But Feynman's mathematical description of quantum electrodynamics is global, not incremental. It considers every possible route from A to B, assigns a single number to the entire route, then adds up the numbers from all the different routes to arrive at the probability of getting from A to B.
The descriptions of other physicists could be compared to observing how a car speeds up and slows every few feet along a highway from New York to Los Angeles, whereas Feynman's description looked only at the total gas consumption for the trip. Furthermore, in Feynman's description, the car travels simultaneously on all routes from New York to Los Angeles, even on such crazy but possible routes as New York to Chicago to Miami to Los Angeles. Such a description leads to a strange picture of the world, where all the different ways in which something can happen are happening, at the same time. What we human beings, grossly insensitive, macroscopic objects that we are, conceive of as a single reality is actually a tapestry of many simultaneous realities. It is ironic that Feynman, who considered philosophy a waste of time, should have come up with ideas philosophically so rich. But all deep theories of nature since Lucretius's atomism have had broad philosophical implications.
As with his previous book, Chaos, which described the new science of nonlinear physical phenomena and the people involved in it, James Gleick brings to Genius high intelligence, a strong sense of narrative, a commitment to thoroughness in ascertaining facts, and excellent prose. Many of his analogies and metaphors are memorable. For example, in describing Feynman's concept of least-time trajectories, he makes an analogy to a lifeguard trying to reach a drowning swimmer. The lifeguard, who begins his rescue mission on the beach, travels faster on land than in water. Therefore, the fastest path to the swimmer is not necessarily along a straight line, which might include a short stretch on land but a long stretch in water. Guided by one of Feynman's articles in the Physical Review, Gleick gives a beautiful and vivid description of how Feynman visualized superfluids while lying in bed one night. What Gleick does most brilliantly is to tell us with honesty and insight who Feynman was.
Where Genius falls short, in my opinion, is in its presentation of Feynman's science. There are too many untranslated technical words, like “matrices,” “spin,” “momentum variables,” and “imaginary numbers.” (“One prescription was to take all the momentum variables and replace them with certain more complicated expressions.”) Perhaps more importantly, some of the scientific concepts are not clearly explained—for example, quantum physics, handedness, the difference between numerical and analytical solutions, space-time diagrams. To be sure, a great deal of science is presented skillfully, but many of the scientific explanations are facile and vague. Reading the science portions, one has the sensation of trying to see through a veil.
Many nonscientists are fascinated and puzzled by the scientific mentality. How does it differ from the thinking of musicians or writers? How do scientists make discoveries? What do scientists mean when they say that a theory or an equation is aesthetically attractive? In my view, few books explain this mentality better than Feynman's own book The Character of Physical Law and the mathematician G. H. Hardy's A Mathematician's Apology. But, of course, the scientific mentality is only a part of the scientist. It does not include his personality, the life he leads, his world. Some of the scientific biographies of recent years try to bring together the mentality and the life; in doing so the scientific biographer, perhaps more than others, faces a formidable challenge, requiring not only the skills of a scholar and writer but also a technical grasp of the science.
How much of the actual scientific work of a great scientist do we have to understand in a scientific biography? The answer is only partly a matter of taste. We cannot understand a genius like Feynman, who spent sixteen hours a day thinking physics, if we do not also appreciate some of his work. Yet this is never enough. Two other recent scientific biographies. Abraham Pais's “Subtle is the Lord,” about Einstein, and Walter Moore's Schrödinger, both splendid books in their own ways, give accurate technical descriptions, but they are not accessible to the general reader, and the accounts of the lives of their subjects lack the richness and power of Genius: The Life and Science of Richard Feynman. Gleick has written a monumental work, a lasting scientific biography. Even the book's shortcomings make one appreciate both the difficulties of the genre and the extent of Gleick's accomplishment.
In February 1988, after a gruesome series of illnesses and complications from cancer. Feynman entered the UCLA Medical Center for the last time. He was sixty-nine. Across the city, on a corner of his blackboard, he had written in chalk. “What I cannot create I do not understand.” As he lay in his hospital bed, with his strength ebbing, Feynman whispered his last words: “I'd hate to die twice. It's so boring.”
Nicholas Wade (review date 7 January 1993)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1830
SOURCE: “Mental Arithmetic,” in London Review of Books, January 7, 1993, p. 17.
[In the following review of Genius, Wade commends Gleick's portrayal of Richard Feynman's character and life, but concludes that the biography fails to illustrate the reasons why Feynman is considered to be a genius.]
Richard Feynman was one of the elite group of American and British physicists who developed atomic weapons with the Manhattan project in the Second World War. He flashed back into the public eye in 1965, when he won a share of the Nobel physics prize, and again two decades later when his formidable presence on the committee inquiring into the crash of the Challenger space shuttle forced the cause of the disaster into the open.
Genius is the attempt by a skilled and elegant science writer, James Gleick, to present the facts of Feynman's life and achievements. Unfortunately, the latter are quite elusive, which is surprising given the mystique that has long surrounded Feynman. His fellow physicists held him in an awe that seems to have transcended his actual achievements, at least to judge from the evidence of this book. That evidence is not easy to weigh, however, because Genius is short on scientific explanation and technical detail.
Feynman burst onto the high-energy physics scene as an enfant terrible, a role he continued to play at all ages. He so impressed his teachers at MIT and Princeton that he was able to transcend all barriers, including the anti-semitism that tainted American universities and industry in the Thirties. ‘Is Feynman Jewish?’ the head of the Princeton physics department wrote in 1939 to the MIT professor who had recommended him as a graduate student. ‘We have no definite rule against Jews but have to keep their proportion in our department reasonably small because of the difficulty in placing them.’
Feynman soon joined the emigration of the best and brightest physicists to General Groves's boot camp at Los Alamos. His chief duty there was to run the computer division at a time before computers were available. His tools were a roomful of Babbage-style mechanical calculators, later supplemented with primitive punched-card machines. As a principal assistant to Hans Bethe, chief of the theoretical physics division, Feynman also worked on many aspects of the bomb's physics. Together they developed an important formula, known as the Bethe-Feynman equation, for calculating the efficiency of a nuclear weapon.
The intense and heady business of group bomb-building was accompanied for Feynman by private tragedy, the slow death of his first wife from tuberculosis. On the days Feynman couldn't visit her in the sanatorium where she was staying in nearby Albuquerque, she wrote him love letters in code designed to baffle the Los Alamos censors. Years later, a rediscovered photo of Arline would move him to speechlessness, and the many women who passed through his life thereafter never assuaged his inner grief.
Feynman left Los Alamos at the age of 27, prematurely wise but riding high on his achievements and the esteem of his peers. ‘He is by all odds the most brilliant young physicist here, and everyone knows this,’ Robert Oppenheimer, the director of the Los Alamos project, wrote to a colleague. Another eminent elder physicist, Eugene Wigner, described him as ‘a second Dirac, only this time human’. The rest of Feynman's career was spent in academic physics, first at Cornell under Bethe and then at the California Institute of Technology. His principal scientific achievement was a contribution to the theory of quantum electrodynamics. Gleick doesn't make clear whether Feynman restated the physical ideas of others in his own mathematical formalism, the widely-known Feynman diagrams, or whether he also helped invent the physics of the theory. It was this work for which he received the Nobel Prize together with Julian Schwinger and Shin'ichiro Tomonaga.
Feynman seems to have spent much of his academic life as a prisoner of his own limitations. He had the admirable habit of working out any physics problem from first principles, but this formidable ability was allied with a profound aversion both to reading the physics literature and to listening to his colleagues. The arrogance that everything worth knowing was ascertainable from his own mind cut Feynman off from developments in his own field for long periods at a time. ‘Unlike many of his colleagues, educated scientists in a cultivated European tradition, Feynman did not look at paintings, did not listen to music, did not read books, even scientific books,’ Gleick writes. He couldn't be bothered to read the scientific literature, even the classic papers of Bohr and Dirac. ‘He refused to let other scientists explain anything to him in detail, often to their immense frustration. He learned anyway.’ The problem was that physics after the Second World War soon developed into a broader body of knowledge than one mind could intuit. Feynman had ‘disregarded so much of the decade's high-energy physics,’ Gleick notes of his work in the Sixties, that it was a long-term project just to catch up. ‘He tried, as always, to read papers only until he understood the issue and then to work out the problem for himself. “I've always taken an attitude that I have only to explain, the regularities of nature—I don't have to explain the methods of my friends,” he told a historian during those years.’
His extraordinary intuition about physics, allied to a prodigious ease with the mathematics, should have provided the basis for a long and productive academic career. Yet Feynman seems often to have been at a loss for a suitable problem to work on. He was a lousy teacher, according to Gleick. His Lectures on Physics intended for undergraduates, and an undoubted succès d’ estime, were in fact so difficult that most courses quickly dropped them. A similar fate befell Feynman's own classes. One by one the students dropped out, although the audience size remained roughly constant because the faculty was fascinated. His published books, including the Lectures on Physics and the popular Surely You're Joking, Mr Feynman!, were assembled by others on the basis of lecture notes or taped conversations. He wrote articles for the scientific literature only with reluctance. He left no cadre of graduate students because he lacked the patience to train any. As Gleick cryptically notes, Feynman ‘developed a stature among physicists that transcended any raw sum of actual contributions to the field’. On what, then, did his reputation among other physicists depend?
Strangely, this is not a subject that Gleick addresses directly, almost as if having decided to write a book entitled Genius he was reluctant to parade any systematic flaws in his subject. Feynman seems to have impressed his peers by personal intellectual dominance. He was a ‘genius’ at mental arithmetic. At lunch at Los Alamos one day he challenged the table to a competition. He would mentally solve in 60 seconds, to within 10 per cent accuracy, any problem that could be stated in ten seconds. He cracked such puzzlers as ‘Find the tenth coefficient in the expansion of the binomial (1+x) to the 20th power,’ and ran out of time only when a friend who had played the game with him previously posed a question that required calculating π to 101 digits. His skill at mental arithmetic, Gleick notes, ‘did much to establish Feynman's legend’.
So did Feynman himself. He promoted a number of stories designed to show him coming out smarter than anyone else—for example, that he was able to lift atomic secrets by picking the most secure safes at Los Alamos. In fact, he had developed what were essentially magician's tricks for opening safes whose owners had chosen too obvious a combination number. ‘He surrounded himself with a cloud of myth, and he spent a great deal of time and energy generating anecdotes about himself,’ his colleague Murray Gell-Mann announced at Feynman's memorial service, firmly flouting the nil nisi bonum convention. Feynman was also, among other things, a jerk. At his father's funeral, he refused to recite the Kaddish, making his mother break down and cry. He seduced and discarded many women, including the wives of his friends. Presiding over physics colloquia at Caltech, he would reduce visiting speakers to tears. He behaved as if he had never been taught how to.
Why then does Gleick present Feynman as a genius in a book that purports to shed light on the nature of genius? It's hard to escape the suspicion that Gleick is too good and honest a reporter to suppress the many flaws he found in Feynman's character, yet hesitates to let them seriously interfere with the paean to intellectual splendour he set out to write.
Feynman's best and noblest side came out at the very end of his life, when he had survived one kind of cancer and was afflicted with another. In 1986, as a last-minute choice, he was made a member of the committee inquiring into the cause of the crash of the Challenger space shuttle. The chairman, former Secretary of State William Rogers, seemed to place foremost priority on restoring public confidence in Nasa. Feynman considered getting at the truth more important. Leaving the other members of the committee behind in Washington, he dashed round the country, interviewing engineers and from their data piecing together a sombre conclusion: the colder the surrounding temperature at launch, the more likely it became that hot gases could burst past the rubber seals between the sections of the space shuttle's auxiliary rockets.
The truth, Feynman realised, had to be established on camera. During a break in the proceedings one day, Chairman Rogers was overheard to remark while standing beside Neil Armstrong in the men's room: ‘Feynman is becoming a real pain in the ass.’ On their return, they found he had prepared a simple demonstration. Dunking a length of the rubber seal in a glass of ice water, Feynman showed that it lost its resiliency. Since the seal had to bounce back to do its job, and since the temperature on the day of launch was that of ice water, Feynman had visibly put his finger on the reason for the catastrophe. As his friend Freeman Dyson later remarked, ‘the public saw with their own eyes how science is done, how a great scientist thinks with his hands, how nature gives a clear answer when a scientist asks her a clear question.’ In a dissenting report, which witheringly contrasted the knowledge of Nasa's engineers with the actions of its administrators, Feynman wrote simply: ‘For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.’ Then, desperately ill, he returned to California to die.
Despite the handicap of never having met his subject, Gleick has sharply drawn Feynman's character in all its strengths and weaknesses. Genius is beautifully written, and particularly effective in describing the emotional high points of Feynman's life—his relationship with Arline, with his parents, and with his mentors John Archibald Wheeler and Hans Bethe. But if Feynman was a genius, Genius fails to prove the case.
Tom Regan (review date 26 August 1999)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 588
SOURCE: “Walk—Do Not Run—to Read this Book,” in Christian Science Monitor, August 26, 1999, p. 19.
[In the following review, Regan offers a positive assessment of Faster.]
If you want to understand how time has accelerated, you need only wait for a green light at any street corner in Boston. Because no one else will. Instead, other pedestrians will scan for the smallest sliver of a break in traffic and dodge across the street so they can get to work, lunch, a haircut, etc. about 30 seconds faster than if they had waited for the light to change.
Meanwhile, as you wait for the walk signal, you can't escape the nagging feeling that you're doing something wrong—that by obeying the law, you're an oddball, that you're not making the best use of your time, even if it means endangering your life in a city where drivers slow for a pedestrian about as often as the Red Sox win the World Series.
In a way, reading James Gleick's Faster: The Acceleration of Just About Everything is like waiting for “Walk” while everyone else rushes across the street. Perhaps you do feel like a bit of an oddball, and maybe you could do something more “useful.” Yet, there is something vaguely comforting in the sensation of stepping outside the stream of time that everyone else swims in. And pausing.
Because Faster is a book that demands your attention. As you follow his lead into the labyrinth of “time” and his musings on why life is so much faster these days, Gleick forces you to take a step back and slow down. In a masterly (and somewhat mischievous) analysis, he examines the successive technologies that have pushed us into the fast lane—the watch, the typewriter, the phone, the TV, and of course, the computer. You'll need time to digest and enjoy the wonderful ironies he uncovers about the ways these “time-saving” devices have influenced our world, or indeed, about how they have created the opposite effect of what was intended.
But, alas, it's hard to ignore the irony of the book itself. After all, as well-written and enjoyable as Faster is, not many people will read it because, well, they just don't have time for this kind of book. (It's fun to imagine fans of the “One-Minute Manager” books mistakenly thinking Faster is the latest guide to improving efficiency.)
Unfortunately, the book's overall effect is not helped by an undercurrent of Luddism. While Gleick makes sure to give each technology its due, you sense he finds these examples of “progress” more disturbing than empowering, but in a way that is more nostalgic than insightful.
Then again, Gleick probably knows this nostalgia is misplaced. As he points out, there never has been a “Golden Age” for time. Acceleration seems to be the choice of humanity. After all, has there ever been a generation that by choice moved at a slower pace than the generation before it?
Perhaps, if anything, Faster will help us think about the way we “construct” time—for it is surely a construction. And help us recognize that “neither technology or efficiency can acquire more time for you, because time is not a thing you have lost. It's not a thing you ever had. It's what you live in. You can drift or you can swim, and it will carry you along either way.”
So why not drift a little, and take the time to read the delightful Faster? You might actually find yourself waiting for more green lights.
James Gleick with Douglas Starr (interview date 30 August 1999)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 2019
SOURCE: “James Gleick: Speeding Toward the Millennium,” in Publishers Weekly, August 30, 1999, p. 44.
[In the following interview, Starr provides an overview of Gleick's career and discusses Gleick's comments on his life and work upon the publication of Faster.]
I really don't think of myself as a science writer, says James Gleick, one of the nation's preeminent practitioners in the field. Gleick, a former New York Times science reporter, columnist for the Times Sunday Magazine and author of two classic science books, is sitting on the deck of his house overlooking the Hudson River. The setting is pastoral, but Gleick seems more tentative than relaxed. He worries that readers will consign him to a single category—science—while he sees his own work as much broader than that. “Granted, I'm more interested in technology than most people, and less interested in politics than most. But I don't like to think about categories. I really see myself as a general nonfiction writer.”
Maybe so. But few people have made as much of a mark in general interest science books in such a short time. Gleick was a relative newcomer to the field in 1987, when Viking published Chaos, a narrative about the scientists developing new theories to explain disorder in the universe, which became an instant hit. His next book, Genius, a biography of the eclectic physicist Richard Feynman, also met with laudable sales and reviews.
Now, Gleick has written a genuinely genre-defying book—Faster: The Acceleration of Just About Everything, out next month from Pantheon. Bringing together science, technology, social commentary, psychology, philosophy and even self-help, if you read it the right way, Faster presents a vision of a society hurtling in no particular direction, whose members “multi-task” to create leisure time they never get to enjoy.
Faster shares certain elements with his earlier works—compelling ideas presented at an exhilarating pace through examples and analogies, all supported by copious research. These are the qualities Gleick most prizes in nonfiction reporting, and he attributes them to his role models, who include J. Anthony Lukas, Tom Wolfe and Gay Talese. “They all seemed to have the same values,” he says. “Telling the truth, going to the place and hanging out.” That sense of hanging out, or being there, infuses Gleick's reporting. In Faster, for example, he takes readers to such diverse locales as an air traffic controllers center in Dallas, an action-movie set north of San Francisco and a telephone directory assistant's office in Manhattan. “I'd always considered that to be one of the world's nightmare jobs—nonstop. But it soon became clear to me that the people who work there like what they do. They compete with one another, but at the same time are able to let their minds wander.” It's an insight he never could have gotten had he not spent time on location with people.
Gleick was jumping categories from the very beginning. After excelling in mathematics in high school in New York, he majored in English and linguistics at Harvard, where he “spent a lot of energy trying to wriggle out of the few science requirements they had.” He moved to Minneapolis, where he founded a short-lived alternative weekly; a year later he returned to New York, where he secured a copyediting job at the New York Times. He broke into reporting by writing a magazine profile of a famously multidisciplinary thinker, the linguist-mathematician Douglas Hofstadter. Later Gleick profiled two mathematicians who were stretching the boundaries of their own particular disciplines—Mitchell Feigenbaum, who studied how simple systems can produce wildly unpredictable results, and Benoit Mandelbrot, who in discovering fractals showed how a pattern can repeat itself endlessly in nature, whether in the branches of a tree or the shape of a coastline.
Gleick thought these and other like-minded scientists were creating a new understanding of the natural world, one that could explain the behavior of previously disparate, complex phenomena, such as weather, populations or even cotton prices. Publishers were entranced by Gleick's synthesizing vision. Several approached him to write a book on the subject, including Dan Frank of Viking. “We had known each other at age 11 in summer camp,” Gleick recalls. The two conferred over lunch and later made a deal. (Gleick has been with Frank, now at Pantheon, ever since.) Gleick, by then a science reporter for the Times, took four months off to do the reporting for his book and wrote mornings, evenings and weekends to deliver the manuscript.
Chaos: Making a New Science, appeared in a midsize printing of 20,000 books. No one anticipated the stir it would cause. A National Book Award nomination came almost immediately, and copies flew off the shelves faster than the publisher could supply them. Eventually, half a million copies sold in hardcover and paperback. His cross-disciplinary approach struck a chord. Letters came in from readers as diverse as legal scholars and literary researchers telling him that the phenomenon he described in the physical and biological worlds applied equally to theirs.
The success of Chaos helped Gleick earn a sizable advance to write about Feynman, another hero of multidisciplinary thinking. (Gleick left the Times to pursue his reporting, although he continued to write columns and articles.) Five years in the making, Genius is a rich evocation of the Nobel Prize-winning physicist, whose accomplishments ranged from helping to build the A-bomb to uncovering the cause of the space shuffle disaster.
By 1992, when Genius was published, Gleick had become fascinated by the Internet—“right from its earliest, geeky beginnings,” he says. “I'd been running around interviewing scientists and they all had e-mail, and I wanted e-mail.” But when he set himself up with it, Gleick found the network less than satisfactory, especially when pursuing one of his passions, contract bridge, on the computer. Hampered by the need to type arcane Unix commands, he contacted a programmer he knew and suggested they design a user-friendly interface. After months of intensive work, he and his partner came out with Pipeline, a precursor to the Net browsers of today. In 1995 they sold the venture for a reported $10 million. Gleick, while declining to specify his earnings, admits he made “some millions” of dollars.
That windfall, plus the revenues from his books, enabled Gleick and his wife—Wall Street Journal reporter Cynthia Crossen—to build a retreat about an hour's drive north of Manhattan. The house is a celebration of flagstone and cedar, each level offering stunning views of the river below and the hawks soaring above. Gleick takes us upstairs to his office, a turret-like room with views so panoramic that he has to draw the shades in order to work. The only hint in the building of anything less than ideal is that in order to ascend to the room he must take an elevator.
A SHATTERING EVENT
In December of 1997, Gleick was flying his airplane, a canard-winged Long EZ, with Harry, his eight-year-old adopted son. Gleick, a pilot with many years’ experience, was bringing his plane in for a landing when it crashed short of the runway, killing the boy. Gleick, who was pulled from the wreckage in critical condition, lost one leg and almost lost the other. He declines to talk about the accident other than to say that it “shattered my life, If you have children you can only imagine … no, you can't even imagine,” he says, and trails off.
Gleick spent months in the hospital and many more in painful rehabilitation as he learned to use his artificial leg. He had been writing a column for the Times magazine, called “Fast Forward,” which offered thoughts about our social and technological future. During his hospital stay, at the urging of his editor, he began writing again.
The work proved therapeutic—it had always been a place where he could give rein to his startling, quirky, challenging perceptions. In one column, for example, he muses about the evolution of technology and society by watching a rerun of the TV show Lost in Space. One scene, supposed to take place in the then-distant future, shows photographers taking pictures of a rocket launch, popping flashbulbs. “Yes, flashbulbs. Remember them?”
“And what,” the column continues, “are those shiny round disks resting on the desks of this advanced, high-tech, space-mission control room? Ashtrays … no one guessed, a generation ago, that a 1997 control room would be a no-smoking zone.” This is the future that Gleick wants us to consider—“not shiny and gleaming … [but] all mixed up like a junkyard, the old and the new jumbled together.”
Many of the ideas Gleick first sowed in his column eventually came to fruition in Faster. Gleick opens the book with a description of the Directorate of Time, the military installation where atomic clocks dissemble each second into billions of segments, in order to produce an accurate standard. He then discusses how more stuff has been crammed into less time in our current society—shades of the junkyard of his earlier column. Gleick parades before us remote controls, computers, 500-channel TVs, “door close” buttons on elevators whose real purpose is to keep riders from becoming impatient and athletic competitions decided by a hundredth of a second, He looks at the tightening nets of speed and efficiency coaxed around complicated systems, such as airline schedules, in which a mishap in one location can cause a cascade of errors in many others (remember Chaos?). He portrays a sense of individual hurry that we experience but may not notice in our daily lives—as in the grad student who flips on his computer each morning, brushes his teeth during the boot sequence, and fires up his Net browser as he gobbles down breakfast in order to save “two or three minutes a day”—and a feeling of universal rush as we all connect to each other over the Internet.
“We live in a buzz,” he explains in the book, and adds during our interview that that's not necessarily a bad thing. In fact, Gleick worries that some readers may fault him for not decrying the accelerating state of the world. Yes, there is an obvious increase in “Hurry Sickness,” as he calls it—the modern equivalent of what we used to call Type A behavior—but dealing with constant stimuli may be something human beings can adapt to.
“The typical reaction to this is, woe is us” he says, “but the truth is, we're all multi-tasking. I think we're evolving—not in any biological way, but in the sense that our brains are being challenged by our culture in a way they were not challenged in the past, except, perhaps, in times of war.”
An admitted multi-tasker, Gleick concedes that he gets distracted at times, but says he generally finds that he can productively sit at his computer and alternately write, read his e-mail, do some programming and play games.
He adds that rather than merely decry modern communications, people should learn to use them effectively. He pointedly complains that the publishing industry has been slow to occupy cyberspace. “Publishers should be creating a Web site for every one of their authors,” he says. Gleick maintains his own site (www.around.com), with excerpts from his columns and books, a hot button for reader feedback and a purchasing link to Amazon.com. He believes every author should become part of the online community and participate in copyright and sales. “Authors,” he says, “need to take more control.”
The need for control may indeed be a key issue. People might see themselves as victimized by technology and speed, but they don't have to be if they exercise sufficient discipline and assertiveness, he suggests. It's a strategy that seems to apply not only to the world that Gleick describes in Faster but to the author who rebels at being confined to a genre.
“We shouldn't think of ourselves as victims,” says Gleick. “We make choices in our lives. Sometimes we have to remind ourselves that there are a bunch of goodies in front of us and the one that looks the brightest or tastiest this instant is not the one that's going to leave us satisfied.”
Todd Gitlin (review date 12 September 1999)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1418
SOURCE: “Overload,” in Los Angeles Times Book Review, September 12, 1999, p. 6.
[In the following review of Faster, Gitlin concludes that Gleick's compilation of illustrative examples, although interesting, fails to provide substantive analysis of complex historical and societal issues.]
Reader, you may be reading these words while nibbling on your morning Pop-Tart, or sipping a cup of microwaved coffee, or looking up from the Sunday morning cartoons or tennis or “Meet the Press.” You may have learned to speed-read, thereby saving valuable seconds that you may spend dipping into a game of Doom or, on your portable phone, hitting the redial button to try Ticketmaster again—or, much better, setting your Power-Dialer to try up to 25 times a minute—because, in pursuit of the latest hot ticket, which just went on sale and is already tying up tens of thousands of long-distance lines, you keep getting a busy signal. Or you may be reading online, skimming from high-lighted keyword to subhead, stopping at bulleted lists, one mouse click away from a classified ad, or a steaming Pamela Anderson Lee video, or an incoming instant message via AOL, or an offshore Internet bet, or a quick toss on the Singapore stock exchange, or the chance to order the book under review or an inbox of e-mails, each clamoring for reply (you will be thought rude if you take longer than overnight) while ads flash and wiggle across the margins.
We are surely an MTV culture, poised uneasily between mania and boredom, relentlessly churning out new goods and new means to access them, our standard of living widely believed to rest on the success of marketers in foisting their wares before a glutted populace and at least momentarily arresting our attention, converting our time to their money. Consider that, at this writing, 93٪ of American households possess at least one underestimated little gadget of this civilization: the remote-control device. “Every television programmer,” James Gleick writes in Faster, “works in the shadow of the awareness that the audience is armed.” The producer of “Meet the Press” wants to paralyze your itchy finger, lest you click your remote control device over to the competition. Toward that end, the networks have figured out how to pare fractions of a second out of the dead space between show and commercial. When Gleick goes on his book tour, he will be asked again and again by interviewers reading from bulleted lists on press releases just what is his thesis, what is his point, what is the bottom line, for his drive-time listeners have their fingers poised over the SCAN button, his TV watchers are remote-control-ready and his publisher has a lot at stake in getting you into the store before you can say Faster to pick up his book, because the time will come soon when the chains will have to return unsold copies to open up shelf space for the next cycle's candidates for bestseller status.
In his engaging but breathless new book, Gleick, a former science and technology reporter for the New York Times and author of Chaos; Making a New Science and Genius: The Life and Science of Richard Feynman, has compiled many hundreds of facts, demonstrating conclusively, if you had any doubt, that for some number of Americans and others, life gets faster with every passing moment, gets busier, more cluttered with channels that bring us what we fancy to be information. If you have any doubt of the human toll, read Gleick's sampling of the lives of telephone operators—a less common topic than the driven entrepreneurs, those who travel the fast lane, fearing that if they're not up to speed, they'll end up as road kill. Whatever Ben Franklin thought, time isn't really money—you can swap money for things but not for an extra hour a day—though with time you can make money, and if, like many a company in charge of an 800 number, you can get your customers to touch-tone from one voice menu to another before intercepting an actual human operator in real time, you are getting them to work for you.
The daily speed cycle apes the investment and innovation cycles. Not so long ago, Gleick tells us, the automobile development cycle was five years, but Toyota's development cycle is down to 18 months, aiming for 14. In his forthcoming book “High Stakes, No Prisoners: A Winner's Tale of Greed and Glory in the Internet Wars,” Internet entrepreneur Charles H. Ferguson writes that “the ultrafast development cycle that characterizes the entire Internet industry is in large part a consequence of using the Web to distribute and receive information and technology about the Web. …” In other words, speed causes speed—a point worth exploring. Microchips double their power every year and a half; moderns and fiber optics do their equivalent things, and sufficient numbers of us are persuaded to upgrade our laptops, our moderns and phone lines to keep up with the frenzy of progress.
Keeping pace, Gleick's sentences zip along. Trouble is, he hyperventilates. “For example” is not an argument, and neither are hundreds of examples dropped end to end in chapters—each of which is about the length of this review—not long enough to build up a case or consider contrary evidence or ask (let alone answer) difficult questions. Gleick sprinkles fascinating tidbits around, but presto! he's promptly off to another subject. Here is one tidbit that drops in just before the end: “‘Depressants like alcohol slow time, because the brain receives fewer inputs per second.” In the next sentence, Gleick is off to the sense of time accelerating as one grows old. But wouldn't it be worth pausing to consider the difference between alcohol and marijuana, on the one hand, and methamphetamines and cocaine on the other? What might the bifurcation of drugs tell us about the fact that some people resist MTV culture? (There is, Gleick reports, an international Slow Food Movement, though by 1998 it had gone online with a “Virtual World Guide to Slow Places.”) He doesn't much consider the interesting relation of the fast and the slow. Speed, after all, is relative; so is slowness.
About the existence of a mania for speed there can be no dispute, but though Gleick's examples are sometimes surprising and fun, they tend to pile up rather than accumulate into an argument. The unasked questions pile up as well: To what degree is the speed mania driven by the cycle of innovation, investment and production, to what extent by the itchy fingers of consumers? Or is the egg running just as hard as the chicken? In his breakneck rundown, Gleick doesn't pause often to ask difficult questions. For one thing, although he has some interesting etymological research on the origin of the word “speed,” he doesn't consider other languages. Nor is he very interested in how Americans’ fascination with speed resembles, or doesn't, that of other people. His account is historically thin. Just how new is the breakneck pursuit of speed, and for whom? “Our nature consists in motion,” wrote Blaise Pascal in 1660; “complete rest is death. … Nothing is so insufferable to man as to be completely at rest, without passions, without business, without diversion, without study. … Men so much love noise and stir.” Here is Thomas Carlyle in the London of 1831: “How men are hurried here; how they are hunted and terrifically chased into double-quick speed; so that in self-defence they must not stay to look at one another!” Gleick could make the case that communication devices like the Internet and cell phones are diffusing throughout the population faster than did phones, radios, cars, rail and air travel, but he doesn't. He is—yes!—in too much of a hurry.
Finally, what is to be done? Gleick ends as a self-help counselor, advising that you “serve as your own director of your own time directorate.” There's nothing wrong with personal slowdowns—better than co-dependency with the speed freaks in charge of the frenzied economy—but his denouement is, well, a quickie and somehow complacent. What, after all, are the dangers? Should we be reconciled with the pace of air traffic control? Do more financial bubbles take place nowadays, and do they burst faster? How to get out of the speed trap; if it is a trap? Why the pharmacological boom in attention deficit disorder? Centuries of economic, military, technical and psychic change brought us to the current mania, the boredom panic. When considering what to do about a culture on steroids, it's best not to hurry.
Benjamin Kline Hunnicutt (review date 19 September 1999)
Last Updated on June 7, 2022, by eNotes Editorial. Word Count: 1072
SOURCE: “A Fast-Paced Look at the Whirl and Flux of Modern Life,” in Chicago Tribune Books, September 19, 1999, p. 1.
[In the following review of Faster, Hunnicutt takes issue with Gleick's fast-paced analysis of social change and his acceptance of the acceleration of contemporary life.]
So much to do! So little time!
Fragments of old hippie hymns ring in the ear: “Slow down, you move too fast, / You got to make the morning last”; “No time for a gentle rain. … No time left for you.”
Now there is precious little time left for anyone. All is speed and rush, whirl and flux. Our lives race ahead, the pace ever more breathless. And the race gets longer. We become long-distance runners, plunging headlong at a sprinter's pace.
What happened? Where are we going, and why so fast?
In Faster: The Acceleration of Just About Everything, James Gleick gives us a delightful yet troubling book that describes the seemingly irresistible quickening of modern life. A virtuoso of the popular-science genre, Gleick has been a frequent contributor to the New York Times and has written two National Book Award finalists—Genius: The Life and Science of Richard Feynman and Chaos: Making a New Science.
Gleick's book imitates what he describes: It is supercharged, fast-paced, scattered, restless. Topics are covered in short chapters, some no more than four pages long; examples cascade. The next chapter begins, the focus shifts; another flood of examples showers the reader. There is little time for reflection, analysis, or development. The book hastens forward.
Type A personality? He is in such a hurry he is killing himself. “Can our bodies take the strain?” Elevators! Too slow. Anger boils if the wait is longer than 15 seconds. Time deforms. Heaven help us if we wait 2 minutes; it seems like 10. The “close door” button disintegrates under assault from frenzied jabbing. As time distorts, “time-madness” and mania follow.
TV? Too slow. Offered a choice of hundreds of channels, we (men at least) channel surf. We don't care what's on TV. We want to know what else is on TV. With the VCR we fast forward movies, skipping credits and trailers, condensing the action. The networks feed the mania, providing instant surveys and real-time programming, reinforcing a “sound-bite mentality'” and helping create a mercurial public opinion that changes hour by hour. Producers shave seconds from transitional black screens, producing a rapid-fire effect. “MTV Zooms By.”
Nature? We try to hurry compost, for goodness’ sake. We sleep less, 20 percent less over the past century. Our biological clocks are reset to dash. Time is compressed. We “multi-task,” squeezing several activities into the same crowded moment.
Computers? The embodiment of faster. We are “On Internet Time.” Information whizzes by, bullet-like. Attention spans contract further.
Our choice of drugs? Speed of course, or at least its respectable cousin, caffeine—one of “The New Accelerators.” We are overworked—academics quibble, but we know it. More is required of us at home, commuting and certainly at work. We are never through. The reward of success? More work! “Every office an Augean stable.”
Assessing the quickening of life, Gleick maintains the book's tempo. He introduces, hastily, critical analyses of the building stress, overwork, the intrusion of the machines into our lives. Compressing, excruciatingly, the work of scholars such as Juliet Schor and Sabastian de Grazia, he acknowledges that faster has a downside.
The price we pay is alienation—arguably the dominant complaint of the critics of technology for over a century. Capitalism and the machine separate humans from themselves, their very essence; exiling moderns to a stunted and restless artificiality and transforming all into what Herbert Marcuse called “One Dimensional Man.”
Having to go so fast to keep up, we miss stuff—our existence is truncated. Some things simply cannot be done going full speed: love, sex, conversation, food, family, friends, nature. In the whirl, we are less capable of appreciation, enjoyment, sustained concentration, sorrow, memory.
Most music is no longer accessible. Our attention deficits are such that we cannot listen to the likes of J.S. Bach—we even have a hard time watching old movies. In the rush we forget joy. Gleick quotes Randall Jarrell:
you needn't mind. The soul has no assignments, neither cooks Nor referees: it wastes its time. It wastes its time.
The universe of things done for their own sakes—whose primary purpose is to waste its time, not to get it over as soon as possible—has fallen victim to speed. What is all the speed for? Gleick's answer: Speed is for more speed. Now, speed is the thing-for-itself.
In short, Gleick acquiesces. Alienation is inevitable. There is no going back—get over it. There are consolations: the exhilaration, the sense of being connected, in touch immediately with everything. Indeed, according to Gleick we are not so much alienated by the pace of life as advanced in evolution. “We are different creatures, psychologically speaking, from what we were a generation ago.” We expect, no, need, the stimulation, the rush of acceleration. The alternative is boredom.
Those who are not true believers in technology may, like this reviewer, squirm at Gleick's tepid defense of speed, wishing for a more robust critique. We are losing chunks of our humanity, for crying out loud. Shall we not rail in protest?
I am most uncomfortable with Gleick's speed-the-antidote-to-boredom prescription. Perhaps boredom is symptomatic of our loss and alienation, of something deeper, and “faster” makes things worse.
I am reminded of Thomas Pynchon's splendid essay on sloth, “Nearer, My Couch, to Thee.” For centuries, sloth (one of the seven deadly sins) had two forms; idleness/boredom, and the opposite, the desperate attempt to escape in manic activity (going “faster”). Thomas Aquinas called this “rushing after various things without rhyme or reason.” According to Soren Kierkegaard, the root of sloth is “a despairing refusal to give consent to one's own being.”
Pynchon concluded that “unless the state of our souls becomes once more a subject of serious concern” we will forget, and perhaps embrace sloth. I would add that we will continue to wallow in “faster,” mindless of our lost humanity, ignorant of the past and dumb before the future, always in the cascading present. Like Benjy Compson in the Faulkner novel, doomed to the present, we will continue to tell each other, in ever more excited and manic fashion, the same tale of whirl and flux, full of sound and fury, signifying nothing.