Grammatical Man

(Literary Masterpieces, Volume 13)

Regardless of what the jacket blurb says, Grammatical Man is not the first book to tell the story of information theory from its beginnings during World War II. It is, however, the first book to try to tell the story in layman’s language without requiring any mathematical knowledge on the part of the reader—no small feat considering that information science had its basis in a complex set of theorems first published by Claude E. Shannon in the Bell System Technical Journal in 1948, entitled “A Mathematical Theory of Communication.” Probably the first history of information theory was Symbols, Signals and Noise: The Nature and Process of Communication (1961), written by J. R. Pierce, an electrical engineer who knew Shannon, and who began his own book by admitting that although his account was less mathematical than Shannon’s, it could not be nonmathematical. The advantage of Jeremy Campbell’s book, because it leaves out the math, is its increase in readability and its currency in comparison to Pierce’s 1961 history. Its concomitant disadvantage lies in the fact that this new account was written by a journalist rather than a specialist. What Campbell, a Washington correspondent for the London Standard, offers here is a popularized history of the development of the new science of information, which at the same time attempts to show how information theory has synthesized research and theory in most of the hard and soft sciences.

The first part of Grammatical Man focuses on the background of information theory in the nineteenth century study of the science of probability and the principle of entropy. Campbell charts the progress of scientists and mathematicians as they attempted to come to terms with the crucial tension between the predictable and the unpredictable in any given system. System, or structure, as researchers in the so-called “soft sciences” of psychology and sociology would call it, is the crucial word in information theory, for information is defined in terms of its relationship to other possibilities within a system, rather than atomistically in isolation. After establishing that the relationship between the predictable and the unpredictable could be determined by laws of mathematical probability, the next step for scientists of the twentieth century was the discovery of the rules that govern a system. The history of this search in various areas of study dominates the remainder of Grammatical Man.

The basic premise on which Campell proceeds, and which he continually repeats, is that the metaphor for all life processes is the sentence of a language, a model itself for the complex mix of predictability and unpredictability that makes information possible. Metaphor is another key concept in Campbell’s discussion, for he often suggests that certain processes are “like” a language, or that “in a metaphoric sense” certain processes must be understood as one understands language. For example, he notes that, in a metaphoric sense, when a thermodynamic system is in a state of low entropy, it is like a message, whereas when it is in a state of high entropy, it is like noise. Moreover, Campbell, in order to avoid the use of mathematical notation, also must make use of his own metaphors or analogies, such as illustrating the principle of entropy by discussing the arrangement of molecules in a glass of ice water or by postulating the arrangement of books in a hypothetical library.

Grammatical Man is filled with references to essays, books, and monographs written by the primary researchers in the field, but the documentation is unobtrusively tucked away at the end of the book with page references so that the text is not cluttered with notes. Campbell summarizes with admirable clarity the salient points of the work of Claude Shannon, the originator of information theory; Norbert Wiener, the founder of cybernetics; Noam Chomsky, the great innovative linguist; and many others. Moreover, he charts the development of the basic principles of these thinkers whose discoveries have become household words. For example, few people are now unaware that “bit,” which Shannon postulated as a basic measure of information, is short for “binary digit,” which, in a remarkable series of yes/no patterns, is able to perform the high-speed word processing and number-crunching in countless 8-bit home computers across the country.

Shannon’s work began with the basic problem of trying to find ways to separate message from noise (or order from disorder) in a communication system. Noise always adds itself to messages and by randomizing and distorting messages makes them less reliable. Thus, an extra ration of predictability is necessary in messages to reduce the randomizing; this extra ration is called redundancy, one of the most important concepts in information theory. Rules...

(The entire section is 1998 words.)

Grammatical Man Bibliography

(Literary Masterpieces, Volume 13)

Choice. XX, December, 1982, p. 611.

Library Journal. CVII, August, 1982, p. 1471.

Los Angeles Times Book Review. September 26, 1982, p. 2.

The New Yorker. LVIII, August 9, 1982, p. 95.

Publishers Weekly. CCXXI, June 11, 1982, p. 55.