The Mathematical Theory of Communication

rw-book-cover

Metadata

Highlights

  • The word communication will be used here in a very broad sense to include all of the procedures by which one mind may affect another. (Location 77)
  • LEVEL A. How accurately can the symbols of communication be transmitted? (The technical problem.) LEVEL B. How precisely do the transmitted symbols convey the desired meaning? (The semantic problem.) LEVEL C. How effectively does the received meaning affect conduct in the desired way? (The effectiveness problem.) (Location 85)
  • One essential complication is illustrated by the remark that if Mr. X is suspected not to understand what Mr. Y says, then it is theoretically not possible, by having Mr. Y do nothing but talk further with Mr. X, completely to clarify this situation in any finite time. If Mr. Y says “Do you now understand me?” and Mr. X says “Certainly, I do,” this is not necessarily a certification that understanding has been achieved. (Location 96)
    • Tags: orange
    • Note: surely the problem here is also that the receiver may well think they have understood the concept but have not actually (for whatever reason) have fully comprehended it.
  • This unit of information is called a “bit,” this word, first suggested by John W. Tukey, being a condensation of “binary digit.” (Location 170)
  • Zero and one may be taken symbolically to represent any two choices, as noted above; so that “binary digit” or “bit” is natural to associate with the two-choice situation which has unit information. (Location 173)
  • A system which produces a sequence of symbols (which may, of course, be letters or musical notes, say, rather than words) according to certain probabilities is called a stochastic process, and the special case of a stochastic process in which the probabilities depend on the previous events, is called a Mark off process or a Markoff chain. (Location 203)
  • ergodic processes. (Location 206)
  • Having calculated the entropy (or the information, or the freedom of choice) (Location 233)
  • One minus the relative entropy is called the redundancy. This is the fraction of the structure of the message which is determined not by the free choice of the sender, but rather by the accepted statistical rules governing the use of the symbols in question. (Location 237)
  • It is most interesting to note that the redundancy of English is just about 50 per cent,4 so that about half of the letters or words we choose in writing or speaking are under our free choice, and about half (although we are not ordinarily aware of it) are really controlled by the statistical structure of the language. Apart from more serious implications, which again we will postpone to our final discussion, it is interesting to note that a language must have at least 50 per cent of real freedom (or relative entropy) in the choice of letters if one is to be able to construct satisfactory crossword puzzles. (Location 241)
  • it is interesting to note that a language must have at least 50 per cent of real freedom (or relative entropy) in the choice of letters if one is to be able to construct satisfactory crossword puzzles. If it has complete freedom, then every array of letters is a crossword puzzle. (Location 245)
  • This sort of consideration leads at once to the necessity of characterizing the statistical nature of the whole ensemble of messages which a given kind of source can and will produce. And information, as used in communication theory, does just this. (Location 254)
  • “When Pfungst (1911) demonstrated that the horses of Elberfeld, who were showing marvelous linguistic and mathematical ability, were merely reacting to movements of the trainer’s head, Mr. Krall (1911), their owner, met the criticism in the most direct manner. He asked the horses whether they could see such small movements and in answer they spelled out an emphatic ‘No.’ Unfortunately we cannot all be so sure that our questions are understood or obtain such clear answers.” (Location 514)