Cybernetics by Norbert Wiener, 2nd edition

From back when people thought public intellectuals should actually understand something...

On the other hand, under certain conditions of delay, etc., a feedback this is too brusque will make the rudder overshoot, and will be followed by a feedback in teh other dirciton, which makes the rudder overshoot still more, until the steering mechanism goes into a wild oscillation or hunting, and breaks down completely.

this time we decided to take a nervous problem directly from the topic of feedback and to see what we could do with it experimentally... We worked chiefly with cats, first decerebrated under ether anesthesia and later made spinal by a thoracic transection of the cord...

these we tried to analyze as we should analyze a mechanical or electrical system exhibiting the same pattern of hunting.

Bergson emphasized the difference between the reversible time of physics, in which nothing new happens, and the irreversbile time of evolution and biology, in which there is always something new.

The key idea of Gibbs is this: in Newton's dynamics, in its original form, we are concerned with an individual system, with given initial velocities and momenta, undergoing changes according to a certain system of forces under the Newtonian laws which link force and acceleration. In the vast majority of practical cases, however, we are far from knowing all the initial velocities and momenta. If we assume a certain initial distribution of the incompletely known positions and momenta of the system, this will determine ina completely Newtonian way the distribution of the momenta and positions for any future time.

(entropy) is primarily a property of regions in phase space and expresses the logarithm of their probability measure. For example, let us consider the dynamics of n particles in a bottle, divided into two parts, A and B. If m particles are in A, and n-m in B, we have characterized a region in phase space, and it will have a certain probability measure. The logarithm is the entropy of the distribution: m particles in A, n-m in B. The system will spend most of its time in a state near that of greatest entropy, in the sense that for most of the time, nearly m1 particles will be in A, nearly n-m1 particles in B, where the probability of the combination m1 in A, n-m1 in B is a maximum. For systems with a large number of particles and states within the limits of practical discrimination, this means that if we take a state of other than maximum entropy and observe what happens to it, the entropy almost always increases.

The mongoose begins with a feint, which provokes the snake to strike. The mongoose dodges and makes another such feint, so that we have a rhythmical pattern of activity on the part of the two animals. However, this dance is not static but develops progressively. As it goes on, the feints of the mongoose come earlier and earlier in phase with respect to the darts of the cobra, until finally the mongoose attacks when the cobra is extended and not in a position to move rapidly. This time the mongoose's attack is not a feint but a deadly accurate bite through the cobra's brain.