481.
Entropy and knowledge
Entropy
is the number of alternative compositions of components that a sytem with given
properties can have. Think of a mechanism with different components, like a
motor with its parts. The mathematical formula for entropy E of a system of n
elements of probability pi is E= -For a dice there are 6 possible oucomes,
and its entropy is log6, which is also the extent of information one has when
one of the compositions materializes. For a system of 2 units of equal
probability ½ , E = 1, called a bit. For a system of four elements of equal
probabiliy, E = 2 or two bits. For a system with 8 elements of equal
probability E=3, or three bits.
The
second law of thermodynamics says that the entropy of a closed system can only
increase, such as in the cooling off of a container of hot water in a cool
environment. An organism can only survive and stay alive when it is not a
closed system, combating the process of increasing entropy by taking in energy
in the form of food. Increasing entropy has also been seen as loss of order, as
when a body decays when no longer being fed.
Another
item to be looked at is the number of direct connections between components,
which is a measure of possible combinations C, and thereby of the potential for
novelty by interaction, which is n(n-1)/2. The derivative, a measure of its
increase, is n-1/2, beyond the minimum of n=2 is greater than the increase of
entropy logn, whose derivative is 1/n. Thus, innovation potential increases
faster than entropy, the loss of order. Chaos gives opportunities.
Perhaps
this is a way to look at the difference between democracy and
authoritarianism.In the latter order is greater, but opportunities for renewal
are smaller. The price for the order is more rigidity.
In
law, case law has greater entropy than
jurisdiction based on legal codes, but also yields greater inventiveness.
However,
perhaps the model should be further refined. In other research, reported
elsewhere in this blog, I proposed ‘optimal cognitive distance’. Higher
cognitive distance increases misunderstanding, but at the same time increases
the potential for innovative ‘novel combinations’. The conclusion is that for
innovation one should seek an ‘optimal’ distance: large enough to yield
innovative potential, but no too large to realise it, due to lack of
understanding. Productive outcome is a quadratic, inverse-u shaped function of
distance at a certain intermediate ‘optimal’ distance with the highest
production.
If
we take this into account, an increased number of potential combinations at too
high a distance, in a society diversity is productive, but in a fragmented
society of people thinking differently too much, innovative potential does not
increase, and democracy will not realise its potential.
No comments:
Post a Comment