Saturday, May 12, 2018

370. Entropy

Carlo Rovelli[i] proposed that much of the wondrous complexity of quantum physics can be captured with information theory (which isn’t really a theory, but a method). Here I consider whether that ‘theory’ might also help to understand humanity and society, and add to the dynamic ontology that I developed in preceding items in this blog. I begin with a brief explanation of information theory and the corresponding notion of ‘entropy’.

Information theory derived from thermodynamics, with its second law stating that when left alone, a system can only dissipate its energy: a hot thing dissipates its heat to its environment, and cannot by itself become hotter. The relevance to ‘my’ ontology is that this might contribute to understanding the central issue of how objects interact with their environment. States of nature decay from ‘order’, in the form of some specific, distinct configuration of elements, to decay in a ‘chaos’ of an undifferentiated mass. From information theory, that chaos is called ‘entropy’. In those terms, isolated systems increase in entropy.

In biology ‘life’ is characterized, defined even, as a system that resists the increase of entropy to maintain its negative, ‘negentropy’, to maintain itself, by ingesting new energy in the form of foods. Organs need to be maintained within a narrow range of states and boundaries of variation, in homeostasis. In death everything decays, dissipating into an undifferentiated mass.

Entropy is measured according to the formula:
          n
E = - Σ pi.logpi, where E = entropy, n is the number of possible states, pi is the probability of state i,
         1                  2
and log  is log at the base 2.
           2

From this follows that entropy can increase in two ways: an increase of the number of possible states (n) and the equalization of their probabilities (pi). See below for some examples:

n=2: p1= 1/2, p2= 1/2 à E= 1;                                 p1= 3/4, p2 = 1/4 à E= 0.8
n=3: p1= 1/3, p2= 1/3, p3= 1/3 à E= 1.6                 p1= 1/2, p2= 1/4, p3= 1/4 à E= 1.5            
n=4: p1= 1/4, p2= 1/4, p3= 1/4, p4= 1/4 à E= 2     p1= 1/4, p2=1/6, p3= 1/6, p4= 1/6 à E=1.8

Here, entropy (E) increases from top to bottom, with more states (n), and decreases from left to right, with more unequalities probabilities.

In quantum physics, discussed in the preceding item in this blog, in the Copenhagen interpretation a cloud of probabilties for the location of an elementary particle collapses into a specific location, in interaction with something in its environment. Then entropy is reduced to 0: one state with probability 1.

In the preceding item in this blog, I proposed something similar in language: a cloud of possible denotations of a universal (such as ‘chair’) collapses into a specific particular chair, in interaction with others words in a sentence. Uncertain denotation collapses into a certain one.

In this blog I proposed that understanding of language, and of corresponding thought, suffers from a ‘object bias’: the strong inclination to see universals in analogy to objects in Newtonian time and pace, similarly to the problem in understanding quantum theory. What does not fit in that perspective ‘is not real’.

In philosophy, this bias drove Plato to claim that a univesal has an unambiguous denotation in the form of an ‘ideal’ object, as the ‘real reality’: some unobservable that lies behind the shadows or imperfect manifestations that we can observe (in his famous metaphor of the cave).

So language use, according to the rules of grammar and syntax, reduces entropy and therefore is a sign of life.

But how about poetry, then?  That adds meanings to clouds, and connects clouds, in metaphor, seeing something in terms of something else, and thus appears to increase rather than decrease entropy, disturbing the order rather than creating it. Is poetry not to be seen as a ‘form of life’, then (as Wittgenstein called language)?

Remember that there are two ways to increase negentropy: reduce the number of possible states, here meanings, or increase their distinction, to make some meaning salient, ‘sticking out’, precisely because it does not satisfy the existing order. And that is what poetry does. Bureaucratic, ‘normal’ language reduces meanings to some norm and thereby increases entropy.  



[i] Carlo Rovelli, 2016, Reality is not what it seems, Penguin.  

No comments:

Post a Comment