The General Theory of Capital: Self-Reproduction of Humans Through Increasing Meanings - страница 32
“According to this way of measuring information, it is not intrinsic to the received communication itself; rather, it is a function of its relationship to something absent—the vast ensemble of other possible communications that could have been sent, but weren’t. Without reference to this absent background of possible alternatives, the amount of potential information of a message cannot be measured. In other words, the background of unchosen signals is a critical determinant of what makes the received signals capable of conveying information. No alternatives = no uncertainty = no information. Thus Shannon measured the information received in terms of the uncertainty that it removed with respect to what could have been sent” (Deacon 2013, p. 379).
Thus, the average amount of information H a culture-society contains can be measured by the number of (counter)facts it generates and the probability of their occurrence. The Shannon entropy H is an indicator of the complexity of a culture-society as a whole. If we look at the history of human cultures-societies, we see that their complexity has consistently grown: from a meager set of primitive meanings (tribal community, elementary language, simple stone tools, causal mini-models, animism and fetishism) to a complex arsenal of meanings characteristic of agrarian societies (fields and livestock, agricultural and craft tools, city-states and empires, writing and literature, ancient and Arabic science, world religions).
Cultural evolution has increased the complexity of entire cultures-societies as well as individual meanings. As mentioned earlier, the complexity of a meaning is determined by the minimum number of figurae required to reproduce it. Suppose we have a meaning s that can be represented as a string with a certain number of figurae. The length of this string is L(s). In this case, the complexity of a meaning s is defined by the length of the shortest program s* that can describe this meaning. The length of the program s* is called the algorithmic entropy of s or Kolmogorov complexity K(s):
“The key concept of algorithmic information theory is that of the entropy of an individual object, also called the (Kolmogorov) complexity of the object. The intuitive meaning of this concept is the minimum amount of information needed to reconstruct a given object” (Vinogradov et al. 1977-1985, vol. 1, p. 220).
We call the program s* the least or minimalaction necessary to reproduce the meaning s. The complexity of the meaning s depends on the length (size) of the minimal action required to reproduce it. For example, the string asdfghjkl can be described only by itself. Its length is 9 non-repeating figurae. However, if a string s has a pattern—even a non-obvious one—it can be described by a minimal action s* that is much shorter than s itself. For example, the string afjkafjkafjkafjk can be described by the much shorter string afjk repeated as many times as necessary.
“The distinction between simplicity and complexity raises considerable philosophical difficulties when applied to statements. But there seems to exist a fairly easy and adequate way to measure the degree of complexity of different kinds of abstract patterns. The minimum number of elements of which an instance of the pattern must consist in order to exhibit all the characteristic attributes of the class of patterns in question appears to provide an unambiguous criterion” (Hayek 1988-2022, vol. 15, p. 260).