1. Anthropocene, exosomatization and negentropy

Maël Montévil, Bernard Stiegler, Giuseppe Longo, Ana Soto and Carlos Sonnenschein

Bernard Stiegler, The Final Warning, Serpentine Galleries (September 2018)

The industrial economy took shape between the late eighteenth century and the nineteenth century, initially in Western Europe and then in North America. Besides technical production, it involves technological production – the integration of sciences in order to produce indus-trial goods –, to the strict extent that, as Marx showed, capitalism makes knowledge and its economic valorization its primary element.

Newton’s physics and the metaphysics that goes with it originated the epistemic (in Michel Foucault’s sense) and epistemological (in Gaston Bachelard’s sense) framework of this great transformation. In this transformation, otium (productive leisure time) submits to negotium (worldly affairs, business). All along, mathematics has been applied with ever more powerful and performative calculating machines.

After precursors such as Nicholas Georgescu-Roegen, himself inspired by Alfred Lotka, we maintain that political economy in what is now called the Anthropocene (whose features were delineated by Vladimir Vernadsky in 1926) is a challenge that requires a fundamental reconsideration of these epistemic frameworks and epistemological frameworks. With Dar-win, living beings became part of a historical process of becoming. In humans, knowledge is a performative part of this process that shapes and reshapes lifestyles in order to tame the im-pact of technical novelties.

*

A brief historical introduction: knowledge and technics

The intellectual context of the industrial revolution is the idea that science and econ-omy, especially trade, would become the new basis of legitimacy,security, justice, and peace. For example, Hume argued that the gold standard adjusts the balance of payments between states spontaneously. The underlying scientific paradigm is Newtonian, where deterministic mathematical laws are the ultimate embodiment of knowledge. Under this perspective, equi-librium and optimization follow from the relations between the parts of a system. Studies de-scribe spontaneous, optimal equilibria and, therefore,they promote the withdrawal of ra-tional supervision once the intended dynamic takes place. Further intervention would break the balance of these equilibria. Along these lines, scientific and technological developments yield progress by the optimization of processes and the providence of spontaneous balances. However, by construction, such analyses neglect the context of a situation even when this context is the condition of possibility of this very situation. Moreover, following the same ra-tionale, both in science and industry, complicated situations are reduced to a combination of simple elements that can be known and controlled. Then, for example, the production of a single craftsman can be decomposed into simple tasks performed by several specialized workers eventually by machines. This method entails the progressive loss of workers’ knowledge because of its transfer to the technological apparatus; this was first described by Adam Smith and later by Karl Marx, who named this trend proletarianization. This loss of knowledge is a critical component in a more general process of denoetization, that is, the loss of the ability to think (noesis). Technics has become technology, and like technics, technology is a pharmakon: like drugs, it can lead both to positive and toxic outcomes.

At the same time that these events took place, new major scientific ideas emerged. Darwin’s views on biological evolution provided a historical framework to understand living beings. Darwin’s framework has been interpreted by some as another instantiation of the Newtonian model of science, while others emphasized the originality of historical reasoning in natural science. In this Darwinian framework, the living world is no longer a static manifes-tation of divine order. Instead, current life forms stem from a process of historical becoming. This change of perspective led to question the becoming of humankind and the role played by human intelligence in this process; namely, eugenics and social Darwinism emerged - against Darwin’s view that embraced the singularity of human societies.

Another scientific framework appeared on the scene. With the industrial revolution, heat engines were developed which raised theoretical questions that gave birth to thermo-dynamics. Physicists developed the concept of entropy and showed that entropy can only in-creases in isolated systems. In physics, energy is conserved by principle but entropy increase means that it becomes less usable to perform macroscopic tasks. In a nutshell, the increase of entropy in a physical system is the process of going from less probable to more probable macroscopic states. It follows that the increase of entropy is the disappearance of improbable initial features and their replacement with more probable features. This means the erasing of the past. This notion departed from the reversibility of classical mechanics – the latter lacks an objectivized time arrow – and brought about the cosmological perspective of the universe heat death. This concept goes hand in hand with the discovery of chaotic dynamics by Poinca-ré and the refutation of Laplace’s view that mathematical determinism entails predictability, thus taking a stab, in principle, at the notion of mathematical predictability and control of natural phenomena. In particular, Poincaré’s work applies to the solar system whose stability cannot be ascertained. These scientific developments provide a precarious view of the cos-mos.

Nevertheless, in the XXth century, determinism sensu Laplace has found a second wind with mathematical logic and the subsequent computer sciences. These developments took place when industrial production shifted to consumer capitalism, a framework driven by mass consumption. Mass media are designed to trigger standard responses from consumers. As a result, the trend of denoetization expands to consumers as such – for example, processed foods led to a loss of folk cooking knowledge and contributed to the pandemic of non-communicable diseases like obesity.

In this context, the lax notion of information became central. Shannon coined a precise concept of information in order to understand the transmission of written or audio messages in noisy channels of communication. A very different concept was proposed by Kolmogorov to describe how hard the generation of a given sequence of characters is for a computer pro-gram. Specifically, Shannon’s theory states that information means improbability. This idea becomes absurd when used in order to assess meaning instead of facing transmission diffi-culties (noise), which was Shannon’s original motivation. For example, a constant binary se-quence has maximum information sensu Shannon, while a random sequence has the maximal information sensu Kolmogorov (i.e, elaboration of information), and both limit cases have more information in their respective sense than a Shakespeare’s play of the same length. De-spite the incompatibility of these frameworks and their limits, the received view in current cognitive sciences – themselves dominating representations in digital capitalism – is that in-telligence is information processing, that is to say, a computation. Similarly, information plays a central role in molecular biology in spite of the failure to characterize it theoretically. Last, ignoring early criticism by authors such as Poincaré, the economy has been conceptual-ized as a process of spontaneous, mathematical optimization by “rational” agents, with pos-sibly biased information processing due to “imperfect” cognitive processing.

At the beginning of the XXIst century, computer use has spread in diverse forms (such as personal computers, smartphones, and tablets). Their connection in networks has deep-ened and transformed the role of media. Private interests started competing to catch and re-tain the attention of users. With these technologies, the services provided to users depend on users’ data, and at the same time, service providers use these data to capture the users’ atten-tion. These transformations led to a further wave of automatization. Algorithms like those used in social networks formalize and automatize activities that were foreign to the formal economy. These changes lead to further losses of knowledge and denoetization where atten-tion itself is disrupted. Since the received view in cognitive sciences is that intelligence is in-formation processing, several scientists consider the algorithms used as artificial intelligence an neglect the conditions of possibility of human intelligence such as attention. At the same time, management, as well as commercial platforms, decompose humans into tables of skills, interests, behaviors that feed algorithms, drive targeted political and commercial marketing, and shape training and recruitment policies.

The same trend occurs in the sciences,: knowledge tends to be balkanized in always more specialized fields of investigation, and scientific investigations tend to be reduced to the deployment of new observation apparatus and new information processing on the data obtained. By contrast, theorization is a necessary process for science, and it is a synthetic ac-tivity that reevaluates the concepts and history of a field, empirical observations, and the in-sights of other fields. With the emergence of data mining Chris Anderson advocated the end of theory. This perspective has been accurately criticized; however, the dawn of theorization in sciences seems to come mostly from another path. Following society’s general trend, it comes as the indirect result of institutional re-structurations and the increasing weight of scientific marketing, both in publications and funding decisions. It also comes from an insufficient crit-ical assessment of digital technologies and their consequences for scientific activities; it fol-lows that the academic appropriation of these technologies to mitigate their toxic conse-quences and push forward scientific aim is lacking (except for purely mathematical ques-tions).

Now, at the beginning of the XXIst century we are also witnessing the rising awareness of the consequences of human activities on the rest of the planet, leading to define a new era: the Anthropocene. The Anthropocene is characterized by human activities that tend to de-stroy their conditions of possibility – including both biological organizations (organisms, ecosystems) and the ability to think (noesis). In this context, the ability to generate knowledge to mitigate the toxicity of technological innovations is deeply weakened, to the extent that the problem of this toxicity is seldom raised as such by governments and socie-ties.

Entropies and the Anthropocene

Energy or mineral resources, such as metals, are conserved quantities from the per-spective of physics; however, there is some truth in saying that these resources are becoming scarce. A crucial concept to understand these situations is the concept of entropy. Entropy describes configurations and is directly related to our ability to use such resources. For ex-ample, ore deposits are at an improbably high concentration - generated by geological and atmospheric far from equilibrium processes -, and human activities concentrate them further by the use of free energy. For these resources, the critical concepts are the dispersion and, on the opposite, the concentration of matter; that is, the entropy of their distribution on Earth. However, a straightforward accounting of entropy is not conceptually accurate, and it is nec-essary to provide a finer-grained discussion of the articulation of entropy and the living, in-cluding the special case of human societies.

From the perspective of thermodynamics, biological situations are not at a maximum entropy and do not tend towards maximum entropy. The low and sometimes decreasing entropy of biological objects seems to “contradict” the second principle of thermodynamics, which states that entropy cannot decrease in an isolated system. However, biological situations, in-cluding the biosphere as a whole, are not isolated systems. Biological situations are open; they use flows of energy, matter, and entropy. At the level of the biosphere, the sun is the pri-mary provider of free energy that is used by photosynthetic organisms. Therefore, biological situations do not contradict the second principle. A consequence is that biological organiza-tions and, by extension, social organizations, are necessarily local and depend on their cou-pling with their surroundings. In organisms, the relationship between the inside and the out-side is materialized and organized by semi-permeable membranes.

How to move forward in order to understand biological situations and their articulation to thermodynamics? Predicting requires to single out theoretically a situation among many oth-ers: typically, the state that the changes of the object will bring about. Entropy maximization singles out a macroscopic state: the one that maximizes entropy. Functions performing this role in physics are called potentials. There is a diversity of potentials in the field of equilibri-um thermodynamics, which are different variants of free energy, involve entropy, and whose relevance depends on the coupling between the system studied and its surroundings. Howev-er, in the case of systems far from thermodynamic equilibrium – situations that require flows with the surroundings to last, like organisms –, there is no consensus on the theoretical exist-ence of such a function or family of functions. For example, Prigogine’s fundamental idea is that the rate of entropy production (i.e., the rate of energy dissipation) could play the theoret-ical role of potential; however, this idea is valid only in particular open systems. It follows that the ability to understand general systems far from equilibrium by calculus is not theoret-ically justified. From a less technical perspective, Schrödinger introduced the idea that the problem in biology is not to understand order from disorder, like in many physical situations, but instead to understand order from order. To capture this idea, he proposed to look into negative entropy, an idea which was later elaborated by Brillouin, who named the corre-sponding negative entropy “negentropy.”

However, negative entropy does not precisely reflect biological organizations. Entropy can be lowered just by decreasing temperatures, while biological organizations remain as such only within a range of temperatures. A major glaciation would decrease entropy, but it would also destroy biological organizations. Moreover, functional parts of biological organi-zations often involve a local increase of entropy to be functional. For example, diffusion of a compound from its production location to the rest of the cell is a process of physical entropy production. Nevertheless, this process leads the said compound to reach locations where it can play a functional role. It follows that an articulation between entropy and biological or-ganizations requires a careful analysis. In a nutshell, biological organizations maintain them-selves far from maximum entropy configurations thanks to fluxes from their surroundings. At a given time, they actively sustain this situation by the interaction between their parts and fluxes. The necessary coupling between organisms and their surroundings takes place in eco-systems that are themselves embedded in larger levels up to the biosphere. The viability of living situations stems from the systemic properties of these various levels, and at the same time, from the underlying history that originated organizations in their respective past con-texts. More generally, the way biological organization sustain themselves is fundamentally historical, i.e., they stem from natural history. This historicity implies a particular vulnerabil-ity to fast anthropogenic changes that disrupt biological organizations at various levels sim-ultaneously. Examples of those changes are climate change at the level of ecosystems, or en-docrine disruptors at the level of organisms. Moreover, life forms continue to change over time by generating new structures and functions. More than individual species, biologists emphasize the conservation of biodiversity and of the branching process of evolution that we may call biodiversification. This process is itself the object of anthropogenic disruptions. In a nutshell, biological organizations are precarious because the existence and the nature of their parts are fundamentally contingent and these parts need to be actively sustained. Organiza-tions sustain themselves in ways that stem from past contexts, and can reorganize with suffi-cient time, however both processes are disrupted by anthropogenic changes. This argument is well accept in the state of the art biological knowledge, and at the same time, these matters are insufficiently theorized.

A possible strategy to go further in this analysis is to propose a complementary con-cept to that of entropy (and its mathematical opposite negentropy. Bailly, Longo, and Mon-tévil proposed such a new concept called anti-entropy thatrefers to biological organizations (organs, functions …). In contrast to (digital) information, which is a one-dimensional notion (Shannon’s and Kolmogorov alpha-numeric strings), its geometry and dimensions do matter. A living organism produces entropy by transforming energy, sustains its anti-entropy by set-ting up and renewing its organization continually and produces anti-entropy by generating organizational novelties.

Anti-entropy aims to accommodate biological organizations in their historicity. Current life forms sustain themselves by the use of functional novelties that appeared in the past (anti-entropy) and the production of functional novelties (anti-entropy production). These novel-ties are unpredictable and unprestatable a priori (i.e., their nature cannot be predicted). At the same time, they are not generic random outcomes. They are specific because they con-tribute to the ability of biological objects to last over time by contributing to their organiza-tion in a given context (that this organization may impact). Entropy depends on the coupling of a system with its surroundings. Similarly, anti-entropy is relative to an organization, and not all objects are organized. For example, considered alone, a heart has no function; it is only at the level of the organism that it is endowed with a function. As a result, all discussions on anti-entropy are relative to an intended organized object, that is to say, to a specific locality.

As pointed out by Lotka, a specificity of human societies is the importance of inorgan-ic objects in their organizations, such as tools, written texts, or computers. These objects are shaped and maintained by human activities. The constitution of objects theoretically analo-gous to organs outside organic bodies is called exosomatization by Lotka, and this process underlies how humans’ ways of living evolve.

In order to enable these inorganic objects to have a functional role and to limit the de-stabilization they introduce, evolution and developmental and physiological plasticity have a role in the process of exosomatisation. For example, reading recruits the plasticity of several brain areas that depend on the writing system. However, these purely biological responses are insufficient, and noetic activities are required to complete the process of exosomatiza-tion. For example, philosophy can be interpreted as a reaction to writing and its use by soph-ists, with possibly catastrophic consequences for the polis. In contemporary terms, it is far from being sufficient for a technic to find a market by the use of marketing to become desira-ble. It is also required to find variations and uses that mitigate the toxicity of these technics – especially in the perspectives of climate change, the decline of biodiversity, and denoetiza-tion. In other words, more work is required to single out exosomatic novelties (i.e., technics and technologies) that would be compatible with a desirable future for humankind. In this perspective, knowledge in all its forms plays a special role. Knowledge prescribes variants and uses for the novelties introduced by exosomatization and is tied to ethics.

Computers participate to this process and can be defined as automatic rewriting sys-tems. With the increase of their speed and inputs (data), computers’ ability to process infor-mation and perform categorization increases dramatically. However, the tasks that they can perform are not equivalent to the novelties produced by human work. In the latter, meanings are produced that are neither in the initial data nor in their combinations by algorithmic methods. For example, the principle of inertia describes a very exotic situation on Earth where no forces are exerted on an object (e.g.,no friction and no gravitation): it cannot be de-rived from data, but was posed by Galileo as an asymptotic principle, a way to “make sense” of all movements at once and analyze what may affect them, that is, frictions and gravita-tion. Similarly, equal rights between citizens and gender equality are political principles that trigger a departure from former situations and reshape social organizations; they cannot be deduced from the former situations. These examples are historically significant in their re-spective domains; however, such processes are, in a sense, ordinary in human activities. They define work by contrast with labor: the former is also the permanent “invention of a new con-figuration of sense.” The current trend, however, is unfortunately not to develop work in this sense; instead, it is a convergence between algorithms and human activities. This conver-gence means a sterilization of work by its standardization – its transformation into generic information processing.

The scientific consensus is that the current path of civilization leads to its destruction, in particular by identifying anti-entropy, extended to social organizations, with information, a one-dimensional flattening. Work invents new tools and uses, thus constructs new configura-tions and sense for human and ecosystemic interactions. Thus, it departs from the alpha-numeric combinatorics in a pre-given set of possibilities (computational data processing), and it is required at all levels of society to face the current crisis.