Date: 16-05-1995
URL: http://igw.tuwien.ac.at/igw/Personen/
fleissner/papers/Entropy/Entropy.html


Entropy and Its Implications for Sustainability

Peter Fleissner and Wolfgang Hofkirchner[1]

  1. Abstract
  2. The Roots of Entropy in Thermodynamics
  3. Entropy Becomes a Measure of Structure
  4. The Background of the Unending Controversies
  5. Entropy and Techno-economic Processes
  6. Running out of Free Energy is no Actual Threat to the Economy

    Abstract

    The paper deals with the concept of entropy in different contexts. The mathematical description is the common denominator for entropy in all of these contexts: in physics entropy is a measure of the usability of energy (Clausius' macro-concept as well as Boltzmann's combined micro-macro approach); in communication theory (Shannon) entropy is a measure of the degree of surprise or novelty of a message; in biology and sociology entropy is closely connected with the concept of order and structure. Thus, it is important to clarify the dependence of the concept used on the context of application. We have to analyze how the above concepts can be linked to economics and the environmental sciences. Although it is evident that all economic processes of production, distribution and consumption necessarily transform free energy into dissipated heat, it still has must asked whether the Second Law of Thermodynamics is really restricting economic activities, and thus, if Georgescu-Roegen's fourth law holds.

    The paradigm of self-organization shows how the evolution of dynamic systems can be explained in a scientific manner in spite of the law of entropy. Guided by this idea, the feasability of a sustainable society has to be investigated.

    1. The Roots of Entropy in Thermodynamics

    Since the times of Clausius (1822-1888) the concept of entropy has been the subject of intensive debates. These discussions were to a great extent prompted by a radical change in the world view of physics. While classical mechanics as well as classical quantum mechanics had perceived the world as being reversible (the result is the same if one solves the basic equations towards the future or the past), entropy focuses on the irreversibility of physical processes. It represents a quantitative measure for the tendency of a closed system towards thermodynamic equilibrium. Thermal gradients are leveled out, while entropy is growing. Consequently one had to accept that most activities that are real are connected with an overall increase in entropy. Clausius' analysis of the Carnot (1796-1832) process indicated that reversibility may still be true for imaginary processes, but any real activity has to be regarded as being irreversible. Thus follows his statement of the Second Law of Thermodynamics: The entropy of a closed system can only increase, but never decrease. To state it in the economic word game: "There is no such thing as a free lunch". Any activity degrades the energy intake to a certain degree and is necessarily linked to waste heat. In the long term perspective, this means the "heat death of the universe".

    Schrödinger[2] and later on Prigogine[3] and his school qualified the Second Law of Thermodynamics by stressing that its range of application is true for closed systems only. They proved the opposite tendency (an entropy decrease over time) for an open system which eats up energy of low entropy and dissipates energy of higher entropy to its environment. The overall system does not violate the Second Law. Nevertheless, more unlikely material structures can arise locally. They are the basis for the development of any higher forms of evolution which counteract the overall tendency of entropy increase. In addition to Schrödinger, who had formulated a more static view, Ilya Prigogine was able to present self-organizing mechanisms which definitely produce higher-ordered macro structures. An element of uncertainty still remains which seems to be essential for evolutionary processes: while the occurrence of the macro structure can be predicted with certainty, its precise location in space and time remains fuzzy.

    The next step of the debate started with Boltzmann (1844-1906) who, by analyzing heat as a statistical phenomenon, was able to base the (macro-) concept of entropy on the possible (micro-) states of the particles in a thermodynamic system. His well known function for entropy

    S = k log W,

    where k = 1.38 10-23 J/deg.K is the Boltzmann constant, and W - roughly speaking - is the number of possible micro-states which correspond to one macro-state, reaches a maximum if all the micro-states are equally distributed. Micro-states in this respect refer to particles which are in a particular state of energy or show a particular impulse (mass times velocity). They are energy-related. The investigation of the consequences of the different distributions of the micro-states in the phase space is one of the main goals of statistical thermodynamics[4]. Each of these states can be interpreted as being representative of a "region" in an abstract "space". Through his definition of entropy Boltzmann established a link between the heat- (and thus energy-) related notion of entropy and a measure of structure of abstract micro-states.

    2. Entropy Becomes a Measure of Structure

    In their famous booklet "The Mathematical Theory of Communication" Shannon and Weaver[5] have exploited the second meaning of entropy by using the identical mathematical formulation to characterize the average information of a message transmitted from a source to a sink through a channel, thus giving rise to a source of confusion which has not yet come to an end. Entropy in this context reaches its maximum if each of the single pieces of information can occur with equal probability, but there is no connection to physical reality. The "micro-states" no longer represent impulses or energy levels. The only thing in common is the name and the mathematical formula.

    It may be characteristic for the contemporary situation that the choice of the term entropy was suggested by John von Neumann, who gave Claude Shannon the advice to use this concept. The reason: this term would promote discussion of his theory "because nobody would understand its meaning"[6].

    Consequently the US physicist Jeffrey Wicken stated: "As a result of independent developments in thermodynamics and in information theory, today there exist two 'entropies' in science, this is one too many"[7].

    Since then, an inflation of the concept of entropy can be found in different branches of science and social science. We shall give but a few examples: in mathematical statistics and econometrics the formula of entropy in its logarithmic form is used to define a goal function for maximum likelihood estimators[8], in sociology it is used to measure social equality, in political science to describe voting behavior[9], in economics to characterize market activities[10], in the theory of neural networks it is applied as a tool for finding the global minimum instead of local minima by simulating an entropy function using the method of "simulated annealing"[11], and in biology entropy is used as a measure of complexity[12] of organisms (by reversing the sign).

    Georgescu-Roegen has used the entropy concept to construct a Fourth Law of Thermodynamics in which he extended the entropy concept to matter and arrived at a very pessimistic conclusion[13]: there is no possibility of a complete recycling of matter once it is dispersed. He states that in a system like the earth (nearly no exchange of matter with the environment) mechanical work cannot proceed at a constant rate forever, or, there is a law of increasing material entropy. This means that it is not possible to get back all the dissipated matter of, for instance, tires worn out by friction.

    Such a statement could of course not pass unchallenged. The inherent contradiction of the Fourth Law and the Second Law was recently revealed[14]. We shall return to Georgescu's Law later on.

    Following Boltzmann's correct step to link the energy oriented measure of entropy to an energy-structural measure of micro states, many epigones applied the entropy concept to other dimensions of reality without any further thought as to whether linking the two aspects in the new field of application remains correct as well. Order of any material structure on the macroscopic level was more and more identified with negentropy, and its connex to available energy was taken for granted. Thus it is no surprise that Werner Ebeling, the "Prigogine of the East", comes to the following conclusion: "It has to be mentioned that for economic-technological processes the quantification of flows of entropy has not as yet been solved"[15]. This can be illustrated by a recent paper by T. H. Dung[16]. In describing consumption and production he separated entropy completely from energy or heat. The term "energy" is not used even once throughout his paper. The link to micro-states has been severed as well. Entropy in Dung's context means some state of disorder of macro-structures. The question remains open, then, as to why he still believes in the applicability of the Second Law of Thermodynamics to his concept of entropy.

    3. The Background of the Unending Controversies

    While it seems correct that the evolution of any ordered material structure requires free energy and creates waste heat, the reverse statement is not true. It is not possible to regain the energy used during the construction of a house, although an ordered material structure was formed. On the contrary, dismantling the house once again requires free energy (to blow it up or to break it down into bricks and other parts). The order produced (falsely identified with low thermodynamic entropy) does not give rise to any amount of free energy. The same holds for the familiar Shannon entropy. There is no longer any connection to thermodynamics, but merely pure structures of signals, devoid of any material basis.

    One link nevertheless remains: if one wants to realize a physical structure which carries information (let's say 1 bit), the minimum effort of energy can be computed by means of the well known Boltzmann entropy[17]

    S(1 bit) = k log 2.

    Since entropy on a thermodynamic level can be described by Clausius' formula

    S = Q/T,

    where Q represents a heat difference which can be expressed in energy units, and T represents temperature measured in deg.Kelvin, one ends up with the amount of energy to create the smallest possible measurable difference in matter.

    E(1 bit) = k T log 2.

    As one can see from the formula, the lower the temperature the lower will be the level of energy needed. Thermal noise has to be overcome by such an amount of energy that the microstate of the particle to be coded remains fixed. It is even more speculative to extend the formula further to the amount of mass necessarily connected with one bit of information. One could apply Einstein's energy equivalent E = mc2 to the formula above such that

    m(1 bit) = (k T log 2)/c2

    represents the necessary mass equivalent to encode 1 bit in a material structure.

    It may be that these formulae will be real restrictions for chip technologies if they shrink towards orders of magnitude of the size of elementary particles. At the contemporary level of technology they are simply not yet applicable. The current memory chips for computers, e. g. the 64MBit chip on an area of about 1cm2, carries flip-flops of an average length of about 1/1000000 m = 10-6 m. To compare this figure with the size of nuclei: the Bohr-diameter of the oxygen atom is of the length of 10-10 m.

    4. Entropy and Techno-economic Processes

    In order to describe all our actual economic activities (production and consumption of consumer goods and services and/or the production and use of means of production like machinery, construction and intermediary goods) it seems sufficient to measure the activities by using the concept of available or free energy and waste heat. The majority of economic processes uses all the energy intake for the production of the desired commodity (or material structure) or service (a material process) and, in the end, transforms the energy into waste heat. The exceptions are the production processes in food and agriculture, the conversion processes in the energy sector, and, quantitatively less important, in chemistry. In the above cases the output of production can be used as an energy source again, either for consumption (where chemical energy is used to maintain a temperature gradient between the corpse and the environment of many mammals and human beings, but finally that gradient is transformed into waste heat) or for starting new production activities. If enough energy is available no restrictions to the production process will apply directly through the Laws of Thermodynamics.

    In fact Georgescu-Roegen did not base his Fourth Law on theoretical grounds, but on empirical data which weaken the persuasive power of his argument. He connected his Fourth Law to the limited availability of energy to humankind. While for the development of humankind it is impossible to refer to nonrenewable energy resources in the long run, one has to reconstruct the energy base towards renewable sources, namely solar energy. In the quoted article he repeats the findings by SOLAREX Corporation that the energy input for the construction of photovoltaic cells is higher than their output over their lifetime. His main argument consists in the rather outdated information by SOLAREX whose final report on "The Energy Requirement for the Production of Silicon Solar Arrays" stated that "the harnessed energy is insufficient for reproducing the array even if all the materials necessary are supplied gratis".

    Implicitly Georgescu-Roegen expects humankind to run short of energy and therefore matter cannot be recycled.

    More recent sources unanimously tell us a different story: photovoltaic devices need 3 to 7 years to reproduce their energy consumption, while their lifetime is about 20 years[18], thermal collectors of solar energy have an energy pay-back period of 2 to 5 years only (the lifetime is, again, about 20 years)[19]. The pay-back period depends on the amount of recycled aluminum used for the device.

    What seems to be true for the energy surplus seems to be true for the economic surplus as well. Recently the US company United Solar Systems, Virginia, announced a break-through in the production of photovoltaics. While the energy produced by earlier technologies cost approx. 50 cents per kilowatt, the recent technology brings costs down to less than 15 cents. Nicholas Lenssen, energy expert of the Worldwatch Institute in Washington, D. C., indicates that: "For the fabrication of these photovoltaic cells by a kind of sputtering the metal to glass less resources are needed and less waste is produced ... The efficiency of these cells probably will be further improved .. Prices will decline and there will be more funds for research."[20]

    The photovoltaic use of solar energy is not the only one possible. Side by side we find the solar-chemical and the solar-thermic option. Solar-chemical processes would either allow for a hydrogen based-economy which exploits the photolytic effect[21] (breaking up water molecules into hydrogen and oxygen by photons - in a manner analogous to electrolysis), or biomass production for renewable energy resources. It can be shown that a mere 5% of the biomass would be sufficient for satisfying the world's energy demand. At the moment about 1.5% of the biomass is used for human and animal food, 2% is consumed as fibers or wood. While the photolytic effect does not yet have any applications in our economy, the solar-thermic option locally plays an increasing role. Particularly for low (central heating and hot water) and for medium temperatures (tower concepts based on light concentrating mirrors, e. g. in France, Italy, or in Barstow, California, or Ocean-Thermal-Electric Conversion = OTEC) practical solutions are available[22]. All the solar options show another additional advantage: they will not increase carbon dioxide in the atmosphere.

    From the above the following understanding can be derived[23]:

    1. The sun is the long term source which can deliver us plenty of free energy for billions of years;

    2. The amount of free energy available to humankind depends on the level of technological development;

    3. In theory technological principles are known which could be applied to use solar energy more efficiently; and

    4. These technologies will be put into practice when the energy price is high enough to make them economically feasible.

    5. Running out of Free Energy is no Actual Threat to the Economy

    Such empirical findings would dispense with Georgescu-Roegen's Fourth Law as well as the pessimistic mood about the use of available energy. As far as we can see, production on earth now and in the foreseeable future is not restricted by the laws of thermodynamics. Nevertheless, difficulties may arise for other reasons, for example by changing the major feedback loops of our biosphere, or by poisoning the atmosphere, etc., but there is no limit with respect to thermodynamic entropy.

    Methodologically speaking, the manifold laws determining the scope of human action compose a hierarchy of encapsulated cones of possible developments, the cosmic evolution being a chain of intertwined cycles of ever more sophisticated, qualitatively different self-organizing systems. The laws of physics apply to the entire universe and cannot be disregarded. They define the ultimate boundaries that cannot be transcended by any feasible development. However, they may be - and have been - prerequisites for realizing possibilities of further developments. These new developments constitute only a fraction of the range of physically possible developments. The phenomena belonging to the new cone are subject to laws that are specific to these phenomena exclusively and are not applicable to phenomena outside of this cone. Though these phenomena are restricted to the existence of well-defined conditions essential for this segment they increase the variety in which reality appears.

    Biotic phenomena are governed by biotic laws that are far more specific to them than laws ruling both inorganic and living matter. Societal developments, in turn, are made possible by biotic laws, but not fully determined by them. They are subordinate to more specific laws, and the range over which they are valid strictly covers the cone of cultural phenomena. So the boundaries of each cone are far more restrictive than those in which the cones are embedded. The more general the laws are, the less specific they are in explaining any particular phenomenon.

    The laws concerning the degradation of energy in a physical sense are applicable to every open and dynamic system with regard to the physical aspects of the system. But the laws do not determine the specific way in which a dissipative structure, a living system or a human society obeys them. If the Law of Entropy holds for the universe as a whole, it is by far the most distant boundary mankind will ever come close to reaching.

    There are urgent challenges in much greater proximity. The human race has so far not yet learned to control its interference in the biosphere, thereby causing a range of environmental problems such as a poisoned atmosphere, a change of natural cycles of matter and energy caused by man-made substances, the possible depletion of the ozone layer, the dying of the forests, the greenhouse effect, land degradation, reduced biodiversity and so forth. Most of these problems arise from the fact that humans overtax the buffer capacity beyond which given natural global cycles and feedback loops threaten to stop working. These problems are in no way entropic, but bio-geo-chemical. Biotic laws give necessary conditions to be fulfilled if metabolism is to take place but they do not define in particular how it has to be done and which strategy the organisms will pursue. Likewise, there are no definite ways in which mankind can meet the challenges. Instead there are a variety of ways to solve environmental problems, depending on the state of historically accumulated knowledge. The solution of environmental problems therefore cannot be reduced to physical or biological considerations - it remains a societal, that is mainly an economic and political task. The question arises how production can be designed so that it is in harmony with our natural environment - and this is a political question.

    Thus we arrive at the understanding that in principle plenty of usable energy is available in our environment. There is no need to believe that by the laws of thermodynamic humankind will come to an end. All these effects point to the fact that it is our task to restructure our economy towards a sustainable one. The quetion remains , as to whether the social decision process can be directed towards the availability of energy resources to everybody all over the world.


    Peter Fleissner
    Departement for Design and Assessment of Technology/Social Cybernetic
    Möllwaldplatz 5/187
    A-1040 Vienna
    e-mail:peter@iguwnext.tuwien.ac.at

    File last modified 16.05.1995 by
    wwwadmin@iguwnext.tuwien.ac.at