The Way of Entropy: from Lagrangian Modelling to Thermal Engineering

This article discusses the concept of entropy in an alternative thermodynamic view, demonstrating dialectically that the reversibility illustrated in common laboratory practice is only a local technical e ect resulting from anthropic processes that slow down the irreversible advance of the disorder. Then, negative entropy is only a ction stemming from the imaginationist idealism. The Lagrangian formalism is applied from the introduction of the idea of temporal con nement of thermal energy states, with time being interpreted as the basis of an evolutionary variable. The acceleration of entropy is formally presented independently of statistical mechanics.

thermodynamic nature of the bioeconomic discourse cannot be ignored in view of the vast evidence of environmental degradation caused by mass industrialization, with clear irreversible consequences.
There are many ways to observe the entropic character of our existence as a species capable of developing technological culture, starting with the belief that we are the supreme result of evolution. The subjective theories that interpret the probabilistic models and the arrow of time as artefacts of ignorance and not as representative schemes of an objective reality are solely distorted echoes of participatory or strong versions of the anthropic principle. We are fascinated by the idea that intelligence is at the top of the order, something very doubtful. Apparently, intelligence has been much more disruptive of the order than constructive, much more entropic than any force of nature; complex, no doubt, but dangerous. Particular emphasis is placed here on the fact that complexity, as an evolutionary aspect, does not necessarily mean progress (in terms of increasing order).
The depreciation of language by technological means of communication has been also a marked factor in increasing entropy with respect to social behaviour and quality of what is produced culturally speaking. From the point of view of the intelligentsia, it is a fact that the number of genuine great minds qualied to leading transformations on knowledge is decreasing, or they are muting (if this is just a phase in a wider cycle, only time shall tell us; the outstanding fact is that never in human history has there been so much technology available for consumption as today). This is clearly seen in the quality of literature in general, in music and in visual arts. One factor that certainly contributes to this phenomenon is that technology has occupied a large part of humanity's lifetime; we even dare to say that it tends to denitely obstruct access to science by society, if not even replace science itself; technological dazzle is making us less and less able to reect constructively. Noise, bad taste, corruption and wickedness have prevailed in our daily lives alongside unrestrained consumerism, the social and environmental cost of which we wouldn't risk estimating. All of this converges in the same direction of time and, therefore, of entropy.
The scraps that humanity leaves wherever it goes are the most evident material evidence of the progressive degradation of the terrestrial system, debris of a civilization that bet all its chips on an overwhelming industrialism towards the depletion of planetary natural resources. The old adage of economy that "man will always be in a position to nd new sources of energy and to invent new ways to control them, for his benet" is becoming more and more overt as the myth that it has always been. In addition, we have long realized that the strength of those who seek to achieve and defend common good and protect the greatness of our Pale Blue Dot as Sagan poetically called the planet Earth [24] is much less than the strength of the predatory organizations that control the destiny of the world.
We thus see that the most immediate laboratory for verifying the generalization of the growth of entropy with time is the planet Earth. On the progress of entropy over time, Rifkin was quite emphatic: "[...] time as we experience it is irreversible. Time only goes in one direction, and that is forward. That forward direction, in turn, is a function of the change in entropy. Time reects the change in energy from concentration to diusion or from order to increasing disorder. If the entropy process could be reversed, then everything that has been done could be undone." [22] Disagreements about the concept of entropy are well known. The reader can nd an interesting discussion of these divergences in Swendsen [29]. Due to the great diversity of views about the very meaning of entropy, we decided to re-discuss entropy based on three statements: 1. There is no isolated system; 2. There is no negative entropy; 3. There is no absolute reversibility. With respect to the rst proposition, it is sucient to refer the reader to the arguments of Borel [5], who proved that no nite physical system can be considered closed. The last proposition is the result of a nding so beautifully presented by Planck: "Un peocessus qui ne pouvait en aucune manière être complètement retourné, je l'appelai un 'processus naturel'. Le terme devenu d'un usage universel pour exprimer l'idée est aujourd'hui: irréversible." [18] As for what we call "reversible process", there is no better denition than Norton's: "The label 'thermodynamically reversible process' denotes a set of irreversible processes in a thermal system, delimited by the set of equilibrium states." [17] Under the hypothesis of controllability the basis of all human action , we can get second proposition and summarize everything that has been said in the three statements as follows: outside the classic mechanistic approach, there is only technical reversibility, that is, one produced anthropically outwards the natural course of events, at the expense of a considerable amount of energy, giving the impression of an entropic reversal between two states of equilibrium.
So, we think that a good way to phrase the heart of the discussion raised is to make a small correction to an observation by Aldous Huxley: "We think of time as [...] something moving irreversibly in one direction. The whole idea is expressed in the scientic notion of increasing entropy: we move continuously in one direction and life is a temporary cancellation of entropy within a larger system." [10], rewriting as: "We think of time as [...] something moving irreversibly in one direction. The whole idea is expressed in the scientic notion of increasing entropy: we move continuously in one direction and life is a temporary deceleration of entropy within a larger system." Then, we can say that, according to the second law of thermodynamics slightly modied, in every natural process, the sum of all accelerations and decelerations of entropy, in all bodies involved in the process, always points to a global acceleration. Starting with an innitesimal moment immediately before the Big-Bang, everything was just radiation "contained" in a kind of a micro-limbo that we may call "abspace" (ab = separated + space), the "rst cause", without which there is no evolution, where no energy was dissipated. The Big-Bang was, as it were, the beginning of the advance of entropy, the beginning of dissipation and, consequently, of degradation. This was a phase of very low entropy, the lower limit that marked the initial state of our universe, something similar to a hermetically sealed box containing a corpuscular gas in which all the particles were arranged in a single wall of the box.
So, the second law of thermodynamics bequeathed us the possibility of starting from a phenomenal evolutionary perspective more complete and consistent with the current model of the universe supported by modern cosmology. Leaving aside the old imaginationist idealism, present in the debates of a pre-Einsteinian physics still unaware of the ideas of cosmic expansion and Big-Bang, we can free the entropy of the obscure notions that surround it. One of these obscure notions, purely idealistic and absolutely unnecessary, is that of negentropy. It was born under strong inuence of mechanics in the subjective propositional context that assumes local order as a result of the inversion of the general trend of degradation, in such a way that, with more local systems of positive entropy than of negative entropy, the global balance would always be of positive and increasing entropy.
Our rst thinkings on thermodynamics had the main objective of detaching it from the recurrent attempts to make it more mechanistic. In fact, such attempts reected, from modernism, the idea of an industrious and orderly world in which mechanics was the discipline of order par excellence. Such a glorious and progressive world could not accept a science that described the degenerating reality of this ephemeral order as something inexorable; not after Victor Hugo's lyrical speeches about progress and the wonderful times to come. Even Poincaré was enthusiastic about industry: "If I take industrial development as a good thing, it is not just because it oers a good argument to the defenders of science; it is mainly because it gives the scientist faith in himself, and also because it provides an immense body of experience [...]." [19] CALIBRE Vol.5, Suplemento Dezembro, 2020.
The world built and centered on the laws of mechanics is the culmination of a true anthropic manifesto, the extension of the mechanisms that characterize our own most trivial everyday actions. In an industrialist society, mechanisms not only enchant people, but serve as references for the progress. As Bachelard said so well, "It is customary to imagine that this general reference to the mechanism stems from the fact that we are a center that produces forces and that we can thus awaken, dynamize the geometry of movements that, without it, would be an useless spectacle." [2] In the words of Rifkin, "The mechanical paradigm proved to be irresistible. It was simple, it was predictable, and above all it worked. Here, it appeared, was the long-sought-for explanation of how the universe functioned. There was an order to things, and that order could be ascertained by mathematical formulas and scientic observation." [22] In aid of the perfect anthropic order model, statistical mechanics introduced attenuating arguments showing that all was not lost.
Certainly, for complex systems the Boltzmann-Gibbs entropic form for non-equilibrium, cannot be generalized, especially in the presence of longrange memory. In such cases, we may start by establishing the probabilistic view of entropy production over time, given by the usual way where Π represents the rate of entropy production in an irreversible process and Φ denotes the ow of entropy from the system to the environment per unit of time; thereof, we proceed, for instance, to the analysis of evolution of probabilities distribution with Fokker-Planck nonlinear formalism for which it is convenient to relate the production of entropy to a current of probability J(x, t) as from, Thus, from the non-linear Fokker-Planck equation it is possible to obtain for a dened boundary where λ denotes a characteristic distance of the system called "London penetration length". An analytical deepening in this line is out of the scope of present work, since we are not interested in dealing with a statistical entropy at this moment. For more details, we recommend, for instance, Balescu (1975) [3]. The reader must understand that the theory we defend does not intend to substantiate a critique of probabilistic models. We recognize that Statistical Mechanics has contributed considerably to the understanding of non-equilibrium phenomena. However, as Sproviero has well observed regarding the law of entropy, "... there is an attempt to 'mitigate' it in two ways: or by trying to deprive it of universality, by means of a new science, Statistical Mechanics (Ludwig Boltzmann), in which (in the sub-atomic world) there could be exceptions; or trying to recognize practical meaning to it only for long cycles such as the 'apagón' of the sun, predicted for cosmic periods of time." [28]. Thus, we do not want to mitigate the role of entropy in the understanding of reality (quite the opposite!), nor to give way to imaginationist idealism. Our debate is aimed at clarifying some points regarding the understanding of the role of entropy and its relationship with the emergence of complexities.

Thermal energy in Lagrangian perspective
In former works, our classic Lagrangian representation of thermal energy was inspired by Louis de Broglie's discussions on relativistic thermodynamics [6] starting with the expression from which he represented a particle as a small heat reservoir in a reference system within which it moves at speed βc, such that The idea of small reservoirs of thermal energy remained, however, not as particles but as small intervals of time. Subsequently, the introduction of a caloric eld generalized the initial idea [26], starting to address the interactions between matter and thermal energy at dierent levels. In particular, the theory of caloric elds, as we have called it, was applied in the description of thermodynamic processes of condensed matter transformations [26]. Since then, relevant questions have been raised by collaborators about the association between entropy and time arrow, a subject that still generates great confusion. Now, it is deduced from the second law that order can increase locally, as long as disorder in the neighbourhood increases. Since the increase in disorder reects the increase in entropy, the increase in order reects its decrease. But this creates a problem; if entropy follows the time arrow, its decrease creates a contradiction, since a local inversion of the time arrow is not experienced. In the universe we live, time always goes from past to future. Because of this, it sounds more logical to assume that entropy always increases under any circumstances, locally or globally; what varies is its acceleration. Order is simply a temporary correlated fact of the way neighbouring systems with dierent equilibrium boundaries (and, additionally, dierent accelerations of entropy) interact.

II: TIME AND ENTROPY Lagrangian forms and topology
In a previous work [26], it was taken present approach associated with the engineering of thermodynamic systems from the perspective of entropy control, however, without the analytical discussion carried out now. Back then, a direct analysis of a non-zero Lagrangian functional was sucient. Since time is understood here as an evolutionary variable in a certain abstract conguration space, it seemed reasonable to us to adopt an analytical procedure in which time is the physical basis of the generalized coordinates of that conguration space (in a sense, we could speak of a "thermal time", even though this sounds very abstract!).
We defend that in reality entropy never decreases. What happens is that irreversibility creates, in certain circumstances, interesting situations of organization associated with states of equilibrium that constitute attractors. In other words, equilibrium states are just boundaries of irreversible sequences [17]. Thus, we can think that there are irreversible processes occurring in neighbouring regions whose respective time intervals establish dierent equilibrium boundary states. It is the interaction between these states that gives rise to those situations where a certain degree of order appears despite the inexorable advance of entropy.
While the entropy of a dynamic system always tends to increase in time, it is acceptable to assume that technical control actions only slow the progress of the entropy. Thus, it is appropriate to speak of the variation of entropy, even if we initially think in terms of the ction of an isolated system. Looking at some thermal engine as in reference [26], the total variation in the generation of entropy can be written as However, if what matters is the rate of change of the entropy, and if the entropy has the same direction as the time arrow, then it is useful to establish a Lagrange functional as where τ ref is the characteristic transition time interval of the system, called "reference time", and f (H) is a generalized coordinate given by the Heaviside function of the time interval which can be translated into Macaulay kets as The Macaulay functions 1 , associated with the Heaviside step function, are used here to start a polynomial thermal loading at some time interval in the entropic evolution of the system. It is important to stress that such analytical technique, rst introduced by the German mathematician Alfred Clebsch (1833-1872), did not receive much attention until the works developed by the English physicist and electrical engineer Oliver Heaviside (1850-1925), and further applications due to the 1933 Nobel Laureate physicist Paul Dirac . Even so, there are currently few works that apply it.
Lagrangian form assumes One of the advantages of Lagrangian formalism is that generalized coordinates can be conveniently chosen to exhibit the symmetries of the system or restrictive topological features. In present approach, the use of kets symbolizes the discontinuities of state between a time interval and the following interval. This Lagrangian was called "ergodetic", which means that it refers to thermal energy evolving over short intervals. The second term refers to the eects of the external supply of thermal energy in a given time interval over a reference period. For τ = τ 0 , the Lagrangian is not dened according to kets rules. In other words, because of this singularity, there is no thermal energy passing through the past, which means that entropy always points to the future. Now, let us take the Euler-Lagrange dierential equation for a non-dissipative situation, d dτ This implies that In addition, we can interpretQ int as the heat transfer interaction for the interior side of a border at temperature T 2 , so that the entropy loaded in the variation of the phase path of the interaction is given by An equivalent ergodetic Lagrangian form would be obtained from the Dirac delta function by From here we can think about extending the analysis by applying the Hamiltonian formalism. Let us derive the Hamiltonian functional for Lagrangian (3), When doing time dierentiation on Lagrangian (3) and Hamiltonian (9), we get Thus, we can write which is, in fact, Lagrangian (8) minus ∂H1 ∂τ , say, We can repeat this sequence of calculations indenitely, in such a way that where m is the ket exponent for the internal ow, and n is the ket exponent for the external ow. So, there is a family of Lagrangian functionals, from Lagrangian (3), describing dierent energy connements, which are related in pairs through the expression (11). Thus, if we have a Lagrangian (m, n), we nd the corresponding La- (12) Two things are easy to notice: 1) the subsequent Lagrangians, from the rst (3), are always the time derivatives (only with respect to the kets) of the respective immediately previous functionals; 2) the equilibrium expression (6) maintains its form for any functional less than a constant, which can even be negative. As interesting as this may be from a mathematical point of view, physically there is a severe constraint: entropy cannot be negative! Therefore, only the rst two representations have physical signicance. However, we can move in the opposite direction. Instead of deriving from the rst functional one after another, we can integrate it successively with respect to the kets to obtain, disregarding integration constants, a family of functionals perfectly in accordance with expression (6) less than a constant from the exponents of the kets.For instance, let us take the ket-double-integral of Lagrangian (3), Applying equation (11), we gain which is precisely the rst ket-integral of Lagrangian (3), and leads to So, we can generalize equilibrium expression (6) to with β as the power constant. For Lagrangian L (2,3) , Therefore, entropy is now depending on a ketpower, so that Clearly, β = 1, 1 n . Note carefully that the method employed combines intrinsic and extrinsic analytical operations; the rst partial derivatives in the Lagrange equation and the integrals are taken with respect to the generalized coordinates (the kets and their temporal derivatives). Still, the partial derivatives of the Hamilton and Lagrange functions are taken directly with respect to time, regarding the rules valid for kets.

First thoughts about all this
We understand that science is, ultimately, the search for universal constants that enable the perception and description of the relatively stable aspects of the world of external things. This is why it seems interesting to assume β as the equilibrium constant for dierent situations of time connement of thermal energy.
In principle, our model serves to reopen the discussion on possible metrics (not quite in the conventional sense of the word "metric") in a thermodynamic manifold, since we can speak of an extensive variable, the "distance" (duration) in a gurative temporal manifold. Furthermore, the multiplier equilibrium constant β appearing in expression (6) is inherited from the power of the time interval in kets, in such a way that we could, so to speak, associate entropy with a "metric index" derived from the time connement of thermal energy established by the Lagrangian functional. As Lavenda said, "A holy grail of thermodynamics has been the search of a metric which would allow one to determine distances on a thermodynamic, or primitive, surface." [14] And he continues: "It still remains that the theory of curvature which relies on a metric which species a line element is nonexistent in Gibbs space." [14] Also, Lavenda recalls Tisza's statements about the impossibility of "a metric based on length and/or the orthogonality of the basis vectors" in Gibbs' space [14]. We believe that we have a new starting point here to ll this void, as long as we consider entropy a native quantity of a temporal manifold (see gures 1 and 2), since it does not depend on the spatial trajectory. Reasserting, it is not a "metric" in common sense, not least because the reference is only temporal. As we said, it is a metric index because it comes from the exponents of the kets. For, the dierent states of connement of thermal energy, connected by the expression (11) and associated with dierent accelerations of entropy, boost the emergence of various systemic congurations.
The introduction of the function H as a generalized coordinate aims to establish small time intervals along which the variation of the entropy appears with an evolutionary signature; moreover, the Lagrangian functional is not dened for a continuous deformation τ → ← τ 0 , since for τ = τ 0 , τ − τ 0 −1 → ∞, and τ − τ 0 0 → ∞. Thus, entropy seen as a quantity associated with the energy "failure" of the system, must be linked, by its very definition, to the time arrow, never pointed to an instant before the instant of observation. In its unidirectional temporal link with energy, it is a trace of the evolution of the system. Consequently, the proposed ergodetic Lagrangian avoids any idealistic abstractionism which seeks to symmetrize entropy as a classic mechanistic concept.
There is a saying from Weinberg that comes well to this discussion. Regarding the standard model, he states that it "... cannot be deduced only from mathematics. Nor does it directly result from the observation of nature". And further on he continues: "It results from conjectures, and is guided by aesthetic judgment and validated by the success of many of their predictions." [31]. We think this applies to entropy, in whose study we have an example of how the limits of pure mathematics and mathematical representation of the physical phenomenon appears. Going through Figures 1 and 2, mathematically, at τ = τ 0 , a "hole" is formed in the corresponding manifold that does not impose any real consequences for the analysis in the region dened by τ < τ 0 , where τ − τ 0 0 = 0 and τ − τ 0 −1 = 0; there is no physical constraint associated. On the contrary, physically, for t to be less than t 0 , the necessary continuous deformation of state γ passing through state α (or α passing through γ) makes impossible to overcome the "hole", since when γ and α coincide the quantities involved lose their meaning in one true entropic "explosion". There would be no way to reverse or avoid this event, since γ and α are on the same causal line. Here is an extreme instance of physical irreversibility. In the realm of human industrial action, considering a thermodynamic process E 0 −→ E 1 as a manipulation of the control variables to change a system from the state E 0 to the state E 1 , each interval t − t 0 µ comprises an irreversible industrial process between two equilibrium states.

Elementary relativistic considerations
From approaches taken by Tolman long time ago on the application of ordinary thermodynamics to an innitesimal region [30], it is possible to draw some interesting conclusions including the introduction of time variation in a generalized function. Strictly speaking in terms of the increasing entropy conjoint with the progression of time, in at spacetime the line element using Galilean coordinates can be written as To consider a variation in an innitesimal volume of spacetime, let us assume the four-dimensional element δxδyδzδ τ − τ 0 −1 . Placed in this way, the innitesimal volume element reects the impossibility of freely navigating time back and forth, showing the distinct nature of the temporal component associated to entropy by specifying a rigid temporal topology (no matter the metric assumed). Achtung: note, however, that this is a purely Galilean particular semantic constraint imposed by the very entropic nature of the universe in which we live; generalized functions can be applied under dierent circumstances in general relativity to analyze the expansion dynamics of the spacetime geodesic arc element (for light-like or space-like geodesics) [25].
If we take into account the essence of the second law, the increase in entropy for the material content of the δxδyδz volume during δ τ − τ 0 −1 is greater than or equal to the entropy that ows from the outside into the innitesimal volume, whether it comes from the transfer of heat or matter. According to Tolman, considering our temporal representation, this allows us to write the inequality where ℘ is the density of entropy, u, v and w are the entropy macroscopic ow velocities at the region considered, and δQ/T is the entropy entering the volume coming from outside during time δ τ − τ 0 −1 , with T being the temperature at the border of the volume (note that the singular function in time appears only as a variational quantity, reecting the hypothesis that time represented as evolutionary magnitude must remain under orientation constraint). Also, Tolman emphasized that since entropy is invariant under Lorentz transformation, entropy density is aected by the Lorentz-Fitzgerald contraction, so that allowing us, after some manipulations and substitutions, to rewrite equation (17) as which is consistent with expression (7), since ("greater than" for irreversible processes). The left-hand side of equation (18)  But, what about δ τ − τ 0 −1 ? As we saw earlier, according to Figures 1 and 2, to deform continuously lines τ and τ 0 there will be a point in which τ = τ 0 , and then, by kets rules, τ − τ 0 −1 → ∞, as well as τ − τ 0 0 → ∞. Both the Lagrangian functional and the relativistic expression would no longer be dened. The energy required for such a topological distortion, if possible, would be unimaginable corresponding to a heat dissipated to such a degree that the increase in entropy would reach catastrophic levels.

The meaning of the expression δQ = T dS
The industrialist view of thermodynamics, represented by the numerous illustrations of thermal machines and graphics of the Carnot cycle, restricted classical school teaching to the mechanistic view of the 19th century, emphasizing a pseudo-reversibility that survives only in the superciality of an anachronistic approach to the subject. This view tends to obscure the general meaning of the expression δQ = T dS, associating it with the entropic process itself, and not with the process boundaries.
To illustrate how far we can go from the expression δQ = T dS, which is nothing more than an equilibrium relation, let us consider a more complex example starting from studies of Jacobson [12] [13]. In general relativity, we can take heat as energy ow through an area of some CALIBRE Vol.5, Suplemento Dezembro, 2020. causal Rindler horizon, setting a proportionality between entropy and that horizon area, and presuming the thermodynamic equilibrium relation δQ = T dS. To satisfy this relation, having in mind the energy ux across the area of the local Rindler horizon, Jacobson states that "...the gravitational lensing by matter energy must distort the causal structure of spacetime in a way that the Einstein equation holds." [12]. For a Rindler horizon, say R, we consider an accelerated observer and identify T as the Unruh temperature, with the heat ow equal to the energy ux to be measured by the observer. From the horizon generator k a and its ane parameter λ, we may dene the expression dX a = k a dλdA, (19) where dA is the area element. As the entropy is assumed to be proportional to the area, we have dS = ηδA, δA being the area variation of a R-generators bundle cross section given by with θ as the expansion of the horizon generators. So, to assume that δQ = T dS ∝ δA is to say that the energy ux is associated with a converging of the horizon generators on the area element. The rate of converging of the generators is given by Raychaudhuri's equation, which under the stationary conditions imposed is simplied to (20) Integrating this equation for a small interval of λ, Now, we set the heat ow through the stress-energy tensor T ab , so that, with the aid of equation (19), Thus, to satisfy the equilibrium relation δQ = T dS = T ηδA, that is, (23) we need to have This is how we impose the local equilibrium condition δQ = T dS in such a context; as in general the horizon will be expanding, contracting or shearing, we need to match these quantities to zero at the vicinity of the horizon. So, we are describing thermal interactions between regions by means of energy ows, with the equilibrium state of the system being a boundary interleaving two irreversible processes. From equation (24), Raychaudhuri's equation can be rewritten in a thermodynamic form, Reverse time: in short, ction and mess We live in a reality that, like it or not, evolves from less probable states to more probable ones. Our existence works this way; we struggle to stay healthy as long as we can, knowing that in the end we will cease to exist, at least in the way we understand a highly organized biological system. Thence, only in a ctional manner could we conceive a reversal of such reality. Our technological culture certainly oers us the means to simulate reversals through engineering, simulations that are illusions of ascending order dragged through time. Figures 1 and 2 illustrate the reasoning; Figure 1 represents topologies 1 and 2; Figure 2, topologies 3 and 4. Let two events α and γ in causal chain be positioned respectively in time bands τ 0 and τ which dene the edges in which the phenomena advance together. In our Universe we have to assume as an unavoidable premise the orientation "time arrow-cum-entropy", from past to future as so clearly explained Hawking [9]. Living here under this strong constraint, to technically "reverse" entropy we would have to, locally, deform the time band τ 0 in order to make the event γ prior to α (Figure 1). Other equivalent possibilities would be to retard γ making it prior to α, or even to combine the two deformations ( Figure 2). As we have already stated, however redundant it appears, assuming that this was feasible, the energy cost of such a topological prowess would be unimaginable, not to say unattainable! The heat dissipated would be of such magnitude that the increase in entropy would reach truly catastrophic levels, with unpredictable consequences for reality.

III: FIELD THEORY
The acceleration of entropy "While entropy tells us the direction of time, it does not tell us the speed. The fact is, the entropy process is constantly changing speed. With every occurrence in the world, entropy increases-but sometimes slower, sometimes faster."

Jeremy Rifkin
There are two ways of dealing with entropy: either convergence with time is given up, assuming it can be negative, or it becomes the very manifestation of time. The diculty seems to stem from a certain inability of the understanding to combine statistics and recurrence/irreversibility. Whitehead revealed, to some extent, this diculty: "La vraie question est que cette récurrence exacte d'un état de la nature semble seulement invraisemblable, tandis que la récurrence d'un instant du temps viole entièrement notre concept d'ordre du temps. Les instants du temps qui ont passé, sont passés, ne peuvent plus jamais être." [32] The argument of the high improbability of a given phenomenon from which the logical impossibility of occurrence in a causal natural chain is perceived is more like a metaphysical recourse introduced ad hoc. Much of the introductory probabilistic analysis in thermodynamics invariably starts under the inuence of Poincaré's ideas about recurrence and with the abstraction of isolated system, something that in practice does not exist. However, for some initial questioning, the image is acceptable as a provisional conjecture. Reichenbach made an enlightening discussion about irreversibility from Boltzmann's theory [21]. We will try to summarize the central idea to treat Gibbs's objection, which permeates the entire discussion. Imagine an isolated system constituted by a gas in which to the velocities of the constituent particles we randomly assigned positive and negative signals. Let the state B of this gas be characterized by the absence of mixing (positive velocities are separated from negative ones). To be as realistic as possible, there must be a mixing state A preceding and originating B. Also, let's assume a merged state after B, say, C. According to Boltzmann's theory, in terms of relative probability, B → C (C provided that B) is much more likely than A → B (B provided that A). However, according to Boltzmann himself, the probability of a given state does not depend on the assigned signal. To avoid any contradiction and Gibbs's objection, we must say that, in terms of absolute probability, B → C is as frequent as A → B (or C → B), a fact that does not contradict Boltzmann's premise. It so happens that, in this way, the perspective of relevance of the temporal direction is completely lost, since go-ing from B to C is as likely as from C to B. This is the end; the utility of the isolated system model nishes here. That is the reason why we must abandon the abstraction of closed systems; rather, when we assume that state B has an external origin, it becomes no longer conditioned to the highly improbable path A → B, and time direction returns to the scene with B certainly preceding C. The most idealistic may object that, at the macroscopic extreme, the universe could be a closed system, but such an assumption is absolutely speculative; there is no indication that would lead us to assume an isolated universe. Tush, it is the perception of the succession of facts that allows us to assimilate time! And, as we have seen, it is the fact that the system is open that allows us to identify the order of that succession! Perhaps time is so poorly understood because it has remained for centuries a mere counter in mechanical expressions, arbitrated by the hands of beautiful gold and silver watches. Its association with entropy, however, gives it a clear transforming role in nature. Entropy is the ensign of time irreversibility. Although there are dierent ways of treating time, either in cosmology or quantum physics 3 , whenever evolution is discussed, entropy is present.
In short, natural phenomena are irreversible, a fact that reects the arrow of time, being reected by the latter. Entropy expresses irreversibility as systems evolve from order to disorder. There is no rationality in talking about negentropy, since entropy is a concept linked to irreversibility. Apart from the anthropic nature of technical interventions 4 , which simulate inversion of entropy (when, in fact, they locally slow down the advance of entropy at the cost of much energy dissipation), what makes sense to consider is the so-called spontaneous state of "zero-mark entropy", a low entropy state at the starting point (zero-mark) like the Big-Bang. In this context, the emergence of life is also inserted. Certainly, the evolutionary trajectory of a system from a zero-mark state shall be inuenced by neighbouring systems according to their dierent equilibrium boundary states (assigned by constant β) and, perhaps, to the entropy variation rates of these systems (as previously mentioned, the interaction of dierent acceleration states would contribute to induce the emergence of new complexities). Formal aspects of the representation of the entropy rate will be dealt next subsection with the help of the so-called Caloric Field Theory developed in [26]. 3 When it comes to quantum entanglement, however, the subject becomes challenging, as it is known that the space-time relationships between two entangled particles are surprisingly dierent from the relationships between macroscopic objects. This, however, is a topic for another place. 4 For all practical purposes, an isolated system is understood as one that does not receive any anthropic input. Generalized caloric eld theory According to Caloric Field Theory presented elsewhere, there is a scalar eld equation with an entropy term written as with γ constant. Depending on the form of the eld, the entropy term can be slightly modied, so that being the eld entropy in generalized coordinates q given by S = −2γ 2 |ξ| 2 ln |ξ|dq. (27) Note that this approach concerns the shape of the caloric eld (the function that characterizes thermal energy itself) and the mathematical law of its propagation, including the entropic trail left by the diusion process. In the previous Lagrangian approach, on the contrary, we established relationships between heat ows through a given region and the entropy productions involved. As we stated, it is mainly the interaction between regions governed by dierent extremes of equilibrium (slightly dierent attractors) that determines the appearance of complexities. It is almost certain (but we are still not absolutely sure) that dierences in acceleration of entropy between those regions also play a rule in this outbreak of order.
So, entropy acceleration is dened as Now, we rst consider the application of derivatives by parts to join both results under one integration: Let q, the generalized coordinate, take the form q = f (τ ). Thus, the total integration is given by For a particular eld of the form ξ = e iaq−ϑ (a and ϑ are real numbers), the rst term of the integration vanishes, so that With q = e κτ , Combining last equation with equation (14), we gain The physical meaning of the quantities γ, ϑ and κ depends on the context in which the formalism is applied. We will see an example in the next section, consolidating the idea of ur-entropy.

IV: HUMAN LIFE WITH THE UR-ENTROPY
Energy to retard entropy growth: a pragmatic approach The ur-entropy is the eternal and primordial entropy of the universe. The last part of this article seeks to make a nal pragmatic condensation of the philosophical discussion that permeated the whole of this work, so that the reader perceives the extent of the theory explained and how ur-entropy is present in our lives.
In terms of survival of the human species, according to Rubbia [23], two alternatives for obtaining energy are presented for large-scale production: solar energy; new nuclear energy. The second option presupposes nuclear nonproliferation, with no use of U-235; Thorium ssion and D-T fusion would be the very candidates. However, this alternative coming up with two major problems: 1)high number of irreversible processes with generation of nuclear waste, and 2)-inextricable connection between peaceful and military applications of atomic energy [23]. Conversely, the rst option has numerous advantages, starting with the high availability of sunlight in tropical, sub-Saharan and desert regions. In fact, mainly after Fukushima Daiichi man-made catastrophe, it seems that we are moving faster in this last direction, withstanding, as always, the pressures of oil companies and hydroelectric plants. Like it or not, it is a global trend with no turning back. We urgently need solar energy, not only for environmental preservation, but also because we will soon have no option. With the current rates of devastation of natural resources and dryness of the planet, it is not surprising that so many countries are now investing in materials research to improve the performance of photovoltaic cells, as well as in the engineering of panels and components. Not only that; we have ample possibilities ahead for the realization of solar plants for recycling solid waste in large scale [26].
The large cities and the entropic idealistic intellectualism Large cities are examples of dissipative places where ur-entropy accelerates considerably due to anthropogenic activities and interactions with neighborhoods. Prigogine pointed out this fact in his discussion on non-equilibrium structures: "The simplest example of dissipative structure that we can evoke by analogy is the city. A city is dierent from the countryside around it; the roots of this individualization are in the relations it maintains with the surrounding countryside: if those were suppressed, the city would disappear. [20]  Worse still, as if the very nature of such daily activities were not enough, a true idealistic entropic intellectualism emerges, very common among urban planners. Jacobs was emphatic about this in his seminal criticism of the modernist arrogance in current urban planning [11].
The major problem of planners is that they derive their ideas from futuristic daydreams and not from a keen societal perception through which the real desires and needs of a population would be captured. Very few would be able to conduct a structural anthropological analysis that reected habits and interpersonal relationships typical of a given urban nucleus, perhaps because they were consumed by the empty culture of postmod-ernism, or by the excess of technological appeals. The most common projects accelerate urban entropy, creating areas of abandonment with no use and that deteriorate into oblivion. Full of zoning and prohibited spaces, these projects impose disruptive concepts on the gregarious nature of the human species, resulting in isolation, monotony and limited mobility.
Heat islands, loss and solar technology If, on one hand, cities accumulate islands of heat, on the other hand they are great islands of marked acceleration CALIBRE Vol.5, Suplemento Dezembro, 2020. of ur-entropy. When describing the thermal energy dissipated through asphalt and concrete in heat islands of large cities, which could broadly be converted into useful energy from solar panels and concentrators for countless devices and supply plants, variables and parameters in eld equation (26) assume well specic macroscopic roles: γ, the opacity of the medium, refers to the so-called "luminothermic capacity", ϑ is the refractive index of the medium and κ is the average Sky View Factor (SVF) of the city [27].
Measuring the eld in urban environments we can evaluate the entropic trail that it leaves to establish the amount of irreversible processes everywhere in the city. As observed in reference [27], the refractive index and the opacity of the medium can vary under anthropogenic inuence, interfering with the local entropy rate.
In order to make the most of sunlight freely available in city's environment and its vicinity, solar panels could take advantage of much of the thermal energy lost, especially in building roof structures, parking lots, airport areas and rural areas. Figure 5 shows a schematic plant of a solar thermal system with storage capacity for use in periods of absence of sunlight adapted from Rubbia's work [23].

New perspectives in solar thermal systems
It is not our intention to discuss technical aspects about the manufacture of photovoltaic cells, nor about the types of existing cells; the reader can nd excellent details in references [8] and [15]. We are interested in emphasizing the limits imposed by the Second Law in this technological eld. As much as we dope silicon, there will always be an eciency limit and a related dissipation of unusable thermal energy.
The photovoltaic eect is the phenomenon resulting from the incidence of light on the surface of a semiconductor material creating charge-carrying electron-gap pairs, producing electric current; solar cells convert sunlight into electrical current using this eect. The use of solar energy is certainly increasing worldwide, even in countries with little annual availability of sunlight compared to Brazil. However, there is still a considerable technological route to travel, since the current eciency achieved by the available photovoltaic cells requires large panels for a satisfactory supply of electricity. This technological route includes the search for new materials from existing ones that enable better performance, in addition to the improvement of forms of storage.
About 89% of photovoltaic cells are made with silicon. Silicon is not a good conductor of electricity; doped with phosphorus, it generates a free electron in the last layer (type N); boron doped 5 , it leaves a gap to be lled by an electron (type P). P-N junction generates electric eld; photon incidence creates electronf low −→ ddp + current. The theoretical thermodynamic limit for the efciency of converting light energy into electricity from a single P-N junction cell is about 32.9%. Photons with energies below the band-gap (prohibited band energy) are not absorbed, whereas photons with energies above the band-gap induce quantum tunnelling (see Figure 4) and have part of this energy dissipated mainly in form of heat.
A new bet in solar thermal systems engineering is the black silicon (b-Si) photovoltaic cell. As reported, this cell exceeds the theoretical limit of 100% in terms of external quantum eciency by 30%. The external quantum eciency of a device is 100% when an incoming photon generates an electron for the external circuit. Thus, a quantum eciency of 130% means that an incoming photon generates approximately 1.3 electrons! Another very promising technology in progress is that of sensitizing photovoltaic cells by quantum dots. Quantum dots (QDs) are semiconductors that are on the nanometer scale; they are nanocrystals that, due to their nanostructure (shape and size), determine potential wells, conning electrons in discrete states of energy ( Figure 4). More precisely, a quantum dot contains a small number of conduction band electrons and valence band-gaps (quasi-particles or excitons). These nanocrystalline structures constitute the so-called "quantum dot solar cells" or quantum dot-sensitized solar cells , of great interest in the eld of solar energy; they reach for now an eciency of 16.6% with negligible hysteresis, accordingly Hao et al in a recent study on the application of mixed caesium and formamidinium lead triiodide perovskite system (Cs1-xFAxPbI3) in the form of QDs [8]. It is important to remember that QDs are microparticles contained in liquid solutions, which allows their application in the form of paints, a very useful feature to manufacture solar panels by printing systems on exible substrates at low cost.
QDs are also called "articial atoms", although the scales are quite dierent (QDs = 100nm against atoms = 0.1nm); in atoms, the attractive forces are exerted by the nucleus, while in QDs they are exerted by the background charges of the nanocrystalline structure. A relevant feature in quantum dot technology is that band-gaps are adjustable across a wide range of energy levels, changing the size of the quantum dot, in contrast to conventional materials where the band-gaps are xed. Lastly, in addition to this feature, some extra optimizations can be implemented; there are researches on the incorporation of silicon quantum dots (Si-QDs) onto b-Si as a hybrid nanostructure, resulting in reectance reduction over a wide spectral range (300 − 1000nm) [16].
Given this overview, it is easy to see that we deal with two main entropic production lines in a fully functioning photovoltaic QDs system (hybrid or not), namely, natural thermal losses as in any transmission system in the photon absorption process, and dust that settles on the panels, notably urban dust made up of a mix of particles of dierent materials. The accumulation of dust decreases the eciency of the panel, so that the correct maintenance together with the robust-ness of modern solar kits will guarantee a life cycle of around 25 years, at the end of which the system as a whole denitively degrades. Ur

Henrique de Gand
In view of the knowledge we now have about the Universe, the idea that, if we wait long enough, thousands of simians touching haphazardly computer keyboards will eventually write Homer's Iliad does not go beyond an imaginationist idealistic delirium. Probably, the mechanistic view at the time of Boltzmann, and still dominant today, forced the thermodynamic reasoning towards a solution that foresaw the possibility of restoring order. But today it would not be justied to maintain idealistic constructions if the concept of entropy were widely discussed since basic scientic education. As observed by Deutscher, "In elementary and even high school education, a great deal of time is spent teaching students the basic notions of force, energy and power, but the word "entropy" is often not even pronounced. It is only those who specialize in the sciences who will become familiar with it. This is a dramatic shortcoming of our education system, as without some understanding of what entropy means it is essentially impossible to comprehend what is going on in the environment and to make the right decisions for its defense." [7] The problem with humanity is that we are especially talented in developing ways to accelerate entropy under the pretense of progress (one of the worst legacies we have left in recent times was the extinction of the old Oxiana Palus, better known as the Aral Sea). Why, it is the very sense of advancing entropy over time that should guide us wisely to mediate between our needs and what is really possible. Every order has an entropic cost. In the case of the civilizing process, this cost is very high. The emergence of humanity has unsettled the environment at a terrible residual rate per entropic activity. Understanding ur-entropy, we believe, it will be possible to accept the fact that certain damages that we cause to ourselves and to the world are denitive. It remains to be seen whether there will be room for the owering of the ideas discussed here, and whether we are willing, as a species, to rationally accept the limits that nature imposes on us.