logic> hierarchies> thermodynamics
This page is about the thermodynamics of organic systems.
assumes closed and static worlds. Therefore – dichotomously
– organicism must start out with the opposite assumption.
Which leads us to an “unboxed” model of
We will have a hierarchically-organised system that has a middle ground exhibiting the signature property of scale symmetry. Or in thermodynamic terms, a system that is energetically open and smoothly expanding. In organicism, in contradictory fashion, it is inflation or growth that comes for free. It is a fundamental attribute of a "world". By contrast with the usual mechanical view, it is stillness or conservation that would cost extra.
Of course, it is not usual to talk about thermodynamics as being part of “logic”. But what is thermodynamics? It is about the behaviour of large scale systems made up of many small scale or atomistic parts. If you get enough bits of stuff in interaction, then eventually they will form a system that settles down towards some average or ambient state. The whole they create will develop a typical internal statistical picture.
And logic is about predictable regularity - the way things have to happen. The laws of thermodynamics allow us to say that given starting conditions A, you can be absolutely certain you will arrive sooner or later at final conditions B. So even if thermodynamic outcomes appear to be based on randomness and averaging - a broader statistical view - they are still logical and we ought to include them as part of any general causal model of reality.
Thermodynamics is only not part of classical models of logic because classical models are mechanical and so desire to dispense with the systems-level view. The mechanical approach says systems behaviour is not intrinsic, but subsequent. It emerges itself as a result rather than being a necessary part of the causal process. But again, here we are taking a holistic stance in which both systems and their parts are fundamental and so thermodynamics does become an intrinsic part of causal modelling – its middle ground as I have said.
And the theme of the discussion here is that there are two quite different thermodynamic pictures that will seem fundamental depending on whether you employ mechanical or organic logic.
Briefly - deep breath - the mechanical view of any thermodynamic system is essentially boxed. That is static and closed. This produces a certain characteristic statistics. The disorder, entropy or randomness of the system will seem Gaussian. It will have a single (smallest) scale. But the organic view is un-boxed. It is the view of a system that is open and dynamic or freely expanding. And now its statistics, its description of the “disordered” middle ground, will be fractal or scale-free.
This organic approach to thermodynamics obviously connects to the extended models of entropy proposed by Renyi, Tsallis and others. Where classical Boltzmann entropy has a single (smallest) scale of interactions, these new entropies have fractal scale – correlations over all possible scales. But let’s not rush into the technicalities. For the moment I just want to paint a broad-brush philosophical picture of the issues.
Boltzmann's gas in a box
We can begin by thinking about the standard mechanical image of a
thermodynamic system – a bunch of gas particles trapped in a
box. This is the classical Maxwell-Boltzmann story of an ideal gas or
the canonical ensemble.
The raw ingredients are some freely moving particles. There are no complicated interactions between the particles as you might have with water molecules, like dragging electrostatic bonds. The particles are Newtonian billiard balls travelling inertially from one elastic collision to the next. It is the simplest kind of system possible – or the one that maximises the causal property of locality (the L in RAMML).
The particles are then trapped inside a rigid container. There is a box that sets fixed or static global limits. There is no need to call on some emergent global effect like surface tension to organise the limits of the system. Following Boltzmann, we can simply take this container to exist.
And note something critical here. There is space (and time) inside the box. So the interior of the box is a void – an a-causal backdrop that neither contributes nor hampers the action taking place within it. And the walls likewise do nothing – except for one thing. They reflect any straying particles back into the heart of the system. The walls of the box serve to keep its contents all together in the one space (and all at the same time).
This is of course pure atomism. Except instead of the void being infinite in its extent, someone has taken the trouble to enclose a portion of it. And if you are thinking the existence of the box now sounds rather artificial, well it is.
Anyway, let’s talk about order and disorder in such a world. The idea of order and disorder – tidiness and messiness - is central to thermodynamics. On the one hand we have organisation, information, crispness. On the other (dichotomously!) we have chaos, noise, chance, randomness.
The first law of thermodynamics is the conservation principle. In any closed system, the contents are going to be preserved. Change is in a sense illusory because there is only ever a process of rearrangement or mixing.
This of course assumes that the “stuff” of the system, its substance, has existence (an inertial quality) rather than merely persistence (a contextual quality). Unlike say a pattern of ripples or whorls in water, a bunch of gas particles can be granted this kind of substantial presence. They won’t be slipping into and out of existence of their own accord. And if the ingredients inside a box are stable, then obviously the total amount is going to be conserved. The gas particles can bang about as much as they want, but like angry bees buzzing in a jar, they can never escape.
The second law of thermodynamics then states that order must always slide down the slippery slope towards disorder. Given freely behaving particles, ones that do bang about blindly, then the insides of a closed box must become more and more messy until eventually it reaches some equilibrium limit of messiness. With time, the contents of the box will become scrambled so that any further changes result in no apparent change. And once scrambled, the contents will have no way of unscrambling themselves.
The second law is often likened to a teenager’s bedroom. The room may start spic and span. But lock a teenager in it for a weekend and it will follow an inexorable path towards maximum entropy or disorder. All order will be degraded by random acts until it reaches a state of disorder where the further shuffling around of books, clothes and CDs can do nothing to deepen the prevailing sense of chaos. And because books, clothes and CDs are helpless to put themselves back in drawers or cupboards, there is no way back up the slope to the initial state of order.
The first and second laws of thermodynamics are of course “paradoxically” dichotomous in many ways. So while the first law is time-symmetric – the world remains essentially the same, unchanged in its overall contents, as it passes through time – the second law is time-asymmetric. It has an arrow of time in that the world looks different at the end than it does at the start. Once order (or its synonym, energy) is spent, it cannot be recovered.
Now back to the gas particles trapped in a box. A point often missed in
discussions of thermodynamics is that there are two possible extremes
of order in this situation, two ways of being perfectly neat and
First we can imagine all the gas particles starting clumped together in one corner of the box. This unlikely situation might be created by an experimenter using a syringe to inject a little squirt of gas. Anyway we can see how the gas would soon spread out to fill the box. The initial state of order would be disordered, the position of the gas particles all messed up by their random wanderings.
According to Newtonian mechanics, it is possible that through the chaotic activity of the particles, they might one day all happen to bounce back to the same corner and regroup. The original state of high order might be recreated. But we also know that this would be a vanishingly unlikely occurrence. It would be like water suddenly deciding to flow up hill.
Each possible configuration of the particles in the box is termed a microstate. And there are countless gazillions of microstates that have much the same look – that old messy spread-outness. Only a few would resemble the initial clumping. So if each microstate occurs at random, then the chances of the clumped look reappearing becomes a very long shot, while a messy looking microstate is pretty much guaranteed.
It is simple statistics. Microstates are effectively pulled out the hat. There are gazillions of messy microstates holding a ticket and only a very few tidy ones available for selection.
Now the image of all the particles starting out clumped in one corner is the conventional way of representing an initial state of order in such a system. But there is - surprise, surprise - a second asymmetric and dichotomous alternative. Instead of being all clumped in one corner, the gas particles might instead find themselves starting out exactly evenly spaced, as if painstakingly arranged by some fiendish experimenter on a lattice-like grid.
So we have thermodynamic order dichotomised into spread~clumped. And released from either of these two extreme of orderliness, gas particles would immediately start to wander off, becoming messy and disordered.
But while we have two possible states of order, there is only one resulting state of disorder. There is only one stable outcome, one equilibrium balance, that can form the emergent middle ground of a hierarchy. What is its statistical profile?
Well we can see that the messiness of an ideal gas at equilibrium looks more spread out than clumped. As a state, the disorder of the particles in the box lies quite close to a perfectly regular or lattice-like arrangement. There is a randomness, but it is rather compressed. Indeed it seems squashed to a single scale – a smallest possible scale.
Randomness with a single scale is called a “normal”
or Gaussian distribution. It yields the famous bell curve and also the
s-curve of a cumulative distribution functions - two ways of graphing
essentially the same thing.
The bell curve has a single mean, a central average. The probability of an event or occurrence then tails away rapidly towards the extremes. There is an asymptotic approach to a limit (and where have we heard that one before?).
If we measure a group of people for height, IQ or many other characteristics, we will get (roughly) this distribution. A lot of people in the middle between five foot and six foot. Then a very few people at four foot and seven foot. Maybe a few in a billion at three and eight foot. And even here, syndromes such as dwarfism or pituitary gland tumours usually come into the explanation.
We can get an intuitive sense of how Gaussian distributions are created. It is like the system in question - like the genome that builds the typical human - is aiming at a target and then there is a bit of error in the results. Point a gun at a bulls-eye and there will be a grouping with a degree of scattering. Most shots will be near the target and then a few may miss by quite a bit. The randomness is due to a process which has a target outcome but which is constrained rather than controlled. The results will always be close but never exact/
Another version of this kind of directed randomness is the Poisson distribution. This the Gaussian or normal distribution, but skewed or compressed to one side. Typically we see this pattern emerge with a time-based event like radioactive decay.
Now radioactive decay is in fact very mysterious being a quantum process. But as it will be useful later, let’s describe how it is an example of a Poisson distribution. Take a lump of plutonium-241, an isotope with a half-life of 14.4 years. This half-life figure means that in 14.4 years, half of the atoms will have decayed. After 28.8 years a further half the remaining atoms – or a quarter of the original – will also have decayed. And so on.
We can see that the tail of this Poisson distribution thus stretches out towards infinity. Or at least some very large numbers. The halving of the remaining half can keep going for quite a time, which is why trace residues of radioactive materials can linger so long.
It works the other way too, which is why the first half of the Poisson curve is squashed up. In half the half-life, or 7.2 years, a quarter of the atoms will have already decayed. In half of that time, or 3.6 years, an eighth will have already decayed. So while the halving gets stretched out in one direction, in the other it gets compressed. The Poisson curve is crisply bounded to one side (the moment, for example, when we first start to count decay events), yet essentially unbounded to the other.
Where this all gets quantum-curious is that individually every plutonium atom acts as if it is independent and memoryless. The chances of some atom decaying at any particular moment is unaffected by how long it has been sitting about. It is not like a time-bomb with a fizzing fuse. The fact that it did not go off a moment ago, or the moment before that, has no impact on its likelihood of going off right now. It might or it might not in quite random fashion. Yet collectively, globally, we see a statistical pattern emerge. Half the atoms in the lump are likely to go off after 14.4 years.
One way of making sense of this paradox is that each atom of a certain size has a break-up threshold – a point at which nuclear forces are no longer strong enough to bind it together. The atom trembles the whole time due to quantum uncertainty, and at some point a fluctuation is large enough to carry it over the threshold. But still the fluctuation itself seems causeless.
So the skew is about leaping a threshold or making a phase transition. The decay is in a sense held back because smaller fluctuations are ignored and then critical size fluctuations are dramatised as "an event". The size of the particular threshold for the system is consequently what sets the mean - the half-life time.
A Poisson distribution is a special case because it overlays a
yes/no binary decision about the occurence of an event over a deeper
level of randomness - a spread of fluctuations which may or may not
bust a particular threshold. And does this deeper level of statistics
fit some profile? Well if it is not Gaussian, it must be scale-free.
But anyway, for the moment, let’s continue building up a
sense of what a Gaussian
or normal random distribution is all about.
Another classic example is coin flipping. With a fair toss, the chances of heads or tails is 50/50 with every go. The process is memoryless – each toss is independent of any historical trends – and so the odds always remain 50/50. Yet it is also equally certain that enough coin tosses will produce excursions from the mean. You will get a surprising run of heads or tails. Indeed, to be properly random, such runs must occur with predictable regularity. We could indeed say that something like a run of seven heads has a half-life of 250 throws. That is on half the trials that length, it will happen at least once.
Coin-tosses, roulette wheels, radioactive decay and other games of chance have no memory. So let’s consider quickly the difference of a random process with a memory, such as brownian motion or a random walk.
Imagine tracking a jittering particle that with every step can jump in one of four directions. It can go left, right, forwards or backwards. Well a quarter of the time it will go back the way it just came, erasing its last step. And at other times it may repeat its last jump, even making runs of jump in the same direction like a sequence of heads.
So the particle is behaving randomly. Yet now this system (also called a Levy flight) has a memory because each new step takes off from wherever the particle last landed. It is like a drunk staggering around an infinitely large field. The actions are random but carry a history of the past with them
When we look at the statistics of such a random walk, it is fractal. Scale-free. Patchy instead of smooth. The particle can spend some time wandering one small corner of the space before lurching away. It is all about fits and starts as the particle appears to be battling two opposed tendencies, one to cluster, the other to flee. And because this dichotomous impulse is freely expressed over spatiotemporal scale, the resulting wanderings become fractal or scale-free.
disorder fills the middle ground
Now let’s return to the statistical picture created by gas
particles allowed to mess about in a sealed box. As we said, there are
two possible initial states of order but only one disorderly outcome
for an ideal gas trapped in a container. This disorder has a single
And it is a scale centred around the system’s lower or local
bound of orderliness. The disorder looks a long way from clumped order
but ends up looking very close to smooth order.
Why is this so? The particles released from either kind of order – clumped or grid-like – will start to bang about like angry bees. They will go off in any which direction and randomise their positions.
They will also randomise their velocities. It would not matter if we released all the particles with a uniform speed or with a wide variety of speeds. Through their elastic collisions they would bump about in ways that had the effect of creating a homogenous distribution of speeds.
If the initial state was a mix of very hot and very cold
particles, then each would rub off against the other in a way that
moved both towards some joint average temperature.
Equally, if all the particles were set
off with exactly the
same temperature, the same kinetic energy, then the randomness
of their collisions would give some particles
a little bit of an extra kick, others left slightly slowed down. From
one single value, there would grow a Gaussian spread of values. So
from either pole of greater order, the population of particles
would slide its way to the same equilibrium balance of
And the box walls would play a crucial part in this. Remember
that the gas particles are flying about freely. Without a box
to contain them, they would simply wander away from each
other. But the box walls reflect every straying particle immediately
back into the
fray. The particles are forced to keep mixing and keep
In fact we can see that it is the global boundary – the walls of the box – that imposes the actual scale of the disorder. The number of particles in the box is constant. None escapes. And the void inside the box is also held at a constant size. So these two factors combine to set the scale of the lower boundary of order.
Lower bound order is defined as an exactly grid-like distribution of paticles. For some particular boxed system, this grid spacing is then simply the quantity of empty space divided by the number of available particles. It spells out how the box should be filled for the lower boundary to become completely smooth.
But of course the box walls can only constrain the wanderings of the particles towards this desired smoothness. The particles have enough remaining freedom, enough unconstrained variety, to express a randomness around the lower bound mean value. The particles can be tamed almost to a grid-like smoothness, but not quite. There must remain a smallest scale Gaussian jitter.
So disorder fills the middle ground of this system, the space between its two extremes of order - the global box and the local grid. But this disorder, being rigidly boxed, does get compressed to a single scale - and the smallest scale. Again what seems a pretty simple situation - a bunch of free particles trapped inside an empty box - harbours quite a metaphysical complexity. It certainly no longer seems like the ontologically most simple possible situation. Surely we can do better?
now lift the lid of the box
Thermodynamics offers a statistical description of a system. It is how
things pan out over the long run. And the organic argument is that a
system is always a hierarchy. There will be the triadic structure of an
orderly upper limit, an orderly lower limit, and then a middle ground
of disorderly-ness where things freely mix, completely expressing their
local freedoms within the prevailing global context of constraints. In
short, between the levels of the container and the particle we find
Or at least this is the mechanical model of an organic hierarchy. The classical Maxwell-Boltzmann system of an ideal gas trapped in a closed box illustrates the basic logic of our holistic story. Yet being mechanical, it has to assume a few key things.
It assumes for example that the particles can fly free while they are travelling through a void. So axiomatically, any meaningful intermediate levels of scale are banished from the start. The system collapses towards its lower boundary of order – the Gaussian jitter of particles about a grid – because all larger scales of existence in the box have no memory. A vacuum is seen as an a-causal nothingness, an empty realm which has spatiotemporal scale yet is not capable of being marked by the events taking place within it.
So in Boltzmann’s box, causality is not just dichotomised – that is, separated and then allowed to mix over all scales. It is broken apart or dualised. You have the local freedoms of the particles making one scale of causal action. Then a long stretch of nothingness and quite suddenly at a certain distance, or spatiotemporal scale, the particles run into a solid reflective barrier. The box walls eventually overwhelm the free flight of the particles, forcing them to homogenise towards a smallest grain of disorder.
Now how are we going to change this mental image around to create something more like a properly organic model of an organic hierarchy? Well we can start by simply taking away the walls of the box.
What happens if we begin with an orderly clump of particles released into an infinite void, a place without walls to confine them? Or equivalently, if they are released into a box that is expanding at precisely the same rate as the particles can spread (much as the spacetime of our universe is said to expand at the rate of its material contents)?
Well now we will get a different statistical picture of their distribution. Instead of being constrained to smoothness, they will move out and start to become fractally patchy. Like the drunkard’s walk, we will switch from single scale Gaussian statistics to fractal, scale-free, long-tail and power-law statistics. Or what some have called Mandelbrotian statistics.
The signature of this kind of statistics is that it enjoys scale symmetry. Every scale is present yet none is privileged. You have small scale randomness, large scale randomness, and all the scales in-between. It is symmetrical in that moving up or down the scale of observation leaves the world looking the same. A fractal world is flat in a direction – an axis or dimension - that we rarely notice even can exist.
And recognising that the middle of our triadic hierarchy has scale symmetry neatly ties up our story of organic logic. We said an organic system is formed by the dichotomous separation towards asymmetric limits of scale. Dichotomisation literally opens up a space. But then as this gap yawns, it is also getting back-filled by a mixing – a disordering – of the aysmmetric order being created. And this act of disordering results in an internal axis of symmetry.
So we end up with a dichotomy of asymmetry and symmetry. Remember that vaguenesss, the starting point for our causal story, is itself best modelled as an unbroken symmetry. Then vagueness gets broken - dichotomised towards the local and the global - to create an asymmetric realm of scale. Completing this tale of organic development we finally have a disordering of what has been divided. And as this takes place across all possible scales, we again return to the "unmarked flatness" of a symmetry. Except instead of being a vague symmetry, now it is a crisply developed symmetry.
So as it should, the causal picture all ties back together. But before returning to organic hierarchies, we need to do a little more mathematics. To have an exact description of the disordered middle grounds of organic hiearchies, we ought to consider various examples of fractal systems.