“I always think that the best way to know God is to love many things.”[1]
(Vincent van Gogh)
“For you are dust, and you will return to dust.”
(Genesis 3:19)
All life is a wonderful counterpoint to death. Only an immature disorderly mind sees life and death as in opposition; life and death journey together and give full expression and meaning to each other. Both are useful.
“Death is as sure for that which is born, as birth is for that which is dead. Therefore grieve not for what is inevitable.”[3]
(Bhagavad Gita)
Life may have been given by the breath of the gods or at its deepest, grainiest level, life may have manifested as animated matter from the random expression of energetic fluctuations in space-time. Wherever it came from, life energy continues its universal journey relying upon unusualness (deviancy) and diversity of life-forms and species and their ability to do useful activities.
In its simplest form, usefulness or utility might be considered as having the same meaning as 'value' – but this displaces the discussion to the question of what is ‘value’. Alternatively, taking a Taoist approach, we could define value as that which arises from what is present and utility from the gaps in-between (negative capability again).[4]
In economic terms, utility is expressed as relative preference within a range of options. There are other different, domain-specific definitions of utility. However, the most scientific description of utility is to be found in physics, but to really understand utility we shall need to grapple with ‘entropy’ – that most slippery of scientific leviathans.[5]
“Entropy is the loss of energy available to do work. Another form of the second law of thermodynamics states that the total entropy of a system either increases or remains constant; it never decreases. Entropy is zero in a reversible process; it increases in an irreversible process.”[6]
Utility is a negative-entropic concept, being the capacity to do non-random (potentially useful) work with available free energy.
For the purposes of a purely practical discussion, we can also leave aside (for now) potential judgements about the ethical value of doing any specific work. We should be able to agree on an objective utilitarian preference, absent other factors, for conditions which provide the capacity to do any useful work and create a more conducive environment for more of the same. We shall return to ethics and ethical action once we have set the scene for the ability of life-forms to undertake any useful action at all.
Life exists by the use of usable energy and this use is offset by a corresponding export of entropy by life-forms into the surrounding environment. The most general description of entropy appears to be as the lack or loss of useful energy, or useful information about the state of things, with which to do work.
The word ‘entropy’ comes from the Greek word for ‘transformation’, which is very apt given the way in which entropy and energy are so closely intertwined, and that energy changes or transforms in its movements but is not usually destroyed.[7] Many readers who have considered this concept before are likely (as I) to be perplexed by the concept of entropy.
Why is the notion of entropy so hard to explain and understand, and so prone to misunderstanding? Is entropy a description of physical disorder and deconstruction? Are we really measuring indeterminacy and randomness? Maybe it is something else that we can’t quite explain?
“I refuse to answer that question
on the grounds that I don’t know the answer”[8]
(Douglas Adams)
In the case of the Earth, we are told that this life process involves all life-forms (and all matter within the biosphere used to build living organisms) taking in energy from the sun at a higher frequency and releasing the same amount of energy back into the environment and the universe at a lower frequency.
“The structure of life on this planet would run rapidly down were it not for a powerful low-entropy source, upon which almost all life on Earth depends, namely the Sun.”[9]
(Sir Roger Penrose)
The usefulness of energy is therefore related to the usable gradients, or deltas, between different types of energy. In other words, where heat energy is concerned, differentiation is life and uniformity is death.
Natural processes are subject to the second law of thermodynamics (which is either a law or a statistically accurate description):
“every natural process has in some sense a preferred direction of action. For example, the flow of heat occurs naturally from hotter to colder bodies…”[11]
Most of the natural processes we are aware of are not reversible (e.g., the dead do not come back to life and broken eggs stay broken).[12] In respect of life, the applicability of thermodynamic theory can be summarised as follows.
“There are two principal ways of formulating thermodynamics, (a) through passages from one state of thermodynamic equilibrium to another, and (b) through cyclic processes, by which the system is left unchanged, while the total entropy of the surroundings is increased. These two ways help to understand the processes of life.”[13]
Sir Roger Penrose has written extensively about entropy, more recently in his quest to ascertain where our universe comes from, where it may go to and the possibility it has a cyclical (natural) quality.
“Usually when we think of a ‘law of physics’ we think of some assertion of equality between two different things … However, the Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system – which is a measure of the system’s disorder, or ‘randomness’ – is greater (or at least not smaller) at later times than it was at earlier times.”[14]
(Sir Roger Penrose)
Despite what Schrödinger has said about the simplicity of entropy, as Penrose points out, the more one looks, the more that it appears there is something tricksy about the second law of thermodynamics.
“Going along with this apparent weakness of statement, we shall find that there is also certain vagueness or subjectivity about the very definition of the entropy of a general system.”[15]
Entropy has an elusive will-o’-the-wisp nature which makes it very hard to pin down. This may be because of its asymmetric nature in time, or something to do with a certain arbitrariness in our definitions or application. Perhaps the universe does not have increasing entropy everywhere, and so it is not an invariant law but a property of our local space-time (galaxy, super cluster etc.). Alternatively, and seemingly more likely, it could be entirely a matter of perspective as to whether initial low entropy is a particularly relevant property of the universe.
We are informed that the Sun is negatively entropic, and the Earth is positively entropic. However, we must be careful with these terms. It appears that the states and processes of high, low and negative entropy act as flows and re-cycles (albeit, overall, the process is running down towards maximum entropy and warm bodies do not get warmer when in contact with colder bodies).
The Sun’s useful energy is finite and will run out eventually, and so the overall flow of entropy is still increasing. The general and greater flow is still from the lower-entropy and higher-energy state (the smaller, concentrated, hot body – the Sun) to the higher-entropy and lower-energy state (the colder, more distributed background of space). Energy is largely conserved throughout the whole process[16] but entropy increases as energy is ultimately becoming less useful for life-forms as the flow of solar energy continues its journey in space-time away from the Sun.
The Sun is considered a source of negative entropy, from the perspective of life-forms on Earth, even though it creates entropy in its formation and fusion processes. The process of fusing atoms at the heart of each star converts atoms – matter – into other types of heavier matter and the difference is released as uniformly radiated heat and light, i.e., pure energy.
Through the process of fusion, the Sun creates an extraordinary level of energy that is significantly different to the background cosmic temperature and local space-time temperature. The difference, or delta, between the energy from the Sun and the energy arriving on Earth from the rest of the universe gives rise to usable energy on Earth for life-forms (which ultimately leaks out into the universe, thus preventing the overheating of Earth).
When we speak about life processes being negatively entropic, we do not mean in exactly the same way that a star is. We are referring to the capacity of life-forms to consume and recycle the free energy from a star to create unique information, to do work, to recycle matter and recycle death.
Looked at from a higher dimension or systems level, life-forms are really agents of entropy in their use of free energy to create even more potential information, matter and energy microstates within their environment and beyond their planets. Life is entropic in that it tends towards a maximum variety of life-forms (the ecological configuration space populated by different species).
Life-forms use information and information structures to help energy flow and dissipate faster, we speed up processes of chemical equilibrium.[17] Life-forms literally help carry and disperse the cosmic fire.[18]
“There is a common idea that a living organism is a sort of fight against entropy: it keeps entropy locally low. I think that this common idea is wrong and misleading. Rather, a living organism is a place where entropy grows particularly fast, like a burning fire. Life is a channel for entropy to grow, not a way to keep it low.”[19]
(Carlo Rovelli)
“Those on Earth come from your hand as you made them
When you have dawned they live
When you [have gone] they die
You yourself are lifetime, one lives by you
All eyes are on your beauty until you set
All labour ceases when you rest in the west”[21]
Like the Sun in its burning, life processes create entropy, but the entropy created is more like a waste product to life-forms. So, life processes borrow usable energy from the Sun for some time and use it within life-forms that require it to stay alive and to perpetuate the lifecycle (to create new life-forms that carry the fire). Pausing there for a moment, the astute reader may have noticed that stars, like the Sun, appear to be acting contrary to the second law of thermodynamics.
If it is improbable for sand on a beach to spontaneously rearrange itself into a sandcastle (or a sandy Taj Mahal), how can randomly distributed hydrogen atoms in a vast cloud in space arrange themselves into such tightly structured and useful stars?
“The probability that the sand pile would blow away and land as a sand castle, the least likely random arrangement, is almost incalculably low. But not impossible. It’s just overwhelmingly more likely that the low entropy sand castle turns into a high entropy sand pile. This is why entropy is always increasing because it’s just more likely that it will”
(Rich Mazzola) [22]
“This histogram illustrates an important principle: As you add more states to your system, it becomes more likely that the arrangement you observe is the most likely arrangement. For example with just 50 states, the odds of observing the least likely arrangement is 1 in 132,931,168,175. Now imagine the sand castle has hundreds of millions[23] of grains of sand.”[24]
If entropy tends always to increase, I wondered whether stars were an example of an eddy in the entropy flow (a temporary reversal), where entropy decreases for a short time even though it is still ultimately tending to maximum entropy.
My initial understanding was that the gravitational forces involved in the creation of stars and in forcing atoms together (which more than overwhelms the energy required in the star formation process) are greater than the energy required to force fusion reactions. They are.
I also thought that when all of these different energies were combined, the total entropy in the star formation process would have temporarily decreased, as the end result is a configuration of matter that is more concentrated and structured. Apparently, this is not how entropy works.
“This is another common source of confusion, because a condensed cloud seems more ‘ordered’ than a dispersed one. It isn’t, because the speed of the molecules of a dispersed cloud are all small (in an ordered manner), while, when the cloud condenses, the speeds of the molecules increase and spread in phase space. The molecules concentrate in physical space but disperse in phase space, which is the relevant one.”[25]
(Carlo Rovelli)
Where did I go wrong?
Low entropy is not necessarily about the increased complexity or structure of a configuration of matter. Entropy is apparently lowest, even when matter is uniformly distributed, if it is in a very small phase space (e.g., conditions near the Big Bang) and it is highest when matter is massively dispersed in phase space (for example, an older universe with stars, solar systems, galaxies, clusters and black holes with lots of ‘empty’ space in between – the one we live in).[26]
Entropy is a description of the number of possible microstates the matter may be in and the difficulty in ascertaining that information. In the example of the star versus the dispersed hydrogen cloud, the momentum of the matter in the star (being its mass multiplied by its velocity) gives rise to more possible microstates for the matter.
The definition of phase space incorporates these additional values in momentum, and so the possible microstates for the arrangements of atoms increases in phase space (the denser space-time).[27] In addition, the Sun creates new matter, like helium and heavier atoms, and unlocks energy trapped in hydrogen and other atoms in the process. The unlocked energy further interacts with the existing matter in the star (heating it up). This all adds to the complexity of the number of possible microstates of all matter within a star system compared to a colder and widely distributed but more leisurely cloud of hydrogen atoms.
For these reasons, it has been suggested that black holes are sources of maximum entropy since there appears to be no way of knowing or estimating the number of possible microstates of any matter that falls through the event horizon, nor is there any way to get useful energy back from it.[28] There appears to be no further extractable specific or probabilistic information available about that matter.[29] In order to maintain the cosmic accounting balance, black holes must be sources of great entropy, at least as perceived from this universe.
“Going to see the river man
Going to tell him all I can
About the ban
On feeling free.
If he tells me all he knows
About the way his river flows
I don’t suppose
It’s meant for me.”[30]
(Nick Drake)
Despite strenuous efforts, the science and descriptions of entropy continue to be very difficult for non-scientists (like me) to understand – yet an attempt to ensure that we squarely understand what entropy is will be helpful given my attempts to create a more objective universal ethics of life.[31] It is some comfort that many eminent scientists express the difficulties faced in making sense of ever-elusive entropy. Perhaps it is because entropy is not a thing or a force, it is a statistical way of looking at the universe and the exchanges between actions and information.
Taking account of all the previous information about entropy and energy, let us put the concept of entropy into an analogous and practical framework to ensure we are comfortable with it. The entropic process is in some ways similar to the way in which energy changes as it interacts with space-time and matter, or how a river changes speed, meanders and even turns back upstream when it eddies on its way to the sea.
So, we shall say that entropy is like the journeys of flowing rivers, where the higher water ‘wants’ to fall (due to gravity) to a lower place, which happens by way of conversion of that (potential)[32] gravitational energy into kinetic energy. We wish to track (using mathematical guesswork) the likely specific movement of ten molecules of H2O from a starting point at the highest point of the river to the lowest point. The flow of water is always tending towards the sea or other lower places (let’s call this flow tendency ‘maximum entropy’) but the process is not instantaneous and is not at a uniform rate.
The impact of time and differences in flow rates give us areas or fields (such as pools, floodplains and estuaries) where the water flow slows or stops. Water can even temporarily flow upstream a little in river eddies. Some of the flow eventually reaches the sea or ocean.
We do not wish to know the exact location of each molecule but we do wish to know the probability of each molecule being in a specific location on Earth (microscopic states, or microstates). We divide the Earth into 1m2 parcels, in which we seek the probability of finding our specific water molecules. When the molecules started their journey, they were in a state in which it was relatively easier for us to guess their location. The number of molecules at the source of all rivers and higher-water-catchment areas is much smaller than all the water elsewhere. The geographic spread of locations where rain falls the most is also much smaller than where water finishes its process of falling down. So, we have fewer molecules (in which our ten can be ‘hidden’) and a smaller number of parcels at the beginning of our probability estimate (less dispersed).
Very interesting things are possible with life-forms within certain parts of the flow,[34] notwithstanding that the flow is still tending towards the higher-entropy state. This flow gives life-forms useful energy, even though, like all natural processes, it is not truly reversible. We can describe the movement of those ten molecules from the start to the end of the falling down process as one of shedding usable (useful) energy, which is why life-forms can take advantage of this flow.[35] In the process of doing so, the molecules become more difficult to locate. The measure of entropy here is a mathematical equation that seeks to specify the probability of each molecule being in any microstate at different times.
Using this analogy, it becomes obvious that – in the absence of perfect, continuous and free[37] knowledge about the location of every molecule of H2O on Earth – the ability to guess the number of possible locations (the microstates) for each H2O molecule becomes increasingly more difficult (costly) as they shed flow energy and disperse. The calculation work required is also a use of energy that creates entropy (heat from computing).
Entropy here is the probable number of microstates of the molecules at each stage of the process.[38] It will therefore increase as the energy of each H2O molecule is transformed and shared (or shed) with other matter. As the water disperses, it becomes increasingly difficult to know or guess what is happening to our ten H2O molecules.
For the purposes of water and other matter, the Earth largely acts as a closed system (exchanges with the surrounding space are primarily energy exchanges) and so the process can give rise to a recycling of water and the molecules can take up a new higher-energy (lower-entropy) configuration and start the process again. The water cycle process is not truly reversible (i.e., it is not the same water molecules spontaneously flowing from the sea up to the mountains and then into the sky), yet the power of the Sun gives us something very close to reversibility in practice: the natural water cycle. The additional energy required to move the water back up (e.g., from the ocean, sea or lake) to the higher altitude point to restart the flow is a product of the Sun’s continuous supply of free useful energy.
We must now return briefly to the example of a deviant pack of cards. The probability of having 26 red cards dealt in consecutive order from a well-shuffled pack is extraordinarily low. Note now what happens after red card No 26 is dealt – all of that initial unlikeliness is now balanced by a near-absolute certainty (in probability terms) that the next 26 cards will be black.[39] The total number of microstates (if the colour black is the only quality in question) for the remaining cards is now just 1 (i.e., there is no information uncertainty). The initial improbability of the system is mirrored by the extremely unusual amount of information those initial conditions give us about the nature or microstates of the remaining cards.
Is the extreme improbability of 26 red cards being dealt first just another demonstration of the universal law that makes it near impossible to know the future?
The universe appears to go to great lengths in its laws to avoid absolute pre-determinacy at all scales. In the example of the first 26 cards being red, whatever configuration the next 26 cards are in, we know – without doing any more work – that each card dealt will be black. There is zero entropy, in information terms, within the remaining pack in respect of the quality or value ‘black’. This move from very high initial uncertainty to absolute certainty of the colour of the cards as they are dealt gives a simple general sense of entropy and negative entropy in information terms.
The concept of information entropy was pioneered in a 1948 paper by Claude Shannon.[40] Shannon realised that entropy can also be described in information terms and not just in respect of heat dynamics. Indeed, he showed that information and entropy are directly correlated and can be said to be different descriptions of the same thing. The more certainty we have about any event (the numbers in a lottery draw, the colour of balls being pulled from a jar or the colour of cards to be dealt in a pack) is directly inversely proportional to the entropy (since certainty about the actual microstates within a macrostate equals maximum information available).
“Memory and agency utilise the information stored in low entropy and translate it into information readable in the macroscopic world … Traces of the past and decisions by agents … are major sources of everything we call information. In both cases, information is created, … at the expenses of low entropy, in accordance with the second principle. In a fully thermalised situation, there is no space for memories or for agents.”[41]
(Carlo Rovelli)
The ability to undertake unusual activity requires information and consciousness (including memory and culture) or coded storage mechanism that give life-forms a relative perpetuation, procreation or survival advantage in their uniquely challenging environments. The reason for the information correspondence is that entropy is a probability analysis, and it is not an attempt to define an actual state of play. Certainty is a situation of no probabilities. Uncertainty can be defined as a probabilistic assessment of our lack of confidence of what will happen next in any event. In addition, the generation of information about the past and potential future events is a consequence of ever-increasing entropy.
Why might this be relevant to freedom of action and the ethics of life for life-forms?
“David Layzer … made it clear that in an expanding universe the entropy would increase, as required by the second law of thermodynamics, but that the maximum possible entropy of the universe might increase faster than the actual entropy increase. This would leave room for an increase of order or information at the same time the entropy is increasing!”[42]
Life is an emergent complex property that can arise, within a window of opportunity, due to the interplay between different types and locations of energy and matter. Life-forms live in the delta between the difficulty of knowing or doing anything useful and an increase in available information we have about this condition. At first this seems counter-intuitive since the aggregate information entropy must also increase with general cosmic entropy (thermodynamic equilibrium) but, as with life-forms’ use of usable energy in this overall process, information about the past is useful and we can make measurements to guess at or even mould likely futures.
‘Reality’ unfolds with increasing potential information and memory about what has happened and increased uncertainty about what is happening or going to happen (an increase in aggregate microstatic uncertainty).[43] The current success of life-forms is despite the fact that the universe appears to ultimately be headed towards a condition of maximum entropy (death to any life-forms), where there is likely to be no more useful matter, energy or information from the perspective of any life-forms that are still alive in this universe through the next 10^106 years or so.[44]
“But entropy, heat, past and future are qualities that belong not to the fundamental grammar of the world but to our superficial observation of it. ‘If I observe the microscopic state of things,’ writes Rovelli, ‘then the difference between past and future vanishes … in the elementary grammar of things, there is no distinction between “cause” and “effect”’”[45]
Whilst we must be very careful to avoid bringing this tricksy second law into potentially subjective definitions of order and structure, in respect of doing useful work and creating the conditions for greater freedom of action we should be on somewhat safer ground.[46] By living and evolving through variation, life-forms create useful information and this gives rise to other complex life-forms. This is the simplest summary of how all life-forms on Earth evolved from a simple shared ancestor. All DNA-based life-forms[47] on Earth show this evolution (variation) from simplicity in code in having a common sequence of DNA:[48]
GTGCCAGCAGCCGCGGTAATTCCAGCTCCAATAGCGTATATTAAAGTTGCTGCAGTTAAAAAG
It seems that we can never move straight or true.
The distance between you and me, destined always to be travelled concentrically.
It appears that I digress, moving at 45 degrees from my heart’s desire and your fragility.
| If light must snake and oscillate in its eternal trajectory then why should we expect to travel differently?
You and I are writ in letters: GATC
They survive by twinning, traversing fecund spaces pregnant with possibility. | Our lives together are writ the same.
The direct path always and ever curved.
And that last departure?
Was just me returning back to you straight and true, the only way I knew.[50] |
Entropy, like evolution, is unusual in not being an equality. This also tells us something fundamental about the nature of nature. Natural evolution (as a life process on Earth) and entropy are strongly twinned together.
“Natural Selection helps to select species which are most effective in survival and can efficiently utilize energy and negative entropy.”[51]
(Marek Roland-Mieszkowski)
When James Lovelock (proponent of the Gaia theory)[52] was asked how he would look for life on Mars, he replied:
“I’d look for an entropy reduction, since this must be a general characteristic of life.”[53]
(James Lovelock)
The simple, non-contentious view, first put forward over 100 years ago, is that life-forms are the most extraordinary known natural useful engines in the universe.
“Thus a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is of death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see.”[54]
(Erwin Schrödinger)
As such, life always has some potentially measurable impact on its environment and some value, even using a strictly utilitarian approach for all living things (whether sapient, sentient or otherwise). It may be that our understanding of entropy is still nascent and, given the equivalence of matter and energy by E = mc2, the differences between the entropy of matter in an expanding, ‘cooling’ universe is cancelled out eventually by a decrease in gravitational entropy of the whole system as it collapses or gravity ceases to apply (there being no matter with which to bend space-time and conceptualise any space between things). Perhaps material and energy entropy are just the two faces of an underlying unity. They may always sum to zero entropy again when all matter and energy recombine – such as at a singularity, whether at the event horizon of each black hole, the pre-Big Bang singularity or at the end of this current universal epoch.[55]
![]()
| ||
“Now as I was young and easy under the apple boughs About the lilting house and happy as the grass was green,
… Nothing I cared, in the lamb white days, that time would take me
| Up to the swallow thronged loft by the shadow of my hand, In the moon that is always rising, Nor that riding to sleep I should hear him fly with the high fields
(Dylan Thomas) | And wake to the farm forever fled from the childless land. Oh as I was young and easy in the mercy of his means, Time held me green and dying Though I sang in my chains like the sea.”[56] |
“Living entities, especially the ones we are familiar with, like a horse or an olive tree, can be thought of as replicating objects in space that have gathered, over considerable evolutionary time, immense amounts of information on how to solve a multitude of problems they might face in their lifetime.”[57]
(Ioannis Tamvakis)
Ioannis Tamvakis takes an interesting, potentially more objective and astro-biologically useful approach with his quantification of life theory. Such an approach can be substrate-neutral, given the potential for life-forms to take many different chemical forms. He suggests it may be better to attempt to quantify life, rather than to attempt to define it, referring to a modified approach to information theory.
Alternatively, some scientists put forward the theory that life is a natural consequence of higher-frequency energy that, in effect, started as a neat innovation of atoms or molecules (amino acids) to dissipate excess energy into the environment.
“Jeremy England … has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy … and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy … under certain conditions, matter inexorably acquires the key physical attribute associated with life.”[58]
Jeremy England’s theories are a development of the abiogenesis theory, which states that life started from inanimate non-living matter, such as organic compounds, which over very many stages[59] evolved into life-forms, due to the beneficial conditions that existed at the relevant time and most particularly the access to free usable energy.
The line or leap between pre-organic matter to aliveness may be something that we should expect to be replicated elsewhere in the universe, or perhaps it was one incident of inexplicable and improbable good luck. It would be wise for us to act as if it is a case of universal good luck.
“A genome, then, is at least in part a record of the useful knowledge that has enabled an organism’s ancestors – right back to the distant past – to survive on our planet … Looked at this way, life can be considered as a computation that aims to optimize the storage and use of meaningful information. And life turns out to be extremely good at it.””[60]
(Philip Ball)
There is a strong correspondence between thermodynamics and information theory. James Clerk Maxwell explored this correspondence with a thought experiment now known as ‘Maxwell’s demon’.
“In the thought experiment, a demon controls a small door between two compartments of gas. As individual gas molecules approach the door, the demon quickly opens and shuts the door so that only fast molecules are passed into one of the chambers, while only slow molecules are passed into the other. Because faster molecules are hotter, the demon’s behaviour causes one chamber to warm up and the other to cool down, thereby decreasing entropy and violating the second law of thermodynamics.”[61]
A number of theorists, including Leo Szilárd, Léon Brillouin and, later, Rolf Landauer, looked at the total impact on the system and showed that the thermodynamic cost for an observer (demon) to remember its intention, take action and record information must be included in the total calculation of the entropy of the system. Information theory shows that even if the demon can do its work without increasing the energy entropy of the system, this will give rise to an information entropy as eventually the demon must wipe its memory clean. There is always an energy or information cost of any agent acting to change any thermodynamic system.
This reminds one of the concept in quantum mechanics that making an observation will always have an impact on the event being observed.[62] There is simply no separate world or universe outside of us to be interacted with, observed or measured – “for here there is no place that does not see you”.[63]
“evolution can be understood as a kind of random computational walk through software space.”[64]
(Gregory Chaitin)
The evolution of life-forms has been dictated by the use and development of a number of extraordinary biological algorithms. Life-forms must keep the energy costs of storing and transmitting survival and genesis information to the lowest possible level and as close as possible to the efficiency threshold (known as the ‘Landauer bound’), given the increased costs in energy and entropy of ‘creating’ and retaining such information.
“We use ideas from the new algorithmic formulation of information theory, in which one considers individual objects and the amount of information in bits needed to compute, construct, describe, generate or produce them, as opposed to the classical formulation of information theory in which one considers an ensemble of possibilities and the uncertainty as to which of them is actually the case.”[65]
(Gregory J. Chaitin)
Life-forms are demons that direct the flow of energy and matter and try to externalise the costs of staying alive by dumping waste into the surroundings; they can only do this if they develop means of storing, compressing, decoding and discarding information about what is helpful in the struggle for survival and perpetuation. Encoding information in DNA, RNA and culture are ways in which life-forms seek to maintain an advantage in this continuous struggle. They are much more efficient than the best supercomputers in doing so.
“the cost of computation in supercomputers is about eight orders of magnitude worse than the Landauer bound … which is about six orders of magnitude less efficient than biological translation when both are compared to the appropriate Landauer bound. Biology is beating our current engineered computational thermodynamic efficiencies by an astonishing degree.”[66]
Life-forms are engaged in a constant sequence of ‘if-then’ assessments to maintain their identity; persistence requires ‘good enough’ predictions about the future. The costs of these continuous calculations and the accumulation of information and errors in the replication of information (including in living cells) eventually lead to individual death as the most efficient means to continue the perpetuation of life in newer forms.
“A code for generating the first 15,000 digits of pi in the programming language C, for instance, can be as short as 133 characters.”[67]
(Jordana Cepelewicz)
Algorithmic information theories (AIT) are being used in advanced AI, biotech and other applications to try to use efficient short-cuts to generate useful results. Scientists realise that we need to learn more from the natural systems around us. A simple way of thinking about AIT is to consider that it takes much less information to provide the process (steps or algorithm) to generate π than it does to calculate it.
“Creationists love to insist that evolution had to assemble upward of 300 amino acids in the right order to create just one medium-size human protein. With 20 possible amino acids to occupy each of those positions, there would seemingly have been more than 20^300 possibilities to sift through, a quantity that renders the number of atoms in the observable universe inconsequential … it would have been wildly improbable for evolution to have stumbled onto the correct combination through random mutations within even billions of years. The fatal flaw in their argument is that evolution didn’t just test sequences randomly: The process of natural selection winnowed the field.”[68]
In the blockchain space, we also see the use of tools that take advantage of algorithmic simplicity to create data complexity – for example, use of elliptic curve digital signature algorithms (ECDSA) to create digital signatures that everyone can validate easily but that are extraordinarily difficult to fake or to use to break the underlying private keys that generate the validatable outputs.
Natural systems (even non-living ones, such as solar systems) tend towards maximum simplicity and stability whilst generating entropy in doing so. This ties in with Jeremy England’s work on how unconscious matter is likely to naturally self-organise into life-forms in order to better dissipate environmental energy.
“Every species of living thing can make a copy of itself by exchanging energy and matter with its surroundings. One feature common to all such examples of spontaneous ‘self-replication’ is their statistical irreversibility: clearly, it is much more likely that one bacterium should turn into two than that two should somehow spontaneously revert back into one”[69]
(Jeremy England)
Given the various attractive effects between bodies, smaller bodies such as atoms are also likely to form collective patterns and arrive naturally at more stable energy forms. The transformation of atoms into life-forms is a very efficient way to dissipate free energy. The beautiful appearance of complexity and order seen in fractals is an example of algorithmic efficiency.
“Life resembles a fractal … The ‘fractals’ of life are cells, arrangements of cells, many-celled organisms, communities of organisms, and ecosystems of communities. Repeated millions of times over thousands of millions of years, the processes of life have led to the wonderful three-dimensional patterns seen in organisms, hives, cities and planetary life as a whole.”[70]
(Lynn Margulis and Dorion Sagan)
It should not be surprising that living nature has taken advantage of algorithmic simplicity to find short-cuts to survival solutions. The walk may be random, but the work becomes iterative and also gives rise to variations on a theme, when success is chanced upon. Chaitin contends that the walk and work done by nature and life-forms is not entirely random, instead following a distribution based on Kolmogorov complexity. The Kolmogorov complexity of something is the length of the shortest algorithm or code that produces the result.
This gives rise to accelerated evolutionary solutions to problems that are not possible in an entirely random distribution.[71] Randomness under this approach is the extent to which any string of data cannot be algorithmically compressed.[72]
“We argue that the proper framework [for consciousness] is provided by AIT and the concept of algorithmic (Kolmogorov) complexity. AIT brings together information theory (Shannon) and computation theory (Turing) in a unified way and provides a foundation for a powerful probabilistic inference framework (Solomonoff). These three elements, together with Darwinian mechanisms, are crucial to our theory, which places information-driven modeling in agents at its core.”[73]
(Giulio Ruffini)
The ability of living organisms to think or make decisions arises from the need of life-forms to model changes in the surrounding environment, including the impact of the actions of the life-form itself. Reality is the best model each life-form believes in, or works within, to make the simplest sense of their environment given the overwhelming and non-computable quantity of data in everyday life.
“the level of consciousness can be estimated from data generated by brains, by comparing its apparent and algorithmic complexities. Sequences with high apparent but low algorithmic complexity are extremely infrequent, and we may call them ‘rare sequences.’ Healthy, conscious brains should produce such data.”[74]
(Giulio Ruffini)
Nature and life (like science and logic) love and reward efficiency. We demand that our theories, axioms and laws are much simpler than the reality they seek to represent since otherwise they are of no use. Such theories are like computer programs on which we can operate algorithms. Indeed, the most ground-breaking theories are often extraordinary in their apparent simplicity (think E = mc²) — they are also a type of algorithmic efficiency.
Life-forms are a process for simplifying and solving complex problems, where the solution has to be worked in reality and the consequences of failure are often final. In addition to the algorithmic efficiency that is needed to work more intelligently (to conserve energy), nature has an in-built drive for maximal diversity.
Diversity is the strategy that best ensures that some life-forms (and therefore life itself) always perpetuate. This drive for diversity is most on show after periods of biospheric destruction or catastrophe, as witnessed by the many periods of great speciation in the Earth’s past (including the Cambrian explosion).
“Contrary to the tendency of optimization algorithms to converge over time to a single ‘best’ solution, natural evolution instead exhibits a remarkable tendency toward divergence – continually accumulating myriad different ways of being. This observation is the crux of an alternative perspective in evolutionary computation (EC) that has been gaining momentum in recent years: evolution as a machine for diversification rather than optimization.”[75]
Sapients are therefore required to take account of the utility value of different species (perhaps partly quantified using algorithmic information theory) and the scientific and sacred value of diversity of species in their decision-making or they risk – at best – an avoidably impoverished life and loss of valuable information. Failure to do so may even encourage destruction by others that might inform their ethical frameworks on our current behaviours and laws (such as aliens or AI).
Our current ethical frameworks would make the prospect of meeting alien life terrifying. If we think that our ethical foundations are sufficient, then we should expect aliens to act towards us as we currently do towards each other and, more importantly, to other species with less intelligence or power – in which case extraordinary levels of human death, slavery and suffering will be our reward (if not extinction).
It seems more likely that any advanced galaxy-faring civilisation is likely to have had to acquire an ethical framework that incorporates the value of life and diversity of life-forms and species in its rule-making. This is necessary in order to avoid self-destruction (either directly through unchecked violence with greater technological capability or by destruction of a viable biosphere or similar).[76]
Such a wiser alien species would be capable of conduct that expressed inter-species respect, so they would likely leave us alone until we had reached a sufficiently mature level to interact with – though they may be here already, watching us develop.
A suitable universal ethical framework cannot be built on necessarily subjective or local foundations. Any beings from anywhere in the universe making ethical decisions should be able to reach similar conclusions as to their primary ethical axioms. We must also leave some room for the risks of dealing with those purely self-centred life-forms that may yet be highly technologically advanced or resourceful. It is of limited value to be right when someone kills you in breach of logic and law.
Our natural world is full of problem-solving engines and strategies that have been developed over billions of years. It would be wise to seek to understand the successes of nature and the nature of success. To study and protect each species we can, rather than destroying them before we even know what they do and how they do it. Before we have understood the way of the world.
It seems that even this universe must die eventually in order for matter and time to be reborn. Perhaps such cycles are the only in-built natural safety valves against permanent death, darkness or eternal wrongdoing. Dreams of individual immortality, whether AI or otherwise, are in fact immature nightmares – as a moment’s reflection makes clear. Endless life for life-forms is the true death.
“To those who can hear me, I say – do not despair. The misery that is now upon us is but the passing of greed – The bitterness of men who fear the way of human progress.
The hate of men will pass, and dictators die, and the power they took from the people will return to the people. And so long as men die, liberty will never perish …
You, the people, have the power to make this life free and beautiful. To make this life a wonderful adventure. Let us use that power – let us all unite.”[77]
(Sir Charles Spencer Chaplin)
Sir Roger Penrose has developed a conformal theory of time, with cyclical expansion leading to new singularities after aeons of time. The cooling, expanding universe[78] is eventually left (after extraordinarily long periods) with only massless particles (such as photons) and other forms of energy radiation. These particles without rest-mass have no interaction with space-time.
In a universe with no massive particles, as there is little that can interact with anything else directly, the concept of time and space becomes effectively meaningless. In that sense, there is no meaningful difference (other than potentially in scale) between the conditions from which our current universe arose and the conditions that inhabitants of a future universe would perceive when looking at the start of their own universe (i.e., towards the other side of the end of our universe). [79]
Other theories[80] suggest our universe’s initial singularity may be the inside of a supermassive black hole in another universe. Our universe can then be seen as just one small part of a larger self-replicating natural process giving rise to a multiplicity of new universes.
All these theories are intellectually tempting. In addition to the coincidence of singularities, there is something very interesting about the idea of the greatest areas of entropy as perceived in our universe (black holes) being the source of the lowest-entropy states in other universes; it fits with the tricksy perspectival or relative quality of the second law of thermodynamics and allows for a truly cyclical quality of entropy. It has a wonderful fractal quality too – a self-similarity at different scales such as we see elsewhere in nature.
We should also be mindful of ergodicity, the law of large numbers and effectively timeless phase space. Poincaré’s recurrence theorem states that, despite the statistical likelihood of entropy increasing, given enough throws of the dice any finite phase space containing matter in a certain arrangement must eventually return to that arrangement no matter how unlikely such an arrangement is.
Whilst Poincaré’s recurrence timespan is expected to be longer than the lifespan of our universe it perhaps also points to a way in which, if we assume time is not an objective reality, every possible universe (no matter how unlikely its starting conditions) exists. Freedom of action at the cosmological scale.
“Murderers are easy to understand. But this:
that one can contain death, the whole of death,
even before life has begun,
can hold it to one’s heart gently,
and not refuse to go on living,
is inexpressible.”[81]
(Rainer Maria Rilke)
“The invisible Spirit (Atman) is eternal, and the visible physical body is transitory …
The Spirit by whom this entire universe is pervaded is indestructible.
No one can destroy the imperishable Spirit.
The physical bodies of the eternal, immutable, and incomprehensible Spirit are perishable.”[82]
Life is universal energy manifesting in extraordinary utility machines; life-forms are really useful engines.
We do not know how widespread life is or whether it is confined to Earth, though the sheer vastness of the universe[83] makes it plausible that life is already experimenting in other parts of the Milky Way or other galaxies. Though, as Enrico Fermi asked: “where is everybody?”[84]
Life-forms are a natural universal means of recycling energy, death, entropy and of creating potentially useful and unique information. Intelligence, consciousness and other qualities of sentient and sapient life-forms are emergent properties of this continuing experiment with free energy. This interplay of energy and matter, through a process of cooperation and competition, gives rise to all the extraordinary biological complexity on Earth and all the knowledge and wisdom that humans have managed to achieve. All species must therefore have value as manifest useful engines.
The most problematic relative questions will always remain in assessing the value of other species:
What is the quantity of usefulness? How do we measure and value the utility ? What is the ethical impact of competition for utilisation of resources?
These will inevitably tend to lead us back to entirely subjective measurements and values that undermine the enterprise of creating a universal ethical framework. We will need to keep using scientific methods and non-dualistic thinking to seek to identify and mitigate human-centricity and weaknesses. We will need to use, evolve and discard scientific axioms where proof of ‘truth’ is not possible[85] and for axioms of wisdom where experience shows us that we are wrong.
We could also create a non-anthropocentric ‘Aliveness, Feeling, Intelligence and Wisdom' behind-the-veil test that in principle could be structured to significantly reduce our subjective tendencies when measuring various qualities of life – though until we have another sapient species to communicate with on such matters it is impossible to remove our dominant-species prejudices.
Ultimately any system of valuation of different species and diversity must always be grounded in our lack of foresight about what the future holds for all life-forms and species – our Promethean limits.
Footnotes:
[1] “Letter to Theo”, 1880 CE. Theodorus van Gogh was Vincent’s ever-loving, supportive younger brother. Image: Vincent van Gogh, “Tree Roots and Trunks”, 1890 CE, Wikipedia, public domain.
[2] Peter Pink-Howitt, “For you are dust”, AI-art, 2022 CE.
[3] 2:27.
[4] “Thirty spokes are joined in the wheel’s hub. The hole in the middle makes it useful. Mold clay into a bowl. The empty space makes it useful. Cut out doors and windows for the house. The holes make it useful. Therefore, the value comes from what is there, but the use comes from what is not there”, Lao Tzu, Tao Te Ching, c. 400 BCE.
[5] Image: Gustav Doré, “Destruction of Leviathan”, 1865 CE, Wikipedia, public domain.
[6] Lumen Learning: “Entropy and the Second Law of Thermodynamics: Disorder and the Unavailability of Energy”, n.d.
[7] In fact, this is a description of the law of conservation of energy which the first law of thermodynamics is a variation of. Some contend that it may not hold true once general relativity is included.
[8] BBC Radio 4, “42 Douglas Adams quotes to live by”, n.d.
[9] Cycles of Time, 2010 CE.
[10] Marek Roland-Mieszkowski, “Life on Earth: flow of Energy and Entropy”, Digital Recordings, 1994 CE.
[11] Massachusetts Institute of Technology, “Axiomatic Statements of the Laws of Thermodynamics”.
[12] However, animals like the immortal jellyfish do something interesting in resetting their biological clocks.
[13] Wikipedia, “Second law of thermodynamics”.
[14] Sir Roger Penrose, Cycles of Time, 2010 CE.
[15] Ibid.
[16] Leaving aside the potential impact of general relativity and an expanding universe, see, e.g., Sean Carroll, “Energy Is Not Conserved”, Discover, 22 February 2010 CE.
[17] Ville R.I Kaila and Arto Annila, “Natural selection for least action”, The Royal Society, 22 July 2008 CE.
[18] Suggested reading: Jeremy England, Every Life is on Fire: How Thermodynamics Explains the Origins of Living Things, 2020 CE.
[19] “Carlo Rovelli: ‘Time travel is just what we do every day…’”, The Guardian, 31 March 2019 CE. See also Carlo Rovelli, The Order of Time, 2017 CE and Carlo Rovelli, “[1812.03578] Where was past low-entropy?”, Cornell University, 9 December 2018 CE.
[20] Richard Mortel, “Akhenaten, Nefertiti and three daughters beneath the Aten”, c. 1345 BCE, photograph 2018 CE, CC BY-SA 2.0.
[21] Miriam Lichtheim, translation from Great Hymn to the Aten, circa 1350 BCE, “Ancient Egyptian Literature: Volume II: The New Kingdom” 2006 CE. Possibly authored by the ‘heretic king’ Pharaoh Akhenaten, “Great Hymn to the Aten”, c. 1350 BCE.
[22]
[23] To hurt your head with large numbers, each grain of sand consists of approximately 2 x 1019 (20,000,000,000,000,000 000) atoms.
[24] “The probability that the sand pile would blow away and land as a sand castle, the least likely random arrangement, is almost incalculably low. But not impossible. It’s just overwhelmingly more likely that the low entropy sand castle turns into a high entropy sand pile. This is why entropy is always increasing because it’s just more likely that it will” (Rich Mazzola, “What is entropy? An exploration of life, time, and immortality”, Medium, 9 May 2020 CE).
[25] The Order of Time, 2017 CE.
[26] See Sir Roger Penrose, Cycles of Time, 2010 CE.
[27] It is interesting that Albert Einstein also proved that the photons of light are effectively timeless in that relativity is only an attribute of the universe for matter (compressed energy having mass). For a photon (if it had consciousness), the distance and time between any two locations in the universe are zero. Time and distance are therefore only a restriction or property that is relevant to compressed energy – matter like us and all life-forms. It is in this sense that time is an ‘illusion’ for living things. At the deepest, simplest level of the universe, time simply does not exist.
[28] On Bekenstein-Hawking entropy, see Wikipedia, “Black hole thermodynamics”.
[29] Although the holographic principle suggests that all the information may be encoded in a smaller dimension on the event horizon; Wikipedia, “Holographic principle”. See also: Wikipedia, “Black hole information paradox”.
[30] “River Man”, Five Leaves Left, 1969 CE.
[31] In fact, this section has been the hardest to write due to the many technical problems I have had as a layman, without easy access to physicists. The risks of getting it wrong technically in this section are therefore the highest (though perhaps I am being kind to myself about the other sections!). I was sorely tempted to regurgitate words by real scientists and hope for the best! Much of the pleasure in writing this is in explaining concepts to myself.
[32] Potential because, as you will know, some water does not make it down, e.g., it stops in a tarn at the top.
[33] Image: John Evans and Howard Periman, “The Water Cycle”, USGS, 2013 CE, public domain.
[34] Transition zones between the river and the sea are very useful places for life as all that energy must continue its flow and transformation. See, for example, Rocky Geyer, “Where the rivers meet the sea: the transition from salt to fresh water is turbulent, vulnerable, and incredibly bountiful”, Oceanus, 2005 CE, 43(1).
[35] Indeed, humans also tap some of this energy directly, e.g., with mills and hydro-electric installations.
[36] Peter Pink-Howitt, "Going to see the river man", AI-art.
[37] That is, without work being required to ascertain the same.
[38] The actual entropy number is proportional to a logarithm of the microstates.
[39] Image: Caravaggio, “The Cardsharps”, 1594 CE, Wikipedia, public domain.
[40] “A Mathematical Theory of Communication”, 1948 CE.
[41] “Agency in Physics”, https://arxiv.org/pdf/2007.05300, 2020 CE.
[42] “It is perhaps easier for us to see the increasing complexity and order of information structures on the earth than it is to notice the increase in chaos that comes with increasing entropy, since the entropy is radiated away from the earth into the night sky, then away to the cosmic microwave background sink of deep space … if the equilibration rate of the matter (the speed with which matter redistributes itself randomly among all the possible states) was slower than the rate of expansion, then the ‘negative entropy’ or ‘order’ … would also increase” (David Layzer, “Arrow of Time”, The Information Philosopher).
[43] See, e.g., Carlo Rovelli, “Memory and information”, 2020 CE.
[44] The number of years in which it is expected that supermassive black holes will lose their energy due to Hawking radiation, after which space may no longer exist (as there may be no matter in it to differentiate between locations or structure the field for such interactions) and the universe may go back to a singularity and then a new Big Bang… or something else!
[45] Charlotte Higgins, “There is no such thing as past or future”, The Guardian, 14 April 2018 CE – an interview with Carlo Rovelli.
[46] It is therefore objective and invariant for all life-forms, though it may be intersubjective when looked at from the perspective of inanimate non-living matter and energy (if it has any perspective!).
[47] Aparna Vidyasagar, “What Are Viruses?”, LiveScience, c. 2016 CE.
[48] Michael Le Page, “A Brief History of the Human Genome”, New Scientist, 12 September 2012 CE.
[49] Peter Pink-Howitt, AI-art, 2024 CE.
[50] Peter Pink-Howitt, “Coriolis Effect”, 2011 CE.
[51] Marek Roland-Mieszkowski, “Life on Earth: flow of Energy and Entropy”, Digital Recordings, 1994 CE.
[52] James Lovelock, Gaia: A new look at life on Earth, 1979 CE.
[53] Wikipedia, “Entropy and life”. We should however be very careful to be clearer about ways in which life-forms are low entropy structures and agents – with unusual agency and different from the surrounding (more random) environment – but also how they increase entropy in their surroundings. After all, life-forms appear to have arisen as the most efficient way to make use of free energy to speed up chemical equilibrium in the atmosphere.
[54] What is life?, 1944 CE.
[55] Wikipedia, “The ultimate fate of the Universe”.
[56] “Fern Hill”, 1945 CE. Image: Peter Pink-Howitt, “Time held me green and dying”, AI-art, 2024 CE.
[57] Quantifying Life, 2019 CE.
[58] Natalie Wolchover, “A New Physics Theory of Life”, Quanta Magazine, 22 January 2014 CE.
[59] Wikipedia, “Abiogenesis”.
[60] “How Life (and Death) Spring From Disorder”, Quanta Magazine, 26 January 2017 CE.
[61] Wikipedia, “Maxwell’s demon”. Image: Htkym, “Increasing Disorder”, Wikipedia, CC BY 2.5.
[62] Wikipedia, “Double-slit experiment”.
[63] Rainer Maria Rilke, “Archaic Torso of Apollo”, 1918 CE.
[64] Jordana Cepelewicz, “Mathematical Simplicity May Drive Evolution’s Speed”, Quanta Magazine, 29 November 2018.
[65] Gregory Chaitin, Toward a Mathematical Definition of Life, 1979 CE.
[66] Christopher P. Kempes, David Wolpert, Zachary Cohen and Juan Pérez-Mercader, “The thermodynamic efficiency of computations made in cells across the range of life”, Philosophical Transactions of the Royal Society, 2017 CE.
[67] Jordana Cepelewicz, “Mathematical Simplicity May Drive Evolution’s Speed”, Quanta Magazine, 29 November 2018 CE. Image from the same source created by Lucy Reading-Ikkanda, fair use assertion.
[68] Ibid.
[69] “Statistical physics of self-replication”, Journal of Chemical Physics, 2013 CE.
[70] What is Life?, 1995 CE.
[71] Santiago Hernández-Orozco, Narsis A. Kiani and Hector Zenil, “Algorithmically probable mutations reproduce aspects of evolution such as convergence rate, genetic memory, and modularity”, 2018 CE. See also: Matheus Sant’ Ana Lima, “Algorithmic-Information Theory interpretation to the Traveling Salesman Problem”, 2019 CE; Hector Zenil and James A. R. Marshall, “Ubiquity symposium: evolutionary computation and the processes of life some computational aspects of essential properties of evolution and life”, Ubiquity, 2013 CE, pp. 1–16; Luís F. Seoane and Ricard V. Solé, “Information theory, predictability and the emergence of complex life”, Royal Society Open Science, 2018 CE, 5(2): 10.31224/osf.io/gmzn5.
[72] Wikipedia, “Algorithmically random sequence”.
[73] “An algorithmic information theory of consciousness”, Neuroscience of Consciousness, Issue 1, nix019, 2017 CE.
[74] Ibid.
[75] Justin K. Pugh, Lisa B. Soros and Kenneth O. Stanley, “Quality Diversity: A New Frontier for Evolutionary Computation”, Frontiers in Robotics and AI, July 2016 CE.
[76] See Adam Frank, David Grinspoon and Sara Walker, “Intelligence as a planetary scale process”, 2022 CE.
[77] The Great Dictator, directed by Charlie Chaplin, 1940 CE, sampled in Paolo Nutini, “Iron Sky”, Caustic Love, 2014 CE. In spirit, the socialist and the religious impulse are the same: a desire to see and use the collective power we have if we work together and believe in common goals.
[78] Image: NASA, ESA and the Hubble Heritage Team (STScI/AURA), “Pillars of Creation”, 2014 CE, Wikipedia, public domain: “The towering pillars are about 5 light-years tall.”
[79] Roger Penrose, Cycles of Time, 2010 CE. Image: ibid. fair use assertion.
[80] Such as those pioneered by John Wheeler and Bryce DeWitt. See also, for example, Tim Anderson, “The Big Bang may be a black hole inside another universe”, Medium, 27 July 2020 CE and Clara Moskowitz, “Do Black Holes Create New Universes? Q&A With Physicist Lee Smolin”, SPACE.com, 2013 CE.
[81] “The Fourth Elegy”, Duino Elegies, 1923 CE, trans. by Stephen Mitchell, Duino Elegies and the Sonnets to Orpheus, 2009 CE.
[82] Laura Getty and Kyounghye Kwon, “3.1: The Bhagavad Gita”, Humanities Libretexts, 18 May 2020 CE.
[83] Life is extraordinary but given the number of years the universe has been in existence (estimated at 14–15 billion) and the number of stars (estimated to be at least 1,000,000,000,000,000,000,000,000), the likelihood that our star system is the only place with life seems low. On the other hand, maybe life is as rare as a beach spontaneously taking the shape of a sandcastle. All the more reason to take care of it.
[84] Wikipedia, “Fermi paradox”.
[85] See Gregory Chaitin, “The Limits of Reason”, Scientific American, 2006 CE.
Peter Howitt