The Simulacrum is True:

Structuralism and the Semantic Universe

Vahid Houston Ranjbar
29 min readApr 30, 2022
Green-leafed Tree Under a Calm Blue Sky by Lisa Fotios Photo by Lisa Fotios from Pexels: https://www.pexels.com/photo/green-leafed-tree-under-a-calm-blue-sky-3050970/

Structuralism and Semantics

In the early part of the 20th century the lecture notes of the linguist Ferdinand de Saussure, were published under the unassuming title “Course in General Linguistics”. The ideas in this book would profoundly affect the course of philosophy and sociology giving rise to what is known as Structuralism. Saussure saw linguistics as a subset of the broader subject of semiotics, the science of symbols or signs. He observed that the process of communication involved two components, the signifier and the signified. Yet this relationship ultimately arose out of the signifier’s relationship to other signifiers in each lexicon. Thus, his key understanding, was that the meaning of words or symbols arises out of its relationship with all the other symbols or words in each language. This contradicted the long-standing assumption that words were grounded in actual objects. That these where just names for objects that arose historically when certain vocalizations became attached to certain objects.

in language there are only differences. Even more important: a difference generally implies positive terms between which the difference is set up; but in language there are only differences without positive terms. Whether we take the signified or the signifier, language has neither ideas nor sounds that existed before the linguistic system, but only conceptual and phonic differences that have issued from the system. The idea or phonic substance that a sign contains is of less importance than the other signs that surround it. Proof of this is that the value of a term may be modified without either its meaning or its sound being affected, solely because a neighboring term has been modified.” (Saussure n.d.)

The result of this thinking was the realization that language’s pretense of referencing some objective and stable reality was an illusion, that meanings changed as the whole structure of relationships between words changed. This seismic shift in how language was viewed would years later help give birth to Post-Structuralism and Post-Modernism.

At around the same time as Saussure, Charles Sanders Peirce took semiotics in a different direction, one which appears to be more inline with the practitioners of the physical sciences. Unlike Saussure’s binary model of signifier and signified, Peirce claimed that there were three components involved in the communicative process. The thing being referenced, the sign used to indicate the thing, and the actual interpretant. Thus, he decomposed Saussure’s signifier into the referent and the interpretant. He connected this inherent structure of communication to the very form of philosophical logic; “a sign is something, A, which brings something, B, its interpretant sign determined or created by it, into the same sort of correspondence with something, C, its object, as that in which itself stands to C” (Peirce 1902). Thus he viewed “logic as formal semiotic”. He took this idea further and linked semiotics to the very process of thought. This view expanded thought from residing only in ‘brains’, so he observed, “Thought is not necessarily connected with a brain. It appears in the work of bees, of crystals, and throughout the purely physical world; and one can no more deny that it is really there, than that the colors, the shapes, etc., of objects are really there” (Peirce 1902)

Bee and Beehive by David Hablützel Photo by David Hablützel from Pexels: https://www.pexels.com/photo/bee-and-beehive-928978/

The American Pragmatist school of philosophy would grow out of the work of Peirce, embodied in his Pragmatic axiom, “Consider the practical effects of the objects of your conception. Then, your conception of those effects is the whole of your conception of the object”. The development of American Pragmatism would help lead to an instrumentalist view of science, which saw theories or ideas as only tools or instruments which are to be judged based on how well they explain phenomena and make good predictions.

Many of the major practitioners of the physical sciences, had for quite a long time effectively jettisoned any pretense of accessing the reality of physical things in it of themselves. James C. Maxwell the father of Electromagnetic theory mused during his 1873 lecture “Discourse on molecules”, “..It is only when we contemplate, not matter in itself, but the form in which it actually exists, that our mind finds something on which it can lay hold.” (Maxwell 1873) Boltzmann, the father of Statistical Mechanics had himself introduced the concept of Theoretical Pluralism within the context of semiotics, which appears to have foreshadowed and influenced the development of the philosophical understandings of quantum mechanics and effective field theory. Boltzmann explains:

“Hertz makes physicists properly aware of something philosophers had no doubt long since stated, namely that no theory can be objective, actually coinciding with nature, but rather that each theory is only a mental picture of phenomena, related to them as sign is to designatum. From this it follows that it cannot be our task to find an absolutely correct theory but rather a picture that is, as simple as possible and that represents phenomena as accurately as possible. One might even conceive of two quite different theories both equally simple and equally congruent with phenomena, which therefore in spite of their difference are equally correct. The assertion that a given theory is the only correct one can only express our subjective conviction that there could not be another equally simple and fitting image. ” (Boltzmann 1899)

Thus, even before the end of the 19th Century, there was some questioning among the physical scientists if accessing some stable and singular truth about the physical world might be achievable. This in part can be understood by considering the very physics of a measurement of some phenomenon. This process required what Kant would term an ‘a priori form’, that is a prior structure of knowledge to fill. So, for example in the case of a digital measuring device, a set of digital switches with a particular bit resolution into which the phenomena can imprint upon. This bit resolution moves through the whole production of knowledge from the measurement to the analytical and numerical output constrained by the achievable precision of a given calculational algorithm. Thus, whatever stable truth such a process yields, it is predicated on the resolution and questions or structure of the initial bits; the organization of the containers which a given measurement will fill.

by Tim Mossholder Photo by Tim Mossholder from Pexels: https://www.pexels.com/photo/eight-electrical-metric-meters-942316/

The advent of quantum mechanics made this situation even worse, since the act of observation changed the very nature of the objects measured, including the observer. Thus, the very idea of a disconnected object and subject became an impossibility. Considering this, most practicing physicists long ago retreated from such sweeping claims of knowledge about an objective reality, that many imagined sciences were engaged in. Instead, a view more in line with American Pragmatism and Instrumentalism prevailed. That is, they realized that what is meaningful are the building of self-consistent models that can predict measured events at a given resolution and scale. Further, it was understood that these predictions had intrinsic probabilities associated with them and thus could never be totally deterministic. This view is captured on some level by American Pragmatist and instrumentalist like Peirce who saw the pursuit of ‘truth’ only meaningful in terms of its ‘usefulness’. Theories could be provisional and not encompass pure objective truth yet be ‘useful’.

This is different from Popper’s view, where the demarcation between science and non-science is based on falsifiability. That is one can never be certain that a given theory is true via pure induction, since there is always the possibility of new data that could falsify a given theory, for example, Hume’s black swan. One can only be sure in a theory to the degree it can be subject to possible falsifiability and survive. Thus, intellectual endeavors which cannot produce falsifiable claims should not be considered science in Popper’s view. However, Popper’s view doesn’t appear to leave enough room for theories which have been falsified, but nevertheless produce predictive and useful results at a given scale or regime. In fact, the whole of effective field theory would have trouble living in Popper’s definition of science. Additionally, it doesn’t provide a mechanism to distinguish probable or fruitful theories from unlikely ones. For example, I may claim that there is a fruit tree on Venus; a claim which is possible yet difficult to falsify. But of course, such a theory probably should not be placed on the same footing with an untested prediction of the standard model. There is a big difference which falsifiability alone doesn’t capture.

Perhaps a more charitable reading of Popper’s falsifiability might be to suggest that it implies the pragmatists usefulness criteria. That is a theory is useless if it is unable to make predictions and, in the process, reduce the number of variables or free parameters in theory. For example, one could construct and fit a massive multi-variate, polynomial function that just reproduced all relevant observations, yet be useless in extrapolating or predicting unmeasured and untested phenomena. Such a system could not be considered a scientific theory or model.

Information Theory

The physical sciences would decades after Saussure, produce its own theory of information with the work of Claud Shannon. His approach appeared to completely discard the semiotic aspect or ‘meaning’ of a datum. He made clear this distinction when he formulated his theories. Thus, he termed physical and entropic sort of information as syntactic and the one which carried abstract meaning as semantic information (Shannon 1948). Shannon explained that his theory dealt with only the syntactic and it was his opinion that for his purposes the semantic was irrelevant to the engineering problems he was addressing. However, Shannon also introduced the concept of mutual information which was necessary for his theory of communication. Mutual information represents the quantity of information gathered about one random variable by measuring another random variable. Mutual information in this context is semantic in that it has value relative to some other thing and is not just a metric of potential quantity of information or storage capacity. In Pierce’s understanding this is the degree to which his interpretant corresponds to the thing being referenced. Based on this we argue that Shannon’s works were both syntactic and semantic and that at its root semantics implies information relative to something.

Shannon, Claude

Shannon’s approach linked information back to Thermodynamics via their mutual basis in the symmetry and the breaking of symmetry in a given physical system. He equated the statistical mechanics definition of entropy with information. Shannon defined information in terms of bits, which are binary variables that can be either 1 or 0. For example a switch or a coin which can be in a heads (1) or tails (0) position can be said to possess one bit of data the value of which is determined if it is in the heads or tails position. Shannon showed that an arbitrary set of information or data can be represented as a string of 0’s and 1’s. He realized that the probability of possible states given in statistical mechanics could be re-expressed as the probability of an ensemble of bits in a 1 or 0 state. Or to use the coin analogy the probability that a collection of coins would be in a heads or tails position when thrown.

The information storage capacity of a given physical system is given by the equivalent number of bits which can be held in that system. This capacity Shannon equated to its Boltzmann’s entropy. So, for example a coin can hold only one bit of data and from an entropy point of view mathematically equals the logarithm of 2. While a six-sided dice has an entropy of the logarithm of six. This new formulation of entropy in terms of information physicalized the heretofore abstract concept of information since now all information could now be represented as a binary series of 1’s and 0’s or physical switches.

Somewhat prior to Shannon, Ronald Fisher working in the field of mathematical statistics and genetics popularized and analyzed an approach to estimate the most probable parameter values for an assumed mathematical description of a given observed phenomena. This is known as the Maximum Likelihood Estimate method, in doing so he also created much of the basis for modern statistics as we know it (Fisher 1922). The development of the Maximum Likelihood Estimate method also introduced another way to define semantic information, known as Fisher information. ‘Fisher’ information represents the amount of information that one observable carries about another parameter: a concept similar to Shannon’s mutual information. Indeed, both Fisher and Mutual information are semantic and in fact later it has been shown how both concepts are related though derived from differing assumptions. (Wei 2016)

Ronald Fisher

Fisher or Mutual information can be illustrated by the hands of a watch, which if correctly set can give information about when the sun will set. Thus, it possesses some part of ‘information’ about the rotation of the earth relative to the sun or can be said to have semantic information of the earth-sun system. Science can be understood as the process of incorporating increasing amounts of semantic information about the universe into mathematical models requiring fewer and fewer variables. Through this lens, the Copernican scientific revolution really entailed a change of coordinates that simplified the model reducing the number of variables and increasing the semantic information about the dynamics of the planets and sun. It wasn’t that an earth centered universe was incorrect, which in truth is a meaningless statement since one is free to choose any old coordinate system. It is that using a helio-centric coordinate system made predictions much easier, more accurate and reduced the number of variables one needed to contend with. Thus, its use dramatically increased the net ‘Fisher’ information of the relevant variables one needs to account for. This is a good way to encapsulate the somewhat ill-defined notion of ‘usefulness’. This view of science is of course counterpoise to Thomas Kuhn’s view (Kuhn 1962) who argued that scientific discovery was driven more by epistemic shifts in paradigms employed by the consensus of practicing scientists than a linear progression of objective knowledge. While psychology and sociology influence the directions science goes, its actual progress is quantifiable in terms of the growth of semantic information about the universe in our models.

Boltzmann thought that “all salvation for philosophy may be expected to come from Darwin’s theory” (Ludwig Boltzmann 1974) and perhaps there is some truth to this sentiment since production of ‘Fisher’ information might even be what lies under the ‘hood’ of the evolutionary machine. Steven A. Frank an evolutionary biologist from the university of California has proposed that the process of natural selection maximizes Fisher information resident in the genome (Frank 2009). A similar idea was echoed by Christoph Adami a professor of Microbiology and Molecular Genetics, as well as professor of Physics and Astronomy, at Michigan State University. In his article “What is Information?” he claims that what Shannon called syntactic information and equated with entropy was really a measure of uncertainty. Adami proposes a new definition of information as “anything, which can give one the ability to predict an outcome better than chance” (Adami 2016). He goes on to calculate it using Shannon’s mutual information which is also a form of Fisher information since the unknown information which Fisher information measures might concern an existing, past, or even future state. This sort of predictive or Fisher information contrasting just ‘noise’ appears to be central to the process of structure formation and evolution.

Interestingly Fisher information might have an even deeper connection to what organizes our universe. Studies of aspects of the string theory involving the AdS/CFT toy model universe which relates gravitation in a static curved universe to a highly symmetric version of Quantum field theory on a curved two dimensional surface, have shown that the calculation of what is termed ‘canonical energy’ an energy related to the changes in the gravitational field, is actually a form of Fisher information in quantum field (Lashkari, Canonical energy is quantum Fisher information 2016). This suggests that the very structure of space and time can be cast as a type of semantic information.

Order, Causation and Mutual Information

The relationship between the creation of order or symmetry breaking and Fisher information can be illustrated by considering the Maxwell Demon paradox, by James Maxwell the father of electro-magnetic theory. (Knott 1911) He devised a thought experiment that challenged the ideas enshrined in the second law of thermodynamics — specifically the idea that for a system of particles in thermal equilibrium, where all the fast- and slow-moving particles were completely mixed, that no more work or energy could be extracted.

Maxwell’s Demon thought experiment visualized. A mixed system of moving black and white particles (top) are sorted by a Demon opening and closing a door (bottom)

Maxwell imagined a box containing this distribution with a wall dividing it into two sides. In the wall there is a door, which is controlled by some demon that would open the door only for fast moving particles and keep it shut for the slow ones. In this way, over time, all the fast particles would come to reside on one side of the box, leaving the slow particles on the other. In this situation, a heat engine could be run from the differential in temperature, and thus extracting work in apparent violation of the second law of thermodynamics. For many years Maxwell’s Demon challenged the understanding of entropy and the second law. At first some thought the energy used by the Demon to open the door and thus sort the particles would resolve the problem, yet it wasn’t until the merging of the new field of Information theory with Thermodynamics in the mid 20th century that this paradox was finally resolved, and the Maxwell Demon was understood to be what is known as an ‘information’ engine: a machine which creates work by creating information. This solution required the understanding of the physical nature of ‘information’ manifested as symmetry breaking of some physical system.

It was Leo Szilard who first recognized that the solution to this problem lay in accounting for the measurement and recording of the information about a particles’ speed and trajectory which permitted Maxwell’s Demon to sort hot and cold particles (Szilard 1929). Thus, to make a measurement presupposes recording it on some medium or creating a memory however temporary it may be. Later Landauer quantified the minimal entropy cost associated with the memory storage cost associated with a given measurement. Here, information gathered by the demon regarding the velocity of each particle represents a rise in entropy, because this information needs to be stored on some physical medium whose initial entropic state had to be considered. So, for example, a magnetic tape, which stores the bits of information as zeros and ones, before it can be used must be initialized so that all the bits are either in a one or zero state. So, to store a new bit of information the bit would be flipped or not to register a 0 or a 1. This initialization places the tape into a lower entropic state, which is then given up as information is recorded. In the end, the work required to reset this memory would consume more energy than was extracted, thus preserving the second law. This is known as Landauer’s erasure principle (Landauer 1961).

Maxwell’s Demon must create memory to sort on some device. To create a memory even a temporary one requires breaking symmetry of the memory device.

The Maxwell demon heat engine is now understood to be a type of information engine which generates information in the process of doing work. Fundamentally it can be understood as the trading of symmetry between two systems, the symmetry of the spatial distribution of fast and slow particles with the symmetry of the bit registers for a memory device. So, to achieve the broken symmetry of having most of the fast particles on one side and slow on the other a memory device loses its broken symmetry with all the bits now quasi-randomly flipped to a zero or one. Yet the nature of the information is not purely syntactic but is semantic in that the bits in the registry carry information relative to the particle’s position and velocity. To understand this better consider a poorly functioning demon whose measurements didn’t distinguish between the fast- and slow-moving particles. Thus it couldn’t sort effectively. It would generate information, yet this information would have little to do with the distribution of particle’s velocity and position. Clearly semantic or mutual information between the demon’s memory and the particles must be developed for symmetry breaking or order to arise in the enclosed gas. This semantic information is ultimately rooted in the quality of the measurement.

Shannon’s mutual information has more recently been applied to neurological models to address the question of mental causation or “Will”. In an article entitled “When the Map Is Better Than the Territory” (Hoel 2017) Eric Hoel pays what seems to be an interesting homage to Baudrillard’s meditation on the Borges’ tale of cartographers who render a map as large as the Empire itself. In this paper he argues that the progression of emergence from the micro-scale to the macro can be given new understanding using the framework of information theory.

Hoel uses the Shannon’s analysis of information transmission to develop a measure called “Effective Information” based on mutual information. It essentially quantifies how much the knowledge of each state in a multi-state state system such as a set of neurons, when combined in some manner (i.e. averaged) reduces the uncertainty about the future of that system. This paper claims that this re-scaled information acquires predictive abilities which exceed that of the individual parts for certain systems. Casting causal structure as a type of communication channel, he proves that the macro scale can possess more ‘Effective Information’ than the micro-scale and thus more casual power. Here casual power is gained via a process identical to how reliable information can be transmitted over a noisy channel, using error correcting codes. Thus, bringing us back to a theory of information, which Ferdinand de Saussure’s work in semiotics really represents. Yet he now shows how the acquisition of predictive power for a given collective network is what bestows agency to that system. This insight can now be applied to linguistics to provide yet another metric of power for a given system of symbols and in judging a translation.

Structuralism in Physics and Platonic Forms

Photo by Serinus from Pexels: https://www.pexels.com/photo/arched-glass-windows-on-cylindrical-concrete-wall-5838948/

Much as Saussure had observed in the context of language, in science one only has access to the measurable relationships between things and not things themselves. Thus, if something cannot be measured directly or indirectly relative to something else then by definition it doesn’t exist in our universe. This is because we have no way of interacting with it. It is not in our ‘Lexicon’. For example, when we speak about substances in the physical world, we can only reference their attributes in relation to other things attributes. We can never ‘know’ what an electron is, rather we can only establish its measurable quantities; charge, mass and spin (intrinsic angular momentum) relative to some other object’s quantities.

Many, at this point in thought, despair that true ‘reality’ is beyond our reach. When in fact Plato’s eternal forms are really staring us in the face. These relationships are the ‘forms’ and are the ‘true’ reality. They are mathematical and thus synonymous with what one could call Form, further by the admission of the structuralist thesis, they are what make our reality. It is due to this fact that some theoretical physicists, in an apparent return to Pythagorean philosophy entertain a much more radical view of math: that it somehow represents the true nature of reality. Thus, the physical universe is not just described by mathematics, but is mathematics.

To recognize this, imagine the following thought experiment; If tomorrow all the objects in the universe were to be replaced with other objects and yet the exact same mathematical relationships between them were maintained, there would be no way we could tell the difference, no measurement or experiment would allow us to distinguish the two universes. In fact, one could maintain that they are the same and, thus, that what “makes” our universe what it is, is given by the math. Hence it is these mathematical relationships which determine what things are and not any reference to a primal or elementary substance. It ultimately represents semantic information. This view is what is called Structural Realism particularly, what is called Ontic structural Realism (Ladyman 2020). I would link these structures directly to platonic forms. Thus, ultimately my view represents a version of Platonism.

This view of math and reality, obviates the need for classical Cartesian dualism between matter and Forms since this suggests that what we think of as “real” actually is relational information or really exists in the realm of Forms. The apparent or imagined dividing line between Forms and matter concerns the ability to perceive or measure these relationships, deducing the Forms. So far, the forms that we ordinarily observe, manifest the relationship inherent in composition and decomposition. However, with the advent of modern physics there is evidence for the existence of fundamental particles or things which are not composite — atoms in the classical Greek understanding of that word. Thus, for example in the case of an electron, physicists believe it is fundamental and not composed of even smaller units. While individual instances of electron can pop in and out of existence, they are all exactly the same ontological thing. Modern field theory postulates that there is an ever present “sea” of these fundamental particles everywhere in space — electrons, photons, quarks, etc. — which can at any point in time be summoned up in the same way that the number 5 is the same ontological thing every time I write it down. Thus, the electron is an eternal form without the usual properties of composition and decomposition to which we are accustomed. Heisenberg one of the co-inventors of quantum mechanics, argued from the outset that fundamental particles as described by the new quantum mechanics should properly be identified with Plato’s forms and was not a true ‘material’ reality. He described elementary particles as “comparable to the regular bodies of Plato’s Timaeus. They are the original models, the ideas of matter” (Heisenberg, Physics and Beyond Encounters and Conversations. 1971).

I believe we should take Heisenberg’s proposition very seriously. It offers a resolution to the problem concerning the ontological status of the quantum state function that is consistent with our current mathematical treatment. The state function is explicitly non-physical by virtue of the fact that it is in mathematical parlance, complex or imaginary. Further it only generates physically measurable observables like momentum or position after it is integrated in a mathematical process very akin to how a shadow or a projection of a higher dimensional object to a lower dimension might be calculated. When a shadow of an object on a two-dimensional surface is calculated one must also integrate or ‘add up’ all the light rays over the object. In fact, the mathematical process of integration in certain contexts is referred to as a ‘projection’ operation. Thus, the relationship between the state function and the physically measurable quantities, is almost literally analogous to how Plato imagined the shadows of the idealized forms where cast to comprise the observed physical world.

While there exist several interpretations of quantum mechanics that attempt to rescue the physical realism of the state function, in the final analysis their approaches are scarcely different from the understanding of a platonic form. For example, though Bohmian mechanics postulates a physical realism for the state function, the non-local physical properties required by this theory do such violence to very concept of ‘physicality’ of things existing in defined locations in space and time, that its scarcely different from how a religious person might describe a ‘spirit’ as being apart from space and time (Goldstein n.d.). In fact, Bohmian mechanics was quickly picked up by followers of various branches of eastern mysticism to validate their views (Horgan n.d.).

This is true for every viable interpretation of the quantum state function. The many worlds theory put forward by Hugh Everett, postulates that at every measurement of the state function the universe bifurcates into multiple separate universes representing each probable outcome (Everett 1956). While the more prevalent Copenhagen inspired interpretations treat the state function as only a mathematical reality. In all these schools the traditional notion of what we call a ‘physical’ thing is upended in various ways, either by denying it properties of space and time, or properties of a singular identity or residing solely in the realm of mathematical abstraction. Thus, the state function for a particle presents itself as a strong candidate as something which could be identified as Plato’s forms.

In 1996, Carlo Rovelli developed an interpretation of quantum mechanics, known as relational quantum mechanics (RQM) which appears to have strong basis in recent experimental and theoretical work rooted in the Wigner’s friend thought experiment. This interpretation fits extremely well with Ontic Structural Realism. In RQM the quantum state function is treated as observer dependent. Thus quantum mechanics really entails a description of the relationships between physical systems and these relative relationships provide a complete description not requiring anything more. One of the strong supports for this view is based on the recent test (Fedrizzi 2019) of a no-go theorem for observer-independent facts developed by Caslav Brukner (Brukner 2018) .

There is another line of evidence which supports the Platonic view. This is due to the fact that in quantum mechanics, fundamental physical interactions appear to be governed by probability and not pure determinism. If we assume that the universe is spatially infinite, it leads to a level I multiverse according to the often-used multiverse classification scheme developed by Max Tegmark (Tegmark 2014). A multiverse is characterized by an infinitude of identical or ‘parallel’ worlds. This is because probabilistic physics operating over any kind of infinity will “almost surely” yield all outcomes that have a non-zero probability of occurring, even if that probability is infinitesimally small. This means that, in a universe with infinite space, all these forms must exist with an infinite number of occurrences. Furthermore if the universe is eternal, these forms will occur an infinite number of times; that is all physical configurations are eternal.

One can understand it this way. If we image that each physical configuration represents the side of a die with an extremely large yet still finite number of sides and we are free to roll that die an infinite number of times, then we “almost surely” will explore all the sides of that die not just once but an infinite number of times. The key concept here is that the number of sides or distinct physical configurations is finite. It is not clear if this is in fact the case, though there is some evidence to suggest that this is true. Namely that space-time might not be infinitely divisible and that there exists some finite division of space and time, thus placing a limit on the number of distinct physical configurations. The existence of eternally repeating forms suggests that, in fact, all possible physical configurations are actually eternal and exist outside of space and time. That each represent a type of eternal platonic Form.

Being as a Relationship

Photo by ROMAN ODINTSOV from Pexels: https://www.pexels.com/photo/monochrome-photo-of-ancient-pillars-on-ruins-6422091/

Another expression of primacy of mathematical relationships, can be found in the Ship of Theseus thought experiment discussed by Heraclitus and related in Plutarch’s Theseus. Here a famous ship which was sailed by Theseus is preserved for over a century in the harbor by replacing some of the wooden parts when they rot or break until ultimately every part on the ship has been replaced. The question is if this is now the ‘same’ ship which was sailed by Theseus. The correct view might be that indeed it is the same ship to the degree with which all the fundamental particles relate to each other in exactly the same way. So, if one imagines it is possible to maintain exactly the same relative positions of Carbon, Hydrogen, Iron and other elements then it should be considered the same ship. This is because its identity is defined by these relative relationships.

This sort of thing actually is what on some level defines a ‘living’ thing in that a living thing is consider living in so far as it is able to maintain these relative relationships. So, for example a living organism will swap out countless cells during its lifetime yet maintain its identity by preserving the overall relationships. Thus, our true being lie in these relationships and it is to mathematics that one must turn to understand our physical origin and basis since it is these relationships which determine the physical nature of existence.

The fungibility of matter with respect to measured physical attributes, became explicit with the birth of renormalization group theory. Grappling with the limits and recursive nature of their new field theories, physicist by the early 70’s had managed to quantify at what level of resolution a self-referential system (much like Saussure had imagined exists with language) does physical phenomena become identical. This was called an Effective-Field theory, indicating that a given field theory was exact up to a certain well-defined resolution. Till then all field theories suffered since they assumed an infinite resolution which in turn introduced infinite values when one attempted to calculate the inherently recursive contributions of the vacuum fields. To solve this one had to introduce a cut-off or resolution limit ad-hoc (Ma 1973).

Wilson (Wilson 1971) and Kadanoff (Kadanoff 1966) went on to show how renormalization group theory could in fact explicate the dynamics and emergence in collective systems, like liquids, superconductivity, and magnetization. How and why the very states and phases of matter appear, and their observed physical attributes change with scale and how these systems change their emergent physical properties. They discovered that in several instances they do so in a universal manner independent of the substance. Thus, the phase transitions of liquid, superconducting, magnetic and many other systems will behave in an identical and universal manner at these points. This it would seem might further relax the level of equivalence necessary to achieve a phenomenological identity between different types of matter at a given scale.

Further they discovered that while multiple full theories were possible and identical above a given resolution, these theories required a certain form to be self-consistent and renormalizable. Thus, in fact there were universal truths that are part of even a self-referential theory. This conclusion effectively falsifies the naïve but common conclusions of pure relativism. Not all systems of knowledge are in fact valid. Thus, not any old collection of symbols with relative relationships can one construct a coherent and predictive structure, further not every translation of a given language captures the predictive and self-consistency of one language into another. In a 1977 interview hosted by Bryan Magee, Quine saw the need for “..the development of some conceptual scheme to take the place of the untenable old-fashioned theory of meaning, something new and something more acceptable in the way of theory of what goes into good translations” (Magee 1977). I would suggest that the assessment of the predictive capacity of a translation could supplant or re-define a theory of meaning in the context of Fisher or mutual information. Ultimately the efficacy of a given translation is judged against its predictive capacity. Or how much semantic information one has captured. So, for example if my translation doesn’t communicate the fact that a certain glass of water is full of poison then I have failed in my translation, and this is not equivalent to a translation that captures this piece of ‘Fisher’ information. This is also true in regard to the Lexicon or map of words to words that create internal self-consistency. Meaning in the context of mutual information and predictive capacity also serve as a necessary component for the ordering process of nature as we saw with the Maxwell Demon system. Thus, it concretizes meaning, a heretofore ill-defined concept.

The Simulation

Photo by Pixabay: https://www.pexels.com/photo/blue-bright-lights-373543/

Baudrillard’s Simulacra and Simulation (Baudrillard, Simulacra and Simulation 1983) would take the Structuralism premise to assert that society had by the modern age constructed knowledge systems that were far removed from any physical reality. Our ‘Maps’ of reality had replaced ‘reality’, thus the average experience of even something as traumatic as ‘War’ was so far removed from the reality of war that it exists only in the CNN newsroom and media as a construct. Modern society was so far removed from the experience of ‘real’ that we were in danger of ‘disappearing’ or having our very personalities and identities become completely unreal the further we drifted from the ‘real’. Thus our wars, foods and very identities were simulations of simulations in an ever increasingly recursive movement away from reality which would ultimately end in our ‘disappearance’ (Baudrillard, Why Hasn’t Everything Already Disappeared n.d.).

In reality, a digital simulation represents a sort of substitution of ‘matter’. A digital simulation of any physical phenomena represents an attempt to recreate the mathematical relationships of physical matter within the physics of a digital computer. It also should not escape notice in this connection that this is the reason the recently popularized simulation hypothesis put forward by the likes of Nick Bostrom really amounts to a sort of retelling of Plato’s cave analogy (Are We Living in a Computer Simulation?) (Bostrom n.d.). Bostrom posits that most probably our consciousness represents agents in an advance simulation based on extrapolation of existing technology and probabilistic arguments. I find it curious that Baudrillard’s Simulacra, via being the inspiration for the movie the Matrix, on some level helped create the cultural and intellectual ecology for much of the interest in the simulation hypothesis. This despite the fact that he hated it and thought it completely misunderstood the book to the point of being an agent of the power he warned against. In an interview by Le Nouvel Observateur he comments, “The Matrix implies the present situation is the one of an all-powerful superpower and so effectively echoes its propagation. Ultimately, the spread of this takeover is indeed a very part of the movie. As McLuhan said: message is medium. The message of The Matrix is its very propagation, by relentlessly contaminating everything.” (Baudrillard n.d.) Yet beyond the social and political implications of Baudrillard’s thesis he himself misses the fact that his irony is not irony, the Simulacrum is indeed true, and this is the fundamental fact which evaded the post modernists. In that same interview he misunderstands or rather incompletely understands Plato: “..the real nuisance in this movie is that the brand-new problem of the simulation is mistaken with the very classic problem of the illusion, already mentioned by Plato. Here lies the mistake.” and later, “The Matrix is in many respects an extravagant thing, both naïve and pervert, with no above and no beyond. The pseudo-Freud speaking at the end of the movie said it: sometime in the past, we have had to re-program the Matrix in order to integrate some anomalies in the equation. And you, the opponents, are part of it. So here we are, in my opinion, in a complete virtual circuit with no outdoor. I once again disagree!” (Baudrillard n.d.) Thus both Baudrillard and Bostrom’s view suffers since it doesn’t take the argument to its logical and seemingly obvious conclusion; that of infinite recursive simulations with no real base reality, thus the Simulacrum is the Truth, the circuit is the ’great outdoors’. This is the ultimate primacy of ‘Form’ and the subversion of materialism which Plato’s allegory makes clear.

Bibliography

Adami, Christoph. 2016. “What is Information?” (Phil. Trans. R. Soc) 374 (2063).

Baudrillard, Jean. n.d. https://web.archive.org/web/20080113012028/http://www.empyree.org/divers/Matrix-Baudrillard_english.html.

— . 1983. Simulacra and Simulation. Semiotext(e).

— . n.d. Why Hasn’t Everything Already Disappeared.

Boltzmann, L. 1899. On the Development of the Methods of Theoretical Physics in Recent Times.

Born, Max. 1955. “Statistical Interpretation of Quantum Mechanics.” (in Science) 122: 675–79.

Bostrom, Nick. n.d. “Are We Living in a Computer Simulation?” (Philosophical Quarterly) 53 (211): 243–255. (211).

Brukner, Časlav. 2018. “A No-Go Theorem for Observer-Independent Facts.” (Entropy) 20 (5).

Derrida, Jacques. 1976. Of Grammatology. Les Éditions de Minuit.

Everett, Hugh. 1956. The Many-Worlds Interpretation of Quantum Mechanics. Dissertation. Princeton University.

Fedrizzi, Massimiliano Proietti and Alexander Pickston and Francesco Graffitti and Peter Barrow and Dmytro Kundys and Cyril Branciard and Martin Ringbauer and Alessandro. 2019. “Experimental test of local observer independence.” (Science Advances) 5 (9).

Fisher, R. A. 1922. “On the mathematical foundations of theoretical statistics.” (Philosophical Transactions of the Royal Society ) 222 (594–604).

Foucault, Michel. 1971. “ Nietzsche, Genealogy, History.” (Hommage a Jean Hyppolite).

Frank, Steven A. 2009. “Natural selection maximizes Fisher information.” (Journal of Evolutionary Biology).

Goldstein, Sheldon. n.d. Bohmian Mechanics. https://plato.stanford.edu/archives/sum2017/entries/qm-bohm. The Stanford Encyclopedia of Philosophy.

Heisenberg, W. n.d. Physics and Beyond.

— . 1971. Physics and Beyond Encounters and Conversations. Harper & Row.

Hoel, Erik P. 2017. “When the Map Is Better Than the Territory.” (Entropy) 19 (188).

Horgan, John. n.d. David Bohm, Quantum Mechanics and Enlightenment. https://blogs.scientificamerican.com/cross-check/david-bohm-quantum-mechanics-and-enlightenment/. Scientific American .

Kadanoff, Leo P. 1966. “Scaling laws for ising models near Tc.” (Physics Physique Fizika) 2 (263).

Knott, Cargill Gilston. 1911. “Quote from Undated letter from Maxwell to Tait.” In Life and Scientific Work of Peter Guthrie Tait.

Ladyman, James. 2020. Structural Realism. https://plato.stanford.edu/archives/win2020/entries/structural-realism/. The Stanford Encyclopedia of Philosophy (Winter 2020 Edition). The Stanford Encyclopedia of Philosophy (Winter 2020 Edition).

Landauer, R. 1961. “Irreversibility and Heat generation in the computing process.” (IBM Journal of Research and Development.) 5 (3).

Lashkari, Nima, and Mark Van Raamsdonk. 2016. “Canonical energy is quantum Fisher information.” (Journal of High Energy Physics) 153.

Lashkari, Nima, and Mark Van Raamsdonk. 2016. “Canonical energy is quantum Fisher information.” (Journal of High Energy Physics) 153: 1–25.

Ludwig Boltzmann, ed. B. McGuinness. 1974. “Theoretical Physics and Philosophical Problems, Selected Writings .”

Ma, Shang-keng. 1973. “Introduction to the Renormalization Group.” (Reviews of Modern Physics ) 45 (50).

Magee, Willard Van Orman Quine interview with Bryan. 1977. Ideas of Quine. https://youtu.be/rVFR1qJAyf0.

Maxwell, James C. 1873. “Discourse on Molecules.”

Peirce, Charles S. 1902. “The New Elements of Mathematics.” 4: 20–70.

Saussure, Ferdinand de. n.d. COURSE IN GENERAL LINGUISTICS.

Shannon, Claud. 1948. A Mathematical Theory of Communication. Bell System Technical Journal.

Szilard, Leo. 1929. “On the reduction of entropy in a thermodynamic system by the intervention of intelligent beings.” (Zeitschrift für Physik) 53 (11–12).

Tegmark, Max. 2014. Our Mathematical Universe: My Quest for the Ultimate Nature of Reality. Random House.

Wei, Xue-Xin. 2016. “Mutual Information, Fisher Information, and Efficient Coding.” (MIT Press) 26 (2).

Wilson, Kenneth G. 1971. “Renormalization Group and Critical Phenomena. I. Renormalization Group and the Kadanoff Scaling Picture.” (Physical Review B) 4 (3174).

--

--

Vahid Houston Ranjbar

I am a research physicist working on beam and spin dynamics. I like to write about connections between science and religion.