THE VIRTUAL ANATOMY OF HYPERSTRCTURES
(The Philosopher must be) like a child begging for "both", he must declare that reality or the sum of things is both at once - all that is unchangeable and all that is in change. Plato, The Sophist.
Ever since Anaximander speculated on the substance of the cosmos by the nonlimited, apeiron, our perception of reality is determined, to a great extent, by the conceptual models we have of it. This was true in the past and is still true for us now. Reality is a concept that is limited by the nature of our conceptual models. Anaximander believed in a circular conception of time and the eternal recurrence of the same. The Judaeo-Christian legacy or linear time however has provailed at least for the history of mankind so far. Perhaps, from a science fictional standpoint, the era of civilisations may be measured, someday, by the forms of global temporalisations that are every bit as spatial as they are eventual Universal history, wrote Borges in his essay on the"Fearful sphere of Pascal" may be the history of a handfol of metaphors It is a history of displacement and condensation that maps out a manifold trajectory of involution and evolution, of endophysics and exophysics, and, most significantly, of the collapse of the closed world towards the infinite universe. The projection of universal narratives, each with a claim to being absolute, is, no doubt, over The 20th century will probably be known. among other things, as the century that suspended the quest for metaphysics as part of the fulfillment of the programme of the Enlightenment.
With only a few years lelt towards the end of the second millennium of the Christian era, there is a glimmer of realisation that metaphysics is a necessity if we are to make any sense of the ubiquitous state of affairs to which we are thrown into. To say that metaphysics is nonsense, as we now realise, is either ludicrous, since there is no statement devoid of presuppositions that are fully accounted for, or, itself a metaphysical statement. One of the sources of traditional metaphysics, claimed the logical positivists, is in the misuse of language. Ironically, the attempt to demystify its use, to establish a clear and distinct specification of its structure and meaning. has invariably led to metalinguistics and genealogies that are, more often than not, deeply tainted with speculative logic that characterises some of the best metaphysicians of the past. As Levinas stated, when commenting on Derrida: attempting to deconstruct metaphysics is more metaphysical than metaphysics itself.
The meaning of the prefix meta-points to a conjunction of two terms about and beyond. Every theory of information carries with it this twin desideratum. Semantic holism states that the meaning of "Q" word cannot be reduced solely to its atomic definition, but must be accessed as a function within the field of state space semantics. By extension it is inferred that there is no discourse that is not already implicated within the conceptual space of some meta-discourse even if the nature of that association is yet to be explored or established. Gregory Chaltin, the metamathematician from "IBM" remarked, during the "On Limits"conference. that the theory of incompleteness and undecidability, as discovered by Gödel, developed by Turing and further extended into an effective theory of algorithmic complexity by him, is only the tip of an iceberg of an underlying mathematical reality.
The number of mathematical objects is much larger than the number of atoms in the universe, and the universe of mathematics is much more extensive than the physical universe which physics is concerned with. Steven Smale, a chaos mathematician from Berkeley, during the "Chaos"conference, facetiously pointed out that the physical universe is not large enough to hold all the fractals there are in fractal geometry. The tacit awareness of this mathematical state of affairs has led some mathematicians to assert the existence of an archaic mathematical reality that is, essentially, invisible but real. This is a principle of existential generalisation, the resolution of which, in the foundations of mathematics, is far from over.
"When God calculates and exercises his thought, the world is created" says a marginal annotation to the Dialogue on the Connection Between Things and Words of t677 by Leibniz, one of the first rationalist philosophers to work out the properties of the binary number system. which of course has turned out to be fundamental for computer science. Unaware of the limits of computability at the time, but fully aware of the combinatorial exhaustion of knowledge in calculating the size of a book that would contain all true, false and meaningless propositions, Leibniz proposed a universal calculus in De Arte Comitinatoria that could compute, or, rather, calculate every set of relationships based on a system of combinations by means of charateristica universalis. With regard to the arehitectonics of geometrical harmony, Leibniz, however relies on the principle of continuity. a principe de I ordre general, which he developed as a calculus of indiscernibles. It requires that the lawfulness of phenomena be conceived as expressing a systematic integration of individual real elements beyond the level of empirical sequences. He transforms the method of calculus de maxlmis et minimis into a method de formis optimis applicable to the real world, a form of geometrical teleology that optimises and reveals the internal laws as sufficient reasons that regulate the harmony throughout nature.
Cassirer pointed out that nothing characterises more the shift from the substance of things to the substance of relations as in the calculus of indiscernibles proposed by Leibniz. Whilst, as John Wheeler, an Americen scientist, remarked, nothing so much distinguishes physics as conceived today from mathematics as the difference between the continuum based formulations of the one and the discrete character of the other. In the article entitled "It from Bit", Wheeler also suggests that it from bit symbolises the idea that the physical world has an immaterial source and explanation that is information - theoretic in origin. Nothing characterises more the implementation of discrete logic that in the determinate state transitions produced by the emergent computations of Cellular Automata (CA). Steven Wolfram, a computational physicist, introduced a dynamical classification of CA behaviour, and, speculated that one of his four classes supports universal computation. Instead of relying on differential equations, a mathematics of continuity, to describe the behaviour of nature, Wolfram investigates into the dynamics of CA, discrete state transitions, that behave similarly to the dynamics of physical systems. The field of Artificial Life, triggered by emergent properties of CA behaviour, has produced concepts of phase transitions that are computations at the edge of chaos. The guiding hypothesis is that life emerges at this periphery in the second order phase transition, referred to as the liquid regime poised between the solid and the gaseous regimes.
Every paradigm has a set of governing metaphors that compress and express its meaning, and the Information Paradigm. as an emergent phenomenon, is no exception. The emerging consensus is that nature / reality is a function of some form of computation even though there is no evidence that nature computes algorithmically. The Universal Turing Machine (UTM), named after Alan Turing, its inventor, has become the de facto standard by which computability is measured. It is an abstract machine developed from the serial act of counting and is looked upon as an anthropomorphic model of computation that is perfectly suited for a number theorist. Every modern computer is a technological embodiment of the UTM. According to the Turing / Church thesis, everything that is computable, in principle, is UTM computable. This is an extraordinary thesis that, if proven true, will have implications in every field of endeavour. There are logical as well as physical limits to computation as in the class of intractable problems known as NP completeness. The travelling salesman and the four-colour problems are in this category. However not all problems are so readily decidable and many are undecidable in relation to the halting problem.
A crucial development in the theory ot computation is the complexity of a minimal string necessary to generate or solve a problem as formulated by Chaitin. Algorithmic information theory states that compression is a function or recursion and is limited by the amount or random information present within any system. One of the profound insights discovered by Chaitin is that the field of arithmetic is random; it is not compressible, and that there are mathematical truths that are true for no reason - a remark made during the "On Limits" conference. No amount of human reasoning will ever solve some of these mathematical problems, and Leibniz`s notion of the principle of sufficient reason has proven to be inadequate. As grim as this may seem, fundamental insight in physics and mathematics does not involve yes-no answers to algorithms, but rather a search for structures and the relationships between them. This has led to research into new forms of computational models, such as CA based dynamical systems. Some of the developments in emergent computations have shown that Byzantine complexity, as displayed by nature, contains archetypal features which surface in many disciplines in disguised forms - a reflection of the same phenomena in different mirrors. The configuration of these generic classes of self-organisations are, however, exponentially rare.
Even before the discovery of these emergent phenomena, Ed Fredkin, a computer scientist, had proposed the provocative idea that the universe may be a form of cellular automation: a computational system that computes itself into existence. If the laws themselves evolve and radically change over time, then, there has to be a meta-space of competing laws that somehow engender the various stages of evolutionary development. This metaphor of universal cellularity however is a falsification, offering an effective symbol that displaces the universal clockwork of mechanism and the Industrial Revolution.
These are issues not without implications or relationship to architecture, yet architecture, has always been. slow to express the prevailing paradigms of knowledge and organisation. If there is "Q" forgetfulness, an unequivocal suspension of the epistemic fields and hierarchies outside of the typographical language of architecture, it most probably originates from ignorance of the meaning of the actual term itseif. The coupling of the two Greak terms, arche and techne, which establishes the conditions for the possibility of a worldly constructivism is intrinsically metaphysical in orientation. Even in the most limiting of cases, as in naive realism, the definability and qualification of architecture can no longer simply be attributable to the empirical logic of buildability, but needs to be extended into the sphere of constructibility in modal space. The internal logic of modal constructivism would include the notion of complementarity, forms of computation, generative systems, selforganisations, ensemble theories, nonlinear dynamics, morphogenetic potentials, statistical models of configuration space at different regimes of reality, combinatorials, artificial life, complexity, mereology, theory of limits and category and set theory at the very least.
It is not generally apparent that reality has a modal structure to it. Since much of the imperative of worldly affairs is driven by the obvious identification of the real with the actual, it is assumed that the counterfactual universe of modal space is nothing but a plausible speculation at best. The universe of modal space, which includes the domain of the possible and the actual, is much larger that the logic of implication derived from subjunctive conditionals such as "if, then" situations in modal semantics. Modal logic, as practised by philosophers, is based on two concepts, necessity and possibility. Modal constructivism, as a theory of architecture, would have to be conceptualised, along with the criteria of necessity and possibility, within the emerging framework of the so-called Information Paradigm inclusive of morphogenetic principles of dependent co-origination. The possible, from the standpoint of modal constructivism, must be given a systematic logic of embodiment and can only be effectively delineated by viable theories of morphogenesis. It is now obvious that the dynamics of information has overtaken the dynamics of energy in the modelling of physical systems. Therefore, it has become evident that the notion of buildability based on material systems is only a subset of the logic of constructability within generative systems. In fact, it would not be unreasonable to suggest that the universe of mathematics is the counterpart of the universe of modal space. Without having to invoke, the status of transworld identity and individuation as explored by some modal logicians, a modal version of monadology where the logic of beings is not identical to the logic of bodies, the conceptual efficacy of modal constructivism can be developed and applied as an extended form of architectural praxis.
With the emergence of cyberspace, we are witnessing the advent of a second order phase transition in our global culture, unprecedented in its scope as well as in its transformative power, it will radically alter our perceptions of reality, and the terms of engagement will be unimaginably rich and treacherous. If we generalised the era of the first order phase transition as spanning from the time of primitive forms of economy and exchange to the time of telepresence, the second order phase transition appears with the emergence of virtual worlds - a parallel universe instantiated by massive clusters of abstract machines in the interactive dominion of cyberspace. We are, without exaggeration, on the verge of a possible world that we cannot even begin to imagine except through the emerging paradigms of artificial world. Virtual entities are, no doubt, present and embedded within semiological systems of the first order regimes, however the radicality of the second order regimes lies in their capacity for the co-evolution of hyperstructures - higher forms of selforganisations, in the virtual sphere of artificial ecologies. The separation of the imaginary and the real, the factual and the counterfactual, the actual and the potential can no longer be clearly demarcated in this profusion of virtual worlds The significance of this lies not only in the representational power of simulation out also, and to a greater extent, in the interactive arena of self-organising systems that will have a reciprocal influence on the two levels of reality, the physical and the virtual.
Within the sphere of virtuality, the transaction of value will be tied to organisational depth and the cost necessary to generate self-reproducing systems. The political ecology of hyper-structures will be measured in relation to the cost curtailed in the emergence of different levels of complexity. Entropy, formulated in terms of the second law of thermodynamics, is a mathematical expression of the amount of disorder in any system and as such it is an inverse expression of the amount of organisations within the universe. The shift from energy to information is now conceptualised as the capacity for algorithmic compression relative to the amount of random information present within any system. Therefore, the production of artificial beings and entities has an information-theoretic cost that is as real as energy and material costs. Information is the currency of nature, and as Seth Lloyd, a physicist from Cal Tech, suggested, its value depends not only on the amount of information, but on how difficult that information was to produce. This transvaluation is most succinctly expressed, again, by Lloyd: "any species stumped by an intractable problem does not cease to compute, but it would cease to exist." Existence is an emergent form of computations in cybernetic space. The genetic make-up of a species registers all the exchanges and interaction from the tracks of the epigenetic landscape. The evolution of massive interaction over time within cyberspace will no doubt register a complex set of virtual history and genealogy that will surely become the archeological site for cryptographers and, most uncannily, artificial beings. It would be a virtual topography of the sublime and the tragic.
What will architecture be in this sphere of virtuality? No one knows for sure, however one thing is certain, traditional conceptions of territory, of dwelling, of identity, of the phenomenology of existence and being will no longer be the same. This domain will be the arena of complex adaptive systems at the global level of the mechanosphere, accommodating a collective co-evolution of models that converge towards the virtual anatomy of hyper-structures. It is very likely that some form of modal constructivism will emerge, allowing architecture to address a multitude of emergent phenomena at different levels of scalar and specification regimes, and opening up a universe of possibility for architectural invention. Shakespeare once remarked that we are the stuff from which dreams are made of, and nothing characterises this more than the coming era of hyper-reality in modal space. This brave new world, a spectral fusion of neural-networks-in-action, filled with hope and danger, will be the future horizon that must be measured by the collective space of experience without feeling into a massive state of amnesia. This will, no doubt, be one of many ethical challenges for life and architecture in virtual reality. Cyberspace, ultimately, may be the entry level simulation of artificial worlds within modal space.
© Karl S. Chu ( X Kavya ).