“The settlement, the human city, is natural. To recognise it is a natural science. To tend it is an art, analogous to horticulture. This cannot exist without knowledge of the plants, soil and water involved. The art of urban development requires knowledge of all living organisms in nature, of non-living nature, the present state and the possibilities of technology.”
Frei Otto, Occupying and Connecting, Berthold Burkhardt (ed), Edition Axel Menges, Fellbach, 2009
The use of specifically designed apparatus of material computation to demonstrate and solve problems of urban morphogenesis is not new and the authors have taken great inspiration from the work of German Architect Frei Otto.
Otto has studied processes of occupation and connection of large territories by means of apparatus deploying the computational power of soap bubbles, sand, ink droplets and so on.
Material computation has the benefit to operate morphologically and in relationship to a specific substratum or medium. Otto was able to compute in a completely analogical way the emergence of path systems and in particular to define a special category of path systems, the so-called minimizing detour networks.
Minimizing detour networks are special among others as they produce connections among points which are optimal in terms of the energy expenditure required to connect them. Otto discovered that the patterns emerging from his experiments appeared almost everywhere in non-human as well as in human systems, such as in unplanned settlements or intercontinental road networks.
Otto has contributed greatly to bring this form of natural urbanism and bio-inspired design to the forefront, to the point that today it informs a large part of the digital design avant-garde; we could argue however that upon a deeper look his stance is informed by a very specific vision of nature.
In his view natural systems were operating as self-organizing networks whose behavior would, if left to evolve without restrictions, self-regulate and tend to a stable balance. And as a consequence the apparata he was building would abstract the design problem at hand by mean of self-organizing networks that would tend to find a kind of optimal equilibrium, where forces reciprocally neutralize, material distribution is optimized and paths minimized. Such equilibrium, once reached, is definitive and, we may say, absolute. In fact the study of such equilibrium and its morphological appearance was the main purpose for building analogue computational models, and deploy their ability to solve complex computational problems morphologically.
This ideal of a balanced kind of nature was in fact not just Otto’s credo but it was the mainstream reading of natural systems in science as well as in engineering still during much of the 70s and 80s. Inspired by the pioneering work of Eugene Odum and his seminal book “fundamentals of ecology”, this view of nature was born out of the early computer models of nature and the related development of the discipline of cybernetics.
Cybernetics provided the mathematics to underpin a new interpretation of biology as a self-organizing network of interrelated feedback loops; biology adopted such ideals and remodeled nature as a collection of simplified, quasi mechanical living organisms. The assumption was that if nature and the biosphere was left alone, it would stabilize and find an equilibrium. Such view still underpins contemporary environmental ideology and constitute a powerful conservative force that prevents true innovation to occur within the discipline of architecture and urban design.
However nature does not tend to a balanced state of equilibrium, dynamical systems fluctuate and upon reaching specific conditions of critical mass can drastically alter their behavior.
French philosopher Gilles Deleuze had a great influence on early digital design avant-garde by providing an extended meaning of the term machine beyond the mechanical paradigm and towards the so called “Machinic” framework. For Deleuze the notion of machine exceeds the technical realm and includes abstract generative mechanisms that can be recognized in any organic or inorganic being and that operate across multiple realms and at multiple scales;
It makes no sense to think that an organism stands a chance of survival independently of the survival of its milieu; the milieu is a precondition for the organism’s development; […] If we frame the organism plus milieu as a unit as Bateson suggests, than it is impossible to define it neatly as having a clear form, or limit.
Andrew Ballantyne, Deleuze and Guattari, Routledge, Oxon, 2007, p. 85
A simple example used repeatedly by Deleuze and Guattari is the relationship between the wasp orchid and the Thynnie wasp; the orchid flower has evolved parts that resemble very closely the female wasp; the seduced wasp male tries to mate with the flower and by doing so it impollinates the plant; the two have evolved so inseparably that even their appearance has become similar despite being an insect and a plant; the wasp is inherently part of the plant so that it becomes very hard to draw a frame around its identity; however resisting to do so allows to conceptualize the pair as a larger ensemble or machine, and their coupling as the process of reproduction of the machine itself.
In similar fashion human beings can be defined as an assemblage of what Deleuze calls “desiring machines”, thousands of mechanisms that without us noticing are producing the desires that we do notice and that surface to the level of consciousness; a similar machinic framework can be recognized in the formation and evolution of inorganic assemblages like dunes and deserts; million of sorting mechanisms create coherent patterns of sand distribution that travel in space and time until final dissolution.
The most important aspect of Deleuze’s definition of machines, for our discourse, is that they cannot exist outside the “milieu” or the environment; in other words their definition is inextricably embedded in the environment within which they perform or are conceived: exactly like an animal and its habitat, they form an inextricable “unit of survival”.
Two early examples of a pioneering application of an expanded notion of algorithmic machine are provided by engineer Le Ricolais, working with models of structural analysis and behavioral simulations of structures observed during collapse, and by Cybernetician Gordon Pask, who famously developed a series of artefacts, conceived in conversation with the surrounding environment as well as with the users, testing beds for his theories on second order cybernetics;
Le Ricolais suggests that matter, material, construction systems, structural configurations, space, and place comprise a continuous spectrum rather that isolated domains. Such an understanding provides a model for organizing forces and their effects that is communicative, reverberating across scales and regimes.
Reiser–Umemoto, Atlas of novel tectonics, Princeton Architectural Press, New York, 2006, p. 110
It seems to me that the notion of machine that was current in the course of the Industrial Revolution – and which we might have inherited – is a notion, essentially, of a machine without goal, it had no goal ‘of’, it had a goal ‘for’. And this gradually developed into the notion of machines with goals ‘of’, like Thermostats. Now we’ve got the notion of a machine with an underspecified goal, the system that evolves. This is a new notion, nothing like the notion of machines that was current in the Industrial Revolution, absolutely nothing like it. It is, if you like, a much more biological notion; maybe I’m wrong to call such a thing a machine; I gave that label to it because I like to realize things as artifacts, but you might not call the system a machine, you might call it something else.”
Gordon Pask quoted in Mary Catherine Bateson, Our Own Metaphor: A Personal Account of a Conference on the Effects of Conscious Purpose on Human Adaptation, Alfred A Knopf, New York, 1972.
On the basis of these developments in philosophy and cybernetics, one of the transformations that have taken place in architecture’s avant-garde in the ‘90s is the abandoning of the ideology of form and advancing towards establishing what we could call ‘the ideology of the process’, based on the exploration of potential patterns of use of new diagrammatic representations supported by emerging digital design technologies.
Before the introduction of such techniques, the architectural system of production was based on the illusion that it was the determiner of the final form of what was imagined during the expressive phase of activity within the studio or the visionary mind of the architect. As work was carried out with a repertory of forms totally dependent on Euclidean geometry, it was easy to believe this illusion. Each mathematical expression represented a single constant form that, when embodied in design was easily associated with the real form of the final product.
Algorithmic design demonstrated the existence of this illusion; Gregg Lynn’s work on animate forms can be considered the first manifesto of a new knowledge regime of production which is declaring the end of the era of predictable forms that started with the renaissance. This approached informed the exceptional work carried out in the 00s at the Architectural Association in London by a talented group of tutors and students under the guidance of the Chair Mohsen Mostafavi.
The limits of our design language are the limits of our design thinking. The medium of representation delimits the domain of architecture and implicitly defines what architecture is. How we represent architecture determines how we anticipate (design) architecture.
Patrik Schumacher, The autopoiesis of architecture, Wiley, London, 2010, p. 330
Algorithmic design techniques allowed to conceive form as an emergent effect in a design process; the initial design actions or sketches merely define starting points within a process of creation or production and cannot determine what the final result will be like. What emerges are architectural practices implemented by the continuous going back and forth within the process, which we call ‘systemic design practices’. The old hierarchy of expertise is now being replaced by a new production process where the efficiency of the design solution is built up through revisions which consequently incorporate intelligence and performance. This is a process where the initial decision of form can be repetitively re-described by another decision taken on another level of the design algorithm; it is a process where all practices are the indivisible parts of an emergent whole.
The truly innovative architects designs swarm architecture for an open source in real-time. Building components are potential senders and receivers of information in real-time […]. People communicate. Buildings communicate. People communicate with people. People communicate with buildings. Buildings communicate with buildings. Building components communicate with other building components; all are members of the swarm, members of the hive.
Kas Oosterhuis, Hyperbodies, Birkhauser, Basel, 2003, pp. 5-6
Still complex algorithmic procedures can be develop in order to predict final form, but that is not a vital option for their existence; they are virtual multiplicities held by a mathematical formula with infinite possible geometric solutions that can be made to evolve and interact within the milieu and in real-time. Form is than the product of an algorithmic procedure of material organization applied in a specific space and time.
Mies’s constraint of matter by ideal geometry is based on an essentialist notion: that matter is formless and geometry regulates it. […] When freed from such essentializing conception, matter proves to have his own capacities of self-organization. As an analogue computer, it can perform optimizing computations that have been shown to be trans-scalar; […] it becomes a model not only for dealing with structure but for dealing with the feedback that occurs between multiple forces at work on a building.
Reiser–Umemoto, Atlas of novel tectonics, Princeton Architectural Press, New York, 2006, p. 88
Geometry does not domesticate material; rather it operates as a tympanum against which material properties resound. As novel spatial effects emerge, complex material behaviours evolves; this is the essence of what we call morphogenetic design. The emergence of performance is then also transformed like in the case of structural or environmental design logics. For instance, the empirical calculations representing a building structural behaviour on a two-dimensional plane, depicted within the context of formulas which are the universally valid results of previous experimentation, are no longer relevant. Morphogenetic design practices do not operate within such strict polarity; there is no such a yes/no solution; a particular structural behaviour is always modelled in “real-time”, defining an area of efficient possibilities within an almost chaotic whole, generating design solutions which are articulated, iteration after iteration.
Morphogenetic design practices suggest a different spatial logic exemplified by the brief of TAB2017 titled bioTallinn. BioTallinn does not prescribe a final morphological solution, rather it describes an informational protocol, an experimental “derive” in the field of architectural prototyping and urban design. How wide a series of relevant possibilities can we create in the context of designing, calculating and building a small urban folly or a large scale urban waste water treatment plant cum ornithological park? Certainly there is no guarantee that all these possibilities will become useful in the future; all the same it is worth trying, because what creates the future is the experiment itself, the practice of experimentation which has no definitive end.
So that is another rule for the whole nature of architecture: it must create new appetites, new hungers – not solve problems, architecture is too slow to solve problems.
Cedric Price, Re: CP, Hans-Ulrich Obrist (ed), Birkhäuser, Basel, 2003, p. 57