Indeed, today, it has sadly become very fashionable to reject, in an obscurantist way, many more things in the subconscious than is necessary: it is much nicer to adore the ‘primitives’ and to note, happily, a ‘bankruptcy of the rationalist mind’ … than to realize, once and for all, without equivocation, that the age of physics, far from becoming extinct, is just beginning!
Arno Schmidt (1959), Calculus I
The scientific man is the ulterior development of the artistic man.
Friedrich Nietzsche (1994), Human, All Too Human
1. ‘Distributed’ Paradigm
During the past five years, most of my (post-) ‘critical-political time’ has been spent dealing with a new idea of collective intelligence that replaces the one implicitly defined, a century ago, by Gabriel Tarde (1890) or, explicitly, 10 years ago, by Pierre Lévy (1997). This intelligence first evolved from communication networks ranging from the newspaper to the telephone, the fax to the Internet. Today, it finds its fullest expression in grid computing, the distributed computing paradigm par excellence. Grid computing is a protocol for linking discrete but geographically dispersed machines into a distributed parallel processing network. Grid computation has given rise to a distributed computational intelligence that renders the classical concept of singular and autonomous intelligence obsolete.
As important as the technology itself are the consequences of this phenomenon, namely the rise of a new kind of people – geographically isolated scientific farmers who exchange their postpolitical concepts in symposia[1] – and the rise of a posturban environment that I call the ‘Ambient Factory’. What are the constituents of this factory? An early example, SETI@home, participated in deciphering extraterrestrial radio signals for signs of extraterrestrial intelligence. And this experiment has recently been transformed into a new model of industrial production with projects such as Folding@Home, Evolutionary@Home, XPulsar@Home, Fightaids@Home, Genome@Home, Models@Home and HIWTNI (Home is where the network is).[2]
All these examples of grid computing use the downtime of geographically dispersed PCs (at the moment often running as screensavers, but there is no doubt that computation power per se will become a global market – take, for example, the polluting rights market and the recently created Powernext Carbon)[3] to process the immense amounts of data involved in investigating a scientific research problem. Through a home PC, which becomes a piece of e-laboratory equipment, any user therefore participates in a community of scientists and nonscientists in producing knowledge. This effectively creates a carpet constructed of autonomous but interconnected ‘farms’: computer farms, energy farms and so on. The term ‘farm’ may seem anachronistic, but it is appropriate because it connotes the sorts of new living practices these networks produce.
The questions such projects raise for (a-)spatial and (a-)social organisations of production for the present are as significant as Ludwig Hilberseimer’s recognition of the electrical power grid in 1955 as ‘the real force toward [urban]decentralization’, since ‘even the smallest settlement can be supplied with water, electricity, heat and light’ (Hilberseimer, 1955).[4] Today, grid- computing networks not only allow the multitudes to communicate and to give an existence to the ‘world brain’, they allow computers to communicate in autonomous ways as a pure infrastructure that is at once global, abstract and standardised. This infrastructure allows a new kind of distributed computational and linguistic production, revealing what previously was called human production[5] as what it truly was all along: reproduction. Indeed, because of the growing complexity of all production (for example, in biotechnology, material science, pharmaceutics and computer sciences), all that is left for human labour is conceptual work. Everything else, in any field, is done by computers or numerically controlled machines.
2. ‘Distributed’ Paradigm
Why use ideas of industrial production, post-human networks or disappearing cities in reference to bionetworks and the multitude? Contemporary production seems to be driven by scientific rather than social forces. If so, understanding science means understanding real characteristics of our civilisation driven by ‘quantitative concepts’ such as numbers, statistics, approximation tools and methods, and mathematical precision. Quantity sounds abstract if, in the field of architecture, we consider Hilberseimer’s diagram
H Bomb on Chicago, dated 1946, but it also appears as a very practical concept if we listen to Max Planck’s comment on the same nuclear power problem (1947): ‘An appropriate calculation has shown that the quantity of energy liberated in this way in a cubic meter of uranium oxide pulverized in a hundredth of a second would be enough to raise a load of a billion metric tons to a height of some 18,000 meters. Such a quantity of energy could replace the combined production of all the world’s most powerful plants for a good many years’ (Planck, 1947).
Here, quantity is a concrete problem, as concrete as Robert Musil’s definition of the traditional newspaper – ‘filled with a measureless opacity’ that ‘goes far beyond the intellectual capacity of a Leibniz’ (Musil, 1930) – a definition often envisioned for our contemporary information overload. In this respect, thinking about collective intelligence means thinking about the way it works and the way problems are solved. Quantitative questions in fact pose their problems and their solutions simultaneously. Algorithms solve searching problems, peer-to- peer storage problems, open-source collaborative practices’ problems[6] and open technologies standardisation and communication problems. An answer to a technological problem is always a technological answer. Then, what is grid computing if not an appropriation, for industrial purpose,[7] of a new kind of productive paradigm? Is not SETI@home, which processes each year the equivalent of 400,000 years of computing time by a single processor, the new paradigm for industrial production, for ‘collaborative computational practice’ in any field, including architecture? It seems that corporations have already answered positively to this with their employment of the SETI@home model,[8] and I believe that grid computing is the next step for collective intelligence – an infrastructure-based computational intelligence.
3. Tools and Concepts
The integration of concepts like distributed partial machine intelligence within the design process is an integral component of EZCT Architecture & Design Research’s work. However, the practice not only refers to ideas of technology and science in their analyses of technology, but makes use of them. Architects should not metaphorically depict technology but use it, in a flat model, beyond any representation. Finally, because EZCT is part of the multitudes whose work concerns the ‘ultimate production of human imagination’ – that is, concepts – the practice also builds proofs for these concepts as design projects constructed through computers and programming languages.
For example, the practice has recently moved towards a grid model of design conceptualisation and production, extending its long-standing use of Mathematica, software normally oriented towards scientific communities, as a design tool, by using its grid-computing variant. GridMathematica leads to more efficient collaborative practice while alleviating the constraints of a single computer’s calculation power. Its use has allowed EZCT to reinforce its long-term collaboration with physicist Bruno Autin (author of the Geometrica software package, formerly of CERN) and mathematician Maryvonne Teissier (Paris VII University). Of course, GridMathematica is not the only way to achieve distributed computing. During the summer of 2004, EZCT led a project wherein a series of chairs were computed using genetic algorithms for optimisation using a cluster of 12 computers from the École Polytechnique in Paris. They were controlled by Hatem Hamda, from a geographically distinct lab (INRIA), using a Linux platform and open-source libraries and software including Evolving Objects (an evolutionary computation library) and xd3d (a scientific visualisation tool). In this process, first a human collaborative practice was evidently implied in the previous development of the open-source libraries, software and so on, and second a computational and ‘post-human’ collaborative practice became the paradigm since a very limited number of people were able to appropriate a vast amount of computational resources.
Thus we should not underestimate the fact that collaborative practice does not necessarily mean a human collaborative practice, and take into account newly emerging concepts; for example, that of productive autonomy.
Those who are not scientists, who are, by implication, in marketing or business, exchange their ideas in trade shows or technology conferences, places where, following Nietzsche’s sublime prediction, our whole civilisation affords ‘buying or selling as a luxury of our sensibility’ (Nietzsche, 1882).
I used this HIWTNI abbreviation, developed by the McKinsey quarterly, in Morel (2002), which is an explicit theorisation of what I only evoke in the present article.
Due to the Kyoto Protocol, companies and countries now trade polluting rights on this dedicated marketplace (Powernext Carbon).
The Ambient Factory concept actualises not only Hilberseimer’s but also Mies’ parallel comment: ‘There are no cities, in fact, any more. It goes on like a forest. That is the reason why we cannot have the old cities anymore; that is gone forever, planned cities and so on. We should think about the means that we have to live in a jungle, and maybe we do well with that’ (van der Rohe, 2001).
The classical one theorised by Adam Smith then Karl Marx
Open source is an answer from contemporary capitalism to itself: ‘I’ve worked for IBM in Linux for more than six years, and it has become big business for us, it’s a fundamental part of IBM’s business. We’re not into Linux and open source because it’s cool. It’s nice that it’s cool, but it’s good business. We’re making billions’ (Frye, 2005).
Keep in mind that everything is industrial – it is pharmaceutics, high- energy physics experiments, education, and so on – and that some distinctions between laboratories on one side and transnational corporations on the other do not really hold any more.
Arcelor or AstraZeneca: ‘Large-scale clusters allow us to manage and share computing resources across the entire Discovery Function, accelerating drug discovery, design and time-to-market, and realize our investment in hardware. Platform LSF has more than proved itself so far and is now the preferred solution for managing compute farms in AstraZeneca’ (McLaughlin, 2002).