This article was published in BLOOM n°14 in 2012 (in Italian).

Wilfing Architecture has recently republished this article online, fixing a few typos and enriching the text with images for a more immediate understanding of design precedents and references.

English readers can find a translated version of the article below.

ENGINEERING ARCHITECTURE (English translation by Alberto Pugnale)

Over the last twenty years, the impact of the “digital” in architecture has grown in an exponential manner, as demonstrated by Greg Lynn’s inform “BLOBs”[1] and by the NOX “free-forms”[2]. The adjective “free” here indicates the freedom to create architectural forms, irrespective of any composition, static or construction principle, and has been taken to an extreme in the purely virtual “trans-architecture” of Marcos Novak[3]. The computer threatens the conceptual work of designers, as well as the realization of their works. Through Computer Numerical Control (CNC) fabrication, the so-called “file-to-factory” process, the Objectile group has challenged the serial production of industrial design[4] – a single parametric digital model can now be materialized in a multitude of unique spatial variations without a significant increase in costs.

Just by mentioning these few examples, probably the most well-known among those on show at the “Architectures non-standard” exhibition[5], held at the Centre Pompidou in Paris in 2003-2004, is attention drawn to the intrinsic difficulty of theoretically and historically classifying the use of computer technologies in architecture.

The heterogeneous nature of such experimental projects was labeled with the term “non-standard” by the curators Zeynap Mennan and Frédéric Migayrou. Originally coined by Bernard Cache of the Objectile group, with reference to the CNC fabrication of different elements at the same price as those which are commonly standardized[6], the term was here simply used to draw attention to the organic nature of the exhibited works[7]. However, “non-standard” can also be interpreted in a more specific manner which tends to highlight the particular nature and diversity of the projects that have been developed by means of computers. But this does not necessarily preclude a continuous reading with architectural periods and movements of the past.

The approach of critics who instead coin new labels for the digital phenomenon at an incessant rhythm[8], such as generative, evolutionary and “performative design”, lead us to think of computerization as a heavy homogeneous mass that has suddenly been catapulted onto architecture from nowhere. These are terms that seek a distance from the past and ignore any possible interrelations.

The integration of computer science in the various design and building phases is rooted in architecture itself and passes through its specialisms. Research and innovation have gradually written its history. Structural engineering and industrial design are part of this complex plot, which also takes inspiration from fields that are apparently very distant from the world of construction, such as Artificial Intelligence (AI) and cognitivism.

Six brief stories can guide us in the endeavor to highlighting the details of originally separate events, which are unfortunately more and more frequently coupled together under the generic term “digital”[9].

1

Computer-Aided Design (CAD) automates the patient and precise task of technical drawing. It does not, by its very nature, affect the conceptual phases of a project, but facilitates and speeds up the work flow, and its use has spread like wildfire in architectural firms. Taken directly from the world of mechanics, it is commonly, but erroneously, placed at the very beginnings of computerization in architecture – some important steps, antecedent to the release of the main commercial CAD software of the 1980s and 1990s, have therefore been neglected.

Computers already began to populate the administrative sector back in the 1950s in the Skidmore, Owings and Merrill (SOM) office in America. Over a period of ten years, they also began to be used by the design group, who boasted they had acquired an IBM-1620 to perform complex structural and energy studies pertaining to buildings.

Those architects and engineers, faced with rudimental computers without programs, knew how to develop synergy with their new companions more freely than the designers of today.

With the aid of programmers and experts, including the partner Douglas Stoker, but also through agreements drawn up directly with IBM, the SOM group conceived and developed an experimental program in the 1980s that they called Building Optimization Procedure (BOP). The research was led Bruce Graham together with the engineer Fazlur Khan and had the purpose of lowering the construction costs of tall buildings.

From a conceptual point of view, this is a rough predecessor of what is today known as Building Information Modeling (BIM)[10], which, through a single three-dimensional model that is knowledge-rich, gathers and manages not only geometrical data, but also structural, energy and construction information about a building. BIM then relates these different types of information and, in this way, improves the interaction and dialogue between the different figures involved in the design process.

Although light-years away from such a definition, the qualities of BOP were somewhat akin to the wonders of recent BIMs, as far as the principles and concepts are concerned, as they were inspired by and responded directly to specific design requirements.

However, the power of the large software companies did not concede a long life on the market to this and other SOM-developed programs. The inevitable sale to IBM did not, however, slow down John Zils, an associate partner who is currently in charge of the structural group of SOM Chicago, from reflecting on a key point of computerization in architecture: “We were used to being able to create our own software to respond to exactly what we wanted to do… Now we are relying on other people to do things for us and, of course, they’re not going to do them the way we want to do them. We’re always left trying to evaluate the different software that comes closest to what we want to do”[11].

2

The love/hate relationship with the computer can in part be attributed to the continuous presence of such a tension, i.e. to the natural distance between the ideal program, which is hypothetically tailored to designer needs, and the mass commercial software, which attempts to adapt itself to the user’s requirements as much as possible.

This is the dilemma at the center of Robert Aish’s research activities. Aish is a trained computer scientist who is specialized in man-machine interaction. Formerly in collaboration with ARUP, then Bentley, and finally as part of the Autodesk Research group, he has conceived and developed specific software for architectural design.

Aware of how traditional CAD automates design data at too low a semantic level, taking advantage of simple lines, arches and circles to aid the conceptual work of an architect, Aish did not mean to cage potential creativity, or to promote, in the same ineffective manner, the opposite, i.e. the development of programs that already supply wall, door and window libraries.

He therefore questioned himself about the existence of invariants within the design process and looked for those recurring and general patterns that unveiled the potential of standardization in digital design.

The release of products such as GenerativeComponents and Autodesk Revit allowed him to assert that each design process is always based on the definition of elements and the relationships among them[12]. In other words, it is founded on the construction of topological rather than metrical spaces.

A brick wall can therefore be described through the basic properties of its components, i.e. the “parameters” of length, width and height of the bricks, but also by taking advantage of a series of equations that establish the geometrical interrelations, in this case the reciprocal spatial position. Computerization guarantees integrity to this system by allowing the architect to concentrate on numerical modifications of its “variables”, within continuous or discrete domains.

This is the concept of “parametric design” on which Gramazio & Kohler[13], architects and researchers at ETH in Zurich, based their projects such as the façade of the Gautenbein wine cellar in Switzerland.

In order to guarantee the ventilation of internal spaces and protection from direct light, they reinvented the “treillage” arrangement of bricks – by means of the parametric model described above, they studied sets of new patterns which figuratively call to mind grapes. Thus, they made the façade up of CNC prefabricated modules that were assembled in a reinforced concrete structural frame.

Gramazio & Kohler then recycled the same wall parametric model to design the installation of the Swiss pavilion at the Venice Biennale in 2008 and again for the Pike Loop prototype, which was built and exhibited in the heart of Manhattan in 2009[14].

Through a single consistent system of elements and interrelations, they rapidly explored a multitude of spatial configurations in what Lars Spuybroek, from the NOX group, defined “the architecture of variation” in his latest book[15].

A curious definition of “parametric architecture” can also be found in Forma come struttura, which was published in “Spazio” in 1957[16], and in Ricerca matematica in architettura e urbanistica,which was printed in “Moebius” in 1971[17]. Luigi Moretti, the author, identified as “parameters” all those design variables that the architect has to consider in order to satisfy program requirements.

Erring on the side of naivety, his intention was to understand, systematize and formalize the design process as much as possible. This is an endeavor that is almost impossible, but which deserves mention as it was one of the first attempts to contextualize architecture within a scientific research program. For this purpose, Moretti founded IRMOU, which stands for “Istituto per la Ricerca Matematica e Operativa applicata all’Urbanistica” (Institute for Mathematical and Operative Research applied to Town Planning).

3

However, such a meaning of the term “parametric” should not be mistaken for its purely geometric significance. In CAD three-dimensional modelling, for instance in programs such as Rhinoceros or 3D Studio Max, parametric is defined as those curves and surfaces that are used to accurately represent free, organic or particularly complex forms, i.e. which cannot be drawn, without approximation, using simple geometries. These curves and surfaces originated from the world of automotive design as the result of a research activity that was conducted by Citroën. They quickly became an indispensable aid for designers who, in this way, are now able to visualize and study the shapes of future car models in a virtual manner.

The first standard of parametric curves was introduced by the mathematician Paul de Casteljau in 1959, in which he defined the calculation algorithm on the basis of Bernstein’s polynomials. Pierre Etienne Bézier, a Renault engineer, was responsible for its diffusion over the next decade, and it was in fact his name that was finally given to the curves, i.e. “Bézier’s curves”[18]. By now obsolete for the three-dimensional modelling of free-forms, these curves are instead still used in the graphic sector and are still implemented in programs like Adobe Illustrator and CorelDraw.

The present standard for the representation of parametric curves and surfaces is called NURBS (Non-Uniform Rational B-Splines)[19]. It has spread in architecture through the Rhinoceros CAD software. But it also has substituted its predecessors mainly because it allows the user a better control of the created geometries, which is an essential characteristic for an instrument of conceptual design rather than a simple drafting tool.

NURBS surfaces are obtained through the interpolation of curves and can be classified, on the basis of the generative method, as skinned, proportional, spine, swept or of bidirectional interpolation. Massimiliano Ciammaichella, describing these five types in “Architettura in NURBS”[20], showed how the creation of such surfaces follows a similar logic to the ways in which architects conceive the underlying spaces. NOXs, for example, design free-forms on the basis of the “skinned” criterion, i.e. by interpolating section curves lying on parallel planes. Zaha Hadid, instead, often represents the dynamic nature of flows through the use of “proportional” NURBS, that is to say obtained from generatrices converging in a point.

The approach of architects and engineers after the Second World War was completely different. Works characterized by an elevated spatial complexity, such as the Kresge Auditorium by Saarinen, the BP service station on the Bern-Zurich motorway by Isler, or the bridge over the Basento River by Musmeci, were, in that period, the result of a creative-generative process that indissolubly welded the structural contribution to that of form exploration. At the beginning of the century, not even Gaudì could draw the steeples of his Sagrada Familia without first having studied their mechanical behavior – he had to simulate the basic properties of the stonework through the use of hanging models, therefore shifting the design to the resolution of a “form-finding” problem, which had the aim of searching for the structural optimum through catenary arches.

It was in fact impossible to separate the representative component of architecture from its conformative core.

The parametric nature of NURBS curves and surfaces also allows them to be used as a geometric reference for the study and design of more complex spatial configurations. In the specific case of surfaces, more often than not this means conducting a “paneling” operation, i.e. a discretization of the NURBS into a structural mesh and/or a series of components, these also being parametrically defined.

The grid-shell of the new trade fair in Milan and that of the MyZeil shopping centre in Frankfurt, both of which were designed by Massimiliano Fuksas, are two recent examples of the paneling procedure. From pure and abstract NURBS surfaces, the structural consulting engineers, Schlaich Bergermann und partner and Knippers Helbig, obtained, or rather designed and calculated, the structural grids, as well as the exact geometries of the glass cladding elements[21]. The strong aesthetic impact of the initial surfaces guided the designers in the search for somber patters that would discretize the NURBS without adding any new decorative elements.

However, the starting geometry in many other projects has been relatively simple and the efforts of the architect have in fact been concentrated on the study of the paneling in order to confer an organic nature or dynamism to the overall construction. This is the case of the Research Pavilion 2011 of the University of Stuttgart, the result of a joint work between the Achim Menges and Jan Knippers groups[22]. The pavilion is in fact a relatively simple polyhedron with octagonal faces whose morphology, modularity and structural behavior is roughly inspired by the shells of sea urchins. Here the core of the design was the definition of a pure plate structure, also considering the way in which components are prefabricated and formally/constructively joined to each other.

4

It is difficult for architects to use the basic features of commercial CADs to perform and manage complex paneling procedures. It is even more difficult to use them to design parametric walls, such as those of Gramazio & Kohler. Software companies, being well aware of such limits, generally implement simple programming environments in their products based, for example, on interpreted languages such as Visual Basic or Python. In other words, they invite the expert user to extend the inborn potentiality of the programs on their own and to conceive new functions through the development of small codes known as “scripts”.

At the beginning of the 1990s, when the first versions of AutoCAD only implemented the cumbersome LISP language, the architect Neil Katz, one of the SOM associates, already boasted a reasonable collection of scripts. His codes formulated parametrically complex geometric patterns and more often than not they inspired the design activity of the office – the envelope of the Lotte Tower in Seoul was designed and calculated so quickly as it was defined as a parametric entity, and it was the same for the antenna on the Freedom Tower in New York[23].

From being a medium that was used passively, digital technology changed into a design resource that could be used to formulate the problems in a different way and construct resolution tools and strategies in an interactive manner.

This “tooling” phenomenon, although commonly labeled with the term “scripting”, is not conceptually equivalent but is rather its evolution[24]. It should in fact be recalled that scripting was first introduced in the 1960s with the mere purpose of automating long and repetitive operations that required periodic executions from the command line[25].

Rhinoceros, the CAD software made by McNeel, is overall the most commonly used program for the development of scripting in architecture. There are many reasons for this success. Version 3.0, which was released in 2003, already combined NURBS representation, which was ideal for creating and managing free-forms, with RhinoScript, a simple but complete programming environment based on Visual Basic language. Since the introduction of version 4.0, even those who are not experts have been encouraged to tackle the development of codes – through the use of Grasshopper, a plug-in conceptually inspired by the flow-charts created in Simulink by MathWorks, the users are now allowed to “model” the scripts through a purely graphical editor.

Grasshopper is based on simple, already compiled routines and functions which, without any previous programming skill, can be put together in different ways directly from the graphical interface in order to develop more complex algorithms. The user deals with several small black boxes which, when supplied by specific input data, execute a series of instructions and return new output data.

The drawbacks of such an approach are evident and call to mind the unfortunate experience of “automatic programming” of the 1950s. Its supporters, among which the name Grace Murray Hopper stands out, and who is also famous for having been the main person responsible for the much feared “millennium bug”, suggested realizing what Ford originally conceived for the production of cars: to arrange a system based on interchangeable parts in order to develop new programs by simply writing codes that link prearranged routines[26]. This was a good idea for the assembly line but, in the computer science field, it leads to a stiffening of the programming procedure and to failure. The diagnosis is clear: premature standardization and an erroneous level of abstraction.

Grasshopper is flexible in some ways and it can easily be used as a tool for the study of parametric models, which can then be translated as more complex traditional codes. However, it is pure illusion to present it as a simplified version of RhinoScript – in fact, the difficulty of tooling is not that of learning a programming language, but rather that of knowing how to correctly formulate the problems that have to be resolved in a parametric manner.

5

Digital technologies are also radically modifying the work of civil engineers. Numerical calculation techniques like FEM (Finite Element Method) as a whole are replacing experimental structural design and analysis methods. In the same way, physical models for the form-finding of RC shells or tensile structures are no longer being used[27]. The way now is to use mathematical optimization which, on the basis of one or more chosen criteria, takes advantage of the computation power of the computer[28] to interactively search for optimal solutions to a problem from among a series of possible candidates[29].

This change is relevant, from the architectural design point of view, for at least three reasons.

Unlike in classical form-finding, the topology of a structural system no longer needs to be fixed. It can therefore become the object itself of the optimization process, as in the case of the design of the new TAV station in Florence, which was developed by Isozaki and Sasaki on occasion of an international competition in 2003[30]. An immense flat roof is here suspended in the sky from an organic structure, from which both the topology and the final tree-like shape are derived through the use of an extended version of the ESO (Evolutionary Structural Optimization) technique[31].

Given an initial spatial configuration, and calculating the Von Mises stresses through FEM analysis, this algorithm interactively removes the inefficient parts of the structure, in this way minimizing the waste of material. In this case, it was also able to add new ones where needed, thus guaranteeing an optimal mechanical behavior to the overall system.

The façade of the Akutagawa West Side office building[32], designed by the architect Hiroyuki Futai and the Hiroshi Ohmori research group from Nagoya University, was conceived using ESO. The façade of the Sagrada Familia was also curiously re-designed in this manner as part of a research project coordinated by Jane Burry of the RMIT University – this work had the aim of studying any possible analogies between the results of the topological optimization and the natural forms originally conceived by Gaudì with the hanging models[33].

Compared to the projects by Heinz Isler and Frei Otto, optimization also allows the original form-finding concept, literally aimed at the search of the optimal form, to be changed into what can be defined as “form-improvement”[34] – this new process is instead aimed at improving the performances of an already existing spatial configuration, which does not necessarily mean reaching the structural optimum.

For example, as far as the Kagamigahara crematorium is concerned[35], no physical model used to obtain the inverse of the tension-only hanging membrane would have been able to translate the idea of the architect Toyo Ito into a structure[36]. Instead, through optimization, the floating RC roof, figuratively inspired by a cloud, was first freely modeled as if it were a sculpture and was then structurally honed through a Sensitivity Analysis (SA)[37].

Adopting this technique, Mutsuro Sasaki reduced the total strain energy of the roof shell and iteratively modified its curvature. Based on gradient calculation, an SA automates the traditional “trial and error” design method – in this way, Sasaki avoided a slow and repetitive drawing/verification process of the form, which would have required several manual FEM analysis runs.

This is the strategy that was also used for the Grin Grin Park in Fukuoka and the Kitagata Community Centre in Gifu[38] – another two cases in which the designer was able to consider free-form spatial configurations, sub-optimal from a structural point of view, only by means of an SA.

From simple resolution instruments, this and other numerical optimization techniques become efficient “form-exploration” tools to support the conceptual phases of architectural design. For this reason, they are also often referred to in scientific literature as “Computational Morphogenesis” strategies[39].

The research the author has conducted for some years, together with Mario Sassone and other colleagues of our group, can rightly be considered as part of this current[40]. The objective is to develop and apply optimization techniques for architectural design, studying to what extent, and according to what logics, they can be considered as tools of thought[41].

One always starts from a well-defined design problem, i.e. one which can clearly be formulated in a parametric way. For example, in 2007 when we structurally re-designed the Kagamigahara crematorium, we represented the roof shell with a NURBS surface in Rhinoceros – its control points being constrained in correspondence to the pillars, the spatial coordinates of the remaining ones automatically became the design variables of the system[42].

An optimization strategy was then added. It took on the role of guide in the study and evaluation process of the architectural form and, through scripting, a Genetic Algorithm (GA) was developed and linked to the NURBS parametric geometry[43]. GAs are meta-heuristic optimization techniques which, being inspired by the principle of natural evolution, generate entire “populations” of design solutions (in this case, free-form spatial configurations) among which only the best are “selected” and “recombined” iteratively among each other. In our example, the GA metaphorically allowed those NURBS surfaces which structurally presented a low mean value of vertical displacements to “survive”.

The last fundamental aspect of optimization is that it is not just limited to resolving questions of a static nature, which instead is an intrinsic characteristic of form-finding based on physical models. Techniques like GAs can be used in all those cases in which an architectural performance can be formulated through a mathematical function and, technically speaking, it can therefore be “minimized”.

In our research group[44], Tomás Méndez Echenagucia optimizes the acoustics of concert halls in this manner[45], Dario Parigi studies the geometry and kinematic behavior of reciprocal structures[46] and Paolo Basso resolves cost and construction problems of free-form grid-shells[47].

The latter topic is of particular interest for engineering companies, such as RFR in Paris, which was originally founded by Peter Rice in 1982. Free-form grid-shells with quadrilateral elements require the use of double-curved glass panels, which makes a project economically costly. This was the case of the TGV train station in Strasbourg of which the geometry, through optimization, has been refined in order to be realized with single-curved elements[48].

Entire research groups work on this problem which, before the arrival of computers, could not be otherwise resolved[49]. The first quadrilateral mesh grid-shells by Jörg Schlaich in the 1980s and 1990s were therefore anything but free-form: in order to guarantee construction with planar glass elements he had to design adopting rigid geometrical rules, i.e. only by means of the translation and scaling of generatrix curves[50].

6

Parametrics and optimization have changed the way architecture has been designed in its conceptual phases. CNC production has instead changed the construction techniques.

The NOX Son-O-House and Gramazio & Kohler’s parametric wall are two examples of how an extreme geometric complexity, managed only by means of computers, can rationally be realized from “file-to-factory”, in other words, translating digital models directly into construction components using industrial derivation machines[51].

The Zaha Hadid stations for the funicular railway in Innsbruck are instead a case in which, without resorting to optimization, the fluid roof forms could only be cladded using expensive double-curved glass elements. However, in just a few years, the development of “dynamic moulds”, which would allow a CNC production of the transparent components, could reduce their construction costs. The small start-up company ADAPA, founded by Christian Raun Jepsen in Aalborg, was in fact established for this purpose. This company has already tested a first prototype of a “dynamic mould” with cast gypsum and cast concrete[52].

After a long period of progressive separation, architecture and engineering are gradually coming closer together again thanks to digital technology. I have here simply made an attempt to introduce this phenomenon, which I have personally called “Engineering Architecture”.

[1] Lynn G., Folds, Bodies & Blobs: collected essays,La lettre volée, Brussels, 1998.

[2] Spuybroek L. (NOX), Nox, Thames & Hudson, London, 2004.

[3] The most significant publications of Marcos Novak are: Novak M., Next Babylon, soft Babylon,in “Architectural design”, no.136, November 1998, pp.20-29; Novak M., Speciazione, trasvergenza, allogenesi: note sulla produzione dell’alien, in Sacchi L., Unali M. (Eds), “Architettura e cultura digitale”, Skira, Milan, 2003. Also in “Architectural Design”, no.157, May-June 2002, pp.64-71; Novak M., Transmitting architecture,in “Architectural Design”, no.118, October 1995, pp.42-47. Also available at the web address: http://www.mat.ucsb.edu/~marcos/Centrifuge_Site/MainFrameSet.html.

[4] The Objectile group is made up of architects Bernard Cache and Patrick Beaucé.

[5] Migayrou F. (Eds), Architectures non standard, Centre Pompidou, Paris, 2004.

[6] Cache B., Beaucé P. (Objectile), Vers une mode de production non-standard, in Migayrou F. (Eds), Architectures non standard, Centre Pompidou, Paris, 2004 (partially published). Published in full at the web address: http://architettura.supereva.com/extended/20040214/index.htm (ARCH’it, digital journal in Italian).

[7] Mennan Z., The question of non-standard form, in “METU Journal of the Faculty of Architecture”, Vol.25, no.2, 2008, pp.171-183.

[8] See, for example, the recent issue of “Architectural Design” edited by Rivka and Robert Oxman: “The New Structuralism: Design, Engineering and Architectural Technologies”, July 2010. For the term “performative design” see: Oxman R., Performance-based Design: Current Practices and Research Issues, in “International Journal of Architectural Computing”, Vol.6, no.1, January 2008, pp.1-17. For the term “digital tectonics” see: Oxman R., Morphogenesis in the theory and methodology of digital tectonics, in “Journal of the International Association for Shell and spatial Structures”, Vol.51, no.165, pp.195-205.

[9] Franco Purini, for example, has attempted to make a classification of “digital” architecture and has identified three fields, which are penetrable and interchangeable with each other. The first is instrumental, non-integrated to the design conception but purely of a service nature, the second is creative and complementary to the previous one, the last is utopic, i.e. of a purely virtual experimentation nature. However, such a generic subdivision does not help one understand the real logics of the phenomenon. The paper has been published in: Purini F., Digital Divide, in Sacchi L., Unali M. (Eds), “Architettura e cultura digitale”, Skira, Milan, 2003. A more specific text is instead: Picon A., Digital Culture in Architecture, Birkhäuser, 2010.

[10] For a complete guide on BIMs see: Eastman C., Teicholz P., Sacks R., Liston K., BIM Handbook: A Guide to Building Information Modeling for Owners, Managers, Designers, Engineers and Contractors, Wiley, 2008.

[11] The interview is cited fully by: Adams N., Skidmore, Owings & Merrill. SOM since 1936, Electa, Milan, 2007, pp.34-36. The use of computers within SOM has recently been discussed at a conference titled “Digital Design at SOM: The Past, the Present and the Future”, recorded and visible at the web address: http://vimeo.com/42786059.

[12] Aish R., Extensible computational design tools for exploratory architecture, in Kolarevic B. (Ed), “Architecture in the

Digital Age: Design and Manufacturing”, Routledge, 2005, p.17. See also: Shea K., Aish R., Gourtovaia M., Towards

integrated performance-driven generative design tools, in “Automation in Construction”, Vol.14, no.2, 2005, pp.253-264.

[13] The projects and research activities of Gramazio & Kohler are collected in: Gramazio F., Kohler M., Digital Materiality in Architecture, Lars Müller Publishers, 2008. See also: Converso S., Il progetto digitale per la construzione: Cronache di un mutamento professionale, Maggioli Editore, 2010, pp.61-63, 82-87; and Yudina A., Matthias Kohler & Fabio Gramazio: Digital Empirics, in “Monitor”, no.56, 2009, pp.50-65.

[14] The parametric model of the “Pike Loop” prototype is clearly described in Bärtschi R., Knauss M., Bonwetsch T., Gramazio F., Kohler M; Wiggled Brick Bond,in “Advances in Architectural Geometry 2010”, Springer, 2010, pp.137-147.

[15] Spuybroek L. (Ed), Research & Design: The Architecture of Variation, Thames & Hudson, 2009.

[16] Moretti L., Forma come struttura,in “Spazio” (Extracts), June-July 1957. Also in: Bucci F., Mulazzani M., Luigi Moretti: Opera e scritti,Electa, 2000.

[17] Moretti L., Ricerca matematica in architettura e urbanistica,in “Moebius”, no.1, pp.30-53, 1971. Also in: Bucci F., Mulazzani M., Luigi Moretti: Opera e scritti,Electa, 2000.

[18] A good historical introduction on various standards for parametric curves and surfaces can be found in: Rogers D.F., An introduction to NURBS: with historical perspective, 1st Edition, Morgan Kaufmann, 2001.

[19] Piegl L., Tiller W., The NURBS Book, 2nd Edition, Springer, 1995 (1966).

[20] Ciammaichella M., Architettura in NURBS: il disegno digitale della deformazione, Testo&Immagine, 2002.

[21] Details on the design of the grid-shell of the new trade fair in Milan have been published in: Schlaich J., Kurschner K., New Trade Fair in Milan- Grid Topology and Structural Behaviour of a Free-Formed Glass-Covered Surface,in “International Journal of Space Structures”, Vol.20, no.1, 2005, pp.1-14. The design of the MyZeil grid-shell in Frankfurt has instead been described in: Knippers J., Helbig T., The Frankfurt Zeil Grid Shell,in the “Proceedings of the IASS Symposium 2009: Evolution and Trends in Design, Analysis and Construction of Shell and Spatial Structures”, Valencia, Spain, 2009, pp.328-329. See also: Knippers J., Digital Technologies for Evolutionary Construction, in “Computational Design Modeling. The Proceedings of the DMSB 2011”, Springer, 2011.

[22] La Magna R., Waimer F., Knippers J., Nature-inspired generation scheme for shell structures,in “Proceedings of the IASS-APCS Symposium 2012: From Spatial Structures to Space Structures”, Seoul, South Korea, 2012.

[23] Aqtash A., Katz N., Computation and design of the antenna structure – Tower One, in “Proceedings of the 6th International Conference on Computation of Shell and Spatial structures IASS-IACM 2008: Spanning Nano to Mega”, Ithaca, NY, USA, 2008.

[24] It is not by chance that some of the recent publications on the development of scripts in architecture report the term “tooling” instead of “scripting”. See, for example, Aranda B., Lasch C., Pamphlet Architecture 27: Tooling,Princeton Architectural Press, 2005.

[25] More detailed are provided by: Ceruzzi P.E., A History of Modern Computing, 2nd Edition, The MIT Press, Cambridge, Massachusetts, USA, 2003.

[26] It is necessary to point out that, at that time, writing codes involved punching cards and not utilizing digital text editors. For details on the history of automatic programming, it is possible to consult: Wilkes M., Wheeler D. J., Gill S., The preparation of Programs for an Electronic Digital Computer,The MIT Press, 1984, pp.26-37; and Campbell-Kelly M., Programming the EDSAC: Early Programming Activity at the University of Cambridge,in “IEEE Annals of the History of Computing”, Vol.2, no.1, 1980, pp.7-36. See also: Ceruzzi P.E., A History of Modern Computing, 2nd Edition, The MIT Press, Cambridge, Massachusetts, USA, 2003.

[27] The state-of-the-art on classical form-finding can be found in: Otto F., Rasch B., Finding Form: Towards an Architecture of the Minimal,Axel Menges, 1996. It can also be found in: Hennicke J. et al., Il 10. Grid Shells, Stuttgart: Institute for Lightweight Structures (IL),1974; and in: Isler H., New Shapes for Shells – Twenty Years After,in “Bulletin of the International Association for Shell Structures”, no.71, 1979.

[28] Computational power was identified by John Frazer as the most important characteristic of computers in his book “An Evolutionary Architecture”, edited by the Architectural Association Publications in 1995.

[29] A good introduction to the main optimization techniques can be found in: Della Croce F., Tadei R., Ricerca operativa e ottimizzazione,Esculapio, 2002.

[30] See: Cui C., Ohmori H., Sasaki M., Computational Morphogenesis of 3D Structures by Extended ESO Method,in “Journal of the International Association for Shell and Spatial Structures, Vol.44, no.141, 2003, pp.51-61. The competition project for the new TAV station in Florence is also described in: Sasaki M., Flux Structure,TOTO, 2005.

[31] The ESO technique was originally developed by Xie and Steven, who published their results in: Xie Y.M, Steven G.P., Evolutionary Structural Optimization,Springer, 1997.

[32] See: Lee D., Shin S., Park S., Computational Morphogenesis Based Structural Design by Using Material Topology Optimization, in “Mechanics Based Design of Structures and Machines, Vol.35, no.1, 2007, pp.39-58. See also: Ohmori H., Computational Morphogenesis: Its Current State and Possibility for the Future, in “International Journal of Space Structures”, Vol.25, no.2, 2010, pp.75-82.

[33] The results of this research were initially published in: Burry J., Felicetti P., Tang J., Burry M., Xie M., Dynamical structural modeling: A collaborative design exploration,in “International Journal of Architectural Computing”, Vol.3, no.1, 2005, pp.27-42. Also in: Burry J., Burry M., The new Mathematics of Architecture,Thames and Hudson, 2010.

[34] The term “form-improvement” was coined by the author for purely explanatory purposes. Therefore, it does not refer to any recognized or consolidated technique within the scientific community.

[35] The project for the Kagamigahara crematorium was published in: Casabella, no.752, February 2007, pp.30-37; Architectural Review,no.1326, August 2007, pp.74-77; Detail, Vol.48, no.7/8, July/August 2008, pp.786-790; The Plan,no.27, June/July 2008, pp.42-52.

[36] By inversion of the hanging membrane we mean that form-finding procedure which, subjecting an elastic surface without any bending stiffness to a gravitational load, a tension-only state is first obtained and then, by inversion, the opposite compression-only structure is achieved.

[37] The Sensitivity Analysis is explained briefly in: Sasaki M., Flux Structures,TOTO, 2005.

[38] Both projects are described in: Sasaki M., Flux Structures,TOTO, 2005.

[39] On December 2007, during an informal conversation with Makoto Katayama, professor at the Kanazawa Institute of Technology, it emerged that it was probably Yasuhiko Hangai, a former professor at the University of Tokyo, who first coined the expression “Computational Morphogenesis”. However, with this term it is not clear if he wanted to underline the differences with respect to pure optimization or whether he simply wished to create a synonym. Still today it is used in an ambiguous manner in scientific literature and often as not with the mere significance of computational form-finding. In other words, it is used when form-finding is not based on physical models but on computer simulations. This is the case of: Bletzinger Kai-Uwe, Form-finding and Morphogenesis,in Mungan I., Abel J.F., (Eds), “Fifty Years of Progress for Shell and Spatial structures”, Multi-Science, 2011; or in: Ohmori H., Computational Morphogenesis: its current State and Possibility for the future, in “International Journal of Space Structures”, Vol.25, no.2, 2010.

[40] For example, see: Pugnale A., Engineering Architecture: Advances of a technological practice,PhD thesis discussed at the Politecnico di Torino, April 2010.

[41] The relationship between technology and thought has been dealt with for example by Walter Ong to study the differences between oral and literate cultures. The results of this research have been published in: Ong W.J., Orality and Literacy: The Technologizing of the Word, 2nd Edition, Routledge, USA, 2002 (1982). With reference to digital technology, Donald Norman proposes some interesting readings. It is possible to mention, for example: Norman D., The invisible computer, The MIT Press, 1998.

[42] Pugnale A., Sassone M., Morphogenesis and Structural Optimization of Shell Structures with the Aid of a Genetic Algorithm,in “Journal of the International Association for Shell and Spatial Structures”, Vol.48, no.155, 2007.

[43] A good introduction to Genetic Algorithms (GAs) can be found in: Mitchell M., An introduction to genetic algorithms,The MIT Press, Cambridge, 1998. A more complete technical book is: Goldberg D.E., Genetic Algorithms in Search, Optimization & Machine Learning, Addison-Wesley, Boston, 1989. I personally started my studies on GAs with the Italian book: Floreano D., Mattiussi C., Manuale sulle reti neurali,Il Mulino, Bologna, 2002 (1996).

[44] With “our research group” is intended that unofficial network of former students and PhD students who, under the guidance of Mario Sassone, started to develop at Politecnico di Torino a set of research projects on Computational Morphogenesis. At present, most of the group members have moved to other universities, but still regular professional collaborations are maintained.

[45] See: Méndez Echenagucia T.I., Astolfi A., Jansen M., Sassone M., Architectural acoustic and structural form,in “Journal of the International Association for Shell and Spatial Structures”, Vol.49, no.159, 2008. See also: Sassone M., Méndez Echenagucia T.I., Pugnale A., On the interaction between architecture and engineering: the acoustic optimization of a RC roof shell, in “Sixth International Conference on Computation of Shell & Spatial Structures: Spanning Nano to Mega”, Ithaca NY, USA, 2008, p.231.

[46] See: Parigi D., Kirkegaard P.H., Sassone M., Hybrid optimization in the design of reciprocal structures,in “Proceedings of the IASS Symposium 2012: From spatial structures to space structures”, Seoul, South Korea, 2012. See also: Parigi D., Kirkegaard P.H., Towards free-form kinetic structures,in “Proceedings of the IASS Symposium 2012: From spatial structures to space structures”, Seoul, South Korea, 2012. For the optimization of reciprocal structures, it is also possible to mention: Baverel O., Nooshin H., Kuroiwa Y., Configuration processing of nexorades using genetic algorithms,in “Journal of the International association for Shell and Spatial structures”, Vol.45, no.142, 2004, pp.99-108; and: Douthe C., Baverel O., Design of nexorades or reciprocal frame systems with the dynamic relaxation method,in “Computers and Structures”, Vol.87, no.21-22, 2009, pp.1296-1307.

[47] See: Basso P., Del Grosso A., Pugnale A., Sassone M., Computational morphogenesis in architecture: cost optimization of free form grid shells, in “Journal of the International Association for Shell and Spatial structures”, Vol.50, no.162, 2009. A similar research was also published in: Sassone M., Pugnale A., On optimal design of glass grid shells with quadrilateral elements,in “International Journal of Space Structures”, Vol.25, no.2, 2010.

[48] See: Pottmann H., Schiftner A., Bo P., Schmeidhofer H., Wang W., Baldassini N., Wallner J., Freeform surfaces from single curved panels, in “ACM Transactions on Graphics (TOG) – Proceedings of the ACM SIGGRAPH 2008”, Vol. 27, no.3, 2008.

[49] See: Pottmann H., Asperl A., Hofer M., Kilian A., Architectural Geometry, Bentley Institute Press, 2007.

[50] See: Holgate A., The Art of Structural Engineering. The work of Jörg Schlaich and his Team, Axel Menges, 1997. See also: Schlaich J., Schober H., Glass-covered Lightweight Spatial structures,in Abel J.F., Leonard J.W., Penalba C.U., (Eds),”Spatial, lattice and tension structures: Proceedings of the IASS-ASCE International Symposium”, Atlanta, USA, 1994, pp.1-27.

[51] According to the Danish research network called “Digital Crafting”, the use and development of CNC production, in the future, could also lead to continuously reinvent the use of material and construction techniques.

[52] Different researchers and start-up companies are at present working on this topic. See, for example: Pronk A., Van Roody I., Schinkel P., Double-curved surfaces using a membrane mould, in “Proceedings of the IASS Symposium 2009: Evolution and Trends in Design, Analysis and construction of Shell and Spatial Structures”, Valencia, Spain, 2009, pp.618-628; and also: Raun C., Kristensen M.K., Kirkegaard P.H., Dynamic Double Curvature Mould System,in “Computational Design Modeling: the Proceedings of the Design Modeling Symposium Berlin 2011”, Berlin, Germany, 2011.