Development and test application of the UrbanSOLve decision-support prototype for early-stage neighborhood design

Development and test application of the UrbanSOLve decision-support prototype for early-stage neighborhood design

Building and Environment 137 (2018) 58–72 Contents lists available at ScienceDirect Building and Environment journal homepage: www.elsevier.com/loca...

6MB Sizes 0 Downloads 38 Views

Building and Environment 137 (2018) 58–72

Contents lists available at ScienceDirect

Building and Environment journal homepage: www.elsevier.com/locate/buildenv

Development and test application of the UrbanSOLve decision-support prototype for early-stage neighborhood design

T

Emilie Naulta,∗, Christoph Waibelb,c, Jan Carmelietb,d, Marilyne Andersena a

Laboratory of Integrated Performance In Design (LIPID), School of Architecture, Civil and Environmental Engineering (ENAC), Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland b Chair of Building Physics, Department of Mechanical and Process Engineering, Swiss Federal Institute of Technology in Zurich (ETH Zurich), Stefano-Franscini-Platz 1, 8093 Zurich, Switzerland c Urban Energy Systems Laboratory, Swiss Federal Laboratories for Materials Science and Technology (Empa), Überlandstrasse 129, 860 Dübendorf, Switzerland d Laboratory for Multi-Scale Studies in Building Physics, Swiss Federal Laboratories for Materials Science and Technology (Empa), Überlandstrasse 129, 8600 Dübendorf, Switzerland

A R T I C LE I N FO

A B S T R A C T

Keywords: Active and passive solar potential Building and urban design Early design stage Decision-support workflow Optimization Usability testing

The need for adequate instruments to support practitioners toward achieving sustainable and energy-efficient architectural and urban design has long been acknowledged. Motivated by identified shortcomings of building performance assessment tools for conceptual neighborhood-scale design, this paper proposes a novel workflow to enable practitioners to efficiently explore a space of design alternatives and compare them in terms of their energy and daylight performance. The workflow includes a multi-criteria optimization algorithm, which is coupled to a performance assessment engine based on predictive mathematical models. To get some insight on the potential added value for design and the usability of this approach in practice, the workflow has been implemented as a plug-in to an existing 3D modeling software and subsequently tested by practitioners during workshops, notably on real projects provided by the participants. Outcomes from the workshops, which include responses from the participants to a pre- and post-test survey, are presented. Results highlight the relevance of the proposed workflow for informing decisions about early-stage building massing on the basis of the considered performance criteria. Improvements envisioned for both the workflow and its implementation are highlighted and discussed.

1. Introduction The increasing necessity for the building sector to comply with various normative frameworks and performance rating systems has led to the spread of design decision-support (DDS) methods and tools. Ranging from simple rules-of-thumb to advanced computer-based simulation programs, such tools aim at casting light onto the future performance of a project [1,2]. In the domain of building performance simulation (BPS) software, the development has been dominated by workflows supporting the evaluation of a detailed building design, whereas less has been done to provide design guidance [3,4], particularly of larger scale projects such as that of a neighborhood [5]. Given the rise in urban densification strategies and urban renewal projects [6,7], most developments are located in an existing context and/or hold more than one building. It is thus essential to move from an evaluation over a unique building, considered as isolated, toward an assessment over a larger scale



conducted simultaneously on multiple buildings, taking into consideration the impact they have on each other notably in terms of shading [8]. The increased complexity of looking beyond a single building is a strong argument for the need for adequate DDS methods and tools, given that decisions based on intuition or simple guidelines are no longer sufficient [9–12]. This complexity is exacerbated when multiple performance criteria, possibly including conflicting ones, are considered simultaneously. In the design of a neighborhood, anticipating the interactions between new and existing buildings and their impact on each performance criterion, and that, for each of the various designs envisioned, is not a straightforward task to say the least. Yet, such an investigation is essential since early-stage decisions for instance on building massing can lead to significant differences in the performance of a project [13–15]. This paper describes the development, implementation, and test among practitioners of a DDS prototype aiming at providing multi-

Corresponding author. E-mail address: emilie.nault@epfl.ch (E. Nault).

https://doi.org/10.1016/j.buildenv.2018.03.033 Received 13 February 2018; Received in revised form 18 March 2018; Accepted 20 March 2018 Available online 31 March 2018 0360-1323/ © 2018 Elsevier Ltd. All rights reserved.

Building and Environment 137 (2018) 58–72

E. Nault et al.

comparison of different alternatives, in terms of quantitative performance data for multiple criteria, and possibly help designers develop an understanding of the implication of their design choices [3,21]. The latter is crucial, since buildings are typically built to last for a relatively long period (40–50 years [22]), which means that there is a strong lockin effect in this sector [23]. Locking-in suboptimal design choices, particularly early-stage decisions most of which relate to features that cannot be modified later on (e.g., building shape), would seriously compromise the chance of reaching the increasingly ambitious energy performance targets [23]. Indeed, although not sufficient, strategic decisions regarding conceptual design features are essential for achieving low-energy buildings at a minimal cost. They can decrease the reliance on active (i.e., energy-demanding) systems by lowering heating and cooling loads, while ensuring daylighting [13]. Despite the clear importance of fully exploring passive design strategies from a performance-driven perspective, and the potential for BPS tools to provide a crucial support in this process at the urban design stage, we observe from Fig. 1 that the predominant methods and tools consist of documentation (including standards, guidelines, etc.) and modeling programs (i.e., computer-aided design (CAD) software) to produce drawings. In current practice, the quantitative simulationbased evaluation of a project's energy performance is typically done by an expert only at the more advanced building design stage, often for code-compliance verifications [17,24,25]. Early-design phase actors therefore make decisions with little consideration and/or limited prior knowledge of their impact on energy aspects [13].

criteria performance-driven guidance during the design of new neighborhoods. We first present in section 2 a review of DDS methods and tools supporting the design process with a focus on BPS at the conceptual design stage. Barriers to the uptake of such tools by designers are highlighted, as well as tool features that could allow bypassing these barriers. In section 3, we describe the proposed workflow that attempts to respond to the identified needs. This workflow forms the basis for the DDS prototype whose development is detailed in section 3.4. The prototype, named UrbanSOLve (Urban SOLar Visual Explorer), is conceived for supporting decision-making over the massing of a neighborhood-scale design in the Swiss context, based on its energy (need and production) and daylight performance. Factors that come into play later on in the design and operation of buildings and that can also alter its performance (e.g., occupant behavior) are beyond the scope of this research. Section 4 describes a test application conducted among professionals to verify the potential of the prototype for fulfilling its purpose in a design process context. Our main findings and conclusions are summarized in sections 5 and 6. 2. State-of-the-Art 2.1. Decision support along the design process The general domain of DDS methods and tools used along the urban planning, urban design, and architectural design process is practically unbounded and in continuous development. As illustrated in Fig. 1, based on a compilation of information from various sources [16–20], their adoption by practitioners is strongly linked to both the design scale (or spatial resolution, top) and phase in the iterative design process (bottom). The stages depict the evolution from a conceptual (or early) to a more defined (or advanced/detailed) level of the project, a process that occurs whether at the urban planning stage or at the architectural building design phase. We are here concerned with the moment in the urban design process where there is a transition from 2D to 3D, with building massing, orientation, position and alike being introduced as design parameters [21]. Given the complexities of urban design as mentioned earlier, having recourse to appropriate computer-based software can present numerous benefits, specially building performance simulation (BPS) and visualization types of tools. These have the potential to support the iterative nature of the early design process by facilitating the

2.2. Barriers to the uptake of tools Limited use of BPS tools in practice has in fact continuously been reported in the literature [17,26]. Surveys conducted among practitioners have shown the main barriers to the uptake of such tools to include (Fig. 2): the complexity of the tools, judged as exceeding the competence domain of the architects, the time requirement, and the lack of integration into computer-aided architectural design (CAAD) software and within the design workflow. These shortcomings can be summarized through the observation that few tools appear to respect and embrace the ill-defined nature of a project in its conceptual stage. In other words, most tools are evaluation-oriented rather than designoriented [3,4,27]. They induce a linear generate-and-test process, where form is given priority over performance [28]. Depicted in Fig. 3,

Fig. 1. Main supporting instruments used along the iterative design process, in urban planning, urban design, and architectural building design. *Building performance simulation (BPS) is typically conducted at the advanced building design stage, often by an external consultant or engineer. Schema developed by intersecting and merging elements from various sources [16–20]. 59

Building and Environment 137 (2018) 58–72

E. Nault et al.

Fig. 2. Barriers to the use of tools according to answers from (a) 685 building professionals (multiple selections possible) across 14 countries [3], and (b) 629 Flemish architects [30].

Fig. 3. Linear ‘generate-and-test’ workflow induced by many existing performance assessment instruments.

demanded from the user should be limited to information that is not only available, but also of interest to investigate by the user in terms of the impact of their variation on the performance. Requesting information about the detailed building design such as materials and systems is to be avoided [33], as reasonable underlying default values are preferred for those parameters. Highly visual elements are seen as key to facilitate interpretation of results, through graphical representations and 3D visualizations, general navigation via the interface(s), and communication with stakeholders. Offered features should notably include multi-criteria analysis and the possibility to test and compare design alternatives. Combined with the wish for minimal usage time and real-time feedback, these requirements motivate resorting to multi-objective optimization techniques as the generation mechanism, and to efficient multi-criteria evaluation methods. Although optimization in the building sector is still more present in research than practice, it is seen as a promising technique for energy efficient and sustainable building design [2,34,35], where multiple, possibly conflicting objectives co-exist. In a recent survey by Ref. [36] conducted among 165 architects, 78% responded that they would commonly optimize for multiple objectives, as opposed

this process involves translating the initial design idea into a format that can be evaluated by one or more specific performance assessment engines. This task can represent an important barrier from the practitioner's side, especially because it often requires information still unknown at the early design phase (e.g., materials). Interpretation of results is left to the user, so are the manual iterations necessary for a comparative exploration of design alternatives, typically desired at the conceptual design stage [29]. 2.3. Requirements for informed early design To address the above issues, specific requirements, listed in Table 1, can be defined for BPS tools to be used by designers during the conceptual design phase. This list was compiled by extracting the relevant information from various sources [3,4,21,25,30–33], although it should be noted that some sources refer to a more specific domain of tools, e.g., for net zero energy building [4] or sustainable urban developments [33]. Requirements concern the interface(s), the input and output data, as well as the general workflow and functioning of a tool. Inputs 60

Building and Environment 137 (2018) 58–72

E. Nault et al.

Table 1 Main requirements for early-stage evaluation tools targeting (architectural/urban) designers as users. List compiled from various sources [3,4,21,25,30–33]. *Using the architect's/designer's language. Element

Requirements

Interface

Intuitive*; flexible navigation; highly visual; clearly structured with restrained set of functions (simplicity)

Input data

Limited to available early-design information (complemented by adjustable default values); quick and simple to provide; intuitive*; easy to review/change (e.g., for generation of alternative designs/options); access to extensive library/database; graphical representation of building geometry, integrated into CAAD software; possibility to import 3D CAAD files

Outputs

Easy interpretation*; graphical representation, display in 3D model; multi-criteria analysis; comparison with building codes and regulations; showing impact of decisions/parameters (uncertainty/sensitivity); simple but supportive information for design decisions; convincing output for communication with stakeholders; indicating problem area(s); benchmarking; reports generated for alternative designs/options; reliable results

Workflow

Minimal interruption of design process; embracing iterative feature intrinsic to early stage by facilitating creation, testing and comparison of alternatives (design space exploration) and allowing geometrical variations; provide real-time feedback on design decisions and changes; provide guidelines; support evolution of the design

General

Transparent; minimal time required to operate tool; adequate for local usage (in terms of units, materials, etc.); easy to use after long time of non-use

quality of the information provided and taken into account, but also likely to be computationally expensive and complex for non-expert users. Some tools however offer counter-balancing features to enhance design support. For instance, with Sefaira and OpenStudio [40,41], it is possible to conduct parametric analysis at the building scale with minimal time via cloud computing. UMI [42] facilitates the data-input process for urban-scale evaluation by automating e.g., the window modeling and by providing templates with default values. CitySim also offers default values and the possibility to conduct an optimization [43]. Another way of addressing the usage time issue is to resort to techniques coming from the fields of computer science and statistics, that allow predicting the performance at a cheaper computational cost. An early example is the LT (Lighting and Thermal) method, that uses a mathematical model derived from simulations over a room module, taking into consideration the different energy flows (e.g., solar gains), to predict the annual energy use for heating, cooling, lighting, and ventilation [44]. Recent examples include [45–47]. While a prediction-based evaluation method facilitates the consideration of multiple performance criteria, its combination with a generative approach can support the comparison of design alternatives for an efficient exploration of the possible solution space. Growing interest for parametric programming environments such as Grasshopper [48] and Dynamo [49] highlights a trend for the development of custom workflows enabling such a generative design approach. As the generation mechanism, optimization techniques represent an efficient option that ensures performance improvement among the design alternatives. [50] combined a parametric modeling program with an evolutionary computational software to optimize different functions based on building program, location and density of an urban district.

to only one, and 82% answered preferring to get a few high-quality options rather than a unique ‘best’ solution. Optimization should thus be employed in a way that respects these desires of the designer community. The above requirements point to emerging design paradigms that have been proposed, such as the ‘performance-based’ [37], ‘performative’ [28], and ‘non-linear’ [29] design, respectively advancing multidisciplinary and multi-criteria performance evaluation, intending to reverse the traditional design order by using performance goals as the form-generation mechanism, and promoting simultaneous generation and evaluation of multiple design alternatives. The workflow illustrated in Fig. 4 captures the essence of such a generative, performance-driven approach. The initial design idea is here expanded to a realm of possible solutions, by parameterizing variables that practitioners typically wish to explore at a given moment in the design process, and that are likely to influence performance. User-defined ranges and objectives are then used by an automated system to sample the solution space, generating and evaluating design variants in a way that supports decision-making and provides useful information (while reducing the need) for any subsequent manual iterations. The automated search allows exploring a more populated and wider solution space (i.e., higher diversity of variants) in a more efficient way, all the more if it involves an optimization algorithm as argued above. 2.4. Examples of existing approaches Whether stand-alone or plug-ins to 3D CAD programs, many of the available simulation tools are front-ends to established simulation engines such as EnergyPlus [38] and Radiance [39]. These physics-based engines are intrinsically accurate to some degree, depending on the

Fig. 4. Non-linear ‘generative’ workflow as an alternative to the more traditional linear workflow of Fig. 3. 61

Building and Environment 137 (2018) 58–72

E. Nault et al.

3.1. User-inputs

Similarly [51], proposed a simulation-based design workflow for the optimization of the passive performance of a building, using Grasshopper and various of its plug-ins including Octopus [52] for multiobjective optimization. The work presented in this paper attempts to tackle the identified lack in methods that can provide performance feedback to practitioners in a non-disruptive yet novel way during the exploratory process of defining the buildings' form and layout. Described in the next section, our proposed workflow follows the generative approach of Fig. 4. It involves the combination of a predictive performance assessment engine with a multi-criteria optimization algorithm for populating the design space. With respect to existing methods and tools, our approach is novel in particular for its neighborhood application scale, the techniques it integrates, and its multi-criteria evaluation that includes energy (need and production) and daylight performance. Moreover, it distinguishes itself by its emphasis on a seamless integration of performance quantification into the conceptual architectural and urban design process. The idea is to make building simulation an inherent element of architectural decision making without requiring technical expertise of the underlying simulators - one of the identified barriers for such tools.

The amount and type of information requested from the user as inputs must be kept in line with the data available at the early stage of the project, yet be sufficient to define a starting base case scenario. At the site level, this information thus includes the definition of the 3D context geometry (if existent) and of the ground surface area (plot) on which the new design is to be located. In addition, the user can define density constraints in terms of minimum and maximum site coverage and floor area ratio (FAR), for which target values are typically found in masterplan guidelines or project briefs [53,54]. At the individual building level, the inputs include the building typology (or shape), function, window-to-wall ratio, position, orientation, and an initial number and interval for each dimension. These intervals delimit, along with the density limits, the boundaries of the solution space to be explored. Building dimensions, which also affect the distance between buildings, are chosen as the variables since they correspond to what designers commonly play with at the early design phase and can have an impact on the energy and daylight performance [44,55–57]. Codecompliant default values are assumed for the more detailed information such as that related to the building construction (e.g., U-value). To embrace the iterative nature of the early design process and ease the comparison of alternatives, an automated generation and evaluation engine is devised to populate the solution space delimited by the user-inputs.

3. Proposed workflow and its implementation We here describe the development of our proposed workflow, depicted in Fig. 5, and its implementation as a CAD plug-in. The idea is to gather user inputs that describe an initial design and that set the boundaries of the space within which design alternatives (or variants) can be generated. These designs are evaluated in terms of different performance criteria to then be compared visually and numerically by the user. The user-inputs and the background generation and evaluation process are described below, while the outputs are detailed in the implementation section 3.4.

3.2. Evaluation of variants For the performance evaluation, a predictive approach is adopted instead of the more traditional physics-based method to enable (nearly) real-time feedback. The approach consists in using a mathematical function (or metamodel), that takes as inputs parameters easily computable for a conceptual design, to provide an estimate or prediction of a given performance metric. The choice for these parameters thus depends on their relationship with the performance metrics of interest, here the energy need and daylight potential. In this context, relevant and available parameters consist of descriptors of the morphology and solar exposure level of a building, the latter also used to account for the shading from surrounding buildings. These parameters simultaneously correspond to values that are expected to vary between the design alternatives based on the user-inputs. Two multiple linear regression functions with linear (i.e., βi x i ) and interaction (i.e., βij x i x j ) terms are here used to predict (i) fEN (x ) : the annual energy need for heating and cooling per unit floor area (kWh/ m2), and (ii) fDA (x ) : the daylit area (%) of each building on the plot, corresponding to the spatial Daylight Autonomy for a 300 lux and 50% occupancy threshold [60]. These metamodels were previously developed based on datasets containing simulated energy need and daylit area values, taken as reference (or ground truth), of a series of hypothetical neighborhood design variants in the climate of Geneva, Switzerland. The variants consisted of standard building shapes (cubic, rectangular, tower, L-shape, and courtyard) in different layouts on a site varying between 10,000 and 22,500 m2. These characteristics of the dataset, notably the buildings' shape and volume, their layout on the site, and the climate, impose intrinsic limitations on the applicability of the metamodels on new designs. This subject is further addressed in sections 4 and 5. A third function fEP (x ) computes the energy production by photovoltaic (PV) systems installed on the roof and facades, not through a metamodel but directly from the solar exposure data. To obtain this data, also required by the metamodels, an irradiation simulation on exposed surfaces is necessary. However, depending on the size of the scene, conducting this type of simulation on geometry with a low level of detail is relatively fast (in the order of seconds [58,61]). Applying the three evaluation functions being practically immediate, the irradiation simulation dictates the duration of the whole evaluation, which takes

Fig. 5. Main phases of the proposed workflow, following the generative approach of Fig. 4. 62

Building and Environment 137 (2018) 58–72

E. Nault et al.

implementation in C#. The open-source code of the solver can be accessed online [66]. A performance comparison of SPEA2 to the more popular Non-dominated Sorting Genetic Algorithm 2 (NSGA2) by Ref. [67] is given in Ref. [68], where it was shown that SPEA-2 tends to achieve a better (i.e., more uniform) distribution. In Ref. [69] the two algorithms were applied to a window placement problem and it was concluded that NSGA2 is slightly superior - however in order to make generalized performance statements more applied problems need to be evaluated. SPEA2 is a population-based evolutionary algorithm, where the ‘quality’ of each individual i (i.e., design variant) in the population is quantified by the degree of Pareto dominance R (i) and a density estimator D (i) which measures the proximity of an individual to the population in the objective space. The first metric ensures the preference of Pareto optimal solutions, whereas the latter metric avoids that individuals are clumped in the objective space and hence enforces better distribution. In contrast to [65], we suggest calculating D (i) with

up to about one minute for a neighborhood design of nine buildings. This method thus allows considerable time-savings compared to running energy and daylight simulations, whose durations are respectively in the order of minutes and hours.1 For more information on our general approach for developing these evaluation functions, we refer the reader to Ref. [58,59]; for their actual application in the present paper, these functions have been further refined, notably as far as the reference dataset and the scale of application (building vs. neighborhood) are concerned (cf. Ref [81], under review). 3.3. Optimization-based search The general purpose of optimization is the identification of design parameters that maximize the benefits and minimize the costs (monetary, ecological, social, etc.). In the first version of the proposed workflow [59,62], a random search was incorporated. However this is known to be an inefficient sampling strategy for performance improvement. For example [63], has shown that with random search using a Sobol (LPT) sequence, the best solution found after 15,000 samples was still worse than the best solution found with optimization algorithms after 500 samples. We have therefore decided to implement an optimization algorithm to increase search efficiency. Formally, we can define design parameters as a variable vector x ∈ n with n being the number of parameters. The ‘quality’ of a design is expressed via a cost function f : X →  . Consequently, the optimal design parameters x (*) can be identified by minimizing that cost function:

x (*) = argmin f (x ) x∈X

D (i) =

(1)

3.4. Implementation The workflow described in the previous section was implemented as a plug-in for the 3D modeling software Rhinoceros (Rhino) [71]. This structure provides a CAD-integrated tool and facilitates the acquisition of certain user-inputs and the integration of 3D visualization features. Coded in C#, the plug-in represents the latest version of the Urban SOLar Visual Explorer (UrbanSOLve) prototype,2 initially developed in 2015 [59]. It consists of three interfaces linked with underlying code that also includes the optimization and metamodel algorithms and the Radiance/Daysim [39,64] commands for the irradiation simulation. The interfaces can be seen in Fig. 6, which graphically shows the unfolding of the workflow through an example case. The main interface (‘Main’) is where users define the optimization parameters and density constraints, specify the Rhino layer names corresponding to the plot and context geometries, and proceed to defining the base case design in a second window (‘Edit buildings’). As explained earlier, each building to be positioned on the plot is parametrized. The three building shapes currently available - simple volume, courtyard, and L-shaped - correspond to basic yet commonly encountered typologies at this stage of the design [72–74] and were fixed so as to facilitate the development of the metamodels. Story height, climate, and code reference (for underlying assumptions e.g., on U-values) are currently fixed to the values used in the development of the metamodels [81]. Offering choices to the user for those parameters would therefore imply further work on the evaluation engine. The input data is then used in the generation and evaluation

m x∈X

∑ wj f j (x ) j=1

(2)

This would require us to have a priori knowledge of the users' preferences with respect to each objective. In practice the preferences are often progressively defined by different stakeholders, by elaborating pros and cons of different design variants. Therefore, we choose to implement a multi-objective optimization algorithm that returns a set of Pareto optimal design variants, i.e., such variants where it is not possible to improve one objective without deteriorating another one. Our MOO problem can then be expressed with:

min f (x ) = {f1 (x ), …, fm (x )}

(3)

Given the evaluation functions introduced previously, this translates to:

min f (x ) = {fEN (x ), −fDA (x ), −fEP (x )}

(4)

We solve eq. (4) using an evolutionary black-box solver (heuristic), that only requires the objective function values f (x ) for its search. This is due to the fact that we use a proprietary simulation program (Daysim [64], as described later) in the evaluation of a variant, which prevents us from using more efficient methods, e.g., gradient-based solvers. We have decided to use the Strength-Pareto Evolutionary Algorithm 2 (SPEA2) as described in Ref. [65] and have developed a custom 1

(5)

where N + N is the multiset union of the population and the archive and σi, k is the normalized distance of individual i to k. It describes the inverse mean Euclidean distance to all solutions in the objective space. In the original algorithm, only the inverse k th distance is considered and hence our formulation is presumably a better representation of an individual's density – however the impact on the algorithm's performance still needs to be rigorously tested. A value of one is added in the denominator to ensure D (i) < 1. For initializing the search we use a uniform random population. As sampling operators we use intermediate recombination with hypercube extension, binary tournament selection for mating selection, a mutation of real variables as described in Ref. [70], and binary single point crossover and uniform random single point mutation in case of integer variables.

where X is the search domain, and where a minimization problem can always be transformed into a maximization problem with min f (x ) = max − f (x ) . Since we are evaluating design variants according to their energy and daylight performance, the problem becomes a multi-objective optimization (MOO), where a solution vector is evaluated for m several objectives f : X → m . Every MOO can be formulated as a single objective optimization problem by weighting each objective f j with a coefficient wj ∈  :

x (*) = argmin

1 ∑k ∈ N + N σi, k + 1

2 In its original status, UrbanSOLve-2015 was programmed as a plug-in for Grasshopper [48], itself a plug-in for Rhino. It has since undergone changes to evolve into the version described in this paper.

On an Intel Quad Core 3.70 GHz computer with 16 GB of RAM.

63

Building and Environment 137 (2018) 58–72

E. Nault et al.

Fig. 6. Steps in the use of the UrbanSOLve prototype (Rhino plug-in) with screen captures of its three interfaces (Main, Edit buildings, and Show charts). Best viewed in color. (For interpretation of the references to color in this figure legend, the reader is referred to the Web version of this article.)

64

Building and Environment 137 (2018) 58–72

E. Nault et al.

professionals likely to fall within the target audience. Following the development of the first version of the prototype, UrbanSOLve-2015 (mentioned in section 3.4), workshops had also already been conducted to (i) assess its potential as a DDS tool, (ii) verify if it could help bring new knowledge and improve the performance of a design, (iii) identify bugs to be solved and improvements to be made, and (iv) evaluate the predictive accuracy of the metamodels [62]. Material for the two latter points directly ensued from the workshop outcomes. Whereas meaningful results were obtained regarding point (i) through qualitative feedback gathered via questionnaires, acquiring evidence demonstrating an improvement in knowledge and design performance (point (ii)) proved much more difficult, particularly due to the short workshop time-frame of half a day. Building upon that experience and the lessons learned, our goals and testing protocol were adjusted for this second set of workshops. Our focus was placed on acquiring mainly qualitative information from a limited number of target users, on the relevance of our proposed approach, and the functionalities offered in relation to the design phase and the questions or decisions for which they typically seek support. Professionals working in the energy and/or urban design services within different engineering, architecture, and municipal offices in the French-speaking part of Switzerland were invited to participate by email. Seven and five people respectively attended the first and second workshop session that took place in early Summer of 2017. Four participants were trained as architects, five as engineers (civil, environmental), and one as a biologist, with six holding an additional degree either in urbanism, environmental sciences, or sustainable architecture. The total of 12 participants was considered sufficient for our scope; as [75] states, if no statistically valid results are required (as the case here), a low number of participants, preferably at least eight, will already allow exposing usability problems. Each workshop lasted one afternoon and included briefing sessions, two periods of answering an online questionnaire, and an exercise session for testing the prototyped (Table 2). The online survey was prepared and structured in two parts, a pre- and post-test section. The second part included the Standard Usability Scale (SUS), proposed by Ref. [76] and consisting of 10 statements with which respondents must specify their level of agreement on a 5-point Likert scale [77]. A score can then be computed over the 10 responses, reflecting a global subjective assessment of the usability of the evaluated product. In the context of building performance, the SUS was notably used by Ref. [4] for testing a simulation software prototype developed to support net zero-energy building design. Samples of the questionnaire, translated from French, are presented in the outcomes section below. The exercises represented the core of the workshop in terms of time and participant engagement. For debugging and analysis purposes, this phase was video-captured using a screen recorder program. Participants were first guided in a step-by-step manner on a simple project to get familiar with the interfaces and functioning of the prototype (Test 1). They were then asked to use the tool on one of their own projects (Test 2), which they had sent to the organizers prior to the workshops for preparation (i.e., generation of a Rhino base design from the received file). Fig. 7 illustrates an example project in its original (left) and workshop-adapted (right) form, the latter on which annotations were added to help the participant when entering the user-inputs in UrbanSOLve. For some projects, it was necessary to simplify or modify the design, given currently available features, e.g., limited pre-defined building shapes. For example, in the case illustrated in Fig. 7, it was not possible to specify superposed volumes (i.e., a larger base with upper levels of a smaller footprint) or adjacent buildings (touching volumes of different heights). Whereas what limits superposed buildings is the impossibility of specifying such a configuration in UrbanSOLve, for the case of adjacent buildings, it is the fact that the performance assessment engine is not built to handle adiabatic surfaces, which are anyhow currently not detected. Also, for some projects, we decided to work on a subplot to ensure the participant would have enough time to obtain

Table 2 Workshop schedule. Introduction

Presentation of research context and project background

Questionnaire Part 1

Answering of online survey including questions on participant's background, and experience with and expectations from design decision-support instruments

UrbanSOLve

Overview and quick demo of prototype

Exercises Test 1 Test 2

Testing of the prototype by the participants Simple guided test Independent test on own project

Questionnaire Part 2

Completion of survey with questions for gathering participant's feedback on prototype and experience during test (incl. SUS)

Group discussion

Complementary feedback and exchanges

process. In the case that a generated variant violates the constraints (e.g., geometry is physically unrealistic due to intersections), the irradiation simulation is not executed and bad performance values (e.g., zero for the energy production) are assigned to make sure this variant is penalized in the optimization process. Given the number of optimization generations (Ng) and variants per generation (Nv / g ) specified by the user, the total number of design alternatives will be somewhere below (if variants were discarded) or equal to Ng ∗Nv / g . Moreover, a seed for the random number generator can be specified to allow reproducibility of results given the same input data. However, in practice a different seed should be chosen for every run. To facilitate interpretation of the results, and in response to the requirements of Table 1, different graphical representations are included as outputs. Users can browse through the generated variants using the number pickers for the optimization generation and variant (within the selected generation) in the main interface. The selected variant appears in 3D in the Rhino window with a false-color irradiation map. Performance results can be viewed in a third window (‘Show charts’) through scatter plots and a table of numerical results. Each plot shows the performance of the variants from the currently selected generation, for a different pair of the three performance metrics. For example, the graph shown in Fig. 6 displays the daylit area on the y-axis versus the electricity production on the x-axis. Each point corresponds to one neighborhood variant, with the yellow dot matching the design that is displayed in the Rhino window. The right-side panel contains the numerical data for the initial and currently selected variant, at the neighborhood level as well as for each building. Other functionalities are included to facilitate the subsequent usage of the results. Based on the user's selection, any of the variants including the initial design can be transformed into individual Rhino geometry layers for further processing. Input and output data for all generated variants can also be exported as text and tabular (csv) files respectively. Although the capabilities of the prototype do not extend further than described above in terms of results exploration and visualization, an additional element was developed in view of the workshops described in the next section, to further facilitate the analysis of the results. Shown at the bottom of Fig. 6, this addition is an Excel-based template that imports a results file previously exported from UrbanSOLve, lists and graphically displays the performance results, and identifies the best variant for each performance criterion.

4. Application through workshops with practitioners 4.1. Workshop objectives and scope To gain insight into the applicability of the prototype within the design process, workshops were organized to test its usability among 65

Building and Environment 137 (2018) 58–72

E. Nault et al.

Fig. 7. Project from one participant in its original form (left, sent before the workshop) and as an adapted Rhino base case design, with guiding information for when entering data in UrbanSOLve (right, for Test 2).

Fig. 8. Example project requiring modifications from the initial plan (left) for Test 2 (center). After some trials in UrbanSOLve, results were obtained by defining the circled buildings as one L-shaped volume.

Table 3 Example design variants from participant projects with predicted and simulated (reference) energy need values (kWh/m2) for each new building in the scene. Dark gray buildings represent the existing context. Example variant and building numbering

A. Predicted

B. Simulated

Error A-B

1 2 3 4 5 6 7 8

45.9 45.4 41.2 39.1 44.4 45.1 38.5 36.1

44.6 44.1 39.8 38.0 39.7 39.9 37.7 35.3

1.3 1.3 1.4 1.1 4.7 5.2 0.8 0.8

1 2 3 4 5

34.7 44.1 36.6 33.0 44.3

34.7 43.3 36.6 33.4 43.4

0.0 0.8 0.0 −0.4 0.9

1 2 3

41.6 38.3 38.2

41.0 36.4 35.9

0.6 1.9 2.3

already be drawn: when confronted to real projects particularly those in existing urban areas subject to site constraints, more flexibility in building design and layout is required from the prototype. The main outcomes from Test 2 and responses to the survey are

results, since the process takes longer the larger the neighborhood. Other modifications included the ‘flattening’ of any sloped ground surface, and the cutting of non-linear and non-orthogonal buildings into separate volumes as shown in Fig. 8. An expected first conclusion could 66

Building and Environment 137 (2018) 58–72

E. Nault et al.

Fig. 9. Quest. part 1 - Answers to To which method(s) do you resort to obtain support during your decision-making related to one or more performance aspect(s) of a design?.

the prototype to malfunction. Naturally, such issues hinder the usability of a tool, but are common in prototypes and did not prevent valuable information to be extracted. 4.3. Prediction accuracy Although not a goal of the workshops and outside the main scope of this paper, we briefly look at the prediction accuracy of UrbanSOLve's energy need metamodel when applied on three example variants of participant projects. Table 3 shows that the predictions are very close to the simulated values, with an average error of 3.6%. Results for the daylight metric are not shown here due to time and computational cost constraints. However, a lower prediction accuracy would likely have been observed, based on results obtained during the metamodel development. 4.4. Answers to questionnaire We here present the answers to a selection of questions included in the pre- and post-test parts of the survey. Response rates were almost 100%; only for the second part did one person not complete the questionnaire (for unknown reasons). Figs. 9 and 10 show the distribution of answers to questions included in part 1 for gathering information on the typical methods and tools used by participants. Within the provided list of decision-support methods (Fig. 9), all were selected by a number of respondents as used during the conceptual design phase. At this stage, lessons learned from previous experiences appears as the method most often used whereas simulation is the least employed. At the detailed stage, resorting to an external consultant is the dominant method. Fig. 10 shows that while some participants stated having used specific modeling software, notably AutoCAD, during the conceptual and to a lesser extent detailed design phase, a majority did not have any prior experience with simulation tools. In this category, only the Swiss Lesosai program [78] is used, typically for code compliance verification. Fig. 11 corresponds to a question that was present in both parts of the survey. In part 1, participants had to select (or add) what they thought a decision-support tool should offer at the early design phase. Their selected answers were then automatically pipped into a similar question in part 2, where they had to specify the degree to which they thought the tested prototype fulfilled those identified criteria, the question being “According to you, does this prototype fulfill the criteria previously identified about what a tool should offer?”. Results show that, for the most selected criteria - “allow comparing the performance of different variants of a project” and “allow quickly estimating, with

Fig. 10. Quest. part 1 - Answers to What is your experience with 3D modeling/ simulation tools? *Added by participant (not in list of options).

presented below. 4.2. Outcomes from exercises Out of the 12 participants, four were able to obtain results for their project at the first attempt and without further modifications of their design. Three encountered problems - often crashes of the program difficult to diagnose ‘on the spot’ - and did not obtain results that were meaningful for their project (i.e., for geometry similar to their design). The remaining five were able to obtain results after some trials and slight adjustments to their design. The example cases presented in Figs. 7 and 8 fall into this category. The most successful cases were characterized by a less dense and more spaced-out urban form, with rather flexible constraints (e.g., wide density range). The video recordings proved to be an effective tool in identifying most of the reasons behind the crashes, which were mainly lack of crash-safe coding. For instance, the tool did not warn the user when two instances of Rhino + UrbanSOLve were opened, a situation that caused 67

Building and Environment 137 (2018) 58–72

E. Nault et al.

Fig. 11. Quest. part 2 - Responses to According to you, does this prototype fulfill the criteria previously identified [in part 1] regarding what a [decision-support] tool should offer [at the conceptual design stage]?.

Fig. 12. Quest. part 2 - Answers to You are satisfied with respect to […].

themselves as not part of the target audience (hatched bars). Reassuringly, we observe that this latter group corresponds to the three lowest SUS scores. Another positive observation is the fact that participants who had to put up with unexplained crashes and start over still saw some potential in the prototype (except participant 10, part of the nontarget group). As for the participants' background, no clear distinction can be made between scores from architects and engineers. In fact, according to a majority of the participants, potential users of the prototype are architects/designers as well as consultants/engineers working at the early design phase (Table 4), which corresponds to our targeted design stage. Responses to open-ended questions also included in the survey are presented in the next section, which summarizes the overall workshop outcomes.

acceptable precision, multiple performance aspects of a project simultaneously” - the prototype performs well, with a majority of positive responses (agree and strongly agree). Answers for the last criterion on easing the definition of a 3D model, task on which we come back later, are more mixed. Fig. 12 presents the level of satisfaction with respect to different elements such as the relevance of the provided information and the interfaces of the prototype. A majority of answers fall into the agreement side, except regarding the relevance of the approach (second item) and the interface for the data input (last item). We can speculate on the reason behind these results based on the mixed success during the second test, as described earlier. We can expect that participants who could not obtain meaningful results despite repeatedly entering their inputs - particularly each building's characteristics - after an unexplained crash of the program, would have reservations about the relevance of the tool within the design process as well as frustrations with the interface to input the data. Fig. 13 shows the distribution of answers to each of the 10 questions included in the SUS [76]. Fig. 14 presents the SUS score of each participant, computed according to the method described in Ref. [79], which also contains additional scales to help interpret the scores and better judge the usability of a product in absolute terms. One such scale is displayed in Fig. 14, ranging from ‘Not acceptable’ to ‘Acceptable’. While a majority of the scores fall within the ‘Acceptable’ band, the two lowest scores drag down the average over all participants to the lower limit of that band at 70.45. To push the interpretation further, we have added other layers of information on the graph, to distinguish architects from engineers (dark versus light gray), and to identify the participants who encountered difficulties when testing the prototype on their own project (number between parenthesis) as well as those who self-rated

5. Discussion The main strengths and limitations of the UrbanSOLve prototype are presented in Table 5, along with envisioned improvements. This table merges the participants' feedback - collected through the discussion and the closed- and open-ended survey questions - and our observations and interpretation of the results, also in relation to the requirements identified in section 2. This list and our overall experience lead us to formulate the following remarks:

• Whereas the general low level of prior experience with simulation

software (Fig. 10) highlights the lack of usage of such DDS tools, the observed quick learning of the prototype brings down the perceived lack-of-expertise barrier and supports the idea that adequate BPS tools - even integrating optimization - can be adopted at the

68

Building and Environment 137 (2018) 58–72

E. Nault et al.

Fig. 13. Quest. part 2 - Responses to the Standard Usability Scale (SUS) [76] questions. Table 4 Quest. part 2 - Answers to By whom and in which context do you think this tool could be used? More than one selection possible. *Added by participant. Who, when

Number of answers

Architect/designer, early design phase Consultant/engineer, early design phase Architect/designer, advanced design phase Consultant/engineer, advanced design phase Developer to evaluate competition projects* I don't see the use/relevance of this tool

11 9 3 1 1 0

closer to practice, i.e., with real projects and users.

• The currently restricted design freedom is a limitation intrinsic to

Fig. 14. Quest. part 2 - SUS score per participant [79]. (x): Participant encountered difficulties during test 2 (on own project).





conceptual design stage. This is further supported by the fact that participants indicated early design phase actors as potential users (Table 4). The major shortcomings do not concern the fundamental proposed workflow, but rather implementation-related issues (e.g., the dataentry process) that could be dealt with through further technical development. This would reduce the gap between the theoretical capabilities of the prototype and its actual functioning in a context

• 69

using a metamodel-based evaluation method. To expand the application scope and enable the prototype to handle more diverse building shapes, functions, etc., the possibility of dynamic metamodeling from an automatically populated project-specific database could be investigated. This issue could also be solved by conducting simulation-based evaluations, conditional to reducing the running time to enable near real-time results (e.g., through cloud computing). From the suggestions made by participants, we observed that at the targeted design scale and phase, the prototype should ideally offer flexibility in the type of inputs given the diversity of actors involved. For instance, an urban designer expressed the wish for more ’urbanistic’ and less building-specific inputs (e.g., average distance between buildings), while an engineer asked for facade-specific variables such as specifying the type of PV system per orientation. Despite the challenges of subjecting a prototype to user-testing, we argue that continuous probing and development from a research perspective should be done in parallel with, and in response to the

Building and Environment 137 (2018) 58–72

E. Nault et al.

Table 5 Main strengths and limitations of the UrbanSOLve prototype, along with suggested improvements to be made. Compilation of participants feedback collected during the discussion and through the open-ended survey questions, and in italics, our interpretation of the workshop outcomes (also in relation to Table 1). Input data and related interface Strengths Limitations Envisioned improvements

Quick to learn; no unknown information demanded (in-line with target design stage); 3D visualization of initial design Restrictions in terms of building design (no adjacency and complex forms) and variables; fastidious data-input process; impossibility to interactively modify building once defined (must delete and re-enter) Add parameters (e.g., position of building on the site, as variable); allow for more freedom in the design and program options (e.g., mixed-use); ease data-entry process by adding functionalities that also increase interactivity e.g., allowing to define and modify a building through the CAD interface; allow ignoring certain building facades in the PV production calculation

Outputs and related interface Strengths Limitations Envisioned improvements

Allow better anticipating the influence of the urban form on energy-related performance aspects; performance evaluation very interesting for neighborhood masterplanning; graphical and visual No benchmarking; no sensitivity analysis Add graph containing all variants (all generations) to see complete set of results

Workflow/General Strengths Limitations Envisioned improvements

Simplicity of use; rapidity in obtaining results; integration of energy aspects at the initial phase of the project; allows comparing simultaneously similar yet different variants, with a control over many parameters Unknown bugs and crashes of the program; unknown calculation time; obscure and rigid evaluation method; in between urban planning/design and architectural design stage (in terms of inputs and variables) Fixing bugs through thorough testing and using screen recordings; inform users on evaluation method; make adaptable to design phase; provide guidelines with respect to results (support interpretation); provide indicative order of magnitude for calculation time; adapt to other 3D software such as SketchUp; add possibility of specifying conditions on performance criteria, e.g., prioritize one over other two (which would influence the optimization)

series of geometrically distinct design alternatives. Evaluated according to their energy and daylight performance by prediction functions using parameters capturing the morphology and solar exposure of buildings, the variants can be compared through different visual representations including performance graphs and 3D irradiation maps. From those outputs, users can extract information potentially useful for their decision-making, for example the identification of designs that may be geometrically similar yet have distinct performance values. Compared to more traditional tools that evaluate a well-defined and unique design, UrbanSOLve reduces usage time and complexity by requiring a limited amount of design information and avoiding full energy and daylight simulations, while moreover populating a space of design options for the user to explore. Insights collected on the potential of the prototype through application tests among practitioners serve not only to assess the prototype in its current status, but also anticipate its expected added-value given future refinements. Furthermore, our findings from the user responses may help other researchers and developers better understand the challenges and requirements DDS methods face in a real applied context. Ultimately, it is imperative to improve the decision-making process of buildings and neighborhoods, given their environmental impact and economic contribution. To achieve this, fast predictive models and optimization algorithms might be two of the missing keystones needed in the early design process, provided they are implemented in a framework that respects the ill-defined and comparative nature of the conceptual design stage.

outcomes of, application tests with potential real users within a design process setting. In that way, the interaction with practitioners can only enrich and guide the ongoing development towards the true needs of the designer community. This interaction can begin early-on using a mock-up when no prototype yet exists. Additional future work should include thorough testing of the optimization algorithm, by examining the evolution of the variants' performance over multiple generations. This type of study was out of the scope of our workshops, where time was anyhow too short to get participants to specify a large enough number of generations to clearly see the performance improvement. By looking at the total number of possible design variable values and combinations, and the behavior of the optimization for different cases, we could infer on increasing levels of performance improvement and make recommendations to users about the number of generations and variants to specify, also according to the time they have available for conducting their study. The implemented solver could also potentially be improved by different sampling operators, or by using function approximation techniques to improve convergence with low evaluation budgets [63,80]. For cases where adjustments had to be made on the initial design as explained in section 4.1 (Figs. 7 and 8), the results obtained by applying the metamodels on these adapted designs should be compared to those from a simulation over the design in its original status, to verify the error induced by the geometrical simplifications. Additional developments could furthermore include extending the evaluation to the existing context to see the impact a new design could have on its surroundings, and adapting the prototype to other climatic contexts. Finally, additional performance metrics could be included in the multicriteria evaluation, such as global warming potential (in terms of CO2equivalent emissions) and cost.

Acknowledgement This work was conducted at the Ecole polytechnique fédérale de Lausanne (EPFL) with additional support from the InnoSeed program of the School of Architecture, Civil and Environmental Engineering (ENAC) at the EPFL. This research has also been financially supported by the Swiss Competence Center for Energy and Mobility project SECURE (CCEM 914, E. Nault and C. Waibel) and by the Swiss Commission for Technology and Innovation within SCCER FEEB&D (CTI 1155002539, C. Waibel). The authors would like to thank Mélanie Huck for her essential contribution to the implementation of UrbanSOLve, all workshop participants and Giuseppe Peronato and

6. Conclusion Acknowledging the importance of performance considerations from an urban-scale and early-phase design perspective, we have developed the Urban SOLar Visual Explorer prototype by implementing a semiautomated workflow that generates, from a user-defined neighborhoodscale initial design and ranges in terms of variables and constraints, a 70

Building and Environment 137 (2018) 58–72

E. Nault et al.

Sergi Aguacil for their help during the events.

10.1016/j.egypro.2012.11.125. [33] J. Gil, J.P. Duarte, Tools for evaluating the sustainability of urban design: a review, Proceedings of the Institution of Civil Engineers - Urban Design and Planning 166 (6) (2013) 311–325, http://dx.doi.org/10.1680/udap.11.00048. [34] R. Evins, A review of computational optimisation methods applied to sustainable building design, Renew. Sustain. Energy Rev. 22 (2013) 230–245, http://dx.doi. org/10.1016/j.rser.2013.02.004. [35] A.-T. Nguyen, S. Reiter, P. Rigo, A review on simulation-based optimization methods applied to building performance analysis, Appl. Energy 113 (2014) 1043–1058, http://dx.doi.org/10.1016/j.apenergy.2013.08.061. [36] J.M. Cichocka, W.N. Browne, E. Rodriguez, Optimization in the architectural practice. An international survey, in: P. Janssen, P. Loh, A. Raonic, M.A. Schnabel (Eds.), Protocols, Flows and Glitches. Proceedings of CAADRIA, 2017, pp. 387–397. [37] Y.E. Kalay, Performance-based design, Autom. ConStruct. 8 (4) (1999) 395–409, http://dx.doi.org/10.1016/S0926-5805(98)00086-7. [38] D.B. Crawley, C.O. Pedersen, L.K. Lawrie, F.C. Winkelmann, EnergyPlus: energy simulation program, ASHRAE J. 42 (2000) 49–56. [39] G.W. Larson, R.A. Shakespeare, Rendering with Radiance - the Art and Science of Lighting Visualization, Morgan Kaufmann Publishers, 1998. [40] Trimble, Sefaira, (2016) URL http://sefaira.com/sefaira-architecture/. [41] A. Parker, K. Benne, L. Brackney, E. Hale, D. Macumber, M. Schott, E. Weaver, A Parametric Analysis Tool for Building Energy Design Workflows: Application to a Utility Design Assistance Incentive Program, Pacific Grove, CA (2014). [42] C. Reinhart, T. Dogan, J.A. Jakubiec, T. Rakha, A. Sang, Umi-an Urban Simulation Environment for Building Energy Use, Daylighting and Walkability, Proceedings of Building Simulation (IBPSA)Chambéry, France, 2013, pp. 476–483. [43] D. Robinson, F. Haldi, J. Kämpf, P. Leroux, D. Perez, A. Rasheed, U. Wilke, CitySim: comprehensive micro-simulation of resource flows for sustainable urban planning, Proceedings of Building Simulation (IBPSA), Glasgow, Scotland, 2009. [44] N. Baker, K. Steemers, Energy and Environment in Architecture: a Technical Design Guide, E&FN Spon, New York, 2000. [45] B.B. Ekici, U.T. Aksoy, Prediction of building energy consumption by using artificial neural networks, Adv. Eng. Software 40 (5) (2009) 356–362, http://dx.doi.org/10. 1016/j.advengsoft.2008.05.003. [46] A. Foucquier, S. Robert, F. Suard, L. Stéphan, A. Jay, State of the art in building modelling and energy performances prediction: a review, Renew. Sustain. Energy Rev. 23 (2013) 272–288, http://dx.doi.org/10.1016/j.rser.2013.03.004. [47] A. Tsanas, A. Xifara, Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools, Energy Build. 49 (2012) 560–567, http://dx.doi.org/10.1016/j.enbuild.2012.03.003. [48] McNeel, Grasshopper, (2015) URL http://www.grasshopper3d.com/. [49] Autodesk, Dynamo, (2017) URL http://dynamoprimer.com/en/01_Introduction/12_what_is_dynamo.html. [50] M. Bruno, K. Henderson, H.M. Kim, Multi-objective optimization in urban design, Proceedings of SimAUD’11, San Diego, CA, USA, 2011, pp. 102–109. [51] K. Konis, A. Gamas, K. Kensek, Passive performance and building form: an optimization framework for early-stage design support, Sol. Energy 125 (Supplement C) (2016) 161–179, http://dx.doi.org/10.1016/j.solener.2015.12.020. [52] R. Vierlinger, C. Zimmel, Bollinger+Grohmann engineers, Octopus, URL http:// www.grasshopper3d.com/group/octopus (2017). [53] OECD, OECD Green Growth Studies Compact City Policies a Comparative Assessment: a Comparative Assessment, OECD Publishing, 2012. [54] Urbaplan, Plan Directeur Localisé Intercommunal Lausanne-Vernand - Romanelsur-Lausanne. Cahier 1-Rapport d’aménagement, Tech. Rep. V.1.3 (2015). [55] C. Hachem, A. Athienitis, P. Fazio, Design methodology of solar neighborhoods, Energy Procedia 30 (2012) 1284–1293, http://dx.doi.org/10.1016/j.egypro.2012. 11.141. [56] W. Pessenlehner, A. Mahdavi, Building morphology, transparence, and energy performance, Proceedings of Building Simulation (IBPSA), IBPSA, Eindhoven, Netherlands, 2003, pp. 1025–1032. [57] H. M. Lechner, Heating, Cooling and Lighting, John Wiley and Sons Inc., New Jersey, USA. [58] E. Nault, P. Moonen, E. Rey, M. Andersen, Predictive models for assessing the passive solar and daylight potential of neighborhood designs: a comparative proofof-concept study, Build. Environ. 116. doi:https://doi.org/10.1016/j.buildenv. 2017.01.018. [59] E. Nault, Solar Potential in Early Neighborhood Design. A Decision-support Workflow Based on Predictive Models, PhD Thesis Ecole polytechnique fédérale de Lausanne, Lausanne, Switzerland, 2016. [60] IESNA, IES LM-83–12 IES Spatial Daylight Autonomy (SDA) and Annual Sunlight Exposure (ASE), Tech. Rep. IES LM-83-12, New York, NY, USA (2012). [61] C. Waibel, R. Evins, J. Carmeliet, Efficient time-resolved 3d solar potential modelling, Sol. Energy 158 (Supplement C) (2017) 960–976, http://dx.doi.org/10. 1016/j.solener.2017.10.054. [62] E. Nault, E. Rey, M. Andersen, Urban planning and solar potential: assessing users' interaction with a novel decision-support workflow for early-stage design, Proceedings of SBE16, Zurich, Switzerland, 2016. [63] T. Wortmann, C. Waibel, G. Nannicini, R. Evins, T. Schroepfer, J. Carmeliet, Are genetic algorithms really the best choice for building energy optimization? Proceedings of SimAUD’17, Toronto, Canada, 2017. [64] C.F. Reinhart, DAYSIM, (2012) URL http://daysim.ning.com. [65] E. Zitzler, M. Laumanns, L. Thiele, SPEA2: Improving the Strength Pareto Evolutionary Algorithm, Tech. rep (2001). [66] C. Waibel, BB-O: Black-box Optimization Library, (2017) URL https://github.com/ christophwaibel/BB-O. [67] K. Deb, A. Pratap, S. Agarwal, T. Meyarivan, A fast and elitist multiobjective genetic

References [1] L. Weytjens, E. Verdonck, G. Verbeeck, Classification and use of design tools: the roles of tools in the architectural design process, Design Principles and Practices: Int. J. 3 (2009) 289–302. [2] X. Shi, Z. Tian, W. Chen, B. Si, X. Jin, A review on building energy efficient design optimization from the perspective of architects, Renew. Sustain. Energy Rev. 65 (2016) 872–884, http://dx.doi.org/10.1016/j.rser.2016.07.050. [3] J. Kanters, M. Horvat, M.-C. Dubois, Tools and methods used by architects for solar design, Energy Build. 68 (Part C) (2014) 721–731. [4] S. Attia, E. Gratia, A. De Herde, J.L. Hensen, Simulation-based decision support tool for early stages of zero-energy building design, Energy Build. 49 (2012) 2–15, http://dx.doi.org/10.1016/j.enbuild.2012.01.028. [5] C.F. Reinhart, C. Cerezo Davila, Urban building energy modeling – a review of a nascent field, Build. Environ. 97 (2016) 196–202, http://dx.doi.org/10.1016/j. buildenv.2015.12.001. [6] M.G. Riera Pérez, E. Rey, A multi-criteria approach to compare urban renewal scenarios for an existing neighborhood. Case study in Lausanne (Switzerland), Build. Environ. 65 (2013) 58–70 doi:https://doi.org/10.1016/j.buildenv.2013.03. 017. [7] E. Conticelli, S. Proli, S. Tondelli, Integrating energy efficiency and urban densification policies: two Italian case studies, Energy Build. 155 (2017) 308–323. [8] C. Hachem, A. Athienitis, P. Fazio, Solar optimized neighbourhood patterns: evaluation and guide-lines, Proceedings of ESim, Halifax, Nova Scotia, 2012. [9] B. Beckers, Solar Energy at Urban Scale, Wiley, 2013, http://dx.doi.org/10.1002/ 9781118562062. [10] K.S. Leung, K. Steemers, Exploring solar-responsive morphology for high-density housing in the tropics, Proceedings of CISBAT, Lausanne, Switzerland, 2009. [11] C.F. Reinhart, A simulation-based review of the ubiquitous window-head-height to daylit zone depth rule-of-thumb, Proceedings of Building Simulation (IBPSA), Montreal, Canada, 2005. [12] Tech. rep., London, LSE Cities, EIFER, CITIES and ENERGY - Urban Morphology and Heat Energy Demand, (2014). [13] L.D.D. Harvey, Recent advances in sustainable buildings: review of the energy and cost performance of the state-of-the-art best practices from around the world, Annu. Rev. Environ. Resour. 38 (1) (2013) 281–309, http://dx.doi.org/10.1146/annurevenviron-070312-101940. [14] P.A. Rickaby, An approach to the assessment of the energy effiicency of urban built form, in: D. Hawkes, J. Owers, P. Rickaby, P. Steadman (Eds.), Energy and Urban Built Form, 1987, pp. 43–61. [15] P.A. Sattrup, J. Strømann-Andersen, Building typologies in northern European cities: daylight, solar access, and building energy use, J. Architect. Plann. Res. 30 (1) (2013) 56. [16] A. Yezioro, A knowledge based CAAD system for passive solar architecture, Renew. Energy 34 (3) (2009) 769–779, http://dx.doi.org/10.1016/j.renene.2008.04.008. [17] J. Hensen, R. Lamberts (Eds.), Building Performance Simulation for Design and Operation, Spon Press, London ; New York, 2011. [18] J. Goodman-Deane, P. Langdon, J. Clarkson, Key influences on the user-centred design process, J. Eng. Des. 21 (2–3) (2010) 345–373, http://dx.doi.org/10.1080/ 09544820903364912. [19] E. Verdonck, W. Lieve, G. Verbeeck, H. Froyen, Design support tools in practice. The architects' perspective, Proceedings of CAAD Futures, Liège, Belgium, 2011. [20] Tech. rep. IEA, Types of Tools, Canada Mortgage and Housing Corporation, 2004. [21] K. Besserud, T. Hussey, Urban design, urban simulation, and the need for computational tools, IBM J. Res. Dev. 55 (1.2) (2011), http://dx.doi.org/10.1147/JRD. 2010.2097091 2:1–2:17. [22] SIA, SIA 480 Calcul de rentabilité pour les investissements dans le bâtiment, (2004). [23] D. Ürge Vorsatz, N. Eyre, P. Graham, D. Harvey, E. Hertwich, Y. Jiang, C. Kornevall, M. Majumdar, J.E. McMahon, S. Mirasgedis, S. Murakami, A. Novikova, Energy End-use: Buildings, in: Global Energy Assessment - toward a Sustainable Future, Cambridge University Press and the International Institute for Applied Systems Analysis, Cambridge UK, New York USA, Laxenburg Austria (2012). [24] T. Østergård, R.L. Jensen, S.E. Maagaard, Building simulations supporting decision making in early design – a review, Renew. Sustain. Energy Rev. 61 (2016) 187–201, http://dx.doi.org/10.1016/j.rser.2016.03.045. [25] L. Smith, K. Bernhardt, M. Jezyk, Automated energy model creation for conceptual design, Proceedings of SimAUD’11, San Diego, CA, USA, 2011, pp. 13–20. [26] S. Alsaadani, C.B. De Souza, The social component of building performance simulation: understanding architects, in: J. Wright, M. Cook (Eds.), Proceedings of BSO12, Loughborough, UK, 2012, pp. 332–339. [27] B. Beckers, D. Rodriguez, Helping architects to design their personal daylight, WSEAS Trans. Environ. Dev. 5 (7) (2009) 467–477. [28] R. Oxman, Performative design: a performance-based model of digital architectural design, Environ. Plann. Plann. Des. 36 (6) (2009) 1026–1037, http://dx.doi.org/10. 1068/B34149. [29] Y.J. Grobman, A. Yezioro, I.G. Capeluto, Non-linear architectural design process, Int. J. Architect. Comput. 8 (1) (2010) 41–54. [30] L. Weytjens, G. Verbeeck, Towards' architect-friendly’energy evaluation tools, Proceedings of SpringSim, San Diego, CA, USA, 2010, p. 179. [31] K. Branko, Computing the performative in architecture, Proceeding of ECAADe, Graz, Austria, 2003. [32] M. Horvat, M.-C. Dubois, Tools and methods for solar design–an overview of IEA SHC task 41, subtask B, Energy Procedia 30 (2012) 1120–1130, http://dx.doi.org/

71

Building and Environment 137 (2018) 58–72

E. Nault et al.

[68]

[69]

[70] [71] [72]

[73] [74]

algorithm: NSGA-II, IEEE Trans. Evol. Comput. 6 (2) (2002) 182–197, http://dx. doi.org/10.1109/4235.996017. K. Deb, L. Thiele, M. Laumanns, E. Zitzler, Scalable test problems for evolutionary multiobjective optimization, Evolutionary Multiobjective Optimization, Advanced Information and Knowledge Processing, Springer, London, 2005, pp. 105–145, , http://dx.doi.org/10.1007/1-84628-137-7\_6. A.E. Brownlee, J.A. Wright, M.M. Mourshed, A multi-objective window optimisation problem, Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO ’11, ACM, New York, NY, USA, 2011, pp. 89–90, , http://dx.doi.org/10.1145/2001858.2001910. H. Pohlheim, Evolutionäre Algorithmen - Verfahren, Operatoren und | Hartmut Pohlheim | Springer, Springer Berlin Heidelberg, Berlin, Heidelberg, 1999. McNeel, Rhinoceros, (2015) URL https://www.rhino3d.com/. J. Kanters, M. Wall, The impact of urban design decisions on net zero energy solar buildings in Sweden, Urban, Planning and Transport Research 2 (1) (2014) 312–332, http://dx.doi.org/10.1080/21650020.2014.939297. A. Okeil, A holistic approach to energy efficient building forms, Energy Build. 42 (9) (2010) 1437–1444, http://dx.doi.org/10.1016/j.enbuild.2010.03.013. R. SDOL, Gauthier, Malley Centre, Ouest Lausannois - Les coulisses de Malley,

[75] [76] [77] [78] [79] [80]

[81]

72

Concours d’urbanisme et d’espaces publics à un degré, Rapport du jury Bureau du Schéma directeur de l'Ouest lausannois, Renens, 2012. J. Rubin, D. Chisnell, Handbook of Usability Testing: Howto Plan, Design, and Conduct Effective Tests, John Wiley & Sons, 2008. J. Brooke, SUS-A quick and dirty usability scale, Usability evaluation in industry 189 (194) (1996) 4–7. R. Likert, A technique for the measurement of attitudes, Archives of psychology 22 (1932) 101–119. E4tech Software, Lesosai, (2017) URL http://www.lesosai.com/en/. A. Bangor, P. Kortum, J. Miller, Determining what individual SUS scores mean: adding an adjective rating scale, Journal of usability studies 4 (3) (2009) 114–123. A.E.I. Brownlee, J.A. Wright, Constrained, mixed-integer and multi-objective optimisation of building designs by NSGA-II with fitness approximation, Appl. Soft Comput. 33 (Supplement C) (2015) 114–126, http://dx.doi.org/10.1016/j.asoc. 2015.04.010. E. Nault and M. Andersen, On the Potential of Regression Models for Predicting the Energy and Daylight Performance of Buildings, Submitted to Journal of Building Performance Simulation (under review).