A Pedagogical Benchmark Experiment for the Application of Multidisciplinary Design Optimization in...

8
A Pedagogical Benchmark Experiment for Application of Multidisciplinary Design Optimization in Early Stage Building Design Shih-Hsin Eve Lin 1 and David Jason Gerber 2 1 Ph.D. Candidate, School of Architecture, University of Southern California, Watt Hall, Suite 204, Los Angeles, CA 90089-0291; email: [email protected] 2 Assistant Professor, School of Architecture, University of Southern California, Watt Hall, Suite 204, Los Angeles, CA 90089-0291; email: [email protected] ABSTRACT This research is built upon a previously established early stage multidisciplinary design optimization (MDO) framework, entitled Evolutionary Energy Performance Feedback for Design (EEPFD), and proceeds with observing the impact of EEPFD on the early stages of design by conducting a pedagogical benchmark experiment. This experiment has two main observational interests. The first objective is to observe discrepancies between the human versus automated decision making processes and the resulting performance of the solution space from each process. The second objective is to understand students’ ability to translate their design intent into a parametric model, as this is a crucial component in the implementation of EEPFD. By comparing EEPFD and the benchmark pedagogical process, this paper provides an initial assessment of the potential of EEPFD to reduce latency in decision making and to find superior performing design solutions compared to the benchmark process. At the completion of this experiment it was observed that EEPFD was able to deliver superior performing design solution spaces, but that students encountered difficulties in the translation of their design intent into a functioning parametric model. INTRODUCTION + RESEARCH OBJECTIVE Buildings consume nearly half (49%) of all energy used by the United States. Building Operations alone account for 43.5% of U.S. energy consumed today while construction and building materials account for an additional 5.5% (Architecture 2030 2011). However, the overall performance of the building is greatly impacted by design decisions made in the early stages of the design process, when design professionals are often unable to explore design alternatives and their impact on energy consumption (Schlueter and Thesseling 2009, Flager et al. 2009). Research precedents have demonstrated the potential of adopting MDO to provide a performance feedback loop for supporting early design stage decision making (Flager et al. 2009, Welle, Haymaker, and Rogers 2011). However, precedents exploring MDO in the AEC field have typically employed simplified geometry (Flager et al. 2009, Welle, Haymaker, and Rogers 2011) while precedents involving more complex geometry have limited themselves to single domain 802 Computing in Civil Engineering (2013) Downloaded from ascelibrary.org by SOUTHERN CALIFORNIA UNIVERSITY on 03/26/15. Copyright ASCE. For personal use only; all rights reserved.

Transcript of A Pedagogical Benchmark Experiment for the Application of Multidisciplinary Design Optimization in...

A Pedagogical Benchmark Experiment for Application of Multidisciplinary Design Optimization in Early Stage Building Design

Shih-Hsin Eve Lin1 and David Jason Gerber2

1Ph.D. Candidate, School of Architecture, University of Southern California, Watt

Hall, Suite 204, Los Angeles, CA 90089-0291; email: [email protected] 2 Assistant Professor, School of Architecture, University of Southern California, Watt

Hall, Suite 204, Los Angeles, CA 90089-0291; email: [email protected]

ABSTRACT This research is built upon a previously established early stage

multidisciplinary design optimization (MDO) framework, entitled Evolutionary Energy Performance Feedback for Design (EEPFD), and proceeds with observing the impact of EEPFD on the early stages of design by conducting a pedagogical benchmark experiment. This experiment has two main observational interests. The first objective is to observe discrepancies between the human versus automated decision making processes and the resulting performance of the solution space from each process. The second objective is to understand students’ ability to translate their design intent into a parametric model, as this is a crucial component in the implementation of EEPFD. By comparing EEPFD and the benchmark pedagogical process, this paper provides an initial assessment of the potential of EEPFD to reduce latency in decision making and to find superior performing design solutions compared to the benchmark process. At the completion of this experiment it was observed that EEPFD was able to deliver superior performing design solution spaces, but that students encountered difficulties in the translation of their design intent into a functioning parametric model.

INTRODUCTION + RESEARCH OBJECTIVE

Buildings consume nearly half (49%) of all energy used by the United States. Building Operations alone account for 43.5% of U.S. energy consumed today while construction and building materials account for an additional 5.5% (Architecture 2030 2011). However, the overall performance of the building is greatly impacted by design decisions made in the early stages of the design process, when design professionals are often unable to explore design alternatives and their impact on energy consumption (Schlueter and Thesseling 2009, Flager et al. 2009).

Research precedents have demonstrated the potential of adopting MDO to provide a performance feedback loop for supporting early design stage decision making (Flager et al. 2009, Welle, Haymaker, and Rogers 2011). However, precedents exploring MDO in the AEC field have typically employed simplified geometry (Flager et al. 2009, Welle, Haymaker, and Rogers 2011) while precedents involving more complex geometry have limited themselves to single domain

802

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

optimization (Yi and Malkawi 2009). Where the energy performance domain has been included for optimization the relationship between design form and energy performance has been largely excluded. Furthermore, the application of these precedents’ subject of interest to the overall design process remains largely unexplored.

In response to this gap in existing research a MDO design framework was developed that could incorporate both conceptual energy analysis and exploration of complex geometry for the purpose of providing early stage design performance feedback. Subsequently, the established MDO design framework can then be applied to the overall design process where its impact can then be observed.

The established MDO design framework, EEPFD, stands for “Evolutionary Energy Performance Feedback for Design,” utilizes the prototype tool H.D.S. Beagle which enables the coupling of parametric design with multi-objective optimization (Gerber and Lin 2012a, Gerber et al. 2012, Gerber and Lin 2012b). EEPFD specifically enables the exploration of varying degrees of geometric complexity with energy simulation feedback. Also provided are spatial programing compliance and financial performances for consideration in performance tradeoff studies. EEPFD has demonstrated the ability to reduce design cycle latency, automate design exploration, analysis and the evaluation process, and provide improving performance design alternatives. In addition, the established MDO framework has begun testing in pedagogical and practical design process settings (Gerber and Lin 2012a, Gerber et al. 2012, Gerber and Lin 2012b).

This paper presents the impact of EEPFD on the design process through a benchmark pedagogical experiment. The pedagogical experiment is conducted within a design computational tool course environment with students who are in the process of learning both simulation tools and their application to design. The interest of this experimental set is to observe any measurable effects of the introduction of EEPFD sans the element of automation enabled by the Beagle. The second primary observational interest for this experiment set is with regards to the students’ ability to translate their design intent into a parametric model for further exploration.

INTRODUCTION OF EEPFD + H.D.S. BEAGLE

EEPFD, a MDO based design framework, was developed in parallel to H.D.S. Beagle and can be considered a proposed means of implementation of the concepts driving the Beagle in direct application to the design process. In order to realize EEPFD, selection of the platforms by which to implement the GA-based multi-objective optimization (MOO) algorithm was needed. To this end a prototype tool (H.D.S. Beagle) was developed as a plugin for Autodesk® Revit® which integrated Autodesk® Green Building Studio® (GBS) and Microsoft® Excel® to generate the desired automation and optimization routine.

Autodesk® Revit® is a building information modeling platform with parametric capabilities enabling designers to define their geometry while providing a series of parameters that impact the development of varying geometric configurations. This platform also serves as an insertion point for the energy settings necessary for a conceptual energy analysis through Autodesk® Green Building Studio® (GBS). GBS is a web-based energy analysis service that serves as the energy simulation engine for

COMPUTING IN CIVIL ENGINEERING 803

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

the prototype. Microsoft® Excel® 2010 provides not only a means of containing the financial parameters and formula, but also as a user interface proxy in which designers can set up design parameter ranges of interest, constraints, spatial program parameters, and the spatial programing compliance formula. The three objective functions can be formulaically expressed as follows:

Sobj = Max. SPC Eobj = Min. EUI Fobj = Max. NPV

where Sobj = Spatial Programming Compliance Objective Function Eobj = Energy Performance Objective Function Fobj = Financial Performance Objective Function SPC = Spatial Programming Compliance Socre EUI = Energy Use Intensity NPV = Net Present Value

A more detailed description of the adopted method which drives EEPFD and

the automated engine of H.D.S. Beagle can be found in previously published research (Gerber and Lin 2012a, Gerber et al. 2012, Gerber and Lin 2012b).

EXPERIMENT DESIGN & METHOD

This experiment was conducted by the authors through the course entitled Arch507: Theories of Computer Technology during the Spring Semester of 2012 at the School of Architecture of University of Southern California (SoA USC). The course is currently offered to both undergraduate and graduate students of architectural design, building science, structural engineering and construction management disciplines. A total 27 students participated in the experiment. The student group was composed of 17 Master of Architecture candidates, 8 Master of Building Science candidates, 1 undergraduate student pursuing a Minor in Architecture, and 1 Master of Building Science graduate.

The pedagogical experiment is designed as a benchmark case for the evaluation of EEPFD. To this end the simulation process used by EEPFD requires adjustment so as to be suitable for manual use within the pedagogical experiment. Figure 1 illustrates both the simulation process used by EEPFD and the adjusted process designed for the pedagogical context. The significant simulation process difference is predicated on the fact that the pedagogical experiment is run in a classroom setting where students are not granted access in real time, or in parallel, to use of the Beagle and in particular its cloud based architecture and automation. Therefore, the last three steps need to be performed manually by the participants as opposed to automated through the GA-based multi-objective optimization process provided though the Beagle.

After determination of the simulation process, the research organizes the experiment into three major activities: 1) course lecture and hands-on practice by students; 2) assignment; and 3) obtaining data from students for analysis and cross comparison. All activities are designed to utilize the same platforms as EEPFD so students’ exploration results can be considered comparable to results obtained through EEPFD. As a result, the contents of the lecture and assignment are divided into four major portions: 1) Parametric Modeling: The method by which to create a

COMPUTING IN CIVIL ENGINEERING804

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

parametric model using Revit’s conceptual massing environment. 2) Objective Function Calculation: The method by which to conduct Conceptual Energy Analysis, identify relevant information within the Revit model, and input this relevant information into the Excel objective function calculator so as to calculate the three objective functions: EUI, SPC & NPV. 3) Design Exploration: The process through which students can manipulate their parametric model and explore their design alternatives. 4) Design Evaluation: The method of evaluating design alternatives according to the Pareto Rank Evaluation method (Gerber et al. 2012).

Figure 1. Simulation process outline for EEPFD and the pedagogical experiment

Figure 2. The provided design requirements, parametric model and ranges for

the pedagogical benchmark experiment for Part A. After the lecture students are asked to complete an assignment following the

adjusted simulation process as instructed. The assignment is divided in two primary parts: Part A and Part B. Both parts are given identical design requirements to ensure comparability of results to those generated by EEPFD. However, in Part A of the

COMPUTING IN CIVIL ENGINEERING 805

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

assignment students are asked to explore a given parametric design, as illustrated in Figure 2, with the objective of identifying a higher performing design alternative within the given parametric ranges. In Part B students are asked to generate their own design intent and then translate this intent into a parametric model. At this point students are instructed to utilize a total of five form driving parameter. FloorNumber and SiteSetback are required and enable the use of three fully customized form driving parameters unique to each student’s design. This allows for the observation of the translation process of design intent into a parametric model as this is a critical component of EEPFD. For both Part A and Part B the types of data recorded are described in Table 1.

Table 1. Summary of the recorded data for the overall pedagogical experiment

Recorded Data Data Type Background Questionnaire

1. Education Background Enumeration 2. Experience using Revit Number (Time) 3. Experience using other parametric software Y/N Number (Time) Enumeration

Part A & Part B 1. Explored parametric values for each iteration Number 2. The three objective function scores of explored

design alternatives Number (EUI, SPC, NPV)

3. Time spent to obtain and calculate each objective function

Number (Time)

4. Time spent to obtain the final design Number (Time) 5. Design exploration process Image

Additional Recorded Data in Part B

1. Three design intents prior to modeling their own parametric model

Description

2. Design parameterization process map Image 3. Parametric model design Revit File 4. Created parameters Enumeration 5. Initial value and variation range of each parameter numbers 6. Ability of parameters to represent desired design

intent Y/N Percentage

EXPERIMENT RESULTS & OBSERVATIONS

The following is a selected summary of the results extracted from the recorded data as relevant to the previously described objective of this paper. During Part A students recorded an average exploration time of 3 hours, 5 generated iterations, and approximately 16.4 minutes per iteration. Therefore, for qualitative comparison purposes, the Beagle was given the same time period in which to generate results for the same design problem. However, in the interest of exploring the capability of the Beagle when provided extended resources, the Beagle was also given a 7 hour runtime and asked to reach 6 generations in separate runs.

For comparison purposes there are two result sets of interest. The first is in comparing the range of student generated solution space (SGSS) with the Beagle generated solution space (BGSS) within the designated 3 hour time period. The second set is to further include BGSS after 7 hours and 6 generations runs into comparison. Table 1 provides a summary of the performance ranges of the solution pools generated by these result sets.

When comparing the solution pools generated by students and the Beagle under the same time constrains, the Beagle was able to provide a solution pool with a 26.8% increase in the measured NPV and a 13.7% reduction in the calculated EUI.

COMPUTING IN CIVIL ENGINEERING806

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

However, the student solution pool was able to provide a 22.2% increased SPC score. When ranked according to all three objective performance 36.7% of the design alternatives generated by the Beagle were designated as Pareto optimal solutions while only 26.9% of the students generated design alternatives received this designation. Further improvement in the performance of the solution pool was observed when the Beagle was provided an extended period of time, as reflecting of potential availability through increased computational affordances.

Table 2. Performance comparison of solution spaces generated by the

pedagogical experiment and the Beagle after completed runtimes of 3 hours, 7 hours, and 6 generations.

SOLUTION SPACE RANGE SGSS BGSS 3 HRS 7HRS 6th Gen NPV MIN. -48 -41 -41 -41 (Million$) MAX. 341 432 584 835 Initial: -41 Improved 382 473 626 875 EUI MIN. 67 59 59 56 (kBtu/sqft/yr) MAX. 292 233 233 233 Initial: 174 Improved 107.0 115.1 115.1 117.8 SPC MIN. -151 -134 -266 -404 MAX. 99 81 83 88 Initial: 10 Improved 90 72 73 78 Pareto (%)(Students/Beagle) 3 OBJ 26.9/36.7 22.4/40.0 19.4/37.2

In Pedagogical Experiment Part B data from 25 students was collected. Time spent for this part of the experiment by students averaged 5.5 hours with a range of 1 to 13 hours. Recorded time spent includes both initial model set up time and design exploration time through a minimum of four design iterations. At the completion of the experiment students were asked to gauge their success in translating their design intent into a parametric model. Overall, 86% of the students responded that they felt they were able to successfully translate their design intent into a parametric model. However, this rate of success was not reflected in the overall evaluation of all received models.

Table 3 provides a summary of the overall assessment of the parametric models as received by students at the end of the experiment. Parameter Accuracy pertains to the quality and capability of all student provided parameters by observing the correlation of each parameter and the actual impact on the design geometry. Model Robustness is used to determine whether or not a student’s design model is able to maintain its integrity throughout the design exploration process. This assessment is made through evaluating the parametric ranges and rules as provided by each student for their individual design model. Driving Parameter Compliance evaluates whether students followed the given instructions regarding their form driving parameters to ensure a consistent comparable quantity of explored parameters among the students’ designs models. Accuracy of Calculated Data was used to validate student generated results by confirming both proper model set for simulation purposes and calculations based on simulated results. Finally, a final evaluation was made in order to assess the overall quality of the student provided models.

COMPUTING IN CIVIL ENGINEERING 807

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

One key observation is with regards to the Accuracy of Calculated Data results which demonstrated a significant issue regarding human error in the calculation process. In addition, only 40% of the students were able to generate an accurate parametric model with which to generate accurate objective scores for their designs. Furthermore it can be observed that the issue regarding how to ensure Model Robustness requires direct response.

Table 3. Evaluation summary of students’ parametric model performance Evaluated Category

Parameter Accuracy

Model Robustness Driving Parameter

Compliance

Accuracy of Calculated Data Final

Evaluation Rule Range Setup Calculation

Scale (Poor, Acceptable, Good)

(Y/N) (Y/N) (Y/N) (Y/N) (Y/N) (A-D)

Summary P: 36% A: 24% G: 40%

N: 44% Y: 56%

N: 100% Y: 0%

N: 52% Y: 48%

N: 44% Y: 56%

N: 52% Y: 48%

A: 0% B: 56% C: 24% D: 20%

CONCLUSION

During the comparative studies of the experiment, EEPFD demonstrated the ability to generate superior results under the same time constraints as human users, and further improved results within the designated performance objectives when given extended time. Given that time typically dominates early design exploration it can be extrapolated that the reduction in computation time necessary to generate desired results would further acclimate the framework to the early stage design process. In addition it was observed that EEPFD using the Beagle is able to negate issues regarding human based error as observed in the manual calculation of the objective scores. It can be extrapolated then that with increasingly complex design problems there is a greater potential for human error and therefore an increased benefit to the automated process available through EEPFD. This would enable more informed early stage design decision making for even more complicated design projects, which is a subject for future studies.

However, these advantages are dependent on the prerequisite component of a functioning parametric model suitable for exploration through EEPFD. One initial observation regarding the translation of design intent into a parametric model as described by the students was the disparity between perceived success by students and actual functioning results compatible with EEPFD and H.D.S. Beagle. While students considered themselves successful in translating their design intent into a parametric model the quality of the resulting parametric model and its ability to reflect the described design intent were typically found lacking in critical components. Increased experience regarding the composure of a functioning parametric model may be able to close this disparity gap and is in need of further research.

One element that defies parametric translation is that of aesthetic preference which differs widely between individuals. As a result resources will typically be expended on optimizing only potential design solutions that satisfy this first element. As the exploration process of EEPFD possesses no aesthetic preference equally it possesses no aesthetic prejudice. While it may spend time analyzing solutions that will ultimately be dismissed by the designer, equally it analyzes solutions potentially

COMPUTING IN CIVIL ENGINEERING808

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.

overlooked by the designer. The result is a broader based design solution pool with over all improved multi-objective performance levels to enable more informed design decision making inclusive of a more expansive simulated aesthetic and formal range.

ACKNOWLEDGEMENT

The work was in part supported by funding from the USC School of Architecture Junior Faculty Research Fund and in part by Autodesk, Inc. The authors thank the USC Dean of Architecture Qingyun Ma and the junior faculty research grant program; Ms. Bei “Penny” Pan our initial lead software developer; Junwen Chen, Ke Lu, Shitian Shen, Yunshan Zhu for their continued software development; Prof. Kensek and her class for providing experiment materials; Laura Haymond for her participation and her review; and Autodesk Inc. for their generous support within the IDEA Studio program.

REFERENCES Architecture 2030. 2011. Energy - Buildings consume more energy than any other

sector 2011 [cited 18 April 18 2012 2011]. Available from http://architecture2030.org/the_problem/problem_energy.

Flager, Forest, Benjamin Welle, Prasun Bansal, Grant Soremekun, and John Haymaker. 2009. "Multidisciplinary process integration and design optimization of a classroom building." Information Technology in Construction no. 14 (38):595-612.

Gerber, David Jason, and Shih-Hsin E. Lin. 2012a. Designing-in performance through parameterisation, automation, and evolutionary algorithms: ‘H.D.S. BEAGLE 1.0’. Paper read at CAADRIA 2012, 25-28 April 2012, at Chennai, India.

Gerber, David Jason, and Shih-Hsin Eve Lin. 2012b. Synthesizing design performance: An evolutionary approach to multidisciplinary design search. Paper read at ACADIA 2012 - Synthetic Digital Ecologies 18-21 October 2012, at San Francisco, California, USA.

Gerber, David Jason, Shih-Hsin Eve Lin, Bei Penny Pan, and Aslihan Senel Solmaz. 2012. Design optioneering: Multi-disciplinary design optimization through parameterization, domain integration and automation of a genetic algorithm. Paper read at SimAUD 2012, 26-30 March 2012, at Orlando, Florida, USA.

Schlueter, Arno, and Frank Thesseling. 2009. "Building information model based energy/exergy performance assessment in early design stages." Automation in Construction no. 18 (2):153-163. doi: 10.1016/j.autcon.2008.07.003.

Welle, Benjamin, John Haymaker, and Zack Rogers. 2011. "ThermalOpt: A methodology for automated BIM-based multidisciplinary thermal simulation for use in optimization environments." Building Simulation no. 4 (4):293-313. doi: 10.1007/s12273-011-0052-5.

Yi, Yun Kyu, and Ali M. Malkawi. 2009. "Optimizing building form for energy performance based on hierarchical geometry relation." Automation in Construction no. 18 (6):825-833. doi: 10.1016/j.autcon.2009.03.006.

COMPUTING IN CIVIL ENGINEERING 809

Computing in Civil Engineering (2013)

Dow

nloa

ded

from

asc

elib

rary

.org

by

SOU

TH

ER

N C

AL

IFO

RN

IA U

NIV

ER

SIT

Y o

n 03

/26/

15. C

opyr

ight

ASC

E. F

or p

erso

nal u

se o

nly;

all

righ

ts r

eser

ved.