Building and Environment, Vol. 28. No. 4, pp. 419~427, 1993. Printed in Great Brilain.
0 3 6 0 1323/93 $6.00 + 0.00 ~: 1993 Pergamon Press Ltd.
Assessing Building Performance by Simulation J. A. CLARKE* This paper characterizes the state-o/:the-art in the assessment o/'a buildin,q's per/ormance hy computer simulation. Theoretical and applieation aspects are outlined and current developments are described which will help to ensure that simulation can be ~ff~,ctively applied in practice h.v desi,qners with lmtited knowled,qe o/ the technolo`qy. Remainin,q problems are ident(lied.
INTRODUCTION
In theory, the use of a simulation model allows the designer to adopt a prototype and test approach so that information on system performance can be used in the selection of appropriate design options. In practice, the application of simulation is problematic because of technological shortcomings in the applications, especially at their interface, and because of the absence of any standards for problem definition, appraisal and evolution.
BUILDING designers have traditionally employed a wide range of methods to ensure that the performance characteristics of a building will be acceptable [1]. Typically, simple (and sometimes not so simple) calculations and rules of thumb are applied throughout the design process in an attempt to minimize heat loss, maximize the utilization of solar energy, prevent inefficient plant operation, ensure high comfort satisfaction, control problematic air movement and so on. In recent years researchers have stressed the futility of attempting to optimize, in such a piecemeal manner, a system which is inherently dynamic, in that many parameters change over time and at different rates ; inherently non-linear, in that some parameters depend on the system state which, in turn, can only be assessed if these parameters are known ; and inherently systemic, in that the different heat transfer mechanisms interact in a complex manner. In an attempt to address this complexity and provide effective design decision support, the subject of building simulation has received growing attention in recent years. A simulation model will usually attempt to represent the spatial and temporal aspects of a building and its environmental control systems in an exacting manner, although some processes may be simplified because of state-of-the-art limitations or because they are unnecessary in terms of the intended application scope of the model.t Validation issues aside, as the integrity (relative to the reality) of the model improves so does its range of applicability. What is required is either a high integrity simulation model of general applicability or a mechanism to construct bespoke models (of appropriate detail) whose form matches the characteristics of the design problem in hand. The former approach is represented by the growing number of systems now reaching commercial maturity while the latter approach is the goal of emerging program building environments as described later.
THE BUILDING AS A COMPLEX SYSTEM
Consider Fig. 1, which summarizes the problem domain. Within a simulation model such a system is often thought of as a network of time varying resistances and capacitances subjected to varying temperature differences, Rooms and constructional elements are volumes of fluid and solid matter characterized by thermophysical properties such as conductance and capacitance, and variables of state such as temperature and pressure. Since these properties and state variables vary with time, the problem is a dynamic one : regions of differing time constants competing to capture, store and release energy at different rates. Superimposed on this network is a complex set of interacting energy impulses as caused by the physical processes associated with surface convection, long-wave radiation exchanges, shortwave flux, fluid flow, casual gains, mechanical plant, moisture transport and the range of other complexities associated with distributed control actions, passive solar devices and occupant interactions. In attempting to accommodate this complexity two issues have been the subject of intensive research in recent years : the form of the underlying mathematical models and the elaboration of a validation methodology.
*Energy Simulation Research Unit, Department of Mechanical Engineering, Universityof Strathclyde, Glasgow G I IXJ, Scotland. + Many contemporary building energy models are hybrids in that simulation techniques are mixed with decoupled, often simplified algorithms.
Mathematical models"
At the core of all simulation models is a mathematical representation of the heat transfer processes occurring within the buildings/plant system. Typically, the approach is to reduce the real system to some discrete equivalent 419
J. A. Clarke
420
i
m o
T i:1 i i
m
I
:: / o \7
I
,.-~ I
01
'o~1
I
I
I I
I
E
I
i L
_._J
I 1
I
°
. . . . . . . . . . . . ~" i
I
1
I
i
I I
o
:
~°~
N
o
.
~
~~,
~ ....
i
g
V
b:
m
J g : - -" ~
o.,~~
. ~i", P
~ •
0
y
I g/
/
o
c-
Assessin 9 Buildin9 Performance by Simulation and to represent each discretization* by equations which govern the conservation of a quantity of interest such as energy, mass and momentum. Understated, the problem then becomes one of estimating the parameters which comprise the coefficients of these equations and arranging for efficient mathematical models for each of the heat transfer subsystems--such as shortwave flux distribution, buoyancy driven surface convection, temperature dependent conductivity, etc.--and numerical schemes well adapted to the problems associated with the solution of large, sparse and 'stiff't equation systems. Contemporary programs such as BLAST, DOE2, ESP, H V A C S I M + , SERI-RES, SPANK, TAS ° and TRNSYS (among many) are able, with varying degrees of accuracy, to represent constructional conduction, air movement, shortwave distribution and occupant effects. The state-of-the-art is fast approached however when such simulators are used to model problems involving real world scale and complexity. One reason for this is that most models have an incomplete 'library' of parts from which problems can be defined. Consequently a significant research effort is being devoted to the construction of 'neutral' models (of plant components and the like) which can be used by several programs [2].
•
•
•
Validation methodology Given the growing sophistication of stimulation programs, a most important issue facing effective application in practice is prediction accuracy. This issue has attracted growing attention in recent years and nowhere more so than within the Commission of the European Communities' PASSYS project [3] which built on the work of earlier workers (most notably [4 and 5]) to produce a comprehensive validation methodology. The PASSYS methodology comprises 4 components, not all of which need be applied in a given context : • Initial examination of a model's theory and a thorough inspection of the corresponding source code. • Analytical verification involving a comparison of predictions with analytical solutions which apply to some well defined, usually simplified case. • Inter-model testing involving a comparison of the target model with several other models which are usually better known to the validators or may have been subjected to a greater degree of previous testing. • Empirical validation involving a comparison of predictions with measured data for the same problem. Of these components perhaps the most important is empirical validation since it has the potential to both quantify prediction accuracy and indicate possible causes if poor. This component received most attention within the PASSYS project and has the following elements : • Prior to any experiment, an estimate of the likely principal factors is obtained to ensure that the mea* These discretizations may be small as in the case of computational fluid dynamics and three dimensional numerical conduction models or large as in the case of a whole-construction response function approach. t A stiff equation system is one in which the equations represent physical regions with dissimilar time constants differing by one or more orders of magnitude.
•
421
sured data's resolution is well matched to the program to be tested. This might be carried out by the target simulation program so that its inherent sensitivities become the focus of the experiment. The use of a simulation program will require : - - A n accurate description of the experimental configuration--a test cell in the case of PASSYS--with measured parameters used where possible. - - A sensitivity analysis to assess the influence of the program's input parameters on predictions in order to determine sensor and accuracy requirements. Experimental implementation now follows and adheres to the requirements and constraints as identified in the preceding step. As the experiment proceeds, the recorded data will require careful logging, pre-processing, checking and documenting if the data-set is to approach high quality. Program/data comparisons can now proceed and will entail : - - A n initial 'blind' run of the program is made using the carefully formed system model and the measured climatic data. --Goodness-of-fit is then assessed by means of parametric sensitivity analysis, used to estimate the uncertainty bands associated with the predicted time series. - - F o r cases in which dynamic aspects dominate and/or where more information is required on the cause of poor agreement, a statistically-based approach [6] can be employed which is based on an analysis of residuals (the difference between measurements and predictions). This entails estimating the autocorrelation function and power spectrum of the residuals and determining the crosscorrelation functions between program inputs and these residuals in the time and frequency domain. Tests applied to these data can then yield information on which program inputs are mainly responsible for the residuals. In this way some insight may be obtained into which physical processes are not being well represented by the program. Where program deficiencies are exposed, remedial action can be taken. Where deficiencies are not great, a pragmatic alternative is to employ the empirical data in 'program calibration' mode. By this means a program can be tuned to represent a system over a realistic range of operating conditions. The calibrated program can then be used, with caution, to extrapolate performance to other contexts by means of the PASSYS scaling and replication procedures [7].
As simulation techniques become more widely used as the basis of future design tools so the need for program accreditation will grow. Validation methodologies will be an essential part of accreditation, acting to ensure that, for a limited number of cases at least, the predictions from candidate programs are acceptable. SIMULATION PRACTICE Predicting the likely operational states of a building is a non-trivial task involving several complex issues as follows.
422
J. A. Clarke
Problem abstraction The starting point is usually to re-express the building design (real or imagined) in a manner suitable for simulation. This process will depend on whether the context is new-build or retrofit, on the availability of dimensional, constructional and operational data, and on the objectives of the analysis. Unlike tasks such as visual impact modelling--where the goal is to attain geometric realism energy modelling can be carried out abstractly. For example, some complex shape may be simplified or rooms of similar temperature grouped together. Although there is no hard and fast rule, it is usually sufficient to preserve surface areas, aspects and contained volumes so that the heat conduction, thermal capacity and dominant spatial relationships are preserved. There is also a spectrum of approaches to the modelling of heating, cooling and ventilating systems. At one end of this spectrum a plant system might be modelled in terms of its effect on the building--for example by addressing only the time and space dependent flux inputs--while, at the other end, the energy exchanges within and between individual plant components might be tracked. As with the building side, the modelling approach taken will depend on the available data. At the present time the process of problem abstraction is ill-understood and no formal rules exist. With contemporary simulators a rule of thumb is to represent the problem as simply as possible in relation to the required outputs, but in a manner that can be extended and refined should the need arise. For example, this might mean that a building is reduced to a small number of representative rooms so that the possibility exists to model the whole building by simple replication.
Data model creation The next stage is to organize the data defining the problem in the format dictated by the application. This is usually a time consuming task and one which is prone to error. For these reasons most applications will offer problem definition tools usually in the form of a definition language or an interactive facility for data input. Even so, at the present time the prerequisite of efficient data entry is computer proficiency.
Boundao' conditions Before any modelling can commence, an essential task is to determine the types of climatic influence to which the building will be subjected. Typically, several annual collections of climate data will exist and the issue is one of typicality and severity. The starting point is usually to obtain statistics for the location in question and to then match these statistics to the climate data available. In this way it is possible to isolate design and typical summer and winter sequences which can be used to explore performance and identify improvement strategies. Once a robust scheme is arrived at, it is usually necessary to undertake longer term simulations in order to determine fuel consumption trends. This will require some means to select average data from the potentially large period of record. This process--the production of Test Reference Years--has already been undertaken for many countries and is relatively simple to repeat given long-term climate data.
A more problematic issue is the selection of data to characterize micro-climatic effects. In the absence of site specific data, perhaps the best approach is to incorporate an explicit representation of the local climatic modifiers in the problem definition. For example, surrounding trees may be defined as shading devices to mitigate solar gain and their position used as a factor in the selection of pressure coefficients to define the pressure effects of wind.
Per/brmance assessment methodolo.qy The next stage is to associate the chosen boundary conditions with the problem as described and to undertake the simulations. The subsequent results--in the form of time series profiles of temperature and heat flux throughout the system then characterize system performance and, on analysis, can be made to yield strategies for intervention. This seemingly straightforward task is however fraught with difficulties. These stem from the size of the solution space and the fact that there exists a high number of casual interactions within energy systems. An obvious example relates to the window dimension parameter and the underlying trade-offbetween heat loss and solar gain : if an attempt is made to maximize the latter by increasing window size, then other design modifications will likely be required to effectively utilize the new-found resource while minimizing glare problems. A less obvious example might relate to a construction where insulation is added to reduce heat loss but which worsens load levelling by divorcing a zone's thermal access to capacity. Performance assessment methodologies (PAM) address these problems by laying down procedures by which good performance might best be pursued. In essence a PAM defines the procedural aspect of appraisal in which the modelling and decision making steps are placed in sequence. For example, in undertaking an overheating assessment of a building the following PAM might be used (actions in bold font, knowledge in
italics) : 1. Locate climate collection of appropriate severity. 2. Analyse collection against context-dependent warm day criteria to determine simulation period. 3. Undertake simulation using base-case design. 4. In terms of overheating rank-order zones as measured by some suitable criterion. 5. Isolate worst offenders as a function of discomJort
acceptability. 6. Analyse results to identify cause o['discomfi~rt. 7. Determine remedies by associating problem causes with
design options. The important point is that a PAM is model independent and, when used, can be attributcd with alternative knowledge instances depending on the user's viewpoint or the (evolving) state-of-the-art. For example, one user may prefer to use predicted mean vote as a suitabh, criterion in step 4, while another may prefer an index such as standard effective temperature, Given a sufficient set of PAMs, it is entirely possiblc to determine, by simulation, the optimum combination of zone layout and constructional/control schemes which will provide the required level of environmental control at minimum fuel consumption. Several simulations are conducted to determine a zoning strategy which not only
Assessinq Buildin9 Performance by Simulation satisfies the functional criteria, but also will accommodate sophisticated multi-zone control and the re-distribution of excess energy. Some simulations might focus on the choice of constructional materials and their relative positioning within the multi-layered constructions so that load and temperature levelling is maximized. And alternative facade fenestration and shading control features may be investigated in terms of comfort and cost criteria. Once a fundamentally sound design has emerged, well tested in terms of its performance under a range of anticipated operating conditions, a number of alternative control scenarios can be simulated. For example, basic control studies will lead to decisions on the potential of optimum start/stop control, appropriate set-back temperatures, the efficacy of weather anticipation, the location of sensors and the interrelation of thermal and visual comfort variables. Further analysis might focus on 'smart' control, where the system is designed to respond to occupancy levels or prevailing levels of luminance intensity. As the underlying relationships emerge, the designer is able to assess the benefits and the problems of any given course of action before it is implemented. Essentially the appraisal permutations are without limit. For example a simulation model could be used to answer such questions as • What are the maximum demands for heating, cooling and illumination and where and when do they occur? What are the causal factors? • What will be the effect of a particular design strategy, such as adopting 'super-insulation', specialist glazing systems or sophisticated control regimes? • What is the optimum plant start time or the most effective algorithm for weather anticipation? • How will thermal and visual comfort levels vary throughout the building under alternative lighting management schemes? • How will infiltration or temperature stratification be affected by a particular management strategy and will condensation become a problem? • What is the contribution to energy saving and comfort level of a particular passive solar feature? The approach allows a designer to better understand the interrelation between design and performance parameters, to then identify potential problem areas, and so implement and test appropriate building plant and control modifications. The design to result is more energy conscious with better comfort levels attained throughout.
lmprovinq performance While it is a relatively simple task to formulate PAMs, applying them in practice may well be problematic. This is because a simulation based approach to performance assessment will yield much data on a system's changing state over time so that the analysis and design modification tasks become complex. For example, a results data-set of the order of 15,000 items might be generated in the case of a one zone, one day simulation. Decisions about acceptability, and about appropriate intervention, can easily become a process of random change. One solution to this problem is to develop a mechanism which ensures that the designer's attention is focused on
423
the issue(s) of greatest impact. Causal energy breakdown tracking is just such a mechanism. In the approach an energy balance is formulated for the overall building in terms of the consumption of the various systems and the supply to the various zones. From this, the points of greatest consumption and demand can be determined and new energy breakdowns formulated for them in turn. This process is then repeated until the required resolution is attained and an acceptable design modification selected. To illustrate the process, and with reference to Fig. 2, consider the following example. • Assuming that a room was overheating, an energy breakdown for the air would yield information on convective plant flux (heating or cooling), shortwave flux absorption, zone coupled advection flux, surface convection flux and convective casual gain. • Assuming that the surface convection flux was the dominant causal flowpath, an energy breakdown for the corresponding surface(s) would yield the relative contributions of constructional conduction, surface convection, radiant casual gain, shortwave flux, longwave flux and radiant plant. • Assuming that the shortwave flux was now the dominant flowpath, an energy breakdown for the room windows would identify the worst offenders. • A window energy breakdown would then identify the relative contributions of the incident direct and diffuse shortwave flux (at both external surfaces and at internal surfaces due to inter-reflections), the transmitted, reflected and absorbed components, the conductive and convective processes, the longwave radiation exchanges at internal and external surfaces, the radiant casual gains and any radiant plant interaction. • At this level of resolution, informed decisions can be made on the most appropriate design change--for example the incorporation of a shading device or changing the glazing type--in order to lessen the impact of the offending shortwave gain flowpath(s). More importantly, this change can be made in full knowledge that the impact will be beneficial. If necessary, further tracking of other components of the window energy breakdown will allow any other impacts of the proposed change to be assessed. In applying the technique it is important that the initial design hypothesis is not overly costrained by the incor-
Building breakdown \ ~oo~.j
Zone Glazing ,,.... , breakdown breakdown!] ,,~_ 'l ~o~W~ ~ ~'.. '.5"~
x:-.....,,; ~
I
Surface
,/I
~ ..............
/o)7o.)
Desi~ change
y
breakdown
..-""
Fig. 2. Causal energy breakdown tracking.
J. A. Clarke
424
poration of potentially conflicting energy saving devices. If this is achieved, and if the designer has a good appreciation of the spectrum of design and operational options, then the approach will ensure that the performance optimization will proceed in the direction of improvement. Of course it is left to the designer's judgement as to when improvements move out of the region of cost effectiveness. DESIGN PROCESS INTEGRATION While the building profession's technology base is improving--in the form of simulation tools and IT products in general--most professionals are still constrained to traditional methods. This problem derives from the need for design tools to be powerful, comprehensive and first principle to adequately represent the real world complexity, while also being simple, straightforward and intuitive to facilitate user interaction. This conflict between power and ease of use is further exacerbated by the divergence of the conceptual outlook of the design orientated program users and the technically orientated program developers. To complete the confusion, there is the subtly different terminology of the various engineering professions. As with all new technologies the process of transferring application know-how into practice is an important step. There are two approaches to effective technology transfer. The first is to employ traditional means to inform designers about the capabilities and operational requirements of simulation. The second is to employ technology itself to engineer intelligent interfaces so that the design tools possess knowledge of the user and the user's tasks. These approaches are summarized in Fig. 3. It is in these two areas that major advances may be expected in the short to medium term through: • The establishing of professional organizations and design advice schemes able to provide support material in the form of databases, modelling case studies, program documentation and the like. • The elucidation of a conceptual framework for building modelling which enables disparate applications to co-exist and access a central data (product) model. • The development of more intuitive human-computer interfaces with embedded domain, application and user knowledge.
Developments in Software Engineering
]DesignProfessJo~_~Human-Computer Intelligent I~J J Advanced Workstation Interfaces Engineering software +
Technology Transfer
,~ Experts ]
Fig. 3. Approaches to technology transfer.
• The adoption of new approaches to software engineering which allow a task sharing approach to application design and reduce the time to market.
Traditional approach to technology transJer In essence this approach entails the provision of learning and application support by means of three well established routes.
Education and Training. Many higher education institutions (HEI) are recognizing the need for new learning material to inform and train professionals in the emerging technologies. Further, with the improving availability of powerful workstations, there is a realization that the technology itself can be used as the means of delivery. New ideas are therefore emerging, centred on the notions of computer aided instruction (CAI) and computer based training (CBT), which attempt to enhance knowledge (facts and skills) transfer between tutor and student by allowing the latter to access courseware at the place of learning and within the work context. At the University of Strathclyde, for example, the ESP-r system has been configured in CBT mode so that students can learn about energy conscious building design as they acquire skills in simulation practice. Clearly the pedagogical potential of CAI/CBT is immense, not least because students are encouraged to develop their own learning skills and because the approach offers a mechanism for "distance' learning, thus releasing tutors for other activities such as remedial work. Design advice and consultanev. While the importance of education and training must not be underestimated, any real benefit will transpire only after HEls have solved the problematic issues ofcourseware development, teaching practice review and technology provision. In the interim period there exists a need to make available the benefits of simulation without requiring designers to invest in an immature technology. One approach to this is to establish a simulation-based design advice service. This was the approach adopted by the Royal Incorporation of Architects in Scotland in the late 1980s when they established the Energy Design Advisory Service (EDAS [12]). Following on from an initial (free) consultation, the service is empowered to provide a 50% subsidy to clients who progress to a detailed technical evaluation, undertaken by approved consultants and, in most cases, with the help of simulation techniques. In this way EDAS transfers knowledge of the benefits of simulation while not requiring designers to access the technology directly. Based on a cost benefit analysis, the EDAS scheme is now being extended to cover the entire U.K. Professional associations. Recent years have seen the birth of an international association concerned with building simulation--the International Building Performance Simulation Association or IBPSA. The mission statement of IBPSA is : ... to advance and promote the science of building performance simulation in order to improve the design, construction, operation and maintenance of new and existing buildings world wide.
Assessin9 Buildin9 Performance by Simulation In the U.K. the BEPAC group (Building Environmental Performance Analysis Club) was formed to pursue similar objectives and as an organization is growing steadily.
Technoloyical approach to technoloyy transfer This approach involves the engineering of intelligent interfaces and the adoption of approaches to the development of applications which encourage the involvement of the end users.
Intelligent front-ends. Simulation suffers from at least two fundamental problems: the quantity and nature of the data being manipulated and the expertise and conceptual outlook of the user. In recent years advances in intelligent knowledge based system and human-computer interface techniques offer some scope for alleviation of these problems. Using these techniques it is possible to construct a user interface which incorporates a significant level of knowledge in relation to the domain being addressed (here building design), the applications being used (here energy/environmental simulation) and the user's objectives and machine interaction preferences. Such an interface facilitates a more co-operative dialogue which, in turn, allows designers to more easily define their problem and initiate and control the required evaluations. Clearly an intelligent interface (II) must be able to deal with different user types. For example, a designer operating at an early design stage might require an application that can deal with abstract concepts, incremental and exploratory definition of the issues, lack of focus, tentative data, missing information and so on. Such a user normally requires an indication of performance, notification about potential problem areas and data on the appropriateness of design alternatives. Another designer, with specialist knowledge of a particular application package, and access to more detailed information on the design hypothesis, might only require the II to provide assistance with data preparation (through the provision of standard data and sensible defaults) and to support the evaluation definition. These two user types are quantitatively and qualitatively distinct, with interface construction becoming progressively more difficult as the user's concepts become less concrete. In addition to the user type issue, IIs must address at least three distinct facets of the modelling process : data input, application control and result interpretation. Buildings performance applications require large quantities of data to describe the problem and define the necessary evaluations. Not only is gathering these data a time consuming task, but frequently the data are unavailable or uncertain. Also, due to the complex interrelationships, ensuring the integrity of the data can demand high levels of understanding of the application's theory and/or mode of operation. Generally, control of the application does not require sophisticated user interaction. The major difficulty is the identification of an appropriate performance assessment methodology and the selection of the computational parameters to produce a sufficient quality and quantity of output to allow a meaningful appraisal of performance. Result interpretation is the element which usually differentiates most between the requirements of the novice and the expert. BAE 28:4-D
425
The expert will be trying to detect patterns in, and relationships between, the different building parameters, in an attempt to isolate the dominant causal factors. The novice merely requires a concise summary of performance, preferably in terms of those parameters which are most meaningful to the design team and client. To tackle these issues on II will involve the following elements: • Dialogue handling to converse with the user in the appropriate terminology and by means of acceptable concepts. • Knowledge handling to control the dialogue, verify user entries, make inferences and assess the level of help required. • User handling to track the user's progress and ensure the system responds in an appropriate manner. • Appraisal handling to coordinate the application programs in terms of meaningful performance assessment methodologies. • Data modelling to create, from the information supplied by the user and inferred, the problem description as required by the application program. • Control and traceability involving the collection, organization and storage of the session chronicle. • User conceptualizing in relation to the different user types and levels of expertise. • Human-computer interaction in a manner that can satisfy the user's interface preferences. • Application control in terms of the package's syntax and inherent capabilites. • Process concurrency to cope with the needs of all of the above. Many examples of IIs, including those developed for use in the building modelling field, can be found in the literature [8].
Application integration. While the software technology now exists to construct IIs--in the form of knowledge and data manipulation techniques, Blackboarding, graphical interface kits and the like--the form and content of an II is an active research area which is being addressed in projects such as COMBINE [9] and, more generally, STEP [13]. Within the COMBINE (Computer Models for the Building Industry in Europe) project the intention is to develop a computational framework for the integration of all functional and performance aspects of buildings in support of the modelling activity. The central task is to develop a central integrated data model (IDM) to facilitate the exchange of design data between different design tools. Given the complexity of building design data and its high rate of change throughout the design process, such research is fraught with difficulties and so it can be expected that progress will be slow. Nevertheless as IDMs become more robust the pace of II development can be expected to quicken so that in future problem descriptions will be easier to achieve and evolve. New software engineerin9 paradigms, lls are however only one part of the problem. Because different applications employ different mathematical models or different software implementations of the same models, the
426
J. A. Clarke M del hesis
~c
I Compiledcode lib~ies (*.c) I Class definitions (*.h)
OODB
Pem'sistentobjects
p~ction m~P...~
f%
3~ party I -
i
is~tems
,
Build and run
~ 0
oooo qm--0 00000 O 0 0
|
[____V./"---
00000
"-"
0~o
Fig. 4. The Energy Kernel System.
chances of output disagreement are high and the validation process is frustrated. A need therefore exists to enable a common approach to program construction and to find ways to incorporate validation within the process. This need has provided the impetus for several research projects involving program modularization : the Energy Kernel System (EKS) project in the U.K. [10], the ZOOM project in France [11], the MODSIM project in Sweden [2] and the SPANK project in the US [14]. To characterize one possible future, consider the EKS system with which the author is familiar. The EKS is a model building environment which aims to support the construction of a spectrum of possible programs: just as COMBINE addresses standardization with respect to product description, the EKS addresses the need for standard procedures for constructing and accrediting the software tools that will manipulate these product models.* The EKS project is an attempt to place on a rational basis the construction of advanced design tools for the building industry. It does this by providing a model construction and maintenance platform which dispenses with the need to work with source code. The expectation is that this platform will dramatically improve researcher productivity and serve to enable task sharing evolution of future IT products. In the longer term practitioners themselves will be able to use the EKS to create bespoke applications for specific design problems. With reference to Fig. 4, the EKS comprises a set of applications organized around an object oriented database (OODB). This database contains objects, and has * This issue of program accreditation is already implicit in projects with other, more pressinggoals--for example, the development of simulation based standards within the European standards organization CEN [15]. ) The EKS is based on the object oriented programming paradigm in which a problem domain--here building thermodynamic simulationqs broken down into a hierarchy of entities, termed classes, on the basis of programming concepts such as data encapsulation, behavioural inheritance and extensibility. An instance of a class is termed an object.
access to classes,t which define the entities from which a range of models--from simplified calculators to state-ofthe-art simulators-~zan be constructed. To characterize system operation, and assuming that the EKS user has a well formed model hypothesis: EKScb :
EKStb :
EKSdd :
Is an application which assists in the definition of a program's context (this is equivalent to the creation of a main program in conventional programming). For example, it may be that one program might offer site and building modelling capabilities while another might offer a site, multiple buildings and plant capabilities. Within the EKS two different contexts would be required, although the latter could be derived from the former. The output from this module is a Context class. Given a Context, this module allows the user to specify the capabilities of a program by selecting EKS class variants as required. As each class is selected an internal mechanism determines the dependent classes so that the process is automated. This allows program building to be carried out incrementally and with no need for specialist knowledge in terms of the underlying algorithms. The output from this stage is a program template which defines the classes and class connections from which the program will be constructed. This template can either be stored in the OODB for later recall or held within a disk file. This module takes a template as input and outputs the corresponding data requirements. If these data are unacceptable, in that they cannot be obtained from the intended end user type, then the previous applications can be revisited and the program archi-
Assessin 9 Building P e r f o r m a n c e by Simulation
EKSdm :
EKSmb:
EKSrm :
tecture modified. At the end of this iterative process, the data requirements can be made known to some third party problem definition package such as a C A D system. This module allows the definition of a given problem in terms understandable to the EKS generated program and held in the form of EKS X defobjects.* These X defs can then be stored within the OODB. Program construction can be placed under the control of the O O D B in which case the OODB-instailed template is consulted and correct class use can be guaranteed. Alternatively, and for use in cases where an O O D B is unavailable, module E K S m b can be used to build the required model from the template definition as held in file. This procedure is equivalent to the conventional link/load operation. Finally, this module is used to associate the program with its data model as held in X def form. Again this operation can be placed under O O D B control or invoked externally.
The intention is that the EKS will improve researcher efficiency by placing model development on a task sharing basis. By allowing programs of different architecture
*The term X def stands for 'something_definition' where "something' might be a room, a material, a plant component, a site, etc. Typically an EKS generated program will require several X defs to define the site, building geometry and construction, plant layout and so on.
to be built from a c o m m o n set of "objects', program integrity should improve and the validation process should be fostered. In the longer term environments such as the EKS open up the prospect of radical changes to the design process itself. For example, imagine a design support system which allowed the definition of a design hypothesis by the graphical selection of component parts representing walls, windows, radiators, shading devices, a sun and so on. If these components were related to EKS classes then the designer is effectively constructing a model in real time. Given the functionality of the EKS classes it is a relatively simple matter to arrange that these classes start to operate immediately on selection. By bringing together hypothesis manipulation and performance appraisal a real-time computer-supported design environment is enabled. CONCLUSIONS This paper has outlined the issues surrounding the creation of simulation-based design tools for the assessment of energy systems behaviour. For the foreseeable future the major motivation of the research community will remain the improvement o f the integrity of the mathematical models. This is being achieved by the use of more relevant theories in conjunction with a sustained validation effort; by the utilization of more robust software engineering techniques (such as the O O P paradigm) to foster code sharing and reliability of use; and by pursuing more appropriate strategies for technology transfer. It is against this background that, increasingly, intelligent interfaces are being interposed between the designer and the computer in an attempt to make more productive the relationship.
REFERENCES I. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15.
427
K.J. Lomas, Thermal modelling of building envelopes : the state-of-the-art, Proc. of Sym. on Energy Efficient Buildings, Kowloon, Hong Kong (Nov. 1992). P. Sahlin and E. F. Sowell, A neutral format for building simulation models, Proc. Building Simulation '89, Vancouver, Canada (1989). P. Wouters and L. Vandaele (eds), The PASSYS test cells, PASSYS Document: II4-89-PASSYS1NS-033, Commission of the European Communities, DGXII (1990). R. Judkoff, D. Wortman, B. O'Doherty and J. Butch. A methodology for validating building energy analysis simulations, Draft Report TR-243- 1508, Solar Energy Research Institute (1983). N . T . Bowman and K. J. Lomas, Empirical validation of dynamic thermal computer models of buildings, Building Services Engineering Research and Technology, 6, 153 - ! 62 (1965). E. Palomo and H. Madsen, Methods to compare empirical measurements and simulations, Proc. of BS'91, Nice, France, Aug 2~22 (1991). P. Strachan and A. Guy, Modelling as an aid in the thermal performance assessment of passive solar components, Proc. of BEP'91, Canterbury, England (1991). J.A. Clarke, Ed., Intelligent front-ends ArtificialIntelligence in Engineering Computational Mechanics Publications, a (I 991). G. Augenbroe, Integrated building performance evaluation in the early design stages, Bid# Envir. 27(2), 149-161 (1992). P. Charlesworth, J. A. Clarke, G. Hammond, A. Irving, K. James, S. Lockley, D. MacRandal, D. Tang, T. J. Wiltshire and A. J. Wright, The energy kernel system, Proc. of BS'91, Nice, France, Aug. 2~22 (1991). J. L. Bonin, Multimodel simulation: the TEF approach, Proe. European Simulation Conference, Rome (1984). L. McElroy, The energy design advisory service as an aid toward a new working frame, Proe. 3rd European Conference on Architecture, Florence, May 17 21 (1993). W. Gielingh, General AEC reference model (GARN) ISO 184/SC4 Document 3-2-2-1 (Draft) TNO-IBBC (1988). E. Sowell, W. Buhl, A. Erden and F. A. Winkelmann, Prototype object-based system for HVAC simulation, Proc. Systems Simulation Conference, University of Liege (1986). R. Van de Perre, S. O. Jensen, D. P. Bloomfield and L. Agnoletto, Simulation based environmental building performance standards--a case study : overheating risk assessment, Proc. Int. Workshop on Computers and Building Standards, v T r , Espoo, Finland (1991).