Requirements for acceptable model use

Requirements for acceptable model use

ARTICLE IN PRESS Fire Safety Journal 40 (2005) 477–484 www.elsevier.com/locate/firesaf Short communication Requirements for acceptable model use Ala...

179KB Sizes 0 Downloads 76 Views

ARTICLE IN PRESS

Fire Safety Journal 40 (2005) 477–484 www.elsevier.com/locate/firesaf

Short communication

Requirements for acceptable model use Alan N Beard Civil Engineering Section, School of the Built Environment, Heriot-Watt University, Riccarton, Edinburgh EH14 4AS, Scotland, UK Received 4 February 2004; received in revised form 21 July 2004; accepted 13 October 2004 Available online 5 May 2005

Abstract Theoretical models, especially computer-based theoretical models, have a potential to assist greatly in fire safety decision-making. However, they also have a potential for harm and may lead to unacceptable decisions being made. A general framework is needed to help to ensure that models are employed in a way which is acceptable to all parties concerned, including the general public. This article sets out a set of requirements for such a framework and is intended to stimulate further discussion in this crucially important area. r 2004 Elsevier Ltd. All rights reserved. Keywords: Fire risk; Decision-making; Computer; Models; Design

1. Introduction This article has been developed from a presentation given at the Workshop on the Development of Key Performance Indicators for the Use of Computer Models in Support of Fire Safety Engineering, held at FRS/Building Research Establishment, Garston, Watford, UK on 22 September 2003. It is concise and to the point and is intended to help to generate discussion in this crucially important area. It relates to other articles which have been produced over the last 15 yr, most notably Refs. [1–4]. Tentative guide lines have already started to emerge, for example the ISO Standard on ‘Assessment and Verification of Fire Models’ [5], and the report to emerge from the project associated with the Workshop mentioned at the start of this article [6]. Tel.: +44 131 449 5111; fax: +44 131 451 5078.

0379-7112/$ - see front matter r 2004 Elsevier Ltd. All rights reserved. doi:10.1016/j.firesaf.2004.10.003

ARTICLE IN PRESS 478

A.N. Beard / Fire Safety Journal 40 (2005) 477–484

There is a very long way to go, however; most, if not all, of the vital issues and problems which emerge remain unresolved at present. Readers are referred to these articles, and other articles and documents referred to therein, for a fuller discussion of the issues. It is particularly appropriate that this short article help to spur discussion at this time as the crucial question of identifying the conditions under which computer-based models may be acceptably employed as part of fire safety decision making is coming to the fore both in the United Kingdom and world wide. Many computer-based models exist today, for example, see the survey article by Olenick et al. [7]. However, the use of models as part of fire safety decision making has a potentiality to lead to unacceptable design. It is proposed here that in order for computer-based models to become generally accepted as part of fire safety decision making then a framework must be established which is encapsulated in the ‘Fire Design Triangle’ of Fig. 1. Creating a framework which is acceptable to society as a whole is very important. We cannot continue with a ‘wild West’ situation, lacking in any overall coherent acceptable structure. If a disaster were to occur and it emerged later that a fire model had been used as an important part of the design, then it would set back the use of models as part of fire safety decision making for many years. It is in the interests of the international fire science and engineering community to create a rational framework which is acceptable to society as a whole.

FIRE MODEL: Which has the potential to be valuable; that is, which is capable of aiding decision making in a particular case.

METHODOLOGY OF USE: Which is generally acceptable and encourages a user to be explicit

KNOWLEDGEABLE USER: Who is capable of employing the methodology to a model which has the potential to be valuable in a particular case in a comprehensive and explicit manner, and interpreting results justifiably Fig. 1. Fire design triangle.

ARTICLE IN PRESS A.N. Beard / Fire Safety Journal 40 (2005) 477–484

479

The ideas put forward here may be regarded by some as controversial. That is not a bad thing; it is important to stimulate debate at this very crucial period during the development of fire safety engineering. Readers are encouraged to take part in this debate through the pages of this journal and other media. Readers are also welcome to write letters directly to the author with any comments. In Fig. 1, the corners of the triangle represent ‘fire model’, ‘methodology of use’ and ‘knowledgeable user’. The categories represented by the corners of the triangle are inter-dependent; they will be described briefly and the implications pointed out.

2. Fire model: which has the potential to be valuable; that is, which is capable of aiding decision making in a particular case While the term ‘fire model’ is often used to imply a deterministic model, it should be seen generally to include deterministic, probabilistic, or other models. Further, it should be taken to include evacuation modelling as well as fire development. Probably most models have the potential to be valuable, in one way or another, in some cases; that is, are capable of aiding decision making. Sometimes, however, it is difficult to see how a given model may be of value. For example, it is difficult to see how a typical deterministic zone model could be of value in relation to irregular geometries. To take a particular case, while computational fluid dynamics (CFD) models were used in a valuable way [8,9] in modelling fire development up an escalator in the investigation after the King’s Cross Underground Station, London, fire of 1987, use of a typical zone model would have been inappropriate. As another, general, example; a model may be relatively ‘realistic’ in a qualitative way in a particular application (in the sense of indicating trends) but may be poor quantitatively. This implies a need for independent assessment of each model, examining both qualitative and quantitative aspects; acceptable methodologies for assessment need to be devised. It further implies: the need for the existence of one or more independent Model Assessment Groups (MAGs) which would produce written assessments on models. Independent assessments need to be carried out on a rational, comprehensive, basis with justifiable conclusions. This would include considerations of the limitations and conditions of applicability. In this respect the assumptions in a model, per se, would need to be clearly differentiated from the assumptions which might be made in a particular application. These assessments would be updated and would act as a resource for regulatory bodies and ‘knowledgeable users’. Also, this implies the need for an approvals body which would approve the use of each model for ‘real-world’ use under given conditions, including the use of an acceptable ‘methodology of use’ by a ‘knowledgeable user’. Such an approvals mechanism would operate at both a general level, and at the level of a particular case. At the ‘particular case’ level it would be expected to interact strongly with local regulatory bodies such as building control offices and fire brigades in the UK. The Approvals Body would call upon the reports of MAGs and any other relevant

ARTICLE IN PRESS 480

A.N. Beard / Fire Safety Journal 40 (2005) 477–484

information. It is not adequate to rely solely on fragmented local reaction; regulators need help. All related documentation would, as above, act as a resource for regulators and ‘knowledgeable users’. Initiatives by specific organizations have already resulted in documents related to this area; for example, the documents of the Model Evaluation Group of the European Union [10] and the American Society for Testing and Materials (ASTM) Standard for evaluating deterministic fire models [11] as well as the documents of Refs. [5,6] already given. It may be argued that the creation of independent MAGs would be difficult and that is no doubt true. However if independent assessments do not take place then the future for computer-based models in fire safety decision making will be very bleak. It would mean that only ‘non-independent’ assessments could be expected to exist. By definition that would mean that only ‘biased’ assessments were available. A non-independent or ‘biased’ assessment would not necessarily be of no value; indeed non-independent assessments can provide very useful information. However, the value of such an assessment is severely circumscribed. Independent assessments are needed in addition. People’s lives depend upon design and if models are to become a significant part of design then there must be a place for independent assessment, based on a rational and comprehensive approach and with justifiable conclusions. A sketch of a possible generic procedure for carrying out an assessment for a deterministic model is given in Fig. 2. Specific procedures would be necessary for different kinds of models. It is not possible to carry out a truly ‘independent’ assessment, there will always be partiality to some degree or other; we all have our own views, knowledge, experience and interests. However, in this context, ‘independent’ may be taken to mean that those conducting an assessment do not have an interest in seeing a particular model portrayed as either ‘good’ or ‘bad’. In that sense it is possible to conceive of an independent assessment and such independent assessment of models is not only possible but essential. Assessment of a model by those who have an interest in seeing it portrayed as either ‘good’ or ‘bad’ is of limited help in real-world applications. It is often not very difficult to create an assessment of a model which has conclusions which are favourable (or unfavourable) to a particular model, depending upon how the assessment is conducted. For example, the choice of parameter values and experimental data for comparison, quite apart from other considerations, gives scope for finding a favourable or unfavourable comparison if that is, either consciously or unconsciously, desired. In an age of commercial computer packages this problem becomes even more acute.

3. Methodology of use: which is generally acceptable and encourages a user to be explicit How a model is used as part of a given application is of considerable importance. It is very easy to use a model in a naı¨ ve and unjustifiable way. For example, the assumptions made in a particular application (as distinct from the assumptions made in the model per se) need to be clearly set down. That is, how the particular application is ‘idealized’ in the use of the model in a particular case certainly affects results and conclusions. How uncertainty in the input is considered; the choice of parameter values, the degree and

ARTICLE IN PRESS A.N. Beard / Fire Safety Journal 40 (2005) 477–484

481

Sketch of a Generic Assessment Procedure for a Deterministic Fire Model This outline is not intended as a definitive procedure. It is put forward in a tentative sense for the purpose of helping to stimulate further discussion. The procedure is iterative and continual, not closed. At particular stages reports might be produced. However, these should not be regarded as the ‘final word’ on any given model. Ultimately there would need to be different procedures for different types of models. In the spirit of the content of this article, assessments should be conducted by people who are as independent as possible in relation to the model being examined. The word 'examination' is intended to be interpreted broadly, in the qualitative and quantitative senses. 1. Description of the model and the state-variables it is intended to predict. 2. Initial examination of the documentation provided by the producer of the model. 3. Identification of conditions of applicability for which the model is likely to have the potential to be valuable (ie building and occupant characteristics). As an initial consideration, the conditions of applicability as specified by the model developer may be taken. This may be altered during subsequent iterations of the procedure. 4. Examination of conceptual assumptions in relation to 1 and 2 above; eg the physics/chemistry etc. 5. Examination of numerical assumptions implicit within the model, in relation to 1 and 2; ie outwith those which a user may insert or alter. 6. Examination of the numerical solution techniques employed; conceptual and numerical aspects. 7. Examination of the source code of the program and the software as a whole. Assessment of the likelihood of errors. 8. Assessment of the likelihood of hardware faults for the types of computers on which the program might be used. 9. Sensitivity study of the model. 10. Comparisons between theoretical predictions and empirical data 10.1 Assessment of uncertainty/flexibility within available appropriate experimental data. 10.2 Comparisons with the results of replicated sets of experimental tests. That is, a priori, blind and open comparisons should be carried out [4]. 10.3 Comparisons between theoretical predictions and other sources of empirical data; eg from a real fire 10.4 Assessment of the ability of the model to predict quantitative results. 10.5 Assessment of the ability of the model to predict qualitative results (eg trends). 11. Assessment of the limitations of the model in the light of the fore-going considerations. 12. Identification of the conditions under which the model may have the potential to be valuable. 13. Assessment of the documentation in the light of the fore-going considerations. 14. Return to stage 1 and repeat the procedure until homeostasis has been reached, ie no new information, avenues for investigation or insights emerge Fig. 2. Sketch of a generic assessment procedure for a deterministic model.

ARTICLE IN PRESS A.N. Beard / Fire Safety Journal 40 (2005) 477–484

482

nature of any sensitivity considerations, etc. will affect results and conclusions. Methodologies are required, the stages of which encourage a user to be explicit about what has been assumed and done in the application. A generic ‘methodology of use’ for a deterministic model is suggested in Fig. 3. Sketch of a Generic ‘Methodology of Use’ for a Deterministic Fire Model This outline is not intended to be definitive. Its purpose is to help to stimulate discussion. Ultimately there would need to be different methodologies of use for different types of models. In any given case it would be appropriate to apply more than one model.

1. Examine the characteristics of the particular real-world case to which it is intended to apply a theoretical model. 2. Examine the limitations and conditions of applicability of the intended model. 3. Is this model likely to prove to be valuable in this case ? 3.1 If 'No' , consider the possibility of using one or more other models. 3.2 If 'Yes' , continue. 4. Gather empirical data relevant to the case, including test results. Conduct empirical tests if necessary. 5. Re-examine the limitations and conditions of applicability of the model in the light of earlier stages. If the model is still considered to have potential value, continue. 6. Construct different conceptual assumptions for the case. That is, different ways in which the particular case may be represented within the structure of the model ; different 'idealizations'. Emphasise and examine the disjunctions between each idealization, with its associated conceptual assumptions, and the real-world case. Inter alia, different idealizations are intended to account for different sub-models. 7. Construct plausible numerical input for each idealization; assess the uncertainty/flexibility implicit in relation to the real-world case. 8. For each idealization conduct simulations using the model and carry out sensitivity analyses. 9. Infer qualitative and/or quantitative results. 10. Interpret the qualitative and quantitative results in light of the limitations of the model. 11. Interpret results in the light of other fire knowledge and experience. 12. Conclusions 13. Return to stage 1 and repeat the procedure until homeostasis is reached; ie no new information, avenues for investigation or insight emerges. Fig. 3. A generic ‘methodology of use’.

ARTICLE IN PRESS A.N. Beard / Fire Safety Journal 40 (2005) 477–484

483

Such methodologies need to be devised for different types of models. Other material relevant to this is included in Refs. [6,12]. A similar situation arose in the case of finite element models and structural analysis. This led to a project which resulted in the safety critical structural analysis (SAFESA) methodology [13,14]. That methodology, through its stages, encourages a user to be comprehensive and explicit about what they are doing and the assumptions made in a given application. Overall, the implication is: similar projects are needed to construct acceptable methodologies of use for the fire world. However, it would be quite possible for an acceptable methodology of use to be applied to a model which had the potential to be valuable, but in an unacceptable way. An acceptable methodology of use needs to be applied by a ‘knowledgeable user’ and this brings us to the next section.

4. Knowledgeable user: who is capable of employing an acceptable methodology of use to a model which has the potential to be valuable in a particular case in a comprehensive and explicit manner; and interpreting results justifiably This implies: creating people who have knowledge of both fire science and detailed knowledge of a model being used. It further implies having courses which can help to do this. The primary focus of the educator’s role, in this context, is to help in the creation of a sufficient number of ‘knowledgeable users’. However, ‘Knowledgeable use’ depends on experience, not just ‘education’. This implies: a curriculum which includes, inter alia, material covering:

  

independent (q.v. MAG) assessments of models, limitations and conditions of applicability of models, methodologies of use.

That is, the ‘knowledgeable user’ corner of the fire design triangle is intimately linked to the two other corners of the triangle; the triangle needs to be seen as a whole. Effectively, the pre-conditions for creating knowledgeable users do not exist at present: there is no generally accepted context for a knowledgeable user to ‘slot’ into and the materials (methodologies, assessments, approvals mechanism, etc.) which would form an essential part of a relevant curriculum do not exist. The current position with respect to academic courses is such that the number of fire safety engineering related courses has increased worldwide in recent years (e.g. under-graduate and MSc/diploma level courses). However, while these courses may provide an understanding of fire science to some degree, they provide only a ‘broad brush’ understanding of fire models. They are, essentially, generalist courses. More courses are needed which provide a thorough basis in fire science; an essential precondition for the acceptable use of any model. Then, once a thorough grounding in fire science has been gained by a potential ‘knowledgeable user’, and a ‘broad brush’ understanding of models gained, he/she would need to take one or more specialist courses.

ARTICLE IN PRESS A.N. Beard / Fire Safety Journal 40 (2005) 477–484

484

Specialist courses in specific model types and specific models are needed; these would include specific limitations, conditions of use etc. Such courses should be provided by independent staff at independent institutions.

5. Summary A suitable context for the use of computer-based models as part of fire safety decision making does not exist at present. A framework needs to be created within which models may be employed in an acceptable way. The ‘corners’ of the Fire Design Triangle and the implications need to be addressed. Each of the corners of the triangle needs to be seen in relation to the triangle as a whole. References [1] [2] [3] [4] [5] [6]

[7] [8]

[9] [10] [11] [12]

[13]

[14]

Beard AN. Risk assessment assumptions. Civil Eng Environ Syst 2004;21:19–31. Beard AN. Limitations of computer models. Fire Safety J 1992;18:375–91. Beard AN. Fire models & design. Fire Safety J 1997;28:117–38. Beard AN. On a priori, blind and open comparisons between theory and experiment. Fire Safety J 2000;35:63–6. BS ISO TR 13387-3. Fire safety engineering—part 3: assessment and verification of mathematical fire models. London: British Standards Institution; 1999. Kumar S. Computer modelling performance assessment scheme for performance-based fire safety engineering. FRS/Building Research Establishment, Garston, Watford, UK, 2004, prepared for the Office of the Deputy Prime Minister. Olenick SM, Carpenter DJ. An up-dated international survey of computer models for fire and smoke. J Fire Prot Eng 2003;13:87–110. Simcox S, Wilkes NS, Jones IP. Fire at King’s Cross underground station. 18 November 1987: numerical simulation of the buoyant flow and heat transfer. Report AERE-G 4677. Atomic Energy Authority, UK: Harwell. Cox G, Chitty R, Kumar S. Fire modelling and the King’s cross fire investigation. Fire Safety J 1989;15:103–6. Model Evaluation Protocol. European Communities, Directorate-General 12: Scientific Research and Development, Brussels, 1994. Standard Guide for Evaluating the Predictive Capability of Deterministic Fire Models. American Society for Testing & Materials (ASTM), E 1355-97, 1997. Kumar S, Cox G. Some guidance on ‘‘correct’’ use of CFD models for fire applications with examples. Ninth international conference on fire science and engineering (INTERFLAM2001). Edinburgh, 17–19 September, 2001. p. 823–34. Morris AJ. Finite element safety critical software. Directions in safety–critical systems. In: Redmill F, Anderson T, editors. Proceedings of conference held at Bristol, London: Springer; 9–11 February 1993. p. 103–32. ISBN: 3-540-19817-2. SAFESA Technical Manual. Published by NAFEMS, Scottish Enterprise Technology Park, East Kilbride, Scotland G75 OQD.