The impact of a simulation game on operations management education

The impact of a simulation game on operations management education

Computers & Education 57 (2011) 1240–1254 Contents lists available at ScienceDirect Computers & Education journal homepage: www.elsevier.com/locate/...

1MB Sizes 0 Downloads 55 Views

Computers & Education 57 (2011) 1240–1254

Contents lists available at ScienceDirect

Computers & Education journal homepage: www.elsevier.com/locate/compedu

The impact of a simulation game on operations management education Federico Pasin*, Hélène Giroux Logistics and Operations Management, HEC Montreal, 3000 Côte Sainte-Catherine, Montreal (QC), Canada H3T 2A7

a r t i c l e i n f o

a b s t r a c t

Article history: Received 20 June 2010 Received in revised form 1 December 2010 Accepted 25 December 2010

This study presents a new simulation game and analyzes its impact on operations management education. The proposed simulation was empirically tested by comparing the number of mistakes during the first and second halves of the game. Data were gathered from 100 teams of four or five undergraduate students in business administration, taking their first course in operations management. To assess learning, instead of relying solely on an overall performance measurement, as is usually done in the skill-based learning literature, we analyzed the evolution of different types of mistakes that were made by students in successive rounds of play. Our results show that although simple decision-making skills can be acquired with traditional teaching methods, simulation games are more effective when students have to develop decision-making abilities for managing complex and dynamic situations. Ó 2011 Elsevier Ltd. All rights reserved.

Keywords: Simulations Interactive learning environment Applications in operations management Post-secondary education

1. Introduction The present generation of college and university students has never experienced a world without personal computers (PCs). Many have spent much time playing computer games and are now very skilled at learning and applying complex sets of rules through game playing. According to Proserpio and Gioia (2007), the learning style of the new ‘virtual generation’ (V-gen) is very different from that of former generations. It is much more visual, interactive, and focused on problem-solving. While this could be seen as a threat to the traditional teaching style, based on verbal knowledge transfer and Socratic debates, it could also be seen as an opportunity to develop simulation games that build on V-gen skills and encourage the learning of management principles and practices. Simulation games are but one way to acquire knowledge; we do not suggest that they can or should replace lectures, readings, case studies or other learning methods, which have been applied. They have also been around for many years, long before PCs were widely available. Nevertheless, now that a large proportion of students own powerful and interconnected laptops, it is easy to consider simulation games as an alternative to other types of problem-solving activities, one that can provide a complex and rich virtual environment conducive to deep learning. First applied in training in the military and the aeronautics industry, simulation games are now used in the teaching of medicine, nursing, engineering, management, and several other fields. A growing body of literature describes new simulation games and measures their impact on student learning. This article will begin with a review and integration of the existing literature, focusing particularly on the various methods used to assess learning. To contribute to the current literature, the article will also present a new simulation game and analyze its impact on operations management education. The article is organized as follows: Section 2 presents an overview and synthesis of current literature; Section 3 introduces the new simulation game; Section 4 presents the methodology used to evaluate its efficacy; and Section 5 details and analyzes the results. The article concludes with a brief discussion of the limitations of the research and the next steps to be considered. 2. Literature review 2.1. Defining simulation games Clark (2009) asserts that “authors use different terminologies to define business simulation technologies that range from top management, flight simulators, business simulators, simulation games, macro-worlds/micro-worlds to learning laboratories.” The confusion

* Corresponding author. Tel.: þ1 514 340 6752; fax: þ1 514 340 6834. E-mail address: [email protected] (F. Pasin). 0360-1315/$ – see front matter Ó 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.compedu.2010.12.006

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

1241

Table 1 Examples of research measuring reaction outcomes (including perceived learning) of a simulation game. Authors

Type of research

Results

Arias-Aranda and Bustinza-Sanchez (2009) Battini et al. (2009)

Compared simulation games with traditional teaching method; undergraduate students in business management (n ¼ 467); questionnaire with closed questions. Asked participants to evaluate their learning experience after a logistics simulation game; engineering students (n ¼ 252); questionnaire with closed questions. Compared simulation games with cases and exercises; undergraduate students in business management (n ¼ 54); questionnaire with closed questions plus open discussions. Compared simulation games with textbook used in class; undergraduate students in sales management (n ¼ 151); questionnaire with closed questions. Compared simulation games with traditional teaching method; undergraduate students in business management (n ¼ 30); questionnaire with closed questions. Asked participants to evaluate their learning experience after an operations management simulation game; students in operations management (n ¼ 31); questionnaire with closed questions. Compared simulation games with lecture-centered approach; MBA students (n ¼ 588); questionnaire with closed questions Asked participants to evaluate their learning experience after a management simulation game; MBA and business administration undergraduate students (n ¼ 252); standard post-course evaluation form and questionnaire with closed and open questions. Compared two groups of undergraduate strategic management students, one learning with a simulation game (n ¼ 126), the other with case studies and lectures (n ¼ 126); questionnaire with closed questions administered at the beginning and end of the course.

Students reported a positive impact on personal control and self-esteem.

Betts & Knaus (2006)

Cook and Swift (2006)

Farrell (2005)

Lainema and Hilmola (2005)

Li, Greenberg, and Nicholls (2007) Romme (2003)

Tompson and Dass (2000)

Students reported a high or very high degree of learning for all 9 items measured. Students thought the simulation game was superior to cases and exercises. Students rated the simulation game higher than the textbook for all 15 items measured. Students perceived the simulation game as a more effective learning tool. Students evaluated the game very positively for four of the 14 items measured.

Students thought the simulation game was superior to a lecture-centered approach. Students perceived to have learned a lot during the simulation. In each population, the overall rating of the course was better than the program’s average overall rating. Students learning with simulation game showed a higher increase in self-efficacy evaluations, regarding both knowledge and skills.

between games and simulations seems to have always been present (Ellington, Addinall, & Percival, 1981; Lewis & Maylor, 2007). Although there have been many attempts at clarification, it is still important to stress the differences between the two concepts and to define a simulation game. Webster defines a simulation as “the representation of the behavior or characteristics of one system through the use of another system, esp. using a computer.” In other words, simulation refers to the representation of an aspect of reality based on a simplified and abstracted model. The Oxford English Dictionary provides a definition that can readily be applied to a learning environment: The technique of imitating the behavior of some situation or process (whether economic, military, mechanical, etc.) by means of a suitably analogous situation or apparatus, especially for the purpose of study or personnel training. Bloomer (1973, as cited in Ellington et al., 1981, pp. 15–16) defines a game as “any contest (play) among adversaries (players) operating under constraints (rules) for an objective (winning, victory or pay-off).” A game is thus an opportunity to use one’s skills and compete with others. The name also suggests a stimulating and enjoyable activity, even though in a pedagogical context games should not be used mainly for amusement; indeed, Abt (1970) refers to pedagogical games as “serious games.” A simulation is not necessarily a game. For instance, the simulations used by Holzinger, Kickmeier-Rust, Wassertheurer, and Hessinger (2009) in their empirical study of medical education consisted of interactive animated virtual representations of complex physiological models. The students had no personal decisions to make but could, for example, visualize the impact of different values of pressure gradient, radius and bifurcations on arterial blood flow. In operations management research, simulations are also often used to anticipate the possible results of alternative designs or changes made to a complex system. Games also commonly exist outside of simulated situations (for example, hopscotch, hockey, solitaire). Ellington et al. (1981, p. 16) thus define a simulation game as “an exercise that possesses the essential characteristics of both games (competition and rules) and simulations (ongoing representation of real-life).” Simulation games may be used for various purposes. van der Zee and Slomp (2009) assert that they could help workers find solutions for specific problems, or to familiarize themselves with and ease their acceptance of new work methods or systems. Wolfe (1993) explores their application in laboratory research, where they can be used to evaluate human reactions in particular situations. The focus of this paper, however, is on the pedagogical use of simulation games. They are thus defined as challenging interactive pedagogical exercises, wherein learners must use their knowledge and skills to attain specific goals, played within an artificial reproduction of a relevant reality.

2.2. A brief history and overview Simulations and games have long been used for training purposes. Wolfe (1993) traces their origins to war games that were used in ancient China. War games, mostly in the form of board games such as chess, have always been very popular. They were transformed into more serious- and more complex-games in Germany during the 17th century, and war games helped prepare and test tactical moves during World Wars I and II (Wolfe, 1993). Web-based versions, well suited to distance learning, are now used to train military strategists (Keh, Wang, Wei, Hui & Wu, 2008). In a related field, flight simulators are almost as old as the first airplane, and were used extensively during World War II to train fighter pilots (Moroney & Moroney, 1999). More recent simulators reproduce the cockpits of commercial airplanes such as the Boeing 747 and simulate normal and extreme flying conditions.

1242

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

Table 2 Examples of research measuring the learning outcomes of a simulation game. Type of learning outcome Authors Cognitive-based

Skill-based

Affective-based

Type of research

Fowler (2006)

Compared two groups of undergraduate accounting students, one learning with cases and a simulation game (n ¼ 52), the other with traditional lectures (n ¼ 49); post-game theoretical exam measuring six dimensions: knowledge, analysis, synthesis, evaluation, application and comprehension. Pfahl et al. (2004) Three studies comparing two groups of IT students, one coupling traditional leaning with a project simulation and role-play game (total n ¼ 18), the other coupling traditional learning with the constructive cost model (total n ¼ 16); pre-game and post-game exams measuring four dimensions. Washbush and Gosen Before and after comparison of a total enterprise (1998) simulation game played by teams of undergraduate business students (n ¼ unknown); pre- and post-game exams. Bakken et al. (1992) Measured progression of results between the first 40 and last 40 runs of a game; MBA students (n ¼ 17 students) and professional MBAs (n ¼ 16 teams); measures of profit and bankruptcy rates. Chang, Chen, Yang, Measured progression of results for several runs of the and Chao (2009) simulation; undergraduate students in engineering, operations management and supply chain management (n ¼ 56); measures of costs. Davidovitch Measured progression of results for 7 runs of the simulation et al. (2008) using different configurations of history recording mechanisms and break periods; engineering graduate students (n ¼ 66); measures of profit and run duration. Langley and Morecroft Measured progression of results for six runs of the (2004) simulation using different configurations of task structure feedback; MBA students (n ¼ 66); measures of profit/benchmark ratio, # of players who surpassed the benchmark and run duration. Olhager and Persson Measured progression of results for several (2006) runs of the simulation; undergraduate students in operations management (n ¼ 10 teams); measures of fill rate, inventory holding costs and total costs. Parush et al. (2002) Measured progression of results for 5 runs of a manufacturing simulation using different configurations of history recording mechanisms; industrial engineering students (n ¼ 45); measures of profit, run duration and respect of due dates.

Cousens et al., 2009

Before and after comparison of a new product development simulation game; MBA and executive program students; pre-game and post-game questionnaires (n ¼ 265).

Results Simulation game and cases are significantly better than traditional lectures for comprehension. Lectures are better for application. No significant difference for the other dimensions. Simulation is significantly better for imparting knowledge about typical behavior patterns of software development project. No significant difference for the other three dimensions. Students improved their exam score after the simulation. Students’ and professionals’ performance improved between first 40 and last 40 runs.

Costs for the third round of the simulation were significantly lower than for the first two rounds. Run duration improved for all students between first and last runs. Profit improved only for students who had access to history recording. Students’ performance improved between first and last runs. Students who had access to task structure feedback reached a steady-state performance faster. Students’ performance improved between first and last runs.

Run duration and respect of due dates improved for all students between first and last runs. Students who had access to history recording had better due dates and profit performance and were the only ones to improve their profit significantly. The perceived importance of a broad range of key factors in new product development significantly increased after experiencing the simulation.

In medicine, models have long been used for the study of anatomy; mainframe computer-based simulations were already used in the 1960s. Nowadays, high-fidelity human patient simulators are used by health practitioners to develop their skills, and computer-based virtual reality systems help train students in surgical techniques (Bradley, 2006). In nursing, simulation games such as The Ward (Stanley & Latimer, 2010) help students develop clinical skills as well as decision-making and teamwork abilities. Computer-based simulations have also long been used to teach science (Ellington et al., 1981) in high school, colleges and universities. More recent applications include chemistry (e.g. Stieff & Wilensky, 2003), physics (e.g. Chang, Chen, Lin, & Sung, 2008), hydrology (D’Artista & Hellweger, 2007). Wells (1993) traces management simulations back to the 1950s, when ex-military managers transferred the experience they had acquired with war games. The Top Management Decision Simulation was developed in 1956 by the American Management Association (AMA) and by 1961, “more than 100 business games had been published in the US alone, and more than 30,000 executives had played them” (Wells, 1993, p. 4). In the 1970s, Cullingford, Mawdesley, and Davies (1979, p. 159) contends that management games based on computer simulations “passed through the stage of acclaim and disillusionment to take up a significant but fairly minor place in the array of modern teaching techniques.” Fifteen years later, however, Lane (1995, p. 604) asserts that “management games and simulations are in the news,” and that their use was increasing. This assertion was confirmed by a 1998 survey that showed that more than 60% of American businesses with more than 500 employees used simulations games in their training activities (Faria, 1998). In 2004, Faria and Wellington conducted another survey of business school professors in North America and found that more than 30% of the 1085 respondents used business simulations (Faria & Wellington, 2004). In the last ten years alone, new simulation games have been developed to teach marketing (Shapiro, 2003), financial management (Bruce, 2008; Uhles, Weimer-Elder, & Lee, 2008), project management (Vanhoucke, Vereecke, & Gemmel, 2005), knowledge management (Chua, 2005; Leemkuil, de Jong, de Hoog, & Christoph, 2003), risk management (Barrese, Scordis, & Schelhorn, 2003) and microenonomics (Gold & Gold, 2010). According to Wolfe (1993) and Faria, Hutchison, Wellington, and Gold (2009), management simulation games belong to one of three types: top management games (i.e.games including all aspects of an organization and usually involving strategic decisions), functional games (i.e. simulation games focusing on a specific area of business), and concept simulations (i.e. a simulation concentrating on a specific type of decision).

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

1243

Table 3 Examples of research measuring various outcomes of a simulation game. Type of outcome

Type of learning outcome

Authors

Type of research

Results

Reaction and learning outcomes

Skill-based

Léger (2006)

Measured two types of outcomes of an ERP simulation game; MBA and business administration undergraduate students (n ¼ 35); perceived learning questionnaire and standard post-course evaluation form; progression of financial results for several runs of the simulation game, SAP certification success rate of participants. Measured two types of outcomes of an international business negotiation simulation game; MBA and business administration undergraduate students (n not specified); post-game student report, standard post-course evaluation form, pre-game and post-game evaluations by instructor. Compared two teaching methods, simulation game plus some case studies (n ¼ 64) and only case studies (n ¼ 66) using two types of outcomes; business administration undergraduate students; perceived learning questionnaire with closed questions, post-game exam.

Students’ performance improved during the game. 93% of the students received their SAP certification. Students were enthusiastic about their experience. After the simulation was implemented, both courses ranked in the first quintile of all the courses offered in the business school.

Weiss (2008)

Cognitive-based

Mitchell (2004)

Students and instructor perceived that students had improved a wide set of basic and complex international business negotiation skills. Students have rated the dimension “learned a great deal” at a modal score of 7/7 each of the last 10 years.

No significant difference for any of the outcome measures. However, when asked “What is your recommendation regarding using this simulation in the next course?”, 77% of the students who participated in the simulation answered “definitely yes” and 18% “somewhat yes.”

We focus on one kind of functional game, namely operations management simulation games. One of the best known and most enduring simulation games in operations management is probably the Beer Game, which was developed at the Sloan School of Management (at the Massachusetts Institute of Technology) in the 1960s (Goodwin & Franklin, 1994). The Beer Game could actually better be described as a concept game because its main purpose is to help learners experience and understand the bullwhip effect, a particular situation created when several participants in the same supply chain attempt to anticipate the future demand of their immediate client. First created as a board game, the Beer Game was offered in a computer-based version in the late 1990s (Coakley, Drexler, Larson, & Kircher, 1998) and, more recently, in an online version (http://beergame.mit.edu/). Other simulation games developed in the last ten years include another supply chain simulation called the Lean Leap Logistics Game (Holweg & Bicheno, 2002), a computer-based version of the Goldratt’s Game for teaching capacity utilization, bottlenecks and queues in production processes (Johnson & Drougas, 2002), a computer-based version of the Dice Game to teach synchronized manufacturing or Kanban systems (Baranauskas, Gomes Neto, & Borges, 2000), two web-based simulations on lean manufacturing named WebKanban and the Lean Typing Game (Wan, Chen, & Saygin, 2008), and the Logistic GameÔ (Battini, Faccio, Persona, & Sgarbossa, 2009). 2.3. Simulation games and learning Simulation games have mostly been characterized as a form of experiential learning, because the process of knowledge creation relies on the transformation of self-experience (Battini et al., 2009; Haapasalo & Hyvonen, 2001). Simulation games have also been associated with other learning theories (for a review, see Kebritchi & Hirumi, 2008). For instance, Zantow, Knowlton, and Sharp (2005) argue that simulation games are conducive to generative learning, a concept borrowed from Wittrock, defined as: (a) the process of generating relationships, or a structure, among the components, or parts, of the information one is trying to comprehend, and; (b) the process of generating relationships between one’s knowledge and the information one is trying to comprehend (Wittrock, 1985, p. 124, as cited in Zantow et al., 2005, p. 452). Simulation games thus offer the benefits of both experiential and generative learning, and are said to provide an enhanced learning experience:  They are superior to other teaching methods for helping students develop skills such as complex problem-solving, strategic decision making and behavioral skills, including teamwork and organizing (Salas, Wildman, & Piccolo, 2009; Tompson & Dass, 2000).  They allow participants to develop a global perspective, to connect learning with real-world situations and to get close to the realities of a competitive business world (Faria & Dickinson, 1994; Haapasalo & Hyvonen, 2001; Hoberman & Mailick, 1992; Lainema & Hilmola, 2005).  Because they are dynamic, simulation games allow “students to experience the impact of change over time” (Cook & Swift, 2006, p. 38). They are also particularly useful to help students understand systemic effects and unintended consequences (Machuca, 2000).  Participants are active throughout the learning process (Faria & Dickinson, 1994) and have to make sense of and integrate a complex decision process (Zantow et al., 2005) instead of simply applying a set of rules or regurgitating theory; they are thus responsible for their own learning (Adobor & Daneshfar, 2006).

1244

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

STUDENTS Information on: Products (bill of materials, production lead times) Past sales for each product Suppliers (cost and delivery time of raw material) Workforce (minimum and maximum number of employees, capacity per employee) Labor costs (regular and overtime) Inventory costs (for raw material, work-in-progress and finished products) Stockout costs Desired inventory at end of game

YES

EDUCATOR

POSSIBLE MISTAKES

PERFORMANCE FACTORS

Presentation of the company and simulation rules

Students form teams and prepare for the simulation game Decisions: - Forecast sales for each products - Decide how much to fabricate - Decide what has to be purchased - Decide number of employees

Receives decisions from each team

Update information (sales, stocks, costs)

Provides new information: - Actual sales - Actual delivery lead times - Actual productivity of workers - Current profit and ranking of all teams

Decisions: - Forecast sales for each products - Decide how much to fabricate - Decide what has to be purchased

Update information (sales, stocks, costs)

# rounds < 10?

Receives decisions from each team

Provides new information: - Actual sales - Actual delivery lead times - Actual productivity of workers - Current profit and ranking of all teams

- Planned production exceeds capacity - Not enough material to assemble planned quantities - Various calculation errors

- Planned production exceeds capacity - Not enough material to assemble planned quantities - Various calculation errors

- Poor forecast - Poor choice of supplier - Poor estimation of workers capacity - Poor trade-offs (eg, buying subassemblies instead of working overtime - Poor decision regarding layoff or hiring - Not enough buffer

- Poor forecast - Poor choice of supplier - Poor estimation of workers capacity - Poor trade-offs (eg, buying subassemblies instead of working overtime - Not enough buffer - Excess final inventory (last period of the game only)

NO

End of game

Analyse results and identify winning team

Analyse results Modify previous decisions Evaluate impact

Feedback and discussion

Fig. 1. Outline of HECOpSim game.

 Simulations games are said to be more engaging and motivating than other teaching strategies (Garris, Ahlers, & Driskell, 2002; Salas et al., 2009). In addition to providing an enhanced learning experience, simulation games allow participants to learn complex skills in what can be characterized as an “enhanced reality:”  Simulation games are essentially “risk-free”; no one will suffer from mistakes or poor decisions made (Adobor & Daneshfar, 2006; Baker, Navarro, & van Der Hoek, 2005; Faria & Dickinson, 1994; Fripp, 1993; Salas et al., 2009).  The absence of real risk allows participants to increase their confidence level in a less stressful but still stimulating environment (Alinier, 2003).

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

Fig. 2. The “decision” sheet.

Fig. 3. The “information” sheet: Actual data of each period (stochastics variables).

1245

1246

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

Table 4 Anatomy of each mistake. Mistakes

Inputs of the related decision

Origin of the mistake

Factors of complexity

MRP – type I

– Number of purchased items in each sub-assembly item – Number of sub-assembly items that have to be assembled during the period – Actual stock of purchased items at the beginning of the period – Number of sub-assembly items that is required for each finished product – Number of finished products that have to be assembled during the period – Actual stock of sub-assembly items at the beginning of the period

– Calculation error – Last minute increase of the number of sub-assembly items to assemble not taken into account – Actual inventory of purchased items is lower than expected – Calculation error – Last minute increase of the number of finished products to assemble not taken into account – Actual inventory of sub-assembly items is lower than expected

– The number of purchased items that is necessary is dependent of the production plan for sub-assembly items – Some suppliers have a stochasticlead time

MRP – type II

Capacity

– Available theoretical capacity of each resource – Quantity of available resources – Anticipated capacity utilization – Production load

– Calculation error – Variation of the quantity of resource is not taken into account – Last minute increase of the number of sub-assembly items or finished products to assemble not taken into account – Available capacity is lower than expected

– The number of sub-assembly items that is necessary is dependent of the production plan for finished products – Previously launched sub-assembly items may be unavailable because of a previous MRP type I mistake

– Previously launched sub-assembly items may be unavailable because of a previous capacity mistake – The total production load is dependent of the production plan for each item (sub-assembly or finished product) – Actual productivity of workers (thus, effective capacity) is stochastic

 Simulations games can present situations that rarely occur in real life yet require specific and critical skills (Baker et al., 2005; Salas et al., 2009). In the same vein, they allow for multiple repetitions of similar situations, thus accelerating learning (Baker et al., 2005; Faria & Dickinson, 1994; Salas et al., 2009).  Contrary to real life, it is possible to stop the simulation game and take time to reflect, or do a partial re-run to evaluate the effects of alternative decisions (Cousens, Goffin, Mitchell, van der Hoven, & Szwejczewski, 2009; Parush, Hamm, & Shtub, 2002; Rosenørn & Kofoed, 1998). Participants can also quickly get clear and meaningful feedback on their actions, which is rarely possible in real life (Bakken, Gould, & Kim, 1992; Faria & Dickinson, 1994; Hoberman & Mailick, 1992).  Overly complex situations can be simplified and made more manageable (Cook & Swift, 2006). However, simulation games are also fraught with difficulties:  They may be costly. Not only does development and testing require a substantial amount of time (Damron, 2008; Ziv, Small, & Wolpe, 2000), in a computer environment they also have to be adapted whenever new software versions are introduced (in our own experience, it was often necessary to adapt simulations running on Excel to reflect changes in how macro functions were defined and executed in new versions). Nevertheless, Salas et al. (2009) argue that simulation-based training is often little more expensive than other training methods.  Teachers and trainers must be well prepared (Ziv et al., 2000), able to answer questions, and reassure students who become frustrated with technical difficulties (Doyle & Brown, 2000; Haapasalo & Hyvonen, 2001). This is particularly important when there are multiple sections of the same course because all teachers must be equally comfortable with the simulation. This may be difficult where there is a high turnover of teaching personnel in a specific course.  In the case of computerized simulation, there must be sufficient computers available for all students, if they do not possess their own.  If simulation games are played during class time, they may seem to consume an excessive amount of time compared with other teaching strategies. However, if they are played outside of class time, teachers will have to devote a considerable amount of time to manage the simulation, answer questions and provide feedback.  Students may be concerned with how heavily their grade will depend on the simulation (Betts & Knaus, 2006) and may become frustrated if they feel that they have done their best but yet failed to “win the game.” This frustration can be compounded by team management problems (free riders, divergent opinions on decisions that have to be made, etc.) (Adobor & Daneshfar, 2006; Anderson, 2005; Betts & Knaus, 2006). For teachers, it may not be easy to develop grading strategies that are fair yet do not consume too much of their time. Table 5 MRP mistake – type I (purchased items) descriptive statistics.

Mean Standard deviation Minimum Maximum

Did not make any mistakes

Made at least one mistake

Global results

N ¼ 34

N ¼ 66

N ¼ 100

0 0 0 0

First half

Second half

First half

Second half

1.64 1.94 0 10

1.73 1.93 0 9

1.08 1.76 0 10

1.14 1.77 0 9

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

1247

Table 6 Trends in MRP mistakes – type I (purchased items).

Made at least one mistake

Made less mistakes during second half Made more mistakes during second half Made the same number of mistakes

Made no mistakes Total Improvement hypothesis - p values

Frequency

Sum of ranks

28 32 6 34 100 Sign test 0.699

871.5 958.5

87.0 Wilcoxon test 0.752

 If learners perceive a simulation to be unrealistic, they may not take it seriously or may lose motivation to play (Adobor & Daneshfar, 2006). Nevertheless, from a learning standpoint, cognitive fidelity (i.e., how the decision processes deployed in the simulation game are similar to those of a real life situation) is more important than “mundane physical fidelity” (Salas et al., 2009). Machuca (2000) and Lainema and Hilmola (2005) advance that in order to really help students’ learning, simulations have to be ‘transparent,’ i.e. students must be able to access the internal structure of the software and to understand why they obtained particular results. Unfortunately, commercial simulation games are generally ‘black boxes’; even teachers developing their own simulation games may be protective of their ownership rights and mask the internal functioning of their software or spreadsheet. 2.4. Evaluating the efficacy of simulation games Considering the costs of developing and managing simulation games in terms of both time and money, it is only natural to question whether simulation games are really worth the effort. Indeed, most research articles on the topic provide some measure of their effectiveness. To better understand the various types of measurements used, we have classified them according to Salas et al.’s (2009) categories of outcomes. The authors build on Kirkpatrick (1976) and argue that there are four basic kinds of training outcome: reactions (how learners subjectively react to the training strategy), learning (how much has been learned using this strategy), behavior (how competent learners have become), and results (the extent to which learners perform better in real life and attain superior results). Whereas the first two training outcomes can be measured immediately after the training activity, the last two outcomes can be assessed only on the job, well after the training has finished. Salas et al. (2009) further divide the learning outcomes into three subgroups: cognitive-based outcomes (the amount and type of knowledge gained); skill-based outcomes (know-how); and affective-based outcomes (changes in attitude and motivation). As Tables 1, 2 and 3 illustrate, most studies of management simulation games evaluate learners’ reactions to the new learning tool, a conclusion similar to that reached by Gosen and Washbush (2004) in their review of the literature on experiential learning assessment. Attempts to measure other types of outcome are less common, particularly when simulation games are used for teaching management. We found several recent examples of research measuring cognitive-based outcomes in medicine and nursing education. However, they do not measure the impacts of simulation games; they measure the learning outcomes of high-fidelity simulations (for instance, performing reanimation techniques on electronic mannequins simulating human patients). Of the eight articles reviewed, three measured learning outcomes with both theoretical and clinical exams (Ackermann, 2009; Alinier, Hunt, & Gordon, 2004; Cherry, Williams, George, & Ali, 2007), two used theoretical exams only (Nguyen, Daniel-Underwood, Van Ginkel, Wong, Lee, Lucas et al., 2009; Smolle, Prause, & Smolle-Juttner, 2007), and three used clinical exams only (Nackman, Bermann, & Hammond, 2003; Tsai, Harasym, Nijssen-Jordan, & Jennett, 2006; Wyatt, Fallows, & Archer, 2004). In all except one of these studies, the use of a simulator had a significant and positive impact on both knowledge and skills. Cherry et al. (2007), however, could not find any significant difference in post-test exams between a group of medical students using a simulator and a group exposed to traditional teaching methods. Tables 1 and 3 indicate a high degree of student appreciation of simulation games in management and related fields. The predominant perception is that simulation games are better learning tools than traditional lectures or even case studies. Whether or not this is actually the case, participants feel that they have learned something, which in itself is already encouraging because satisfied students may be motivated to learn more. Nevertheless, Anderson (2005) shows that while team cohesiveness and independence positively affect students’ feelings toward the simulation, positive feelings do not necessarily result in strong performance. To our knowledge only four studies measure cognitive-based outcomes for management simulation games (first part of Table 2, second part of Table 3). Only one showed unequivocally positive results (Washbush & Gosen, 1998), two showed mixed results (Fowler, 2006; Pfahl, Laitenberger, Ruhe, Dorsch, & Krivobokova, 2004), and one could not find any significant difference between simulation games and case studies (Mitchell, 2004). More studies comparing cognitive-based outcomes between test and control groups are needed.

Table 7 MRP mistakes type II (sub-assemblies) descriptive statistics.

Mean Standard deviation Minimum Maximum

Did not make any mistakes

Made at least one mistake

N ¼ 31

N ¼ 69

0 0 0 0

Global results N ¼ 100

First half

Second half

First half

Second half

2.04 1.86 0 8

1.17 1.49 0 6

1.41 1.81 0 8

0.81 1.35 0 6

1248

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

Table 8 Trends in MRP mistake – type II (sub-assemblies).

Made at least one mistake

Made less mistakes during second half Made more mistakes during second half Made the same number of mistakes

Made no mistake Total Improvement hypothesis - p values

Frequency

Sum of ranks

42 18 9 31 100 Sign test 2.7  10E-03

1326.5 503.5

823.0 Wilcoxon test 2.5  10E-03

Results for skill-based outcomes seem to be more conclusive (second part of Table 2, first part of Table 3). All studies indicate that playing simulation games improves performance. The three studies that measured the influence of feedback and history records show a strong correlation with performance. Parush et al. (2002) mention that users of a simulator who had access to history recordings (series of past decisions and their outcomes) and an undo mechanism (to undo decisions and go back in time) during a preliminary phase, performed better in terms of profit (p < 0.01) and due dates (p < 0.01) than users who did not have such access. When this access was removed, better performance with respect to due dates (p < 0.05) remained, even with new game scenarios. Davidovitch, Parush, and Shtub (2008) studied the effect of history recording mechanisms. Their results reveal that using manual or automatic history recording mechanisms improved learning (evolution of profit in four successive rounds of play), reduced forgetting (evolution of profit after a break period) and improved relearning (evolution of profit in two successive new rounds of play, after a break). Their findings also indicate that when the break period is short (two weeks), a manual history-keeping mechanism is associated with better learning results than automatic history, but that when it is long (four weeks), there is no significant difference. Langley and Morecroft (2004) show that users who have access to various task structure feedbacks (causal map, decision recommendation, or both) perform significantly better in early trials than users who do not have such access. However, in later trials (starting at round five) the performance of all groups reaches a similar plateau. The sole study measuring affective-based outcomes (Cousens et al., 2009) reports positive results. Even though these results support the validity of simulation games, Clark (2009) contends that there continues to be “much debate about the legitimacy and the fact that the studies have not yet produced firm conclusions.” The present study contributes to the recent literature by presenting a new simulation game developed by the first author (HECOpSim) and analyzing its impact on operations management education. The focus is on skill-based outcomes. Although there have been several studies measuring this type of outcome, they generally rely on the simulated firm’s economic performance to measure learning, an approach that has been highly criticized. For example, Gosen and Washbush (2004) point out that many empirical studies suggest that performance and learning do not covary, which leads them to suggest “it is inappropriate to assess simulations using performance as a measure of learning.” Thus, instead of evaluating a simulated firm’s performance, we carefully analyzed each of the students’ decisions to identify specific technically wrong decisions (mistakes) that the simulation game was intended to reduce. Measuring learning by tracking the number of mistakes has already been done in medicine (see Iacopini, Frontespezi, Vitale, Villotti, Bella, D’Alba et al., 2006). However, to the best of our knowledge it has never been done in logistics or management. 3. Description of the new simulation game Fig. 1 contains a flowchart that outlines the various steps involved in playing and managing HECOpSim. The simulation game, played on a set of linked spreadsheet documents, puts teams of students in the context of a single manufacturer operated independently by each team. The simulation focuses on two finished products that are manufactured by the firm. Teams (of four to five students) are provided with the sales figures for the last five years. Before the game begins, they must establish the sales forecasts for each product for the next 10 periods. All teams receive the following information: the bills of materials and routing for each product and sub-assembly; the time required to complete each manufacturing activity; the regular working hours and maximum overtime constraints; and the delivery lead time for each supplier and each item, along with the starting inventory levels for each item. They are also given the various costs that they will have to consider: prices offered by various suppliers for each raw material; storage costs for raw materials, work in progress, and finished goods; ordering costs and stockout costs; labor costs (regular rate and overtime rate); and hiring and layoff costs. For each of the 10 periods of the game, the teams must decide on the amount of raw materials to purchase and the number of subassemblies and finished products to assemble. Every other period, the team must also submit a hiring and layoff plan for both the current and subsequent periods. Fig. 2 shows the “decision form” sheet that each team must complete during the game. For each production period, there is a deadline for making these decisions, after which time all teams receive a new set of data that will influence the decisions made during the ensuing period. This data consists of actual sales of each finished product during the period, actual delivery lead times of raw Table 9 Capacity mistakes descriptive statistics.

Mean Standard deviation Minimum Maximum

Did not make any mistakes

Made at least one mistake

N ¼ 62

N ¼ 38

0 0 0 0

Global results N ¼ 100

First half

Second half

First half

Second half

1.55 1.13 0 4

0.39 0.79 0 3

0.59 1.03 0 4

0.15 0.52 0 3

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

1249

Table 10 Trends in capacity mistakes.

Made at least one mistake

Made no mistakes Total Improvement hypothesis - p values

Made fewer mistakes during second half Made more mistakes during second half Made the same number of mistakes

Frequency

Sum of ranks

30 4 4 62 100 Sign test 6.2  10E-06

535.0 60.0

475.0 Wilcoxon test 5.1  10E-05

materials purchased during the period and productivity of workers during the period (all these variables are stochastic). Fig. 3 shows the “info form” sheet on which this information is presented. Students are also informed of the consequences of their decisions in this stochastic environment. The document includes several spreadsheets that instantly calculate the cumulative profit and current ranking of all teams, stock levels of all items, detailed cost, and capacity utilization. All information provided to the teams before or during the game (past sales, sales, cost, etc.) is randomly generated but remains the same for all teams in the same section. However, different sections of the same course receive different information. When the game is over, each team is given the opportunity to change five of the decisions made during the simulation (for instance, they can change the quantity of a specific item launched at a specific period) to increase their profit. Both the original and modified profits are taken into account in the evaluation of the performance of the teams. The teams must also write a report presenting: the tools and documents they prepared to play the game; an analysis of their performance during the game; and an explanation of the five decisions they decided to modify. Following the simulation, the teacher will discuss with the class the major errors that were made during the game and the best teams will be invited to describe how they prepared for and played the game. The simulation game was first introduced, after a 15-week development phase, in the winter semester of 1997. Only one group participated in this experiment, which was very favorably received by the students. There were no specific questions about the simulation in the standard student evaluation form. Although a large proportion of the class made positive comments about the game in the “comments” section. No negative opinions were expressed. The first simulation highlighted the need to develop tools to better facilitate the management of the simulation by teachers. Consequently, a website programmed in Perl language was developed during the summer of 1997. This site contained a public zone where students could register their teams before the game and receive information during the game, a private zone where, after logging in, each team could enter its decisions during the game, and a private zone where teachers could initialize and start the game, and start or stop a period. In addition, an Excel spreadsheet was developed so that teachers could generate and manage the teams’ passwords, define the parameters of a specific game, create a game scenario, load a scenario onto the server, download each team’s decisions, identify mistakes, calculate the profit made, and post the teams’ ranking on the server during the game. The improved version of the simulation, which needed another 10 weeks of development, was used between 1998 and 2002 for all sections of the introductory operations management course in the Bachelor in Business Administration program at HEC Montreal. The simulation has also been used since 2008, in a graduate program in Logistics at HEC Montreal, and since 2007, in a graduate program in Supply Chain Management at ESSEC. The website programmed in Perl language was replaced by a Google Docs environment in 2009 after an additional development phase of 5 weeks. Google Docs makes the simulation game very easy to manage, and, as Faria et al. (2009) observe, students now expect this type of learning material to be accessible on the Internet. This new version of the simulation consists of a connected system of multiple team spreadsheets and one educator spreadsheet. Educators use their spreadsheets to set the values of the parameters before the game begins and to generate the information that will be gradually supplied to the students. This spreadsheet is also used during the game to launch each new period by providing an updated info form, to dynamically import the results of each team (i.e. the profit that their simulated firm has made so far), and to calculate their ranking. Each team has its own spreadsheet that enables students to dynamically import newly available information (updated info form and ranking of all teams), to enter their decisions for the next period and to visualize their results. 4. Methods The data presented here were gathered from the 100 teams of undergraduate students that played the simulation game during the winter 2002 semester. The mean age of these students was 20.9 years and women represented 50% of the total. Less than 2% of the participants had previous experience with the simulation. These were students who had failed the course in a previous semester, however, they had played using another game scenario. As shown in Fig. 1, the decisions that the students have to make during the game can lead to two types of problems: mistakes (planning production in excess of the available plant capacity, planning production without having enough sub-assemblies or purchased parts, etc.) and poor performance (poor forecast, poor choice of suppliers, poor trade-offs, etc.). As indicated before, instead of evaluating performance, we assess the impact of the simulation on operations management education by focusing on technical mistakes made during the game. The analysis of the evolution of a game variable between successive rounds of play to assess the pedagogical impact of a simulator is common in medicine. For instance, Iacopini et al. (2006) studied an error rate and the time to complete a task. However, the analysis of the evolution of the learner’s mistakes is a lot less common than the analysis of the progression of the simulated system’s performance. To the best of our knowledge, this method has never been applied in the field of management. We studied three different types of mistakes. We considered two material requirement planning (MRP) mistakes that consisted of launching the production of an assembly item without having enough raw materials and purchased items (MRP mistake type I) or subassemblies (MRP mistake type II). We also studied the overestimation of the overall capacity of the plant. This third mistake consisted of launching the production of a set of assembly items that could not all be produced during the same period due to a lack of capacity. Table 4 presents the types of mistakes and their origin, the inputs of the decisions to which they are related, and the factors that make these

1250

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

Table 11 Lost sales (stemming from stockouts) descriptive statistics.

Mean Standard deviation Minimum Maximum

Had no lost sales

Had at least one lost sale

Global results

N¼4

N ¼ 96

N ¼ 100

0 0 0 0

First half

Second half

First half

Second half

434.88 681.48 0 3643

360.07 853.73 0 4997

417.48 673.04 0 3643

345.67 839.31 0 4997

decisions more complex. In addition to these three types of mistakes, we considered the performance result of lost sales stemming from stockouts, which happens when the firm does not have enough finished products to totally fulfill the demand for a period. Because the game allows players to ask suppliers for an express delivery of the two finished products (albeit at a very high price), lost sales stemming from stockouts are treated as a mistake, rather than poor performance. The simulation was held two or three weeks after the mid-term exam. The subjects covered in the simulation, notably MRP, production planning and inventory management, constituted an important part of the exam. The students thus had already covered the concepts required to master the simulation. One week before the mid-term exam, students had completed a homework assignment covering the same subjects. Further, the students received an annotated copy of their homework a few days before the exam, and an annotated copy of their exam at least one week prior to the start of the simulation. In line with the terminology presented earlier, this research is a skill-based learning outcome study that analyzes the progression of learners’ mistakes and a simulated firm’s performance, in an environment where standard lectures and problem-solving exercises were also present. 5. Results In this section, we present the results of the lost sales and mistakes made by the undergraduate teams during the first and second halves of the game. Because the variables we measured did not follow a Gaussian bell-shaped distribution, we opted for non-parametric tests (the sign test and Wilcoxon signed-ranks test) rather than Student’s t-test to calculate the p values. Table 5 shows that the mean and the standard deviation of the number of MRP (type I - purchased items) mistakes made by the teams were stable during the game. Table 6 indicates that 28 teams reduced the number of times this MRP mistake was made during the game, whereas 32 teams made this mistake more often during the second half than during the first one. According to paired non-parametrical tests with a p level of 5%, this slight difference is not statistically significant (sign test p value ¼ 0.699; Wilcoxon signed-ranks test p value ¼ 0.752). 34 teams made no mistakes during the entire game. Taking into account the lectures, homework and exam, it appears that the students had already reached a steady-state skill level and that the added value of the simulation was not significant with regard to the calculation of raw materials and purchased items. It seems that traditional teaching strategies were also quite effective with regard to the other types of MRP mistakes. As Table 7 indicates, 31 teams made no MRP (type II sub-assemblies) mistakes during the game. However, Tables 7 and 8 show that learning significantly improved for those who had not mastered this topic prior to the game. Both the mean and the standard deviation of the number of mistakes decreased sharply during the game. Further, Table 8 shows that 42 teams reduced the number of MRP (type II sub-assemblies) mistakes made during the game whereas only 18 teams made more mistakes during the second half than during the first one. According to paired non-parametrical tests with a p level of 1%, this improvement is statistically significant (sign test p value ¼ 2.7e-03; Wilcoxon signed-ranks test p value ¼ 2.5e-03). Traditional teaching strategies appeared quite effective with regard to capacity mistakes. Sixty-two of the 100 teams did not make that sort of mistake during the game. However, for the teams that did make capacity mistakes, there was a significant improvement during the game. Table 9 shows that the mean and standard deviation of the number of capacity mistakes decreased dramatically during the game. Furthermore, Table 10 shows that 30 teams reduced the number of capacity mistakes made during the game and that only four teams increased it. According to paired non-parametrical tests with a p level of 1%, the difference between the mistakes made during the first and second halves of the game is statistically significant (sign test p value ¼ 6.2e-06; Wilcoxon signed-ranks test p value ¼ 5.1e-05). In this specific case, the simulation thus helped some students learn skills that they were not able to fully master with the traditional education trio (lectures, exercises, and exam). Table 11 indicates that the mean of the lost sales stemming from stockouts decreased during the game. Table 12 shows that 58 teams reduced their lost sales, and 38 teams had more lost sales during the final part of the game than during the first part. According to paired non-parametrical tests with a p level of 10%, the difference between lost sales during the first and second halves is statistically significant (sign test p value ¼ 0.052; Wilcoxon signed-ranks test p value ¼ 0.032). Adding the simulation was useful for helping students that had Table 12 Trends in lost sales (stemming from stockouts).

Had at least one lost sale

Had no lost sales Total Improvement hypothesis - p values

Had fewer lost sales during second half Had more lost sales during second half Had the same quantity of lost sales

Frequency

Sum of ranks

58 38 0 4 100 Sign test 0.052

1739.5 2916.5

1177.0 Wilcoxon test 0.032

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

1251

learned inventory management by traditional education methods. However, as shown is Table 12, the simulation game was not a panacea. Indeed, some teams displayed very poor performance during the final part of the game (see the augmentation of the maximum and standard deviation). 6. Discussion As highlighted in Fig. 4, the results show that while traditional teaching methods appear effective in operations management education (few teams made many mistakes during the entire game), adding a simulation provides significant help to those who did not master all of

Fig. 4. The learning outcomes of traditional teaching methods and HECOpSim simulation game.

1252

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

the topics presented during lectures. Our results also highlight the unique advantages of simulation games for teaching certain skills. Adding the simulation proved to be of limited value for MRP type I mistakes, but significantly decreased other types of mistakes and improved performance results. Although some of the purchased items had a random lead time, avoiding MRP type I mistakes consisted in calculating the right amounts to buy. This straightforward skill can easily be taught using standard teaching methods. However, avoiding MRP type II mistakes and lost sales required more than merely calculating the right quantities. Students had to take into account that possible mistakes made during previous periods could have led to the manufacture of less than the planned number of sub-assemblies, making the end-ofperiod stock lower than expected. The simulation game helped students to understand that MRP is an ongoing activity and that the plan must continually be monitored and adjusted in response to current events and last minute changes. While this kind of knowledge can be well understood theoretically, through lectures and static problem-solving, for many students it cannot fully be grasped until it has been experienced. Our findings are thus coherent with Machuca’s (2000) statement that simulation games are useful tools to help students understand systemic effects and unintended consequences, and with Zantow et al.’s (2005) conclusion that they help to make sense of, and integrate, complex decision-making processes. 6.1. Research limitations This study has a few limitations. The data sample size, which appears large relative to other research, is nonetheless limited. Further, the nature of the data (decisions made during the simulation) does not distinguish between improvements due to better operations management skill and improvements due to a better understanding of the simulator. Nevertheless, precautions were taken to ensure that a better understanding of the simulator would not have a major impact on our results. For example, all teams had the support of teachers and teaching assistants before and during the simulation to help them use and understand the functioning of the simulator. In addition, the teams faced only minimal time constraints during the first periods of the game (students had a few days to make their first decision, an hour to make the second decision, 45 min for the third, 30 min for periods 4 to 6 and 20 min for all other periods). Finally, our analysis focused on mistakes and lost sales without considering the variables most likely to be influenced by a better understanding of the simulator, for example, the time required to complete a task. Given the methodological issues pointed out by Gröbler (2004) concerning the use of simulators in teaching and experimentation, two more limitations can be noted. Gröbler (2004) recommends that background information such as learning styles, IQ, expertise in working with computers, and past experience with the kind of tasks performed during the game be obtained from the respondents, whereas the background information we requested was more limited. Furthermore, although the fidelity (good representation of the relevant reality) of the simulation game was attested to by the professors and lecturers who used the simulation in class, some of whom had professional experience in production planning and inventory management, the external validity of the simulation (the transferability of the knowledge and insights acquired during the simulation experience to reality, and vice versa) was not tested. The results of our research should be interpreted with caution. The lack of significant improvement in MRP type I mistakes during the simulation game does not invalidate the effectiveness of this teaching method, it simply implies that the knowledge necessary for students to master this kind of decision-making process can be acquired effectively using the traditional teaching trio (lectures, exercises, and exams). Further, learning may have occurred during the preparation period of the simulation, which could have reduced the amount of learning observed during the game. Indeed, during the preparation period teams had access to the simulator. Even though they did not have access to the real game scenario, they could create their own scenarios and prepare for the regular rounds of play. To ascertain which teaching method is more effective or efficient, a comparative study between groups using simulation games as a primary learning method, and control groups not using the simulation would be required. 6.2. Conclusions The increasing availability of new technologies allows the development of tools that can simulate rich and dynamic learning environments. The fact that the health sector, which has many similarities with management (the importance of knowledge, judgment, and adaptability), is eager to develop high-fidelity simulators and assess their educational impacts should encourage operations management researchers to intensify efforts in their own field. Our research has introduced a new method for measuring skill-based learning. Instead of relying solely on overall performance measurement, as is usually done in the skill-based learning literature, we analyzed the individual types of mistakes students tend to make. We could consequently better ascertain the effectiveness of using simulation games to acquire simple or more complex decision-making skills. Our study shows that, in the case of simple decision-making skills (e.g. deciding the right number of purchased items to buy), adding a simulation game after using traditional teaching methods may be of limited value. The results also show that traditional teaching methods may, for some students, suffice to foster more complex decision-making skills. However, the results show that adding a simulation game is of significant value for the learners who did not master complex decision-making skills after traditional teaching. Indeed, more complex decisions, such as quantities to assemble, that must always be monitored and adjusted in response to ongoing events, prior decisions (e.g. past depletions of stock and number of purchased items procured) and recent mistakes (e.g. planned production of previous period could not be met entirely because of a lack of capacity) may not be fully understood without being experienced. Our research, in line with Machuca (2000) and Zantow et al. (2005), highlights that simulation games are valuable tools to help students master complex decision-making skills. Nevertheless, much work remains to determine the contexts that are better suited to astute use of simulation games as teaching tools. The next step would be to characterize different learning contexts (types of learners, types of tasks, time constraints, objectives, etc.) and study the relations between these contexts and the effectiveness of different learning methods. Another interesting step would be to develop cognitive scales to measure various operations management and management skills. Because affective-based learning has not yet attracted much attention, it would also be useful to develop scales that could measure students’ self-confidence regarding particular sets of tasks, or their attitude towards teamwork; a skill that is crucial in today’s workplace.

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

1253

References Abt, C. C. (1970). Serious games. New York: Viking Press. Ackermann, A. D. (2009). Investigation of learning outcomes for the acquisition and retention of CPR knowledge and skills learned with the use of high-fidelity simulation. Clinical Simulation in Nursing, 5, e213–e222. Adobor, H., & Daneshfar, A. (2006). Management simulations: determining their effectiveness. Journal of Management Development, 25(2), 151–168. Alinier, G. (2003). Nursing students’ and lecturers’ perspectives of objective structured clinical examination incorporating simulation. Nurse Education Today, 23(6), 419–426. Alinier, G., Hunt, W. B., & Gordon, R. (2004). Determining the value of simulation in nurse education: study design and initial results. Nurse Education in Practice, 4(3), 200–207. Anderson, J. R. (2005). The relationship between student perception of team dynamics and simulation game outcomes: an individual-level analysis. Journal of Education for Business, 81(2), 85–90. Arias-Aranda, D., & Bustinza-Sanchez, O. (2009). Entrepreneurial attitude and conflict management through business simulations. Industrial Management & Data Systems, 109 (8), 1101–1117. Baker, A., Navarro, E. O., & van Der Hoek, A. (2005). An experimental card game for teaching software engineering processes. Journal of Systems and Software, 75(1/2), 3–16. Bakken, B., Gould, J., & Kim, D. (1992). Experimentation in learning organizations: a management flight simulator approach. European Journal of Operational Research, 59, 167–182. Baranauskas, M. C., Gomes Neto, N. G., & Borges, M. A. F. (2000). Gaming at work: a learning environment for synchronized manufacturing. Computer Applications in Engineering Education, 8(3–4), 162–169. Barrese, J., Scordis, N., & Schelhorn, C. (2003). Teaching introductory concepts of insurance company management: a simulation game. Review of Business, 24(1), 43–49. Battini, B., Faccio, M., Persona, A., & Sgarbossa, F. (2009). Logistic GameÔ: learning by doing and knowledge-sharing. Production Planning & Control, 20(8), 724–736. Betts, S.C., & Knaus, R. (2006). Student perceptions of the teaching effectiveness of a management simulation in a business policy and strategy course. Proceedings of the Academy of Educational Leadership, 11(1), 3–6. Bloomer, J. (1973). What have simulations and gaming got to do with programmed learning and educational technology? Programmed Learning & Educational Technology, 10(4), 224–234. Bradley, P. (2006). The history of simulation in medical education and possible future directions. Medical Education, 40(3), 254–262. Bruce, R. (2008). Business simulator takes off. Financial Director, April, 16. Chang, K.-E., Chen, Y.-L., Lin, H.-Y., & Sung, Y.-T. (2008). Effects of learning support in simulation-based physics learning. Computers & Education, 51, 1486–1498. Chang, Y.-C., Chen, W.-C., Yang, Y.-N., & Chao, H.-C. (2009). A flexible web-based simulation game for production and logistics management courses. Simulation Modelling Practice and Theory, 17, 1241–1253. Cherry, R. A., Williams, J., George, J., & Ali, J. (2007). The effectiveness of a human patient simulator in the ATLS shock skills station. Journal of Surgical Research, 139, 229–235. Chua, A. Y. K. (2005). The design and implementation of a simulation game for teaching knowledge management. Journal of the American Society for Information Science and Technology, 56(11), 1207–1216. Clark, E. (2009). Learning outcomes from business simulation exercises - challenges for the implementation of learning technologies. Education þ Training, 51(5/6), 448–459. Coakley, J. R., Drexler, J. A., Larson, E. W., & Kircher, A. E. (1998). Using a computer-based version of the beer game: lessons learned. Journal of Management Education, 22(3), 416–424. Cook, R. W., & Swift, C. O. (2006). The pedagogical efficacy of a sales management simulation. Marketing Education Review, 16(3), 37–46. Cousens, A., Goffin, K., Mitchell, R., van der Hoven, C., & Szwejczewski, M. (2009). Teaching new product development using the ’CityCar’ simulation. Creativity and Innovation Management, 18(3), 176–189. Cullingford, G., Mawdesley, M. J., & Davies, P. (1979). Some experiences with computer based games in civil engineering teaching. Computers & Education, 3, 159–164. Damron, R. L. (2008). The life of a simulation: programmatic promises and pitfalls. Simulation and Gaming, 39(1), 126–136. D’Artista, B. R., & Hellweger, F. (2007). Urban hydrology in a computer game? Environmental Modelling and Software, 22, 1679–1684. Davidovitch, L., Parush, A., & Shtub, A. (2008). Simulation-based learning: the learning–forgetting–relearning process and impact of learning history. Computers & Education, 50, 866–880. Doyle, D., & Brown, F. W. (2000). Using a business simulation to teach applied skills – the benefits and the challenges of using student teams from multiple countries. Journal of European Industrial Training, 24(6), 330–336. Ellington, H., Addinall, E., & Percival, F. (1981). Games and simulations in science education. London: Kogan Page Limited. Faria, A. J. (1998). Business simulations games: current usage levels – an update. Simulation Gaming, 29, 295–308. Faria, A. J., & Dickinson, J. R. (1994). Simulation gaming for sales management training. Journal of Management Development, 13(1), 47–59. Faria, A. J., Hutchison, D., Wellington, W. J., & Gold, S. (2009). Developments in business gaming: a review of the past 40 years. Simulation Gaming, 40, 464–487. Faria, A. J., & Wellington, W. J. (2004). A survey of simulation game users, former-users, and never-users. Simulation & Gaming, 35, 178–207. Farrell, C. (2005). Perceived effectiveness of simulations in international business pedagogy: an exploratory analysis. Journal of Teaching in International Business, 16(3), 71. Fowler, L. (2006). Active learning: an empirical study of the use of simulation games in the introductory financial accounting class. Academy of Educational Leadership Learning, 10(3), 93–103. Fripp, J. (1993). Learning through simulations. McGraw-Hill. Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: a research and practice model. Simulation and Gaming, 33, 441–467. Gold, H., & Gold, S. (2010). Beat the market: an interactive microeconomics simulation. The Journal of Economic Education, 41(2), 216. Goodwin, J. S., & Franklin, S. G. (1994). The beer distribution game: using simulation to teach system. The Journal of Management Development, 13(8), 7–15. Gosen, J., & Washbush, J. (2004). A review of scholarship on assessing experiential learning effectiveness. Simulation and Gaming, 35, 270–293. Gröbler, A. (2004). Don’t let history repeat itself - methodological issues concerning the use of simulators in teaching and experimentation. System Dynamics Review, 20(3), 263–274. Haapasalo, H., & Hyvonen, J. (2001). Simulating business and operations management - a learning environment for the electronics industry. International Journal of Production Economics, 73, 261–272. Hoberman, S., & Mailick, S. (1992). Experiential management development: From learning to practice. Quorum Books. Holzinger, A., Kickmeier-Rust, M. D., Wassertheurer, S., & Hessinger, M. (2009). Learning performance with interactive simulations in medical education: lessons learned from results of learning complex physiological models with the HAEMOdynamics SIMulator. Computers & Education, 52, 292–301. Holweg, M., & Bicheno, J. (2002). Supply chain simulation – a tool for education, enhancement and endeavour. International Journal of Production Economics, 78(2), 163–175. Iacopini, G., Frontespezi, S., Vitale, M. A., Villotti, G., Bella, A., D’Alba, L., et al. (2006). Routine ileoscopy at colonoscopy: a prospective evaluation of learning curve and skillkeeping line. Gastrointestinal Endoscopy, 63(2), 250–256. Johnson, A. C., & Drougas, A. M. (2002). Using Goldratt’s game to introduce simulation in the introductory operations management course. Informs Transactions on Education, 3 (1). Accessed 02.11.10. http://archive.ite.journal.informs.org/Vol3No1/Vol3No1toc.php Kebritchi, M., & Hirumi, A. (2008). Examining the pedagogical foundations of modern educational computer games. Computers & Education, 51, 1729–1743. Keh, H.-C., Wang, K.-M., Wai, S.-S., Huang, J., Hui, L., & Wu, J.-J. (2008). Distance-learning for advanced military education: using wargame simulation course as an example. Journal of Distance Education Technologies, 6(4), 50–61. Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training development handbook: A guide to human resource development (pp. 18-1–18-27). New York: McGraw-Hill. Lainema, T., & Hilmola, O.-P. (2005). Learn more, better and faster: computer-based simulation gaming of production and operations. International Journal of Business Performance Management, 7(1), 34–59. Lane, D. C. (1995). On a resurgence of management simulations and games. The Journal of the Operational Research Society, 46(5), 604–625. Langley, P. A., & Morecroft, J. D. W. (2004). Performance and learning in a simulation of oil industry dynamics. European Journal of Operational Research, 155, 715–732. Leemkuil, H., de Jong, T., de Hoog, R., & Christoph, N. (2003). KM QUEST: a collaborative Internet-based simulation game. Simulation & Gaming, 34(1), 89–102. Léger, P.-M. (2006). Using a simulation game approach to teach enterprise resource planning concepts. Journal of Information Systems Education, 17(4), 441–447. Lewis, M. A., & Maylor, H. R. (2007). Game playing and operations management education. International Journal of Production Economics, 105, 134–149. Li, T., Greenberg, B. A., & Nicholls, J. A. F. (2007). Teaching experiential learning: adoption of an innovative course in an MBA marketing curriculum. Journal of Marketing Education, 29(1), 25–33.

1254

F. Pasin, H. Giroux / Computers & Education 57 (2011) 1240–1254

Machuca, J. A. D. (2000). Transparent-box business simulators: an aid to manage the complexity of organizations. Simulation and Gaming, 31, 230–239. Mitchell, R. C. (2004). Combining cases and computer simulations in strategic management courses. Journal of Education for Business, 79(4), 198–204. Moroney, W. F., & Moroney, B. W. (1999). Flight simulation. In D. J. Garland, J. A. Wise, & V. D. Hopkin (Eds.), Handbook of aviation human factors (pp. 355–388). Mahwah, NJ: Lawrence Erlbaum Associates. Nackman, G. B., Bermann, M., & Hammond, J. (2003). Effective use of human simulators in surgical education. Journal of Surgical Research, 115(2), 214–218. Nguyen, H. B., Daniel-Underwood, L., Van Ginkel, C., Wong, M., Lee, D., Lucas, A. S., et al. (2009). An educational course including medical simulation for early goal-directed therapy and the severe sepsis resuscitation bundle: an evaluation for medical student training. Resuscitation, 80, 674–679. Olhager, J., & Persson, F. (2006). Simulating production and inventory control systems: a learning approach to operational excellence. Production Planning & Control, 17(2), 113–127. Parush, A., Hamm, H., & Shtub, A. (2002). Learning histories in simulation-based teaching: the effects on self-learning and transfer. Computers & Education, 39, 319–332. Pfahl, D., Laitenberger, O., Ruhe, G., Dorsch, J., & Krivobokova, T. (2004). Evaluating the learning effectiveness of using simulations in software project management education: results from a twice replicated experiment. Information and Software Technology, 46, 127–147. Proserpio, L., & Gioia, D. A. (2007). Teaching the virtual generation. Academy of Management Learning and Education, 6(1), 69–80. Romme, A. G. L. (2003). Learning outcomes of microworlds for management education. Management Learning, 34(1), 51–61. Rosenørn, T., & Kofoed, L. B. (1998). Reflection in learning processes through simulation/gaming. Simulation and Gaming, 29, 432–440. Salas, E., Wildman, J. L., & Piccolo, R. F. (2009). Using simulation-based training to enhance management education. Academy of Management Learning and Education, 8(4), 559–573. Shapiro, S. J. (2003). The marketplace game. Academy of Marketing Science Journal, 31(1), 92–96. Smolle, J., Prause, G., & Smolle-Juttner, F.-M. (2007). Emergency treatment of chest trauma d an e-learning simulation model for undergraduate medical students. European Journal of Cardio-thoracic Surgery, 32, 644–647. Stanley, D., & Latimer, K. (2010). ‘The Ward’: a simulation game for nursing students. Nursing Education in Practice, 11(1), 20–25. Stieff, M., & Wilensky, U. (2003). Connected chemistry – incorporating interactive simulations into the chemistry classroom. Journal of Science Education and Technology, 12(3), 285–302. Tompson, G. H., & Dass, P. (2000). Improving students’ self-efficacy in strategic management: the relative impact of cases and simulations. Simulation and Gaming, 31, 22–41. Tsai, T.-C., Harasym, P. H., Nijssen-Jordan, C., & Jennett, P. (2006). Learning gains derived from a high-fidelity mannequin-based simulation in the pediatric emergency department. Journal of Formosan Medical Association, 105(1), 94–98. Uhles, N., Weimer-Elder, B., & Lee, J. G. (2008). Simulation game provides financial management training. Healthcare Financial Management, 62(1), 82–89. Vanhoucke, M., Vereecke, A., & Gemmel, P. (2005). The project scheduling game (PSG): simulating time/cost trade-offs in projects. Project Management Journal, 36(1), 51–60. Wan, H.-D., Chen, F. F., & Saygin, C. (2008). Simulation and training for lean implementation using web-based technology. International Journal of Services Operations and Informatics, 3(1), 1–14. Washbush, J. B., & Gosen, J. (1998). Total enterprise simulation performance and participant learning. Journal of Workplace Learning, 10(6/7), 314–319. Wittrock, M. C. (1985). Teaching learners generative strategies for enhancing reading comprehension. Theory into Practice, 24(2), 123–126. Weiss, S. S. (2008). Mega-simulations in negotiation teaching: extraordinary investments with extraordinary benefits. Negotiation Journal, 24(3), 325–353. Wells, R. A. (1993). Management games and simulation in management development: an introduction. Journal of Management Development, 9(2), 4–6. Wolfe, J. (1993). A history of business teaching games in English-speaking and post-socialist countries: the origination and diffusion of a management education and development technology. Simulation & Gaming, 24, 446–463. Wyatt, A., Fallows, B., & Archer, F. (2004). Do clinical simulations using a human patient simulator in the education of paramedics in trauma care reduce error rates in preclinical performance? Prehospital Emergency Care, 8(4), 435–436. Zantow, K., Knowlton, D. S., & Sharp, D. C. (2005). More than fun and games: reconsidering the virtues of strategic management simulations. Academy of Management Learning and Education, 4(4), 451–458. van der Zee, D. J., & Slomp, J. (2009). Simulation as a tool for gaming and training in operations managementda case study. Journal of Simulation, 3, 17–28. Ziv, A., Small, S. D., & Wolpe, P. R. (2000). Patient safety and simulation-based medical education. Medical Teacher, 22(5), 489–495.