Impact of online instructional game features on college students’ perceived motivational support and cognitive investment: A structural equation modeling study

Impact of online instructional game features on college students’ perceived motivational support and cognitive investment: A structural equation modeling study

Internet and Higher Education 17 (2013) 58–68 Contents lists available at SciVerse ScienceDirect Internet and Higher Education Impact of online ins...

507KB Sizes 2 Downloads 30 Views

Internet and Higher Education 17 (2013) 58–68

Contents lists available at SciVerse ScienceDirect

Internet and Higher Education

Impact of online instructional game features on college students’ perceived motivational support and cognitive investment: A structural equation modeling study Wenhao David Huang a,⁎, Tristan E. Johnson b, Seung-Hyun Caleb Han a a b

Department of Education Policy, Organization and Leadership, University of Illinois at Urbana-Champaign, Champaign, IL, USA Graduate School of Engineering, Northeastern University, Boston, MA, USA

a r t i c l e

i n f o

Article history: Accepted 15 November 2012 Available online 23 November 2012 Keywords: Digital game-based learning Design Motivational support Cognitive investment Structural equation modeling

a b s t r a c t Colleges and universities have begun to understand the instructional potential of digital game-based learning (DGBL) due to digital games’ immersive features. These features, however, might overload learners as excessive motivational and cognitive stimuli thus impeding intended learning. Current research, however, lacks empirical evidences to align game features with their motivational and cognitive support. Therefore, this study explored the relationship among game features, learners’ perceived motivational support, and cognitive investment based on the Theory on Motivation, Volition, and Performance (MVP). Based on 264 college students’ responses after playing an open online instructional game, the finding first revealed three converging factors of DGBL features (game appeal, game involvement, game structure). Second, a structural equation modeling identified a significant model that aligns with MVP theory's constructs. Future research should develop a consolidated design model to consider all identified empirical relationships in order to support efficient digital game-based learning. © 2012 Elsevier Inc. All rights reserved.

1. Introduction Effective learning should be engaging such that learners could be fully occupied by the learning tasks and environments (Keller, 2008; Spector & Merrill, 2008). With today's emerging learning technologies, options to help learners achieve the state of immersive learning are abundant. Digital game-based learning (DGBL) has been recognized for its capability to motivate and engage learners cognitively, emotionally, and socially (Gee, 2003; Huang, 2011; Huang & Johnson, 2008; Prenksy, 2001; Rieber & Matzko, 2001). Playing digital games has become a popular recreational activity in recent years. The Entertainment Software Association (ESA, 2011) reported that 72% of American households play digital games on computers or game consoles; 53% of players are between 18 and 45 years old; and in average players have more than 12 years of experience in playing digital games. This data suggest that learners now in the higher education system playing digital games and they might have grown up with video and computer games, which further implies the games’ influence on college students’ preferred technological experiences in various learning settings (Edery & Mollick, 2009). Higher education institutions have been urged to embracing this instructional approach ⁎ Corresponding author at: 344 Education, 1310 S, 6th Street, Champaign, IL 61820, USA. Tel.: +1 217 333 0807. E-mail address: [email protected] (W.D. Huang). 1096-7516/$ – see front matter © 2012 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.iheduc.2012.11.004

in order to engage with college students (Johnson, Adams, & Cummins, 2012). Prior studies have discussed DGBL's various effects on learning. Merrill (2007) and Pannese and Carlesi (2007) proposed that DGBL is able to offer many “learning by doing” opportunities; from the viewpoint of model-centered learning, DGBL can present simplified versions of complex systems thereby facilitating the development of transferrable mental models (Huang & Johnson, 2008). On the other hand, some remain skeptical on the instructional effectiveness of DGBL (e.g., Ke, 2008; Kirriemuir, 2002; Kirriemuir & McFarlane, 2006; Papastergiou, 2009). A reason for this inconsistency could be that the complex design process of DGBL prevents the empirical connection between game features and their effects on learning processes from being systematically examined (Garris, Ahlers, & Driskell, 2002; Westera, Nadolski, Hummel, & Wopereis, 2008). In order for instructional games to benefit the learning process, game features must first support learners’ motivation (Garris et al., 2002; Huang, 2011). Although much research effort has been devoted to the design of DGBL (e.g., Amory, 2007; Ang, Zaphiris, & Mahmood, 2007; Asgari, 2005; Gunter, Kenny, & Vick, 2008), the empirical linkage between game features in DGBL and their observable effect on learning motivation remains elusive. That is, we know that DGBL can motivate learning, but we are not sure how specific game features might impact learning motivation that is multifaceted, dynamic, and process-oriented (Huang, Huang, & Tschopp, 2010). This uncertainty

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

may drive DGBL design to predominantly focus on deploying excessive game features with the intention to motivate and engage learners. Not only the abundance of game features could overwhelm and de-motivate learners, but also they could overtax learners’ cognitive processing capacities (Keller, 2008; Rieber & Noah, 2008). Current literature that investigates the relationship among game features, motivational support, and cognitive processing from the design perspective, however, is scarce thus cannot adequately inform the effective design of DGBL (Huang, 2011). To address this design issue, the purpose of this study was to empirically identify the underlying principal factors of DGBL features in relation to their motivational as well as cognitive support. The following section first presents a variety of design frameworks in the context of DGBL. Then the paper reports the literature review on game features in the context of intrinsic motivation. Following, findings are presented from a structural equation modeling that links specific game feature factors to corresponding perceived motivational support and cognitive investment. 2. Literature survey 2.1. Design of digital game-based learning environments A substantial amount of efforts have been invested in developing DGBL design frameworks and models. By integrating considerations on instructions, game design, motivation, and learning, Garris et al. (2002) proposed the Input–Process–Outcome Game Model. The input consists of instructional content and game characteristics; the process includes a 3-stage cyclic game play (user judgment, user behavior, and system feedback); connected by a debriefing procedure, the outcome focuses on the attainment of learning outcomes. This model further argues for the need to establish empirical connections among game features, motivational support, and learning outcome attainment. With a similar focus on motivation, Dickey (2007) analyzed a derivative of DGBL, modern massively multiple online roleplaying games (MMORPGs) and concluded that such digital gamebased environments could provide practical design models for creating learning environments to support the development of complex competencies. Character design and narrative environments in the game could foster players’ intrinsic motivation to continuously participate in the game playing. It is the role-playing process (character) immersed in a story-telling setting (narrative) that deeply engages players cognitively and affectively. Considering the growth of college-level online learning in recent years, Moreno-Ger, Burgos, Sierra, and Fernández-Manjón (2008) proposed a design framework that integrates DGBL as a learning object in online learning environments. The framework suggests that one must identify the online learning environment's pedagogical requirements prior to the integration. Specifically the integration should focus on how the inclusion of DGBL in online learning might address learners’ diverse needs, and how assessment activities in DGBL can be aligned with their counterparts in the hosting online learning environment. This framework, however, is mostly applicable for repurposing existing commercially available digital games for online learners. Emerging from concepts of learning theories, motivational design, and cognitive load, the Relevance Embedding Translation Adaptation Immersion & Naturalization (RETAIN) model focuses on the relevance of game content and activities due to their potential impact on learners’ cognitive processing (Gunter et al., 2008). The model suggests that the content and activities should be situated in a fantasy world to support immersive learning experiences. Although the model was supported by a detailed design and evaluation rubrics, the empirical validation of this evaluation model was not conducted. Considering the heavy developmental cost associated with digital instructional games in higher education, Westera et al. (2008)

59

proposed a design framework to reduce the design complexity of DGBL. They identified four design components (environments, learning activities, multi-user, and methodology) in the DGBL where the environments should be challenging, learning activities should be complex and based on expert problem-solving processes, multi-user should encourage collaborative learning, and the methodology represents the rules of the game. The framework also presents a 3-tier design process consisted of conceptual, technical, and practical levels. This framework, nevertheless, did not address the connection between specific game features with intended motivational or cognitive outcomes. Upon reviewing these DGBL design models, it is evident that DGBL can be saturated with game features and interactions. Designing DGBL therefore entails a complex process that should harness the effect of game features and their interactions for the purpose of supporting learning motivation and sustaining productive learning. The empirical relationship among game features, motivational support, and cognitive learning, unfortunately, remains uninvestigated. Considering motivation's paramount role to sustain cyclic game play as learning processes (Huang et al., 2010), the relationship between game features and derived motivational support in DGBL must be first examined. 2.2. Motivational support in DGBL Digital games are often praised for their support in motivating and engaging learners across various contexts (Garris et al., 2002). Recent motivation studies in DGBL have mostly focused on their support on intrinsic motivation. van Eck (2006, p.167) argued that the design of games should stress Malone and Lepper's (1987) four motivational factors manifested by explicit game features (i.e., challenge, curiosity, rules, and fantasy) in order to promote positive attitudes toward learning games. Similarly Papastergiou (2009) incorporated intrinsic motivational factors to measure digital games’ motivational appeal in teaching computer science with DGBL. Dickey (2007) in a conjectural analysis concluded that character design and narrative environments of MMORPGs could foster players’ intrinsic motivation and sustain their persistent participation in the game play. Nevertheless, some have implied that DGBL's instructional effectiveness could be compromised by motivational and engagement issues (Eow, Ali, Mahmud, & Baki, 2009; Ke, 2008), because the motivational features might overload learners’ cognitive and motivational processing capacities thus disrupt the intended learning (Keller, 2008; Nelson & Erlandson, 2008). In the integrated model of multimedia learning and motivation, Astleitner and Wiesner (2004) suggested that the design of multimedia learning environments must not to overload learners’ motivational processing capacities with excessive stimuli, to maintain the cognitive learning. Situated in serious games, Ritterfeld, Shen, Wang, Nocera, and Wong (2009) confirmed the critical role of multimedia learning processing in affecting learner motivation. Scheiter and Gerjets (2007) suggested that motivational stimuli in self-control multimedia learning environments, similar to DGBL, might impose high demand on learners’ processing capacities. For motivational factors to be effective in supporting intended learning processes, the learning environments must prevent learners from being distracted and overloaded by extraneous motivational and cognitive stimuli. The following section explains the relationship among game features and their motivational as well as cognitive support based on the Integrative Theory of Motivation, Volition, and Performance (Keller, 2008). 2.3. Integrative theory of motivation, volition, and performance (MVP) for DGBL Learning motivation is the result of interactions between learners and the instructions (Driscoll, 2000; Small & Gluck, 1994; Steers &

60

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

Porter, 1983). The ARCS Motivational Design Model, for instance, measures the amount of effort invested by learners to achieve the learning goal during those interactions (Small, 2000; Song & Keller, 2001), which consists of four perceptual components: attention, relevance, confidence and satisfaction (Keller, 1983, 1987a,b). Attention indicates learners’ aroused curiosity upon interacting with the instruction (Keller, 1983). Relevance gauges the perceived usefulness and values of the learning experiences in relation to learners’ prior experiences. Confidence stresses the importance of building learners’ positive expectation towards their performance on the learning task. Satisfaction is the measure of the reflection and evaluation on the ratio between invested efforts and perceived outcome (Keller, 1987b). Prior studies utilized the ARCS Motivational Design Model mainly for two reasons. First, the ARCS model served as a design guideline for providing sufficient motivational support. Shellnut, Knowlton, and Savage (1999) adopted the ARCS model to guide their design and development for multimedia manufacturing engineering courses. The relationships between the courses’ attributes and their intended motivational support, however, were not clarified. The second reason for using the ARCS model is to evaluate instructional programs’ motivational support, which requires the use of Instructional Material Motivational Survey (IMMS) (Keller, 1993). Situated in a computer-based language learning environment, Chang and Lehman (2002) applied a modified IMMS to investigate the effect of perceived Relevance on learners’ motivation level; Huang, Huang, Diefes-Dux, and Imbrie (2006) validated the IMMS in a large freshman engineering course by targeting a computer-based tutorial teaching basic programming skills. In a recent study situated in DGBL, Huang et al. (2010) also applied IMMS to evaluate the motivational support of an online educational game. Building upon the ARCS model, Keller (2008) proposed the Integrative Theory of Motivation, Volition, and Performance (MVP) to include learners’ volitional control, cognitive information processing and the final outcome processing. The theory of MVP proposes that the learning process is initiated by motivational processing, to enable the initial goal setting. In this stage the original ARCS model components are incorporated (i.e., attention, relevance, and confidence). The output of motivational processing then will lead to the next stage, volitional processing, where learners transform their performance intentions into learning actions. Once completing the volitional processing, the next stage shifts focus to learners’ cognitive learning processes. In this cognitive learning stage learners would complete learning tasks upon interacting with the multimedia learning environment. The final stage is the outcome processing. Learners would have completed the learning tasks and are able to evaluate the invested efforts against the perceived learning gain. If the invested efforts were perceived to be more than the perceived return on learning, learners would feel less satisfactory towards the learning experience. Consequently learners might not engage to the next learning cycle. All the aforementioned stages should be closely monitored and managed in order to provide holistic motivational, volitional, and cognitive support for leaners. In the context of DGBL, the MVP theory provides unique perspectives in enhancing the efficacy of DGBL in the following areas. First, a negative outcome processing would impact DGBL processes since learning through game playing often requires multiple attempts in resolving the same problem. It is imperative to consider learner engagement at the end of each iterative game play cycle in DGBL (Huang et al., 2010). Second, the MVP theory provides a conceptual framework to investigate how game features might impact learners’ motivational processing and cognitive processing in DGBL. The following section reviews known game features by prior literature. Finally the MVP theory offers a well-supported conceptual design framework to develop testable hypotheses among various design factors and their potential effects on learning.

2.4. Features of DGBL Playing games should be entertaining (Gredler, 1994). Although players must follow rules, they also experience uncertainty and unpredictability while constrained by predetermined game objectives and voluntarily conquering obstacles in competitive contexts (Abt, 1970; Caillois, 1962; Suits, 1978). In DGBL, many have concurred that games are capable of engaging learners with fantasy-enabling environments, competitive activities, and opportunities for players to control the causal relationships between their actions and the outcomes (Gunter et al., 2008; Moreno-Ger et al., 2008; Raybourn, 2006; Rieber & Noah, 2008; Westera et al., 2008). Table 1 shows game features proposed by prior studies. 2.5. Mental effort investment – cognitive load The cognitive learning processing driven by learners’ motivational processing in the MVP theory warrants further discussion to clarify its operationalization in this study, which is grounded in Cognitive Load Theory (CLT) (Chandler & Sweller, 1991). Cognitive load, in the context of CLT, is defined as a multidimensional construct that includes task-based mental load induced by task characteristics and mental effort invested by learners in their working memory to process information (Paas, Tuovinen, Tabbers, & van Gerven, 2003; Paas & van Merriënboer, 1994; Sweller, van Merriënboer, & Paas, 1998). Mental effort is the measure that reflects the authentic cognitive load of learners, which indicates the actual cognitive load allocation as learners interacting with task characteristics while achieving the desired performance (Kalyuga, 2007). There are three types of cognitive load that, when combined, compose total cognitive load: intrinsic, extraneous, and germane. For learning to occur, the total cognitive load can never exceed a student's working memory capacity. Since the intrinsic cognitive load cannot be manipulated via instructional interventions, the design of any learning system must optimize the combination of the extraneous cognitive load and the germane cognitive load, which is to reduce the extraneous while increasing the germane cognitive load (van Gerven, Paas, van Merriënboer, & Schmidt, 2006). In other words, the learning system needs to minimize the investment of mental effort on tasks that are irrelevant to schema development (i.e., reducing extraneous cognitive load) while promoting mental effort investment on tasks that draw prior knowledge from learners’ long-term memory (i.e., increasing germane cognitive load). Therefore, the measurement of cognitive load focuses on learners’ invested mental effort, which is the result of evaluating imposed mental load and effort required to achieve desired performance (Sweller et al., 1998). The subjective category of the mental effort measurement has been used frequently as the main indicator of learners’ overall cognitive load in earlier studies due to its higher reliability, validity, and sensitivity to students’ small cognitive load changes (Paas & van Merriënboer, 1994). The measurement consists of a 9-point symmetrical category scale to ask learners to report their invested mental effort where “1” corresponds to a “very, very low mental effort” and “9” indicates a “very, very high mental effort.” In light of the exploratory nature of this present study, the 9-point Mental Effort measurement scale was adopted to indicate learners’ cognitive learning levels. In addition, the study employed a recently proposed measure that adds perceived task difficulty and perceived time spent on learning to the self-reported mental effort investment scale, to strengthen the construct validity of the measurement (DeLeeuw & Mayer, 2008). 2.6. Purpose of the study It is clear that the design of DGBL often incorporates a vast number of game features in hope that somehow those features can create a substantial motivational impact for learners. This “spray and wait” design strategy raises a critical concern. That is, the cognitive learning processing might be interrupted since learners’ limited processing

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

61

Table 1 Game features. Feature

Description

Sources

Presenting challenge

A challenging activity should be perceived as achievable, unpredictable, somewhat vague, and designed to stretch and flex players’ existing knowledge or skill levels. The level of challenge must only minimally exceed learners’ potential competency capacity to overcome the obstacles. Otherwise this characteristic might frustrate the learners in the early state of the play Competition is essential in games, which maybe the factor that differentiate game playing from other human activities. Competition in DGBLs has many forms. Players may compete with themselves, the game system, individual players, or other teams to achieve game objectives. Games without rules are meaningless. Rules exemplify problem-solving processes in various forms for learners to follow. Game rules further ensure fair plays in the system. Winning is always the liberal form of goal in games although it might be presented differently in various game genres. For instance, the goal of a sport game is gaining the highest score possible while the goal of an adventure game could be completing a variety of tasks. The role of tasks in games is twofold. First, they are building blocks or performance benchmarks for players to achieve the winning goal. Players often need to accomplish tasks in series to gradually develop the competencies in sequence. Second, game tasks help players and the game system evaluate their performance formatively. Incomplete tasks require players to revisit them till the player's performance meets the competency requirement. Games can situate players in the world of fantasy that is completely detached from reality. Players can have experiences that are impossible to acquire in the real world and are constantly engaged in the game playing process. Games with semi-reality, on the other hand, replicate the real world environment to certain extent, but not entirely. Players might be placed in a different spatial or temporal context to experience a different form of life. Games often have distinctive storylines for players to follow but they do not necessarily represent the rules of the game. This is particularly true in adventure and historical games. Storylines add contextual references and complexity of interactions to the game. Furthermore they help players relate their personal experiences and common senses to the game goals, tasks, and rules. This could be the collective effect of many game features, which enable players to immerse themselves cognitively and affectively in the game-based environments. Players in turn, consider themselves as part of the game. The experience of “flow” is the ultimate outcome of deep engagement in games. Competition, fantasy, and mystery are often implemented into games to enhance the engagement effect. Role-playing complements the challenge, fantasy, storyline, and engaging features of games. Players “pretend” to be someone or something else in all aspects of their interactions with the game system if asked to role-play. This characteristic also enhances players’ intrinsic motivation to perform continuously in a challenging game-based environment. Games often allow players to carry out actions autonomously in the process of completing game tasks. In other words, players have a great extent of control over what paths to take in order to resolve the problem or task in hands. Some suggest that this feature helps develop players’ self-identities in the game while sustaining their intrinsic motivation. Furthermore, the autonomy helps players develop a sense of ownership on decisions they make during the game play Today's digital games take the full advantage of multimedia representations to embody the aforementioned characteristics (e.g., fantasy, storytelling, competition). Not only do multimedia representations help reduce cognitive demand on players’ limited capacities, but they also develop players’ visual and spatial analysis skills.

Baranauskas, Neto, and Borges (1999), Belanich, Daragh, and Kara (2004), Csikszentmihalyi (1990), Garris et al. (2002), Malone (1980), Malone and Lepper (1987), McGrenery (1996) in Kasvi (2000), Rieber and Matzko (2001).

Requiring competition

Enforcing rules

Goal-oriented play as tasks

Situated in fantasy world or semi-reality

Telling stories

Engaging learners

Allowing role-playing

Supporting learner autonomy

Utilizing multimedia representations

capacities cannot meet the high demand of motivational stimuli in DGBL. To address these concerns, this study was conducted to test the following hypothesis based on prior literature review: There are underlying relationships among game features, perceived motivational support, cognitive learning levels, and the final satisfaction towards the learning process in DGBL. To carry out the testing, the first step was to empirically cluster previously identified game features in order to identify converged factors of DGBL. Second, informed by these major game feature factors, the study matched specific game feature factors with learners’

Amory (2007), Crawford (1982), Csikszentmihalyi (1990), Moreno-Ger et al. (2008), Rieber and Noah (2008).

Björk and Holopainen (2003), Garris et al. (2002), Hays (2005), Moreno-Ger et al. (2008), Westera et al. (2008). Björk and Holopainen (2003), Csikszentmihalyi (1990), de Felix & Johnson (1993), Gredler (1996), Hays (2005), Malone (1980).

Amory (2007), Belanich et al. (2004), Björk and Holopainen (2003), Crawford (1982), Csikszentmihalyi (1990), Garris et al. (2002), Kirriemuir and McFarlane (2006), Malone and Lepper (1987), Rieber & Noah (2008).

Dickey (2007), Moreno-Ger et al. (2008), Rieber and Matzko (2001), Rieber & Noah (2008).

Asgari (2005), Csikszentmihalyi (1990), Dickey (2005), Malone (1980) in Asgari (2005), Malone and Lepper (1987), McGrenery (1996) in Kasvi (2000), Moreno-Ger et al. (2008).

Björk and Holopainen (2003), Gredler (1996), Dickey (2005, 2007).

Belanich et al. (2004), Csikszentmihalyi (1990), Garris et al. (2002), Gredler (1996) in Hays (2005), Malone and Lepper (1987), McGrenery (1996) in Kasvi (2000).

Ang et al. (2007), Björk and Holopainen (2003), de Felix and Johnson (1993) in Hays (2005), McGrenery (1996) in Kasvi (2000).

perceived motivational support and cognitive investment based on the theory of MVP (Keller, 2008). The simplification of various game features into a framework is promising to simplify the design process; and the connection among game features, perceived motivational support, and learner’ cognitive investment can further inform the design of DGBL. Both purposes serve to inform college and university instructors when integrating DGBL for instructional applications. 3. Method This study first employed exploratory factor analysis based on participants’ responses after playing the target online instructional game

62

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

to identify converging game feature factors. Second, the study conducted a structural equation modeling to identify the relationship among game feature factors and learners’ perceived motivational support and cognitive investment. The following sections describe the online instructional game, procedures, data collection, and data analysis. 3.1. Selection of open online instructional game Digital games have been suggested to be able to effectively engage college students (Johnson et al., 2012). Organizations around the world also have taken this advantage and created numerous open DGBL to deliver educational information to the public. Based on existing literatures on DGBL's pedagogical elements (Huang & Johnson, 2008), the research team reviewed open online instructional games designed by various organizations in the U.S. (e.g., Center for Disease Control, NASA). Game genre was not considered a relevant factor in game selection based on prior research (Vogel et al., 2006). The “Trade Ruler” game from the Nobel Prize Foundation was selected as the target online instructional game for three reasons. First, the interaction between learners and the game is enriched by its multimedia components and consistent cognitive activities, which would enable learners to experience a rich game play with numerous playing cycles. Second, the content of the instructional game (economic theory) is novel to the participating college students to ensure learners’ prior knowledge will not impact their perceived motivational levels (Moos, 2009). Third, it is feasible for participants to finish the game playing process in one sitting. The Trade Ruler game is available at http://www.nobelprize.org/educational/economics/trade/. 3.2. Instruments For the purpose of identifying convergences of game features, the instrumentation process consisted of two stages according to DeVellis (2003). First, based on previously reported literature on game features (see Table 1), the researchers generated a pool of survey items Table 2 Items derived from the gaming characteristics literature and expert input. Item Item description no. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

The game is challenging enough for me to play. The game requires me to compete with other players or the game itself. The game's rules are clearly presented. The game's rules are easy to follow. The game's goals are clearly presented. The game situates me in a fantasy world. The game's storyline is comprehensible. The game engages me deeply in the playing process. The game allows me to role-play. The game allows me to fully control my actions. The game's graphics are attractive. The game's animations are attractive. The game tasks are clearly presented. The game provides all information necessary for me before the playing process. The game provides all information necessary for me during the playing process. The game's audio elements (e.g., background music, narrations) are attractive. The game provides explanatory feedback on my performance. The game provides corrective feedback on my performance. I can easily transfer skills I learned from the game to the real world. The progression of the game task makes sense to me. The game provides enough support to help me accomplish the game tasks. The game provides enough previews to prepare me for the game playing. The game provides opportunities for me to collaborate with other players. The game allows me to learn from my mistakes. The game keeps me interested throughout the playing process. The game is fun to play.

for participants’ perceptions after playing the instructional game. Second, an expert review was conducted to ensure the content validity and the redundancy of survey items (DeVellis, 2003, p.85), which recruited seven instructional game designers to participate. As a result, 26 items were included in the game feature survey to collect participants’ responses towards the game playing experience. See Table 2 for the item list. For the purpose of understanding the relationship among game feature factors, perceived motivational support, and cognitive investment, the Instructional Material Motivational Survey (IMMS) derived from the ARCS model (Keller, 1987a) was adopted to measure learners’ perceived motivational support in Attention, Relevance, Confidence, and outcome processing, Satisfaction. In terms of learners’ cognitive investment, the study adopted a self-reported scale measuring learners’ perceived cognitive load levels that asks participants to report their mental effort investment, perceived task difficulty, and perceived time spent on learning (DeLeeuw & Mayer, 2008). See Appendix A for the item list. 3.3. Data collection Participants were recruited from a subject pool at a public Midwestern university in the United States, which consists of undergraduate students enrolled in an introductory educational psychology course. All participants were required to access the target online instructional game in a computer laboratory with minimal interruption. No time limit was imposed for participants to finish the game to mimic the authentic game playing and learning experience. All participants were instructed to read the game rules on the entry page before starting the game. After completing the game, participants were redirected to an online survey to respond to the game feature survey, the adopted IMMS, and cognitive load survey on a 9-point symmetric Likert scale (1 = absolutely disagree to 9 = absolutely agree). On average, participants spent less than 45 min to complete the participation. Of the collected 264 valid cases, 50 participants were male (18.9%) and 214 participants were female (81.1%), which adequately reflected the composition of the subject pool in the undergraduate teacher education program. In terms of academic major, 74% of participants were in Education and Liberal Arts, 2.3% were in Business, 7.5% were in Science, and 16.2% reported “Other” as their academic major. With regards to the age groups, 73.1% of participants were between 18 and 20 years old, 21.6% were between 21 and 25 years old, and 5.3% were older than 25 years of age. The study participants were not very familiar about the topic of the instructional game. 3.4. Data analysis The data analysis consists of two stages. First is to identify the converging game feature factors. Factor analyses were used for this purpose. Second is to match the converged factors with perceived motivational support and cognitive investment based on the theory of MVP. A structural equation modeling was conducted to identify the underlying relationship. 3.4.1. Identifying converging game feature factors Based on previous studies on data reduction (Johnson et al., 2007), the data consolidation procedures include two phases, which consist of (1) exploratory factor analysis (EFA) verified by parallel analysis and (2) conceptualization of extracted factors. All items were subjected to exploratory factor analysis using principal components analysis with Varimax rotation to extract factors (McDonald, 1985). Based on the Kaiser (K1) (Kaiser, 1960) rule, factors with eigenvalues exceeding 1.0 were considered. Items with loadings higher than .60 on any factor and without high cross loading were retained in the list. In addition, factors with multiple qualified

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

items were retained. Parallel analysis was then employed to verify the findings of factor analysis since only applying the K1 rule to extract factors might be insufficient (Hayton, Allen, & Scarpello, 2004). Both analyses resulted in the deletion of items as well as the classification of items into different factors. 3.4.2. Matching game feature factors with motivational support and cognitive investment Data on learners’ perceived motivational support and cognitive investment first went through the same factor extraction process discussed earlier to ensure the validity and reliability. Second, structural equation modeling was applied based on the theory of MVP (Keller, 2008). The data points consisted of the converged game feature factors, learners’ perceived motivational support based on IMMS, and cognitive investment based on cognitive load levels. Both the measurement model and the structural model were examined to examine the overall model fit against the theoretical framework. 4. Results

63

Table 4 Identified DGBLE factors and items. Factor

Items

Game structure The game's rules are easy to follow. The game's goals are clearly presented. The game tasks are clearly presented. The game provides all information necessary for me before the playing process. The game provides all information necessary for me during the playing process. The game provides enough support to help me accomplish the game tasks. Game The game situates me in a fantasy world. involvement The game engages me deeply in the playing process. The game allows me to role-play. The game keeps me interested throughout the playing process. The game is fun to play. Game appeal The game's graphics are attractive. The game's animations are attractive. The game's audio elements (e.g., background music, narrations) are attractive.

4.1. Converged game feature factors 4.1.1. Factor analysis and parallel analysis The first result of exploratory factor analysis reported an acceptable sample adequacy (KMO = .90) and showed that five factors’ eigenvalues were contributing to 63.24% of variances. By reviewing the item loading, six items were deleted. The second exploratory factor analysis (KMO = .87) was conducted to examine the improvement of cumulative variance after the item deletion. One item was removed due to a high cross loading. As a result, five factors were extracted based on the K1 rule explaining 69.01% of the total variance. A parallel analysis using 150 randomly generated datasets of the same sample size (n = 264) and variable numbers (20 items) as the real dataset (Hayton et al., 2004) was conducted to verify the five factors identified by the 2nd exploratory factor analysis. The result indicated that only three factors (14 items) should be extracted from the dataset. See Table 3 for the item loading and corresponding factors. Upon conducting the reliability analysis, the overall reliability (Cronbach's alpha) is .89, F (263, 13), p = .00, indicating a good internal consistency.

Table 3 Results of exploratory factor analysis verified by parallel analysis. Item

1

2

4.1.2. Conceptualization of converging factors According to the results of factor analysis, and input from the aforementioned expert designer panel, three factors emerged. Factor One consisted of items related to the structure of the DGBL that guides the playing process (Game Structure, GS). Factor Two, with five items, focuses on how the DGBL enabled players to involve themselves deeply in the playing process (Game Involvement, GI). Factor Three encapsulates DGBL's ability to host non-textual multimedia components (Game Appeal, GA). Table 4 shows all three factors and corresponding items. 4.2. Relationship among game feature factors, motivational support, and cognitive investment 4.2.1. Data reduction Applying the same data reduction procedures described earlier, the 36 items in adopted IMMS were reduced to 17 items. Attention was measured by three items; Relevance by five items; Confidence by five items, and Satisfaction was measured with four items. See Table 5 for the extracted IMMS items with corresponding loadings and constructs. Cognitive investment items were all retained to maintain the integrity of the construct. See Table 6 for all psychometric properties of the Cognitive Investment construct.

3

Loading 4 5 13 14

The game's rules are easy to follow. .843 The game's goals are clearly presented. .732 The game tasks are clearly presented. .824 The game provides all information necessary for me .799 before the playing process. .727 15 The game provides all information necessary for me during the playing process. 21 The game provides enough support to help me .644 accomplish the game tasks. 6 The game situates me in a fantasy world. .600 8 The game engages me deeply in the playing process. .754 9 The game allows me to role-play. .710 25 The game keeps me interested throughout the .725 playing process. 26 The game is fun to play. .681 11 The game's graphics are attractive. .917 12 The game's animations are attractive. .901 16 The game's audio elements (e.g., background music, .726 narrations) are attractive. Eigenvalue 7.42 2.55 1.58 % of variance 37.08 12.75 7.89 Note: Extraction method: principal component analysis; rotation method: Varimax rotation with Kaiser normalization; cumulative variance = 57.82%.

4.2.2. Structural equation modeling Grounded in the theory of MVP (Keller, 2008), this stage applied structural equation modeling to identify the relationship among game features, learners’ perceived motivational support, and cognitive investment. Following the two-step analytical procedures, the researchers first examined the measurement model, then the structural model. The rationale of this two-step approach is to ensure the conclusion on structural relationship among ARCS and cognitive load is drawn from a set of measurement instrument with desirable psychometric properties. 4.2.2.1. The measurement model. The measurement model consists of two indices. First, the convergent validity indicates the extent to which the items of a scale that are theoretically related should correlate highly. A composite reliability of 0.70 or more and an average variance extracted of more than 0.50 are deemed acceptable (Fornell & Larcker, 1981). Table 6 summarized the factor loadings, composite reliability (a) and average variance extracted (p) of the measures of our research model. All the measures fulfill the recommended levels, with the composite reliability ranges from 0.75 to 0.90 and the average variance extracted ranges from 0.50 to 0.75.

64

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

Table 5 Extracted items and corresponding loadings and components from IMMS. No.

Item

Mean

S.D.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

There was something interesting at the beginning of the game that got my attention. The interface design of the game is eye-catching. The design of the game looks dry and unappealing. It is clear to me how the content of the game is related to things I already know. I enjoyed the game so much that I would like to know more about this topic. The content of the game is relevant to my interests. I could relate the content of the game to things I have seen, done or thought about in my own life. The content in the game will be useful to me. The game was more difficult to understand than I would like for it to be. The game had so much information that it was hard to pick out and remember the important points. The content of the game is so abstract that it was hard to keep my attention on it. The activities in the game were too difficult. I could not really understand quite a bit of the material in the game. Completing the exercises in the game gave me a satisfying feeling of accomplishment. I learned some things that were surprising or unexpected with the game. The wording of feedback after the exercises, or of other comments in the game, helped me feel rewarded for my effort. If felt good to successfully complete the game.

6.040 6.150 6.560 6.060 4.270 4.980 5.000 4.480 4.790 6.240 6.590 6.730 6.080 5.500 5.840 6.400

1.854 2.002 2.018 1.884 2.078 2.008 1.883 2.091 2.045 1.869 1.962 1.905 2.268 2.009 2.081 2.016

5.610

1.982

17 1 2

1

2

3

4

Component

.706 .829 .823

.678 .676 .715

Attention Attention Attention Relevance Relevance1 Relevance Relevance Relevance Confidence Confidence Confidence2 Confidence Confidence Satisfaction Satisfaction2 Satisfaction

.775

Satisfaction

.624 .616 .749 .849 .785 .743 .800 .720 .757 .699

This item measures Satisfaction in original IMMS. This item measures Attention in original IMMS.

Second, the discriminant validity indicates the extent to which the measure is not a reflection of some other variables. It is indicated by low correlations between the measure of interest and the measures of other constructs. Evidence about discriminant validity of the measures can be verified with the squared root of the average variance extracted for each construct higher than the correlations between it and all other constructs (Fornell & Larcker, 1981). As summarized in Table 7, the square root of average variance extracted for each construct is greater than the correlations between the constructs and all other constructs. The results suggested an adequate discriminant validity of the measurements. 4.2.2.2. The structural model. The model was estimated using maximum likelihood method. Fig. 1 depicts fit statistics, overall explanatory power, and estimated path coefficients. The fit statistics indicate that the research model provides a good fit to the data (chi-square= 1077.06, n=505, p=0.00; CFI=0.95; IFI=0.95; NFI=0.91; RMSEA= 0.06). As stated by Browne and Cudeck (1989), an acceptable fit exists where CFI>0.90 and RMSEAb 0.10. In addition, the model accounts for 15.68% of the variance in Attention, 4.84% of the variance in Relevance, 17.64% of the variance in Confidence. Summarized results for the tests are shown in Table 8. 5. Discussion 5.1. Converged game feature factors The exploratory factor analysis found preliminary evidences on how DGBL features factor together and form the underlying thematic structure of game features. Three key factors were identified: game structure (GS), game involvement (GI), and game appeal (GA). Building upon prior research that addressed similar DGBL features in relation to their theoretical roots in intrinsic motivation (e.g., Garris et al., 2002), our findings provided preliminary empirical support on how game features can be consolidated there by supporting learner motivation. The GS factor includes several components such as rules, goals, and explanations of game tasks. When designing DGBL, it is important to consider game structure features such as providing clear demonstrations and instructions for learners to be fully immersed in the game, which might further affect whether or not players win the game. The GI factor consists of components that distinguish DGBL from other interactive instructional strategies (e.g., simulation). When designing DGBL, developers need to account for involvement

features such as learners needing opportunities to role-play in a fantasy world and the playing experience should be fun. Finally, the GA factor complements the GI factor in that a game's audio, graphic, and animated elements could help engage learners cognitively and emotionally in the game. Again, all of these components within the aforementioned three factors need to be considered for effective motivational design in DGBL. Upon comparing the converged thematic structures with existing instructional design literature, the GS factor is congruent with design principles of Four Components Instructional Design model (4C/ID-model) derived from Cognitive Load Theory (van Merriënboer, Clark, & de Croock, 2002), which has been suggested to guide the design of complex learning environments. The GS factor should provide clear gaming tasks and comprehensive support for learners that is consistent with Learning Task and Supportive Information components of the 4C/ID-model. The motivational design of instruction can further inform the GI factor to support learners’ intrinsic motivation while learning in DGBL. Keller (2008) suggested that in order to effectively manage learner motivation, learning environments must integrate strategies to support learners’ initial motivational processing, volitional control, cognitive information processing, and finally, the outcome processing that sustains learning. Huang et al. (2010) further argued that solely considering the intrinsic aspect of learning motivation in DGBL might not be sufficient. Since all DGBL use extrinsic incentives (i.e., game scores, levels of play, competition) to drive players’ behaviors, learners’ volatile extrinsic motivational states must be addressed. Both intrinsic and extrinsic motives need to be considered when designing stimuli to sustain learners’ behaviors in computer-based environments (Davis, Bagozzi, & Warshaw, 1992). Finally, the GA factor can be supported by multimedia's effect in enhancing learning experiences. Cobb (1997) suggested that multimedia, acting as sources external to learners’ cognitive capacity, Table 6 Summary of psychometric properties of the measures. Construct

Composite reliability (a)

Average variance extracted (p)

GS GI GA Attention Relevance Confidence Cognitive load Satisfaction

0.895 0.824 0.897 0.800 0.850 0.832 0.700 0.764

0.590 0.503 0.748 0.578 0.533 0.501 0.500 0.519

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

65

Table 7 Correlation matrix of the constructs.

1. 2. 3. 4. 5. 6. 7. 8.

GS GI GA Attention Relevance Confidence Cognitive Load Satisfaction

Mean

S.D.

1.

2.

3.

4.

5.

6.

7.

8.

6.377 6.209 5.978 6.249 4.958 6.085 3.903 5.836

1.547 1.481 1.920 1.641 1.568 1.561 0.865 1.610

1.000 .471 .322 .054 .151 .318 −.210 .206

1.000 .527 .262 .208 .112 .147 .329

1.000 .309 .220 .090 .229 .249

1.000 .373 .170 .051 .490

1.000 .362 .056 .556

1.000 −.321 .381

1.000 .051

1.000

could process information for learners thus increasing learners’ cognitive learning efficiency. While multimedia elements could engage learners, the design of multimedia learning environments must consider learners’ cognitive capacity in terms of available modalities for information processing and mental model development (Mayer, 2001; Moreno & Mayer, 2007). Too much multimedia stimuli is very likely to induce cognitive overload of learners thus impeding the learning process (Ang et al., 2007; Astleitner & Wiesner, 2004). Components from the identified gaming feature factors, however, are slightly different from the initial conceptualization. When comparing the differences between original game features in Table 1 and the consolidated list in Table 4, the element of story-telling seems to be the less of learners’ interest. The missing story-telling element might be contributed by the fact that the target online instructional game does not use storylines to guide players’ activities.

Another finding related to the DGBL development is that learners did not consider game support a critical component during the playing process. Players might prefer to resolve problems by themselves or by working with their peers instead of consulting with the game system. For an efficient DGBL motivational design, the game support function should consider this finding by creating an environment that is conducive for social learning. 5.2. Relationships among game features, motivational support, and cognitive investment This portion of discussion responds to the hypothesis proposed earlier. Our findings based on structural equation modeling supported the hypothesis that there are observable relationships among game features, perceived motivational support, cognitive learning levels, and the

0.19

Fig. 1. Result of the proposed research model with significant path coefficients.

66

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68 Table 8 Summary of hypothesis tests. Hypothesis

Support

H1: H2: H3: H4: H5: H6: H7:

No Yes Yes Yes Yes Yes Yes

A ->CL R ->CL C ->CL A ->S R ->S C ->S CL ->S

Finally the significant path between learners’ cognitive investment and final outcome processing (satisfaction) confirms the relationship proposed by the theory of MVP. This finding suggests that the amount of cognitive effort investment, as the composite of overall cognitive load, can positively impact the perceived satisfaction level at the end of each learning cycle. Additional research efforts, however, are needed to identify which type of cognitive load (extraneous or germane) might contribute to this significant relationship.

6. Conclusion final satisfaction toward the learning process, which adds new insights to the applicability of the theory of MVP to learning environments that are beyond conventional online learning settings (e.g., Kim & Keller, 2011). The findings further suggested that the theory of MVP, to a large extent, is able to explain the relationships among game features, learners’ perceived motivational support, the resulting cognitive investment, and the satisfaction level towards the DGBL process in the following three areas. First, game features could collectively influence learners’ perceived motivational support. In particular, GS could significantly impact learner confidence; GI could impact learner attention and relevance; and GA influenced learner attention. Studies have concurred that providing learner-centered and structured instructional activities can increase learners’ confidence level (e.g., Alfassi, 2003). In those environments learners are aware of the expectations of learning tasks and the available means to meet those expectations. As a result, the perceived confidence in successfully accomplishing the learning tasks could increase (Keller, 2008). Regarding GI's effects on attention, the fantasy element in GI might make learners feel curious about the game playing thus increasing learner attention. The relationship between GI and relevance might be derived from learners’ interests induced by the game playing. By reviewing the items in GI (Table 4) and the Relevance constructs (Table 5), the alignment between the two groups is strong in that both emphasize learners’ belief that the content delivered by the game might be useful. This relationship can be explained based on the viewpoint of Self-Determination Theory (Ryan & Deci, 2000) that suggests a fluid interaction between learners’ intrinsic and extrinsic motivation. Ryan, Rigby, and Przybylski (2006), in a video game study, further suggested a significant relationship between the enjoyment of playing and learners’ perceived relatedness in the game environment. The more players play, the more they can relate to the game. Since GA consists of features pertaining to arousing learners’ interests (e.g., graphics, animations), its connection to learner attention is well supported (Keller, 2008). However, the relationship between GA and its effects on cognitive investment (e.g., Mayer & Moreno, 2003) was not found by this study, which implies the possibility that the GA elements might be more than what learners could utilize in order to make the cognitive learning processes efficient. Second, perceived motivational support (attention, relevance, and confidence) could all directly contribute to the final outcome processing (i.e., satisfaction) while only relevance and confidence could significantly influence leaners’ cognitive investment. This finding concurs with the expectancy-value model depicted by Pintrich (1988), which articulates two approaches for connecting learners’ motivation, as a mediating factor, with their cognitive learning. The first approach deals with learner confidence derived from self-efficacy. Leaners with high levels of self-efficacy might invest more cognitive efforts on the learning tasks than those who do not believe that they can be successful. The second approach explains that learners’ cognitive investment can be driven by the perceived extrinsic values (relevance) of the learning tasks. The absent relationship between learners’ perceived attention and their cognitive investment, while not supported by the expectancyvalue model, implies the negative effect of excessive multimedia elements on learner curiosity and interests, which consequently diminished the effects of learner attention on their cognitive investment.

This study set out to inform higher education institutions when designing DGBL for their students. In particular, this study enables instructional game designers to consolidate DGBL features thus reducing DGBL's demand on learners’ motivational and cognitive processing capacity. With only three game feature factors to focus on, the design complexity and developmental cost of DGBL could be significantly reduced. This in turn may lower the barrier of integrating DGBL for instructional purposes in colleges and universities. Furthermore, findings of this study could help investigate the instructional effectiveness of DGBL as research efforts can focus on a shorter list of variables, to understand the relationship between gaming features and learning outcomes. While this study was able to identify multiple empirical relationships among game feature factors, motivational components, cognitive investment, and outcome processing, one must be cautious in interpreting the results due to three limitations. First, as the DGBL features are mostly derived from the intrinsic aspect of learner motivation, the findings in this study should not be generalized to explain the effect of extrinsic motives. Second, it is unclear whether or not the target online instructional game was originally designed by following proven instructional design models. Therefore the study results cannot imply any relationship between identified DGBL features and the intended learning outcome. Finally, the finding of this study is only applicable to single-player DGBL environments where collaborative playing and socialization among players are kept at the minimal level. Future research efforts will extend the identified empirical framework to identify the relationship between specific learning outcomes (e.g., concept learning, procedural learning) and DGBL features.

Appendix A

IMMS Items 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

When I first looked at the game, I had the impression that it would be easy for me. There was something interesting at the beginning of the game that got my attention. The game was more difficult to understand than I would like for it to be. After reading the introductory information, I felt confident that I knew what I was supposed to learn from the game. Completing the exercises in the game gave me a satisfying feeling of accomplishment. It is clear to me how the content of the game is related to things I already know. The game had so much information that it was hard to pick out and remember the important points. The interface design of the game is eye-catching. There were examples that showed me how the game could be important to some people in the learning setting. Completing activities in the game successfully was important to me. The quality of the writing in the game helped to hold my attention. The content of the game is so abstract that it was hard to keep my attention on it. As I worked on the game, I was confident that I could learn the content. I enjoyed the game so much that I would like to know more about this topic. The design of the game looks dry and unappealing.

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68 Appendix A (continued) IMMS Items 16 The content of the game is relevant to my interests. 17 The way the information is arranged in the game helped keep my attention. 18 There are explanations or examples of how people use the knowledge in the game. 19 The activities in the game were too difficult. 20 the game has things that stimulated my curiosity. 21 I really enjoyed learning with the game. 22 The amount of repetition in the game caused me to get bored sometimes. 23 The content and style of writing in the game convey the impression that its content is worth knowing. 24 I learned some things that were surprising or unexpected with the game. 25 After working on the game for a while, I was confident that I would be able to pass a test on the content. 26 The game was not relevant to my needs because I already knew most of it. 27 The wording of feedback after the exercises, or of other comments in the game, helped me feel rewarded for my effort. 28 The variety of reading passages, activities, illustrations, etc., helped keep my attention on the game. 29 The style of writing in the game is boring. 30 I could relate the content of the game to things I have seen, done or thought about in my own life. 31 There are so many words on each game screen/page that it is irritating. 32 If felt good to successfully complete the game. 33 The content in the game will be useful to me. 34 I could not really understand quite a bit of the material in the game. 35 The good organization of the content in the game helped me be confident that I would learn this material. 36 It was a pleasure to work on such a well-designed game. Mental Effort Investment Items 1 2 3

How much mental effort did you invest to learn the content from the game? How difficult was it for you to learn the content from the game? How much time did you spend to finish the entire game?

References Abt, C. (1970). Serious games. New York: Viking Press. Alfassi, M. (2003). Promoting the will and skill of students at academic risk: An evaluation of an instructional design geared to foster achievement, self-efficacy and motivation. Journal of Instructional Psychology, 30, 28–40. Amory, A. (2007). Game object model version II: A theoretical framework for educational game development. Educational Technology Research and Development, 55, 55–77. Ang, C. S., Zaphiris, P., & Mahmood, S. (2007). A model of cognitive loads in massively multiplayer online role playing games. Interacting with Computers, 19, 167–179. Asgari, M. (2005). A three-factor model of motivation and game design. Digital Games Research Conference (DIGRA), Vancouver, British Columbia, Canada. Astleitner, H., & Wiesner, C. (2004). An integrated model of multimedia learning and motivation. Journal of Educational Multimedia and Hypermedia, 13, 3–21. Baranauskas, M. C. C., Neto, N. G. G., & Borges, M. A. F. (1999). Learning at work through a Multi-User synchronous simulation game. Paper presented at the PEG'99 Conference, University of Exeter, Exeter, UK. Belanich, J., Daragh, E. S., & Kara, L. O. (2004). Instructional characteristics and motivational features of a PC-based Game. U.S. Army Research Institute for the Behavioral and Social Sciences. Björk, S., & Holopainen, J. (2003). Describing games: An interaction-centric structural framework. Digital Games Research Conference (DIGRA) (Retrieved from http:// www.digra.org/dl/db/05150.10348) Browne, M. W., & Cudeck, R. (1989). Single sample cross-validation indices for covariance structures. Multivariate Behavioral Research, 24, 445–455. Caillois, R. (1962). Man, play, and games. London: Thames and Hudson. Chandler, P., & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and Instruction, 8, 293–332. Chang, M. M., & Lehman, J. D. (2002). Learning foreign language through an interactive multimedia program: An experimental study on the effects of the relevance component of the arcs model. CALICO Journal, 20, 81–98. Cobb, T. (1997). Cognitive efficiency: Toward a revised theory of media. Educational Technology Research and Development, 45, 21–35. Crawford, C. (1982). The art of computer game design. Retrieved from http://pdf. textfiles.com/books/cgd-crawford.pdf Csikszentmihalyi, M. (1990). Finding Flow: The Psychology of Optimal Experience. New York: Harper Perennial. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the workplace. Journal of Applied Social Psychology, 22, 1111–1132. de Felix, J. W., & Johnson, R. T. (1993). Learning from video games. Computers in the Schools, 9(2/3), 119–134.

67

DeLeeuw, K. E., & Mayer, R. E. (2008). A comparison of three measures of cognitive load: Evidence for separable measures of intrinsic, extraneous, and germane load. Journal of Educational Psychology, 100, 223–234. DeVellis, R. F. (2003). Scale development theory and application (2nd ed.). Thousand Oaks, CA: Sage Publications. Dickey, M. D. (2005). Engaging by design: How engagement strategies in popular computer and video games can inform instructional design. Educational Technology Research and Development, 53, 67–83. Dickey, M. D. (2007). Game design and learning: A conjectural analysis of how Massively Multiple Online Role-Playing Games (MMORPGs) foster intrinsic motivation. Educational Technology Research and Development, 55, 253–273. Driscoll, M. P. (2000). Introduction to theories of learning and instruction. In M. P. Driscoll (Ed.), Psychology of learning for instruction (pp. 3–28). Boston, MA: Allyn and Bacon. Edery, D., & Mollick, E. (2009). Changing the Game: How Video Games Are Transforming the Future of Business. Upper Saddle River, New Jersey: FT Press. Entertainment Software Association (2011). Annual report. Washington, DC: Entertainment Software Association (Retrieved from http://www.theesa.com/about/ESA_ 2011_Annual_Report.pdf) Eow, Y. L., Ali, W. Z. b. W., Mahmud, R. b., & Baki, R. (2009). From one students’ engagement with computer games and its effect on their academic achievement in a Malaysian secondary school. Computers & Education, 53, 1082–1091. Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18, 39–50. Garris, R., Ahlers, R., & Driskell, J. E. (2002). Games, motivation, and learning: A research and practice model. Simulation & Gaming, 33, 441–467. Gee, J. (2003). What video games have to teach us about learning and literacy. New York: Palgrave Macmillan. Gredler, M. (1994). Designing and Evaluating Games and Simulations: A Process Approach. Houston, TX: Gulf Publishing Company. Gredler, M. (1996). Educational games and simulations: A technology in search of a research paradigm. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology. New York: Macmillan. Gunter, G. A., Kenny, R. F., & Vick, E. H. (2008). Taking educational games seriously: Using the RETAIN model to design endogenous fantasy into standalone educational games. Educational Technology Research and Development, 56, 511–537. Hays, R. T. (2005). The Effectiveness of Instructional Games: A Literature Review and Discussion. Orlando, Florida: Naval Air Warfare Center Training Systems Division. Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor analysis: A tutorial on parallel analysis. Organizational Research Method, 7, 191–205. Huang, W. H. (2011). Learners’ motivational processing and mental effort investment in an online game-based learning environment: A preliminary analysis. Computers in Human Behavior, 27, 694–704. Huang, W. H., Huang, W. Y., Diefes-Dux, H., & Imbrie, P. K. (2006). Preliminary validation of ARCS Model-based Instructional Material Motivational Survey (IMMS) in a computer-based tutorial setting using LISREL measurement model. British Journal of Educational Technology, 37, 243–259. Huang, W. H., Huang, W. Y., & Tschopp, J. A. (2010). Sustaining iterative game playing processes in DGBL: The relationship between motivational processing and outcome processing. Computers in Education, 55, 789–797. Huang, W., & Johnson, T. (2008). Instructional game design using Cognitive Load Theory. In R. Ferdig (Ed.), Handbook of research on effective electronic gaming in education (pp. 1143–1165). Hershey PA, USA: Information Science Reference. Johnson, L., Adams, S., & Cummins, M. (2012). The NMC horizon report: 2012 higher education edition. Austin: New Media Consortium (Retrieved from http://www. nmc.org/pdf/2010-Horizon-Report.pdf) Johnson, T. E., Lee, Y. M., Lee, M. Y., O'Connor, D. L., Khalil, M. K., & Huang, X. X. (2007). Measuring sharedness of team-related knowledge: Design and validation of a shared mental model instrument. Human Resource Development International, 10, 437–454. Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20, 141–151. Kalyuga, S. (2007). Enhancing instructional efficiency of interactive e-learning environments: A cognitive load perspective. Educational Psychology Review, 19, 387–399. Kasvi, J. J. J. (2000). Not just fun and games—Internet games as a training medium. In P. Kymäläinen, & L. Seppänen (Eds.), Cosiga - Learning with computerised simulation games (pp. 23–34). Espoo: HUT. Ke, F. (2008). A case study of computer gaming for math: Engaged learning from gameplay? Computers in Education, 52, 1609–1620. Keller, J. M. (1983). Motivational design of instruction. In C. M. Reigeluth (Ed.), Instructional design theories and models: An overview of their current status (pp. 386–434). Hillsdale, NJ: Lawrence Erlbaum Associates. Keller, J. M. (1987a). Strategies for stimulating the motivation to learn. Performance and Instruction, 26, 1–7. Keller, J. M. (1987b). The systematic process of motivational design. Performance and Instruction, 26(9/10), 1–8. Keller, J. M. (1993). Motivation by design. Unpublished manuscript, Florida State University, Florida. Keller, J. M. (2008). An integrative theory of motivation, volition, and performance. Technology, Instruction, Cognition and Learning, 6, 79–104. Kim, C. M., & Keller, J. M. (2011). Towards technology integration: The impact of motivational and volitional email messages. Educational Technology Research and Development, 59, 91–111. Kirriemuir, J. (2002). Video gaming, education and digital learning technologies. D-Lib Magazine. Retrieved from http://www.dlib.org/dlib/february02/kirriemuir/02kirriemuir. html.

68

W.D. Huang et al. / Internet and Higher Education 17 (2013) 58–68

Kirriemuir, J., & McFarlane, A. (2006). Literature review in games and learning. Futurelab Series. Futurelab. Malone, T. W. (1980). What makes things fun to learn? A study of intrinsically motivating computer games. Palo Alto, CA: Xerox Palo Alto Research Center. Malone, T. W., & Lepper, M. R. (1987). Making learning fun. A taxonomy of intrinsic motivations for learning. In R. E. Snow, & M. J. Farr (Eds.), Aptitude, Learning, and Instruction. Cognitive and Affective Process Analyses, volume 3. (pp. 223–253) Hillsdale, NJ: Lawrence Erlbaum. Mayer, R. E. (2001). Multimedia learning. New York: Cambridge University Press. Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38, 43–52. McDonald, R. P. (1985). Factor analysis and related methods. Hillsdale, New York: Lawrence Erlbaum. McGrenery, J. (1996). Design: Educational electronic multi-player games - A literature review. University of the British Columbia. Merrill, M. D. (2007). First Principles of Instruction: A Synthesis. In R. A. Reiser, & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology (pp. 62–71). (2nd edition). Upper Saddle River, NJ: Merrill/Prentice Hall. Moos, D. C. (2009). Note-taking while learning hypermedia: Cognitive and motivational considerations. Computers in Human Behavior, 25, 1120–1128. Moreno, R., & Mayer, R. (2007). Interactive multimodal learning environments. Educational Psychology Review, 19, 309–326. Moreno-Ger, P., Burgos, D., Sierra, J. L., & Fernández-Manjón, B. (2008). Educational game design for online education. Computers in Human Behavior, 24, 2530–2540. Nelson, B. C., & Erlandson, B. E. (2008). Managing cognitive load in educational multi-user virtual environments: Reflection on design practice. Education Technology Research and Development, 56, 619–641. Paas, F., Tuovinen, J. E., Tabbers, H., & van Gerven, P. W. M. (2003). Cognitive load measurement as a means to advance cognitive load theory. Educational Psychologist, 38, 63–71. Paas, F. G. W. C., & van Merriënboer, J. J. G. (1994). Instructional control of cognitive load in the training of complex cognitive tasks. Educational Psychology Review, 6, 351–371. Pannese, L., & Carlesi, M. (2007). Games and learning come together to maximize effectiveness: The challenge of bridging the gap. British Journal of Educational Technology, 38, 438–454. Papastergiou, M. (2009). Digital game-based learning in high school computer science education: Impact on educational effectiveness and student motivation. Computers in Education, 52, 1–12. Pintrich, P. R. (1988). A process-oriented view of student motivation and cognition. In J.S. Stark, & L. A. Mets (Eds.), Improving teaching and learning through research. New directions for institutional research, 57. (pp. 65–79) San Francisco, CA: Jossey-Bass. Prenksy, M. (2001). Digital natives, digital immigrants. On the Horizon (Retrieved from http://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital% 20Immigrants%20-%20Part1.pdf) Raybourn, E. M. (2006). Simulation experience design methods for training the forces to think adaptively. Proceedings of Interservice/Industry Training, Simulation and Education Conference. Orlando, Florida.

Rieber, L. P., & Matzko, M. J. (2001). Serious design of serious play in physics. Educational Technology Research and Development, 41, 14–24. Rieber, L., & Noah, D. (2008). Games, simulations, and visual metaphors in education: Antagonism between enjoyment and learning. Educational Media International, 45, 77–92. Ritterfeld, U., Shen, C., Wang, H., Nocera, L., & Wong, W. L. (2009). Multimodality and interactivity: Connecting properties of serious games with educational outcomes. Cyberpsychology & Behavior, 12, 691–697. Ryan, R. M., & Deci, E. L. (2000). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55, 68–78. Ryan, R. M., Rigby, C. S., & Przybylski, A. (2006). The motivational pull of video games: A self-determination theory approach. Motivation and Emotion, 30, 347–363. Scheiter, K., & Gerjets, P. (2007). Learner control in hypermedia environments. Educational Psychology Review, 19, 285–307. Shellnut, B., Knowlton, A., & Savage, T. (1999). Applying the arcs model to the design and development of computer-based modules for manufacturing engineering courses. Educational Technology Research and Development, 47, 100–110. Small, R. V. (2000). Motivation in instructional design. Teacher Librarian, 27, 29–31. Small, R. V., & Gluck, M. (1994). The relationship of motivational conditions to effective instructional attributes: A magnitude scaling approach. Educational Technology, 34, 33–40. Song, S. H., & Keller, J. M. (2001). Effectiveness of motivationally adaptive computer-assisted instruction on the dynamic aspects of motivation. Educational Technology Research and Development, 49, 5–22. Spector, J. M., & Merrill, M. D. (2008). Editorial. Distance Education, 29, 123–126. Steers, R. M., & Porter, L. M. (1983). Motivation and Work Behavior. New York: McGraw-Hill. Suits, B. (1978). The Grasshopper: Games, Life, and Utopia. Ontario, CA: University of Toronto Press. Sweller, J., van Merriënboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10, 251–296. van Eck, R. (2006). Digital game-based learning: It's not just the digital natives who are restless. EDUCASE Review, 41, 16–30. van Gerven, P. W. M., Paas, F., van Merriënboer, J. J. G., & Schmidt, H. G. (2006). Modality and variability as factors in training the elderly. Applied Cognitive Psychology, 20, 311–320. van Merriënboer, J. J. G., Clark, R., & de Croock, M. B. M. (2002). Blueprints for complex learning: The 4C/ID-model. Educational Technology Research and Development, 50, 39–64. Vogel, J. F., Vogel, D. S., Cannon-Bowers, J., Bowers, C. A., Muse, K., & Wright, M. (2006). Computer gaming and interactive simulations for learning: A metaanalysis. Journal of Educational Computing Research, 34, 229–243. Westera, W., Nadolski, R. J., Hummel, H. G. K., & Wopereis, I. G. J. H. (2008). Serious games for higher education: A framework for reducing design complexity. Journal of Computer Assisted Learning, 24, 420–432.