Students' perceptions of instructors' roles in blended and online learning environments: A comparative study

Students' perceptions of instructors' roles in blended and online learning environments: A comparative study

Computers & Education 81 (2015) 315e325 Contents lists available at ScienceDirect Computers & Education journal homepage: www.elsevier.com/locate/co...

526KB Sizes 0 Downloads 46 Views

Computers & Education 81 (2015) 315e325

Contents lists available at ScienceDirect

Computers & Education journal homepage: www.elsevier.com/locate/compedu

Students' perceptions of instructors' roles in blended and online learning environments: A comparative study Min-Ling Hung a, *, Chien Chou b a b

Teacher Education Center, Ming Chuan University, 5 De Ming Rd., Gui Shan District, Taoyuan County 333, Taiwan Institute of Education, National Chiao Tung University, 1001 Ta-Hsueh Rd, Hsinchu, 30010 Taiwan

a r t i c l e i n f o

a b s t r a c t

Article history: Received 9 September 2014 Received in revised form 23 October 2014 Accepted 24 October 2014 Available online 1 November 2014

This study develops an instrumentdthe Online Instructor Role and Behavior Scale (OIRBS)dand uses it to examine students' perceptions of instructors' roles in blended and online learning environments. A total sample of 750 university students participated in this study. Through a confirmatory factor analysis, the OIRBS was validated in five constructs: course designer and organizer (CDO), discussion facilitator (DF), social supporter (SS), technology facilitator (TF), and assessment designer (AD). The results show that the five factor structures remained invariant across the blended learning and online learning. Both students in blended learning environments and students in online learning environments exhibited the greatest weight in the CDO dimension, followed by the TF and DF dimensions. In addition, students in the online learning environments scored higher in the DF dimension than did those in the blended learning environments. © 2014 Elsevier Ltd. All rights reserved.

Keywords: Distance education and telelearning Interactive learning environments Teaching/learning strategies Evaluation methodologies

1. Introduction With the development of technology rapidly expanding into higher education, online instruction has emerged as a popular mode and a substantial supplement to traditional teaching. Over the past few years, a growing number of studies have explored the perspectives of online instructors who use various technologies and pedagogies for teaching (e.g., Bailey & Card, 2009; Ellis, Hughes, Weyers, & Riding, 2009; Motaghian, Hassanzadeh, & Moghadam, 2013; Zingaro & Porter, 2014). Research in this field has generally concluded that educators regard both traditional education as chiefly instructor-centered and online education as chiefly student-centered. Owing to the ongoing shift from traditional classroom-based education to online education, many instructors no longer have direct control of the teaching process and they act more as facilitators than as traditional lecturers (Arbaugh, 2010; Schoonenboom, 2012). Instructors have many concerns when taking on the role of online educators. The preliminary concern is how to adapt to the relatively new role and thus effectively shoulder the related responsibilities required by online education. A significant role adjustment for students may be required as well if they want to be successful in an online learning environment. Students may shift from being a traditional passive classroom learner to being an active online inquirer. With such changes in learning contexts and in the roles of instructors and students, corresponding changes may have taken place in students' expectations and perceptions regarding the competence with which teachers should provide assistance, whether in advance of engaging in online studies or while in the process of doing so (Matzat, 2013; Zingaro & Porter, 2014). In addition to online education, blended teaching is growing in popularity. Educators regard it as an essential teaching component that promotes effective learning (Matzat, 2013; Ocak, 2011). Dziuban, Moskal, and Hartman (2005) identified two principal advantages from which participants in blended teaching can benefit: strengthened learning engagement and strengthened interaction. However, Humbert (2007) showed that faculty members are under sometimes oppressive pressure to deal with online interactions and technical issues in blended courses. Ocak (2011) proposed that the reasons for faculty members' lack of interest in teaching blended courses include formidably complex course structures, the necessity of intimidatingly careful preparation and planning, and a lack of effective communication. In response to such problems, Salmon and Lawless (2006) stated that instructors' changing roles constitute a critical issue in blended teaching. The purpose of this study, therefore, is to examine students' perceptions of instructors' roles and associated behaviors in learning environments that are entirely or partially web-based. In this study, we have adopted the definition that Lin and Overbaugh (2009) assigned to * Corresponding author. Tel.: þ886 3 350 7001; fax: þ886 3 3593887. E-mail address: [email protected] (M.-L. Hung). http://dx.doi.org/10.1016/j.compedu.2014.10.022 0360-1315/© 2014 Elsevier Ltd. All rights reserved.

316

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

the term ‘blended instruction’: it is teaching “in which a blend of both traditional classroom instruction and online learning activities are utilized, including synchronous and asynchronous communication modes” (p. 999). To examine students' perceptions of instructors' roles in different web-based learning environments, we created the Online Instructor Role and Behavior Scale (OIRBS) and examined its psychometric properties, each corresponding to one of two samples of students. The first sample comprised students enrolled in a course having blended learning environments while the second sample comprised students enrolled in a course having online learning environments. In this regard, we asked four research questions: 1. 2. 3. 4.

Can a measurement model of OIRBS be established? If it can be established, is the measurement model of OIRBS invariant in the presence of two distinct learning environments? What perceptions do college students have toward their instructor's roles in two distinct learning environments? Do the learning environments correspond to any difference in college students' perceptions of the roles and the associated behaviors of instructors?

2. Literature review 2.1. Studies on online instructor functions There is a growing understanding that teaching online is different from teaching face-to-face. Cho and Cho (2014) pointed out that online instructors' scaffolding for interaction had a significantly positive influence on students' behavioral and emotional engagement. This finding strongly suggests that a particular set of pedagogies should be in place to help online teachers teach. Knowlton (2000) argued that instructors no longer amount to an umpire, a judge, or a dictator; instead, they serve students in the capacity of a coach, a counselor, a mentor, and a facilitator. In an interview-based study of online instructors, Hsieh (2010) examined interactive activities, evaluation criteria, and selfexpectations to identify experiences of online instructors. In a similar study, Liu, Bonk, Magjuka, Lee, and Su (2005) conducted interviews with 28 faculty members and explored four dimensions of online teachers' roles: the pedagogical, managerial, social, and technical dimensions. The aforementioned study suggested that instructors attempting to establish a more engaging environment for online learning should play roles that have been transformed pedagogically, socially, and technologically. Another relevant study was undertaken by Lim and Lee (2008). These researchers argued that teachers in computer-supported learning environments should have technical, managerial, and facilitative skills, and that discussions about teachers' roles should be open to a more diverse set of views. Similarly, Wilson, Ludwig-Hardman, Thornam, and Dunlap (2004) directly identified five significant tasks that instructors should perform: (1) providing a learning-oriented infrastructure that comprises syllabi, calendars, communication tools, and instruction resources; (2) modeling various strategies for effective participation, collaboration, and learning; (3) monitoring and assessing students' learning and providing them feedback, remediation, and grades; (4) troubleshooting and resolving instructional, interpersonal, and technical problems; and (5) creating a learning community characterized by an atmosphere of trust and reciprocal concern. Most of the prior literature, as mentioned above, was based more on conceptual development and qualitative interview data than on quantitative data analysis of instructors' changing roles; however, recent research on online teaching has started to probe perspectives drawn from solid, diverse samples. Mazzolini and Maddison (2007) investigated how instructors' participation rates, the timing of instructor postings, and the nature of these postings are related to students' academic engagement and to their perception of this engagement. The findings indicate that instructors' efforts to post on forums could influence students' discussions and participation on the forums in unexpected ways. Cho and Cho (2014) used a sample of 158 college students and found that instructors' role as a facilitator for social interaction is critical in creating positive online learning environments, a pattern that in turn promotes academic engagement among students. In fact, recent research has examined how instructors' characteristics, attitudes, and behaviors can influence online courses. For example, Liaw, Huang, and Chen (2007) presented questionnaires to a sample of 30 instructors and 168 college students. The results indicate that the instructors had very positive attitudes toward elearning, particularly in regards to perceived self-efficacy, enjoyment, usefulness, and behavioral intention of use. Liaw et al. also noted that system satisfaction and multimedia instruction could positively affect instructors' attitudes toward and enjoyment of e-learning. Similarly, Arbaugh (2010) evaluated faculty members' characteristics and behaviors on display in 46 MBA courses offered by a Mid-Western U.S. university. According to the findings, instructor behavior is an important factor in the enhancement of student learning outcomes. Teaching presence and immediacy behaviors were positive predictors of students' perceived learning and satisfaction with the educational delivery medium. Hence, Arbaugh suggested that instructors should structure and organize their courses in advance so that they can focus on efficient engagement with their students while class is in progress. A number of studies have empirically investigated educational Internet use, which has the potential to motivate students and to strengthen their interactive behaviors and their autonomy in the educational process (Claudia, Steil, & Todesco, 2004). However, some studies have shown that online instructors lack the time, the relevant training, or the support to make proper use of such Internet tools (Muir-Herzig, 2004). While conducting a study in the Netherlands, Mahdizadeh, Biemans, and Mulder (2008) noted that instructors used elearning tools mainly to present course announcements, news, course materials, and PowerPoint displays. These uses were all for preliminary presentation purposes rather than for advanced communication or collaboration purposes. In other words, even when all kinds of e-learning tools are available, instructors tend to use relatively basic tools for teaching, instead of tools for online communication or collaboration. In order to promote instructors' effective online teaching and to eliminate any barriers in the teaching process, educators in general should strive to understand instructors' roles in online learning environments as well as instructors' associated behaviors. 2.2. Online instructors' roles and behaviors To adequately examine students' perceptions and perspectives of online instructors' roles and behaviors, researchers need an appropriate framework and a valid instrument with which they can categorize and measure participants' perceptions. Kim and Bonk (2006) argued that the most important skills for an online instructor are the ability to moderate or facilitate learning and the ability to develop

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

317

or plan for high-quality online courses. Liu et al. (2005) placed a stronger emphasis on online instructors' pedagogical roles, including those of course designer, profession-inspirer, feedback-giver, and interaction-facilitator. However, online instructors in general carry out a diverse array of important roles to varying degrees, and Kim and Bonk's study nor Liu et al.'s study addressed the particularly important role of assessment designer. Thus, the current study reviews the following additional dimensions that may be involved in instructors' roles and behaviors: course designer and organizer, discussion facilitator, social supporter, technology facilitator, and assessment designer. These roles are also the focus of our proposed Online Instructor Role and Behavior Scale (OIRBS), which will be discussed in greater detail in the method section. 2.2.1. Course designer and organizer Anderson, Rourke, Garrison, and Archer (2001) stated that instructors' development of digitally formatted courses can get instructors to think through the process, structure, evaluation, and interaction components of the courses well before the first day of class. Instructors should establish clear guidelines for in-class student participation and should present their students with information about course expectations and procedures (Bailey & Card, 2009; Eom, Wen, & Ashill, 2006). In order to integrate technology components into a course, instructors should use course-management software tools, give students links to web sites and supplemental course materials, and generally make all course materials available to students by the first day of class (Bailey & Card, 2009). These learning arrangements can help students shoulder more responsibility for their own learning, can engage students in deeper and broader interactions with course materials, can promote student collaboration in learning processes, and thus can enhance the degree to which students' learning experiences are positive (Heuer & King, 2004). Shea, Li, and Pickett (2006) conducted a study by using a sample of 1067 students across 32 different colleges and found that students were more likely to report experiencing a greater sense of a learning community when the courses had effective instructional design and organization. The dimension of “course design and organization” includes instructors' role in coordinating learning activities and in handling overall course structure. Online instructors use clearly structured content and timetables to convey to students the course expectations, in turn improving the quality of the given course and facilitating students' positive learning experience (Liu et al., 2005). In order to construct relevant items for course design and organization in our proposed OIRBS, we created a pool of behavioral items by both writing new items and adapting items from Liu et al. (2005). In this way, we selected three items, an example of which is “The instructor provides clear syllabi (e.g., goals, organization, policies, expectations, and requirements) to students at the beginning of the course.” 2.2.2. Discussion facilitator Rovai (2007) characterized online constructivist learning environments as discourses, typically in the form of online discussions. Hara, Bonk, and Anjeli (2000) reported that online discussions can constitute a text-based digital record of concepts, plans, answers to questions, and strategies, and thus can facilitate meaningful processing of information. Other research shows that online discussion can help students reflect on their own perspectives (MacKnight, 2000), foster their own metacognitive skills (McDuffie & Slavit, 2003), and strengthen their own critical-thinking skills (Jeong, 2003). Thus, instructors should seek and implement such effective strategies for facilitating online discussion as promoting students' motivation to engage in productive discussions and engaging students in socio-emotional discussions and authentic content-and task-oriented discussions (Rovai, 2007). When facilitating discussion, instructors must assess student comments, give feedback on student comments, share opinions with students, ask questions of students, encourage students to explore new concepts in the course, keep students focusing on the tasks at hand, draw out shy or reserved students, and praise students for their productive efforts (Arbaugh, 2010; Dringus, Snyder, & Terrell, 2010). Arbaugh (2001) pointed out that instructors could promote discussion and feedback through the use of text-based discussion, emoticons, personal examples, or audio clips. Instructors' attempts to reduce the social and psychological distance between themselves and their students are often on display in the behaviors that the instructors exhibit when directly responding to students' own behaviors. From the above-mentioned studies, we have concluded that the role of discussion facilitator corresponds to an essential dimension in our OIRBS. We have created four behavioral items in one of two ways: either by writing novel items or by adapting concepts from Rovai (2007) and Arbaugh (2010). One such item is “The instructor encourages students to engage in critical and reflective thinking in online discussion.” 2.2.3. Social supporter Bailey and Card (2009) identified the fostering of relationships as a significant means by which online instructors would express their empathy for students, their passion for teaching, and their strong desire to help students succeed in their college-level learning. In this same vein, Yuan and Kim (2014) uncovered evidence that the dropout problem for some online programs may be attributable to a lack of interaction between learners and instructors. The lack of interaction can also leads to learners' feeling of isolation. Thus, learning communities can strengthen learners' interactions with one another and their instructors, and can, by the same token, alleviate the learners' feelings of isolation. Kreijns, Kirschner, Jochems, and van Buuren (2007) defined ‘sociability’ as the extent to which people perceive a computer-supported collaborative learning (CSCL) environment to be capable of facilitating the emergence of a sound social space with attributes such as trust and belonging, a strong sense of community, and good working relationships. Shea et al. (2006) conducted a study using Rovai's (2002a, 2002b) Classroom Community Index as a diagnostic instrument to investigate college students' levels of connectedness and learning in both online courses and in classroom-based courses with online components. These researchers found that if an instructor would reinforce student contributions, thereby furthering students own knowledge and confirming student understanding, students would be more likely to report a strong sense of learning community. Kang and Imt (2013) proposed that instructional interaction can predict learners' outcomes and satisfaction in online learning environments. However, the researchers also showed that social interaction such as social intimacy could negatively affect perceived learning achievement and satisfaction. In order to construct the “social supporter”-related items in our proposed OIRBS in this study, we developed some behavioral items. One such item is “The instructor helps foster a sense of community in this online course.”

318

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

2.2.4. Technology facilitator One element of the successful and effective implementation of online learning is related to effective use of technology (Bailey & Card, 2009). Educators have made efforts to integrate emerging Internet technologies into the teaching and learning process in higher education (Roby, Ashe, Singh, & Clark, 2013). For example, researchers have reported the plausibility of using blogs (Martindale & Wiley, 2005) and wikis (Lamb, 2004) for online student collaboration and reflection. However, Condie and Livingston (2007) argued that, in their sample, most instructors expressed little confidence in the technical aspects of using information and communication technology (ICT) for teaching. Instructors not only criticized ICT use's alleged benefits for the teaching subject at hand, but also exhibited unwillingness to learn proper ICT use for the promotion of students' learning. This evidence of unwillingness and reluctance on the part of instructors raises the question of the extent to which learning institutions require instructors to adapt their practices or adopt new approaches for the purpose of maximizing new technologies' potential support of learning and teaching (Condie & Livingston, 2007; Schoonenboom, 2014). Brill and Galloway (2007) suggested that learning institutions should provide instructors an appropriate degree of support clarifying the diverse positive influences that equally diverse technologies can have on the classroom (e.g., presentations, interactions, attention); in turn, instructors can develop positive attitudes and proficiency in selecting the most useful technologies given specific pedagogical goals. Berge (1995) pointed out that online instructors' technical role required them to have a minimum degree of knowledge, skill, and comfort in the presence of communication tools. Online instructors' technical roles included supporting students with technical resources, addressing technical concerns, diagnosing and clarifying problems encountered, and allowing students sufficient time to learn new programs (Bonk, Kirkley, Hara, & Dennen, 2001). A general principle seems to be that technology can create or bolster unique opportunities for promoting reflective and collaborative learning (Frank, 2006). These technical functions depend on the degree to which instructors not only become comfortable and proficient with the technology being used but also can transfer that level of comfort to the learners (Liu et al., 2005). Since online instructors can strengthen learners' comfort with course-based technical supports, the current study proposes to examine behavioral items related to instructors' technology use and support in our OIRBS. One such item is “The instructor uses tools and technologies (e.g., PowerPoint, audio devices, video devices, multimedia) that foster our learning.” 2.2.5. Assessment designer Bransford, Brown and Cocking (1999) argued that learner-centered environments emphasize forms of assessment that make learners' thinking visible to themselves. Traditional tests usually impose a standardized procedure on all students in a class at the same controlled time and location. In contrast, online assessment and evaluation activities, without face-to-face interactions and observations, can be a completely different process from those in a traditional teaching situation (Azza, 2001). Online testing, if it is to be rigorous, must address the issues of identity security and academic honesty. Researchers have paid some attention to the growing ease with which students undertake text plagiarism using the Internet (Liu, Lo, & Wang, 2013; McMurtry, 2001). Searching the Internet for pertinent textual information, finding it, copying it and pasting it into a document, and passing the document off as an original work can be done easily and quickly (Rovai, 2000). This activity can be a grave problem for online tests. Another problem with online tests is that instructors cannot easily ensure the simultaneity of all students' test-taking (Olt, 2002). If simultaneity is not enforced, earlier test-takers can supply answers to later test-takers whenever the questions on the test remain unchanged. Thus, student assessments in online environments become more challenging for instructors, who cannot observe students' test-taking behaviors directly. Webb, Gibson, and Forkosh-Baruch (2013) identified a need for alternative assessment approaches and instruments for measuring the complex, higher order outcomes empowered by technology-enriched learning experiences. Moreover, with the rapidly unrolling advancements in technology, instructors can now assess students through personalized measures to evaluate the students' attainment of higher-order learning outcomes (Yeh, 2010). Assessments such as simulations, e-portfolios, and interactive games create a medium for analyzing a broad range of knowledge even in real-time (Clarke & Dede, 2010; Gibson, Aldrich, & Prensky, 2007). Robles and Braathen (2002) argued in favor of online assessment requiring a more ongoing and systematic approach than that used with traditional instruction. Hence, online instructors need to use effective assessment strategies and techniques such as creative designs and approaches in projects, portfolios, self-assessments, peer evaluations, and weekly assignments, all coupled with immediate feedback (Gaytan & McEwen, 2007; Rovai, 2000). Consequently, designing forms of assessment to measure student performance is another important task for online teachers. In order to construct relevant items pertinent to assessment designers in our OIRBS, we created a pool of behavioral items by both writing new items and adapting some items from Gaytan and McEwen (2007). One such item is “The instructor designs exam questions or assessment activities that cover information in lectures and reading in proportion to the importance in the course.” As stated by the research that we have been reviewing, online teaching creates a new type of educational experience that requires a reexamination of online instructors' roles and their associated behaviors. In this study, we propose that the five following dimensions of the OIRBS are empirically distinct from one another: course designer and organizer (CDO), discussion facilitator (DF), social supporter (SS), technology facilitator (TF), and assessment designer (AD). Additionally, the above-mentioned studies have led us to the conclusion that learners' perceptions toward instructors' roles and associated behaviors in online learning environments are research issues worthy of further investigation. However, all of these studies have focused on a specific delivery format (i.e., only web-based instruction). There seems to be no cross-format comparison that would help us understanddfrom students' perspectivesdthe changing roles that online instructors play depending on different learning environments. 3. Method 3.1. Participants The current study's sample consisted of 750 students. All the students owned a computer, which they typically kept at their home or dormitory. The participants' courses were of two types: blended courses (classroom-based courses with online components) and online courses. There were 367 undergraduate student participants enrolled in two blended general-education courses at a private university

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

319

in northern Taiwan: the first course was “Introduction to Environmental Protection” and the second was “Taiwanese Ecology.” The first course had 172 students and the second course had 195 students. Each course had one instructor and two teaching assistants respectively. The students attended face-to-face classes once a week for each of the two blended courses, which provided students with digital-learning materials including videos and slides via the Moodle Learning Management System. Students were asked to post questions and comments on discussion spaces each week throughout the semester, and the instructors responded to students' postings. The grade distribution in these two courses was 10% for attendance and participation in face-to-face classroom time, 20% for group projects, 20% for online discussion, 25% for the midterm exam, and 25% for the final exam. The questionnaire was distributed by email after the midterm. The researchers in this study gathered the data pertaining to this learning group over the 2010 spring semester and the 2010 fall semester. In the online learning environment, 383 participants were enrolled in one cross-school general-education course entitled “Internet Literacy and Ethics.” This and similar types of courses were offered via a self-developed e-campus learning-management system. With semesters spanning 18 weeks, only three classes in the “Internet Literacy and Ethics” course were of the face-to-face variety, and these consisted of the orientation class, the midterm-exam class, and the final-exam class. One instructor and five teaching assistants were responsible for designing discussion topics, responding to individual postings, and providing general comments in online discussion forums. Online instructors required students to participate in online forums, where discussions would address topics assigned during the semester. This kind of participation counted for 30% of students' final grade. Students also needed to develop a project, which counted for 20% of the grade, and the midterm and final exams each counted for 25% of the grade. The instructor spent about 2 h every other day to answer students' questions and to participate in the discussions. The online posting rates of the instructor are approximately 25e30% of the total posting. The questionnaire was conducted by email before the final-exam. The researchers in this study gathered the data pertaining to this learning group over four consecutive semesters (the spring and fall semesters of 2010 and of 2011; see Table 1). 3.2. Instrument The instrument that the current study used was the Online Instructor Role and Behavior Scale (OIRBS), consisting of two sections (Section A and Section B) that were identical for the two learning groups examined here. Section A comprised questions regarding demographic characteristics (e.g., gender) and student grade level (e.g., freshman, sophomore, junior, senior). Section B comprised 16 statements (i.e., items) addressing students' post-course views about a given instructor's roles and associated behaviors (regardless of whether the course was of the online or blended variety). To provide students with a clear and understandable instrument, we (the researchers of the current study) revised and adapted the draft of the survey to the current study's framework on the basis of three individuals' opinions: an expert with five years of instructional-design experience in online settings and two online instructors. Students' agreement with each item was indicated by a five-point rating scale with the categories strongly agree, agree, neutral, disagree, and strongly disagree. The Appendix shows the mean and standard deviation of each item in the dimensions of the OIRBS. 4. Results 4.1. Total-sample CFA The CFA of the OIRBS model rested on our assumption that online-instructor roles and associated behaviors would exhibit a five-factor structure composed of course designer and organizer, discussion facilitator, social supporter, technology facilitator, and assessment designer. In other words, students' responses to the OIRBS could be explained by means of five first-order factors, and covariance among the first-order factors could be explained by means of the second-order factor. Also, each item would have non-zero loadings on the measured first-order factors. Adequate model fit is represented by c2, which is highly sensitive to sample size. We used the ratio of c2 to its degree of freedom (c2/df), with a value below 5.0 being indicative of an acceptable fit between the hypothetical model and the sample data (Carmines & McIver, 1981). RMSEA values were below 0.08; SRMR values were below 0.05; GFI and CFA values were greater than 0.90 (Kline, 2005). Fit indices were good (c2/df ¼ 3.525, RMSEA ¼ 0.058, SRMR ¼ 0.050, GFI ¼ 0.94, CFI ¼ 0.99). In addition to fit indices, we set out to examine such structural elements of the model as factor loadings and squared multiple correlations. As can be seen in Fig. 1, each item loads on its intended factor with factor loadings ranging from 0.49 to 0.85, and each factor loading is statistically significant. The range of the factor loadings is consistent with the range recommended for social science researchdthe range being between 0.40 and 0.70 (Costello & Osborne, 2005). Fig. 1 presents the analytical results for CFA.

Table 1 Participant characteristics.

Survey period Course name Number of respondents Gender Student grade

Male Female Freshman Sophomore Junior Senior

Blended course

Online course

Total

2010 Spring semesters & 2010 fall semesters Introduction to Environmental Protection Taiwanese Ecology 367 107 (29.2%) 260 (70.8%) 41 (11.2%) 65 (17.7%) 106 (28.9%) 155 (42.2%)

Spring & fall semesters of 2010 & 2011 Internet Literacy and Ethics 383 197 (51.2%) 186 (48.8%) 9 (2.3%) 42 (11%) 170 (44.4%) 162 (42.3%)

750 304 446 50 107 276 317

320

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

Fig. 1. Results of the CFA: A five-factor model with factor loadings for the 16-item Online Instructor Role and Behavior Scale (OIRBS).

4.2. Measurement invariance analysis (blended vs. online) Since this study used two samples from a variety of courses with different requirements in two learning environments (blended vs. online), we had to test the measurement invariance. Testing measurement invariance involves a series of increasingly restrictive hypotheses. Different levels of invariance are assessed with a hierarchical procedure comparing four multi-group models in a fixed order, from the least to the most restrictive model. An initial baseline model (Model 1) has no between-group invariance constraints on estimated parameters. Groups have the same form without restricting any non-fixed parameters across models (Bollen, 1989). The baseline model shows a good fit with RMSEA of 0.060 and GFI and CFI greater than 0.90. In Model 2, the model is the same across the groups and the factor pattern coefficients (loadings) are identical across groups because the pattern coefficients carry the information about the relationship between latent scores and observed scores (Steenkamp & Baumgartner, 1998). Model 2 also exhibits a good fit for the RMSEA of 0.067, the GFI of 0.92, and the CFI of 0.98. The next step was to assess the model (Model 3) in which factor loadings and measurement error variances were constrained to being equal in the two groups. Model 3 also had a good fit for RMSEA, GFI, and CFI. Model 4 in this hierarchy was the most restrictive model, in which all three parameter matrices (factor loadings, intercepts, and residual variances) were simultaneously tested for equality. In this case of strict factorial invariance, the measurements across groups were not biased in any way and were identical in terms of the construct's validity and reliability as captured by the latent variables. The results show that the five factors in the OIRBS remained invariant across groups of blended learning and online learning, and can be used as a valid and reliable research instrument for further comparisons. Table 2 shows the results of the invariance-testing procedure.

4.3. Differences among students' scores in the five OIRBS dimensions We explore here how students' perceptions about a given instructor's roles and associated behaviors varied across two different samples stemming from two learning environments. Table 3 presents students' mean scores and standard deviations on the five dimensions in blended courses. To calculate each student's mean score for every dimension, we calculated the sum of the values pertaining to the answers to each item in a factor, and then divided the sum by the number of that factor's items. All students' average scores relative to the different dimensions ranged from 3.66 to 3.99 on a 5-point Likert-type rating scale. In order to investigate the differences among the five dimensions of the scale, we conducted a multivariate, repeated one-way ANOVA. By comparing the mean of those five dimensions, we noted a pattern: the higher the mean score, the greater the weight (i.e., importance) students attributed to instructors' roles and behaviors. The results show that Hottelling's Trace was significant (F ¼ 32.262, p < 0.001). A post hoc test further revealed that the mean score of course designer and Table 2 Invariance testing across two different groups. Model

c2

df

RMSEAa

CFIb

GFIc

SRMRd

1 2 2 4

451.62 572.47 574.13 595.92

194 214 226 230

0.060 0.067 0.064 0.065

0.99 0.98 0.98 0.98

0.93 0.92 0.91 0.91

0.049 0.043 0.040 0.040

a b c d

Root mean square error of approximation. Comparative-fit index. Goodness-of-fit index. Standardized root mean square residual.

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

321

Table 3 Results of a multivariate repeated one-way ANOVA and a post hoc test of the OIRBS for the blended learning environment (n ¼ 367). Factors

Mean

SD

F Value (Hotelling's trace)

Summary of significant differences in paired samples in the post hoc test

Course designer and organizer (CDO) Discussion facilitator (DF) Social supporter (SS) Technology facilitator (TF) Assessment designer (AD)

3.99 3.73 3.66 3.85 3.73

0.71 0.67 0.70 0.75 0.71

32.262***

CDO > TF > DF, AD > SS

***p < 0.001, All items were measured via a 5-point Likert scale (1 ¼ strongly disagree, 2 ¼ disagree, 3 ¼ neutral, 4 ¼ agree, and 5 ¼ strongly agree).

organizer (CDO) is greater than the other four factors' mean scores; the mean score of technology facilitator (TF) is greater than the mean scores of discussion facilitator (DF), social supporter (SS), and assessment designer (AD); and the mean scores of factors DF and AD are greater than the mean score of factor SS. Table 4 presents students' mean scores and standard deviations on the five dimensions as they pertain to the online courses. All students' average scores range from 3.62 to 4.02 on a 5-point Likert-type rating scale. We conducted a multivariate, repeated one-way ANOVA to examine the differences among the five dimensions of the scale. The results show that Hottelling's Trace was significant (F ¼ 39.698, p < 0.001). A post hoc test further revealed that the mean score of course designer and organizer (CDO) is greater than the other four factors' mean scores, and that the mean scores of discussion facilitator (DF) and technology facilitator (TF) are greater than the mean scores of assessment designer (AD) and social supporter (SS). As shown in Fig. 2, the radar graphs for the two learning environments have the same dimensions but exhibit different shapes. 4.4. Results of group comparison We conducted an independent samples t-test to explore the differences between the blended learning environment and the online learning environment regarding the five measured factors of instructor roles. The results of this analysis revealed statistically significant group differences between blended and online learners regarding their views of two roles (factors): discussion facilitator (DF) (p < 0.01) and social supporter (SS) (p < 0.001). As shown in Table 5, students weighted the role of discussion facilitator (DF) in online learning environments heavier than the same type of role in blended learning environments. However, no significant group differences were found in terms of students' views of the other four instructor roles (factors): course designer and organizer (CDO), social supporter (SS), technology facilitator (TF), and assessment designer (AD) (Table 5). 5. Discussion 5.1. The measurement model of OIRBS In this study, we first drew from past research to establish the Online Instructor Role and Behavior Scale (OIRBS), and then we statistically investigated whether the five-factor structure underlying this scale could reflect students' perceptions of five online-instructor roles and their associated behaviors. Five main roles for online teaching emerged: course designer and organizer, discussion facilitator, social supporter, technology facilitator, and assessment designer. The first research question probed the measurement model of our OIRBS, and the confirmatory factor analysis results support the measurement model. The second research question asked whether the OIRBS would yield evidence of invariance according to the two learning environments. We used multi-group confirmatory factor analysis to evaluate the measurement invariance across the blended learning and online learning groups. Analytical results show that the factor structures of a given instructor's roles, as perceived by students, remained invariant across the blended learning and online learning. That is, evidence emerged of an invariant pattern of factor loadings, measurement error variances, and factor variances between the two groups (learning environments). To put the matter yet another way, we found that measurement invariance was present across the two learning environments (online learning and blended learning) even when we had drawn our samples from different courses offered by different schools during different semesters. Therefore, we have concluded that the OIRBS items operated equivalently so that meaningful comparison across these two groups could be made. 5.2. College students' perceptions of five instructors' roles The third research question inquired into college students' perceptions of their instructors' roles. Upon examining students' mean scores for the OIRBS factors as shown in Fig. 2, we found that the current study's sample of college students who were engaged in blended learning and college students who were engaged in online learning has exhibited the greatest weight in the dimension of course designer and organizer. This result is consistent with Yusoff and Salimb (2012), whose results indicate that both course organization and a clear Table 4 Results of a multivariate repeated one-way ANOVA and post hoc test of the OIRBS regarding the online learning environment (n ¼ 383). Dimension

Mean

SD

F Value (Hotelling's trace)

Summary of significant differences in paired samples in the post hoc test

Course designer and organizer (CDO) Discussion facilitator (DF) Social supporter (SS) Technology facilitator (TF) Assessment designer (AD)

4.02 3.88 3.62 3.94 3.74

0.69 0.60 0.71 0.69 0.67

39.698***

CDO > DF, TF > AD > SS

***p < 0.001.

322

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

Fig. 2. Radar charts for the mean comparison of the five roles in the two learning environments (Unit: five-point Likert scales).

articulation of course expectations are crucial to the success of the class. The findings in the current study also supports the proposition that the most important role of an online instructor is to act as an instructional designer who plans and prepares the course and who provides direct instruction (Ke, 2010). Without such course management and direction, as Arbaugh and Hwang (2006) proposed, students may be lost in the virtual learning world, which differs notably from the traditional classroom. In addition to the roles of course designer and organizer, instructors should play a positive role as technology facilitator in both blended and online learning groups, as shown in Fig. 2. This finding indicates that instructors, regardless of which of the two learning environments they are teaching in, should be competently knowledgeable about technology to support instruction and learning. Jones (2004) showed that “many teachers who do not consider themselves to be well skilled in using ICT feel anxious about using it in front of a class who perhaps know more than they do” (p. 7). Therefore, it is crucial that instructors should grasp how to use technology and should have effective technological strategies to enhance courses. Finally, also shown in Fig. 2, this study's results show that the participating college students' mean scores for social supporter were by far the lowest scores for any of the five instructor roles. The significance of this finding is that, in each of the two learning environments, the social supporter was neither obvious nor significant in comparison with other roles. Although Garrison, Anderson, and Archer (2000) proposed that “social messages” are necessary to sustain interaction and create and foster a community of discussion, instructors generally play less active roles to facilitate a learning community. Nevertheless, this finding in our present study does not suggest that instructors should refrain from becoming social supporters. Teachers, whether in an online or a blended environment, should endeavor to develop relationships of trust with students and to cultivate a general sense of belonging among students. 5.3. Learning-group differences in instructor roles and associated behaviors One of the research questions in the current study asked whether learning environment makes any difference in college students' perceptions of instructor roles and associated behaviors. On the basis of the independent samples t-test results (see Table 5), we highlighted two interesting findings. First, students in each learning environment equally rated the four instructor roles of course designer and organizer, social supporter, technology facilitator, and assessment designer. The equal ratings suggest that students expect instructors to perform these roles evenly across each of the two learning environments. Second, students in online learning environments assigned higher scores to the dimension of discussion facilitator than did students in blended learning environments. Although teaching via an online asynchronous discussion forum presents a challenge to instructors (Mazzolini & Maddison, 2007), the current study's participants in online learning environments were more likely than the participants in blended environments to formulate positive perceptions of instructors as discussion facilitators or “cheerleaders,” whose function was to promote interactions among students. In comparing the two learning environments, we found that students in online learning environments expected instructors to perform a more active, interactive, and reflective role than did students in blended learning environments. It is generally the case that instructors in blended learning environments have significant opportunities to interact with students and to give them immediate feedback during face-to-face encounters; thus, students in this type of environment may be more likely than students in the online learning environment to expect that instructors facilitate discussion. 6. Implications, limitations, and conclusion The results of this study reveal that online instructors' role as social supporters merits special attention from the researchers. Both the blended-learning mean score and the online-learning mean score for the social-supporter variable are the lowest among the mean scores Table 5 Results of group comparison. Blended learning (n ¼ 367)

Course designer and organizer (CDO) Discussion facilitator (DF) Social supporter (SS) Technology facilitator (TF) Assessment designer (AD)

Online learning (n ¼ 383)

t

M

SD

M

SD

3.99 3.73 3.66 3.85 3.73

0.71 0.67 0.70 0.75 0.71

4.02 3.88 3.62 3.94 3.74

0.69 0.60 0.71 0.69 0.67

**p < 0.01, ***p < 0.001 statistically significant at a 95% confidence level (2-tailed).

0.24 2.52** 1.02 1.54 0.05

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

323

for the five instructor roles. Duncan-Howell (2010) and Matzat (2013) characterized the need to belong as a desire for regular social contact with people to whom one feels connected; in this light, teachers might do well to establish and sustain students' sense of belongingness through the development of their interpersonal relationships and their sense of community, especially in online learning environments. For example, teacher can create learning communities through which group discussions, experience sharing, instant feedback, and so on can keep the students more involved in the course. Teachers can, if possible, send an email or make phone calls to relatively passive students to ask them the reason for the passivity and to draw them back into the course. These might establish and sustain students' sense of belongingness. In addition, students attributed the highest degree of importance to the role of course designer and organizer in both the blended learning and online learning environments. The results are similar to those discussed in Bailey and Card (2009), who argued that organization is an important practice for online teachers. The current study's findings regarding students' perceptions suggest that online instructors should provide their students with effective organization, which entails well-defined course goals and learning objectives, clear syllabi, firmly articulated expectations, satisfactory availability of all course materials, and the like. Effective use of technology is an important component of effective practices for blended learning environments no less than for online learning environments. From students' perspectives, instructors should consider using a wide variety of technological tools to deliver course materials and to assist with student learning. Thus, institutions can, if possible, focus on providing instructors with technological training to enhance their online teaching. The limitations of our study merit acknowledgment here and discussion in future research. First, because of its exploratory nature, this study did not check the OIRBS criterion-related validity; that is, we did not collect students' data on the OIRBS and other similar scales concurrently. Future research may benefit from focusing on possible correlations between OIRBS and similar scales for more concurrent evidence of validity. In particular, future research may find it wise to address the test-retest reliability of the OIRBS. Second, this study compared two different learning environmentsdblended learning and online learningdfrom two distinct samples. We did not probe into the format-related differences as they pertained to the course content, instructor and schools. Future research seeking to examine the usefulness of the OIRBS for all academic disciplines should consider focusing on students from diverse colleges and courses. In this study, we examined instructors' roles and associated behaviors across two samples of students in two learning environments. It seems that, in both blended learning and online learning contexts, students perceived their instructors playing multiple roles, expected their instructors to play certain roles, and assigned varying degrees of importance to these roles. This study suggests that instructors re-consider their handling of these roles for students' successful learning experience in online as well as blended learning settings. The OIRBS helps clarify some of the benefits and drawbacks of various behaviors that instructors perform when inhabiting a particular role, and to this end, the OIRBS will hopefully strengthen the efficiency, the effectiveness, and the overall appeal with which instructors operate in these contexts. The findings of our present study should spur discovery of more applications of the OIRBS in future research, and we anticipate that these applications will be as diverse as are the many contexts of online and blended learning. Acknowledgments The authors are grateful for funding from the Ministry of Science and Technology, Taiwan, grant number: MOST103-2511-S-130 -001 and MOST 103-2511-S-009-009-MY2. Appendix

List of constructs and items, Mean, S.D. and Inter-item correlation (N ¼ 750). Item no.

Dimension/items

Course designer and organizer (CDO) CDO1 The instructor provides clear syllabi (e.g., goals, organization, policies, expectations, and requirements) to students at the beginning of the course. CDO2 The instructor provides supplemental course materials for online courses. CDO3 The instructor provides online courses that are well organized and presented. Discussion facilitator (DF) DF1 The instructor encourages students to engage in critical and reflective thinking during online discussion. DF2 The instructor plays a role of facilitator, guide, or cheerleader in online discussion. DF3 The instructor gives feedback to students in online discussion. DF4 The instructor is helpful in guiding the class toward a reasonable understanding of course topics and concepts. Social supporter (SS) SS1 The instructor helps foster a sense of community in this online course. SS2 The instructor establishes a harmonious learning climate in this course. SS3 The instructor knows students through online interactions. Technology facilitator (TF) TF1 The instructor uses such tools and technologies as PowerPoint, audio, video, and multimedia devices, which are helpful in fostering learning. TF2 The instructor exposes students to tools and technologies that are easy to use. TF3 The instructor provides students with appropriate technical support when the students face such problems as system disconnects. Assessment designer (AD) AD1 The instructor designs exam questions that facilitate higher-order thinking skills (analysis, synthesis). AD2 The instructor designs exam questions or assessment activities that cover information in lectures and reading in proportion to the importance in the course. AD3 The instructor contacts students who have not completed assignments and helps them complete assignments. 1 ¼ strongly disagree, 2 ¼ disagree, 3 ¼ neutral, 4 ¼ agree, 5 ¼ strongly agree.

Mean

S.D.

Inter-item correlations

4.02

0.80

0.732

4.08 3.89

0.79 0.82

0.732 0.752

3.82 3.79 3.86 3.73

0.77 0.87 0.84 0.84

0.694 0.681 0.714 0.697

3.79 3.84 3.27

0.83 0.80 0.95

0.722 0.700 0.616

3.92

0.87

0.685

3.94 3.83

0.81 0.80

0.648 0.704

3.69 3.76

0.83 0.80

0.658 0.687

3.74

0.95

0.580

324

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

References Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web-based courses. Business Communication Quarterly, 64(4), 42e54. Arbaugh, J. B., & Hwang, A. (2006). Does “teaching presence” exist in online MBA courses? The Internet and Higher Education, 9(1), 9e21. Arbaugh, J. B. (2010). Sage, guide, both, or even more? an examination of instructor activity in online MBA courses. Computers & Education, 55(3), 1234e1244. Anderson, T., Rourke, L., Garrison, D. R., & Archer, W. (2001). Assessing teaching presence in a computer conference context. Journal of Asynchronous Learning Networks, 5(2), 1e17. Azza, A. A. (2001). Learning from the Web: are students ready or not? Journal of Educational Technology & Society, 4(4), 32e38. Bailey, C. J., & Card, K. A. (2009). Effective pedagogical practices for online teaching: perception of experienced instructors. Internet and Higher Education, 12(34), 152e155. Berge, Z. L. (1995). Facilitating computer conferencing: recommendations from the field. Educational Technology, 15(1), 22e30. Bollen, K. A. (1989). Structural equations with latent variables. New York: John Wiley & Sons. Bonk, C. J., Kirkley, J. R., Hara, N., & Dennen, N. (2001). Finding the instructor in post-secondary online learning: pedagogical, social, managerial, and technological locations. In J. Stephenson (Ed.), Teaching and learning online: Pedagogies for new technologies (pp. 76e97). London, UK: Kogan Page. Bransford, J., Brown, A. L., & Cocking, R. R. (Eds.). (1999). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press. Brill, J. M., & Galloway, C. (2007). Perils and promises: University instructors' integration of technology in classroom-based practices. British Journal of Educational Technology, 38(1), 95e105. Carmines, E. G., & McIver, J. P. (1981). Analyzing models with unobserved variables: analysis of covariance structures. In G. W. Bohrnstedt, & E. F. Borgatta (Eds.), Social measurement: Current issues (pp. 65e115). Beverly Hills, CA: Sage Publications. Cho, M. H., & Cho, Y. J. (2014). Instructor scaffolding for interaction and students' academic engagement in online learning: mediating role of perceived online class goal structures. Internet and Higher Education, 21, 25e30. Clarke, J., & Dede, C. (2010). Assessment, technology, and change. Journal of Research in Teacher Education, 42, 309e328. Claudia, M., Steil, A., & Todesco, J. (2004). Factors influencing the adoption of the Internet as a teaching tool at foreign language schools. Computers and Education, 42(4), 353e374. Condie, R., & Livingston, K. (2007). Blending online learning with traditional approaches: changing practices. British Journal of Educational Technology, 38(2), 337e348. Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Practical Assessment Research & Evaluation, 10(7). Available from: http://pareonline.net/getvn.asp?v¼10&n ¼7 Accessed 27.08.14. Dringus, L. P., Snyder, M. M., & Terrell, S. R. (2010). Facilitating discourse and enhancing teaching presence : using mini audio presentations in online forums. Internet and Higher Education, 13(1e2), 75e77. Duncan-Howell, J. (2010). Teachers making connections: online communities as a source of professional learning. British Journal of Educational Technology, 41, 324e340. Dziuban, C., Moskal, P., & Hartman, J. (2005). Higher education, blended learning and the generations: knowledge is power no more. In J. Bourne, & J. C. Moore (Eds.), Elements of quality online education: Engaging communities. Needham, MA: Sloan Center for Online Education. Ellis, R. A., Hughes, J., Weyers, M., & Riding, P. (2009). University teacher approaches to design and teaching and concepts of learning technologies. Teaching and Teacher Education, 25(1), 109e117. Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students' perceived learning outcomes and satisfaction in university online education: an empirical investigation. Decision Sciences Journal of Innovative Education, 4(2), 215e235. Frank, M. (2006). How to teach using today's technology: matching the teaching strategy to the e-learning approach. In L. T. W. Hin, & R. Subramaniam (Eds.), Handbook of research on literacy in technology at the Ke12 level (pp. 372e393). Hershey, PA: Idea Group Publishing. Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet and Higher Education, 2(2e3), 87e105. Gaytan, J., & McEwen, B. C. (2007). Effective online instructional and assessment strategies. The American Journal of Distance Education, 21(3), 117e132. Gibson, D., Aldrich, C., & Prensky, M. (2007). Games and simulations in online learning: Research and development frameworks. Hershey, PA: Information Science Publishing. Hara, N., Bonk, C. J., & Anjeli, C. (2000). Content analysis of online discussions in an applied educational psychology course. Instructional Science, 28, 115e152. Heuer, B. P., & King, K. (2004). Leading the band: the role of the instructor in online learning for educators. Journal of Interactive Learning Online, 3(1). Retrieved April 12, 2014, from http://www.ncolr.org/jiol/issues/pdf/3.1.5.pdf. Hsieh, P. H. (2010). Globally-perceived experiences of online instructors: a preliminary exploration. Computers & Education, 54, 27e36. Humbert, M. (2007). Adoption of blended learning by faculty: an exploratory analysis. In M. K. McCuddy (Ed.), The challenges of educating people to lead in a challenging world (pp. 423e436). Springer. Jeong, A. (2003). Sequential analysis of group interaction and critical thinking in online threaded discussions. The American Journal of Distance Education, 17(1), 25e43. Jones, A. (2004). A review of the research literature on barriers to the uptake of ICT by teachers. Retrieved August 2011, from http://www.becta.org.uk. Kang, T., & Imt, T. (2013). Factors of learnereinstructor interaction which predict perceived learning outcomes in online learning environment. Journal of Computer Assisted Learning, 29, 292e301. Kim, K.-J., & Bonk, C. J. (2006). The future of online teaching and learning in higher education: the survey says. EDUCAUSE Quarterly, 29(4), 22e30. Ke, F. (2010). Examining online teaching, cognitive, and social presence for adult students. Computers & Education, 55(2), 808e820. Kline, R. B. (2005). Principles and practice of structural equation modeling (2nd ed.). New York: Guilford Press. Knowlton, D. S. (2000). A theoretical framework for the online classroom: a defense and delineation of a student-centered pedagogy. New Directions for Teaching and Learning, 84, 5e14. Kreijns, K., Kirschner, P. A., Jochems, W., & van Buuren, H. (2007). Measuring perceived sociability of computer-supported collaborative learning environments. Computers & Education, 49, 176e192. Lamb, B. (2004). Wide open spaces: wikis, ready or not. EDUCAUSE Review, 39(5), 36e48. Lim, K., & Lee, D. Y. (2008). A comprehensive approach to the teacher's role in computer supported learning environments. In Proceedings of the Society for Information Technology and Teacher Education International Conference, Chesapeake, VA. Lin, S. Y., & Overbaugh, R. C. (2009). Computer-mediated discussion, self-efficacy and gender. British Journal of Educational Technology, 40(6), 999e1013. Liu, X., Bonk, C. J., Magjuka, R. J., Lee, S., & Su, B. (2005). Exploring four dimensions of online instructor roles: a program level case study. Journal of Asynchronous Learning Networks, 9(4), 29e48. Liaw, S. S., Huang, H. M., & Chen, G. D. (2007). Surveying instructor and learner attitudes toward e-learning. Computers & Education, 49(2), 1066e1080. Liu, G. Z., Lo, H. Y., & Wang, H. C. (2013). Design and usability testing of a learning and plagiarism avoidance tutorial system for paraphrasing and citing in English: a case study. Computers & Education, 69, 1e14. MacKnight, C. (2000). Teaching critical thinking through online discussions. Educause Quarterly, 4, 38e41. Mahdizadeh, H., Biemans, H., & Mulder, M. (2008). Determining factors of the use of e-learning environments by university teachers. Computers & Education, 51(1), 142e154. Martindale, T., & Wiley, D. A. (2005). Using weblogs in scholarship and teaching. TechTrends, 49(2), 55e61. Matzat, U. (2013). Do blended virtual learning communities enhance teachers' professional development more than purely virtual ones? A large scale empirical comparison. Computers & Education, 60(1), 40e51. Mazzolini, M., & Maddison, S. (2007). When to jump in: the role of the instructor in online discussion forums. Computers & Education, 49(2), 193e213. McDuffie, A. R., & Slavit, D. (2003). Utilizing online discussion to support reflection and challenge beliefs in elementary mathematics methods classrooms. Contemporary Issues in Technology and Teacher Education, 2(4), 447e465. McMurtry, K. (2001). E-Cheating: combating a 21st century challenge. T.H.E. Journal, 29(4), 36e41. Motaghian, H., Hassanzadeh, A., & Moghadam, D. K. (2013). Factors affecting university instructors' adoption of web-based learning systems: case study of Iran. Computers & Education, 61, 158e167. Muir-Herzig, R. G. (2004). Technology and its impact in the classroom. Computers and Education, 42(2), 111e131. Ocak, M. A. (2011). Why are faculty members not teaching blended courses? Insights from faculty members. Computers & Education, 56(3), 689e699. Olt, M. (2002). Ethics and distance education: strategies for minimizing academic dishonesty in online assessment. Online Journal of Distance Learning Administration, 5(3). Retrieved August 27, 2014, from http://www.westga.edu/~distance/ojdla/fall53/olt53.html. Robles, M., & Braathen, S. (2002). Online assessment techniques. The Delta Pi Epsilon Journal, 44(1), 39e49.

M.-L. Hung, C. Chou / Computers & Education 81 (2015) 315e325

325

Roby, T., Ashe, S., Singh, N., & Clark, C. (2013). Shaping the online experience: how administrators can influence student and instructor perceptions through policy and practice. Internet and Higher Education, 17, 29e37. Rovai, A. P. (2000). Online and traditional assessments: what is the difference? Internet and Higher Education, 3(3), 141e151. Rovai, A. P. (2002a). A preliminary look at structural differences in sense of classroom community between higher education traditional and ALN courses. Journal of Asynchronous Learning Networks, 6(1), 41e56. Rovai, A. P. (2002b). Development of an instrument to measure classroom community. Internet and Higher Education, 5, 197e211. Rovai, A. P. (2007). Facilitating online discussions effectively. Internet and Higher Education, 10(1), 77e88. Salmon, G., & Lawless, N. (2006). Management education for the twenty-first century. In C. J. Bonk, & C. Graham (Eds.), The handbook of blended learning: Global perspectives, local designs (pp. 387e399). San Francisco, CA: Pfeiffer Publications. Schoonenboom, J. (2012). The use of technology as one of the possible means of performing instructor tasks: putting technology acceptance in context. Computers & Education, 59(4), 1309e1316. Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management system tools more than others. Computers & Education, 71, 247e256. Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. Internet and Higher Education, 9(3), 175e190. Steenkamp, J. E. M., & Baumgartner, H. (1998). Assessing measurement invariance in cross-national consumer research. Journal of Consumer Research, 25(1), 78e107. Webb, M., Gibson, D., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 270e279. Wilson, B. C., Ludwig-Hardman, S., Thornam, C., & Dunlap, J. C. (2004, November). Bounded community: designing and facilitating learning communities in formal courses. The International Review of Research in Open and Distance Learning, 5(3). Retrieved August 27, 2014, from http://www.irrodl.org/index.php/irrodl/article/view/204/286. Yeh, S. S. (2010). Understanding and addressing the achievement gap through individualized instruction and formative assessment. Assessment in Education: Principles, Policy & Practice, 17, 169e182. Yuan, J., & Kim, C. (2014). Guidelines for facilitating the development of learning communities in online courses. Journal of Computer Assisted Learning, 30(3), 220e232. Yusoff, N. M., & Salimb, S. S. (2012). Investigating cognitive task difficulties and expert skills in e-Learning storyboards using a cognitive task analysis technique. Computers & Education, 58(1), 652e665. Zingaro, D., & Porter, L. (2014). Peer instruction in computing: the value of instructor intervention. Computers & Education, 71, 87e96.