Journal Pre-proof Engagement and performance when learning with technologies in upper secondary school Nina Bergdahl, Jalal Nouri, Uno Fors, Ola Knutsson PII:
S0360-1315(19)30333-1
DOI:
https://doi.org/10.1016/j.compedu.2019.103783
Reference:
CAE 103783
To appear in:
Computers & Education
Received Date: 27 February 2019 Revised Date:
30 August 2019
Accepted Date: 7 December 2019
Please cite this article as: Bergdahl N., Nouri J., Fors U. & Knutsson O., Engagement and performance when learning with technologies in upper secondary school, Computers & Education (2020), doi: https:// doi.org/10.1016/j.compedu.2019.103783. This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. © 2019 Published by Elsevier Ltd.
Manuscript title: Engagement and Performance when Learning with Technologies in Upper Secondary School
Author details: Nina Bergdahl, PhD student
[email protected]
(corresponding author)
Jalal Nouri, PhD
[email protected] Uno Fors, PhD, DDS, Professor
[email protected]
Ola Knutsson, PhD
[email protected]
Department of Computer and Systems Sciences (DSV) Stockholm University Nodhuset Box 7003 S-164 07 KISTA SWEDEN
Acknowledgements This study was funded by Stockholm City, Sweden, as a part of the research conducted for the “I use IT” project.
Declaration of Interest statement None of the authors know of any conflicts of interest that need to be disclosed.
1
Engagement and Performance when Learning with Technologies in Upper Secondary School Abstract: Students need to engage in order to learn. As digitalisation changes the conditions for learning, it is essential to consider how student engagement might be affected. This study explores the relationship between students’ level of engagement in technology-enhanced learning (TEL) and academic outcomes. More specifically, we developed and validated an instrument LET (Learner–Technology–Engagement) using principal component analysis and confirmatory factor analysis, and distributed this to second and third year upper secondary school students. We then matched student responses (n=410) with their school grades. Using a bivariate correlation test, a one-way ANOVA test, and a post hoc test, we analysed the associations between low-, average-, and high-performance students and their reported engagement and disengagement when learning with technologies. The analysis reveals that high-performance students find it easier to concentrate when working with learning technologies than do average and low performers. We also found significant correlations between low grades and reported time spent on social media and streaming media for other purposes than learning (e.g., YouTube). There were also significant correlations between a decrease in students’ performance and the occurrence of unauthorised multi-tasking via learning technologies while in class: the lower the grades, the more frequently students reported using digital technologies to escape when lessons were boring. Conclusively: high-performance students seem to develop strategies to use digital technologies in supportive and productive ways. Thus, in order for schools to use digital technologies to ensure that disadvantaged students do not remain disadvantaged when learning with technologies and to not replicate problems in analogue classroom interactions, insights how different performance groups engage and disengage in TEL is critical for learning.
Keywords: student engagement, student disengagement digital technologies, academic outcome, upper secondary school
Introduction Digital technologies change the conditions for engagement learning by for instance enabling poly-synchronous communication (Bergdahl, Fors, Hernwall, & Knutsson, 2018; MarcusQuinn & Hourigan, 2017; Nouri, Spikol, & Cerratto-Pargman, 2016). Building on the consensus that engagement is essential for school success, recent research has begun to explore student engagement in TEL in higher education (Groccia, 2018; Halverson, 2016; O’Brien & Toms, 2008). However, as studies that explore engagement in TEL have primarily focused on higher education, student engagement in upper secondary education appears to have been overlooked. This lack of consideration may have far-reaching consequences, as students who disengage in earlier schooling risk initiating a downward spiral that include withdrawal, increased absenteeism, and dropping out (Finn, 1989; Tafelski, Hejnal, Maring, McDowell, & Rencher, 2017), all of which can create significant problems in later education (Department for Business Innovation and Skills, 2014; Finn & Zimmer, 2012) and lead to struggles with health in adulthood (Finn,1989). Moreover, while extant research have focused on student engagement in traditional (analogue) classrooms; there is little consensus regarding the construct of engagement in TEL and how it is operationalised (Henrie, Halverson, & Graham, 2015). It has been pointed out
2 that student engagement in TEL is ill-conceived, that engagement in TEL manifests differently than engagement in traditional (analogue) classrooms (Halverson, 2016), and that common conceptualisations have ignored how digital technologies shape engagement (Ma, Cheng, & Han, 2018). Following these insights, efforts have been made to develop a conceptualisation of engagement that reflects blended learning (Halverson, 2016; Ma et al., 2018). These studies align with established engagement research, which emphasise that there is a reciprocal relationship between context and engagement (e.g. Skinner & Pitzer, 2012; Wang & Hofkens, 2019). As computers have become commonplace in secondary education, understanding how students engage and disengage when learning with technologies is critical. Even though studies have begun to explore engagement and disengagement when students learn with digital technologies, research has yet to explore this topic in an upper secondary school context (Halverson, 2016; Henrie, Bodily, Larsen, & Graham, 2018; Ma et al., 2018). Guided by the purpose to explore how students’ self-reported engagement in TEL relate to actual performance in school, we raise the following research question:
-
How are different performance groups (low, average, and high) engaging and disengaging in technology-enhanced learning?
Background What is academic engagement and disengagement? Engagement often refers to students’ level of involvement with and effort in learning (Fredricks et al., 2004; Skinner & Pitzer, 2012; Wang et al., 2017), and thus can be further qualified as “academic engagement” (e.g., Alrashidi et al., 2016). A number of engagement studies have adopted a general “school engagement” perspective, which, in addition to engagement in academic activities, includes students’ social interactions with peers between classes, and participation in after-school activities (e.g., Wang, et al., 2017). While the use of digital technologies outside a formal learning context can enable social inclusion and support—which, in turn, can affect students’ experience at school (Timmis, 2012)— technologies can also be used to disengage from school and learning. Hence, we found it important to separate academic engagement, which focuses on participation in academic (rather than non-academic) activities (Alrashidi et al., 2016), from general engagement with technologies to support social interaction outside of learning, when exploring academic engagement and disengagement in TEL. While student engagement is essential for learning, it unfortunately risks declining as students advance through education (e.g., Wylies & Hodgen, 2012). Disengagement however, is more than the absence of engagement, and includes such indicators as withdrawal (Skinner et al., 2009; Wang et al., 2017) and absenteeism, and can lead a student to drop out of school altogether (Department for Business Innovation and Skills, 2014; Tafelski et al., 2017). Even though students who risk failing and dropping out are a cost for any educational institution in terms of both time and resources, a school report in Sweden discovered that as many as two-thirds of the Swedish schools did not meet the standards1 for addressing and 1
3 Ch. 5 a-9 §§ Education Act, Sweden
3 preventing student absenteeism and almost none of the schools had investigated the causes of student absence (Swedish School Inspectorate, 2015). Since it is currently well-known that disengagement increases the likelihood of dropping out (Finn, 1989; Wang & Fredricks, 2014), schools should identify signs of disengagement early and take preventive measures to curb future absenteeism. While chronic absenteeism is a serious challenge that schools face (Balfanz & Byrnes, 2013), efforts to tackle absenteeism tend to focus on getting students to attend school. Assessing students’ participation in learning activities Finn (1992) identified three distinct student profiles: non-participants, passive participants, and active participants. He concluded that students can be disengaged even when physically present in the classroom. In fact, many students who attended class remained disengaged. When implementing digital technologies, problems with disengagement might escalate; Distracting notifications or urges to use the technologies are some challenges that may draw the students’ attention away from learning. Therefore, identifying early indicators of disengagement may help schools to remedy the problem before it can escalate to absenteeism. In their seminal work, Fredricks et al. (2004), gathered understandings of student engagement from research literature, to aggregate a conceptualisation of engagement. They proposed that engagement was a multi-dimensional construct with at least three dimensions: an emotional, a behavioural and a cognitive. Recently, engagement researchers have added a social dimension (e.g., Fredricks et al., 2016; Wang et al., 2017) and suggested that student disengagement is a separate construct, distinguishable from engagement, that involves more than the mere absence of engagement (Fredricks, Reschly, & Christenson, 2019; SalmelaAro, Upadyaya, Hakkarainen, Lonka, & Alho, 2017; Wang et al., 2017). Following Fredricks, et al., (2004) and Wang, et al., (2017) student engagement and disengagement could be described as follows: •
•
•
•
The emotional dimension includes a student’s positive reaction toward teacher instruction, peers, and learning, and can be observed in, for example, accepting instruction and expressions of joy and interest, or negative reaction, frustration and boredom. The cognitive dimension reflects a student’s investment in learning and can, for instance, be observed as his/her concentration and desire to exert the effort necessary to comprehend complex ideas and master demanding skills or, alternatively, simply give up in the face of challenges. The behavioural dimension refers to participation and conduct. This is a spectrum of behaviour that can be observed when a student either takes action to learn (e.g., taking notes, raising a hand to answer a question) or to disrupt learning (e.g., initiates disruptive behaviour, or not following classroom rules). The social dimension addresses either enjoying/participating in or withdrawing from collaboration or social interaction with peers (for example, this can be seen in students either helping each other or expressing that they do not feel noticed).
The above conceptualisation describes the interrelated dimensions of student engagement: behavioural, emotional, cognitive, and social. These indicators reflect learning in a traditional (analogue) classroom, and thus may not in any way reveal whether a learning situation is technology-enhanced or not. Halverson (2016) suggested that face-to-face and online engagement, although related, are distinct constructs. Halverson contributes with a conceptual framework that reflects on the emotional and cognitive aspects of student engagement in TEL in which she suggested that the cognitive dimension includes indicators that reflect for example, time on task, attention, metacognitive strategies, effort, and
4 absorption. Another effort to capture and conceptualise engagement in TEL has been proposed by Ma, Cheng, and Han (2018) who created an instrument that reflects the behavioural, cognitive, and emotional dimensions of student engagement in higher education. However, their proposed indicators of engagement only partially explore the dimensions of engagement and disengagement.
What do we know about engagement and disengagement in technology-enhanced learning? TEL researchers are currently investigating new ways to conceptualise and measure student “engagement”, for example by using learning analytics to collect system log-data to measure “time on tool” (Henrie et al., 2018) or analysing “posts in a forum” (e.g., Hew, 2016) or “virtual world interactions” (Pellas, 2014). Some researchers combine the established engagement theories, such as for example Henri et., al (2018) who compared system log data with self-reports of engagement concluded that this approach to engagement in TEL does not capture the multi-dimensionality (i.e., the emotional, behavioural, cognitive, or social aspects) of student engagement. Engagement in TEL is a growing field (Bower, 2019). However, there is no consensus among researchers in regard to how to describe or capture this, and variations are many. For example, while Groccia (2018) has discussed “online engagement” and “online learner engagement,” others have mentioned “learners’ technology engagement” (Zhai, Dong, & Yuan, 2018), “user engagement with technology”(O’Brien & Toms, 2008), “blended learning engagement”(Halverson, 2016), “student ICT engagement“ (Goldhammer, Gniewosz, & Zylka, 2016) or “student engagement in technology-mediated learning” (Henrie et al., 2015). Some contexts emerged in the dawn of computer-assisted learning, such as language learning (e.g., Parmaxi, Zaphiris, & Ioannou, 2016; Yang, 2011), and computer-supported collaborative learning (e.g., Järvelä & Renninger, 2014). Later, other ways of using digital technologies for learning have emerged, such as online environments and virtual worlds (e.g. Hew, 2016; Pellas, 2014), gamification (e.g., Bolliger & Martin, 2018; Zhai et al., 2018), immersive technologies, and virtual reality (e.g., Allcoat, von Mühlenen, & Mühlenen, 2018). One study, used the term “ICT engagement” to address accessibility and the laptop/student ratio in and outside of school (Goldhammer et al., 2016). Other studies, approach engagement in online environments, (e.g. Groccia, 2018), (to reinforce the need for students in online environments to engage as much as students in traditional environments in order to be successful). Yet, others address engagement in TEL, which often combine face-to-face instructions with uses of digital technologies (for a review see Henrie et al., 2015). Thus, research on engagement in TEL has begun to explore the potential of digital technologies in different contexts. For the purposes of this article, engagement in TEL refers to engagement in any learning that occurs in school through the use of digital technologies. The European Framework for the Digital Competence of Educators (Redecker & Punie, 2017) describes the educational landscape as one in which many (European) schools today offer their students digital technologies to mediate learning. Following Redecker and Punie (ibid.) who view digital technologies as tools and resources, we use the term digital technologies for computers, tablets, smart phones or similar devices (tools), that enable students to use and access applications (e.g. presentation-, editing-, assessment- programs) and online resources (e.g. cloud services, forums, learning platforms) (resources). However, as digital technologies offer a variety of modalities, (and thus shape interaction and engagement differently), research that focus on one digital technology to measure student
5 engagement (and/or disengagement), cannot offer a comprehensive overview of students’ engagement and disengagement in learning, when learning with the full range of digital technologies in schools (as light is only shed on a small proportion of the expression of engagement or disengagement). The acknowledgement of this problem is seen in the current efforts to grasp and explore different aspects of upper secondary school student engagement and disengagement in TEL, focusing on uses digital technologies as such. One study (Salmela-Aro et al., 2017), approached potential risks of using digital technologies in schools, and noted students may use them to disengage by directing their attention away from learning, and that excessive use of digital technologies can lead to depression and school burnout (Salmela-Aro et al., 2016; Salmela-Aro et al., 2017). Yet another study, (Hietajärvi et al., 2019) approached students’ digital technology use after school (from Grade 6 through higher education), and concluded that different cohorts of students used digital technologies differently. For example, while students’ communicating and social networking were related to low engagement in lower education, they were correlated with school burnout in higher education; and playing action games became progressively more negative (when associated with engagement) as student advanced through education (ibid.). While highly engaged students state that they use digital technologies to succeed academically (Hietajärvi, SalmelaAro, Tuominen, Hakkarainen & Lonka, 2019), students who readily use digital technologies for social networking outside of school may feel inadequate and isolated at school if such tools are not available (Salmela-Aro, Muotka, Hakkarainen, Alho, & Lonka, 2016). While some studies have sought to identify potential negative implications, there are plenty of studies which suggest that students are not only able to use the technologies, but also appreciate their potential (Trinder, 2015), and approach their use in learning favourably (e.g. Abidin, Mathrani, & Hunter, 2017; Rashid & Asghar, 2016). However, simply distributing technologies will not re-engage students who are disengaged in learning, as they may choose to engage with their digital tools for purposes other than learning (Bergdahl, Knutsson, & Fors, 2018; Rashid & Asghar, 2016). The present study uses an instrument (LET) specifically developed to reflect upper secondary school students’ behavioural, cognitive, emotional and social engagement and disengagement in TEL, which, to our knowledge, have remained unexplored.
Method Context and participants Selection criteria With an interest in determining what experience of engagement and disengagement in TEL upper secondary school students have, and will bring with them into higher education, we employed a purposive sampling technique (Bryman, 2016) as follows: We first selected the five largest of the national programmes leading to qualification for higher education (The Swedish National Agency for Education, 2018). As access to computers at a structural level is central for technology-enhanced learning, we then selected all schools that had participated in a 1:1 (one laptop per student) investment project and that had used a learning management system for at least ten years. As such, we were guaranteed that students would have the experience of learning with technologies at their school. As students in Technology programmes might be more positive toward digital technologies than students in other programmes, we omitted the Technology programme from our inclusion criteria to avoid potential bias. We kept the remaining largest (four) programmes (see Table 1) and
6 approached all upper secondary schools governed by Stockholm City that offered at least two of these. Eleven of the fifteen queried schools accepted: seven inner-city schools and four suburban schools. Table 1. Respondent demographical data Student demographical data Number of respondents Gender Male
n
%
SD
3.4
0.93
410 150
37
206
63
Programme Social Sciences programme Humanities programme
205 45
50.2 11.0
Economics programme
72
17.3
Natural Sciences programme
88
21.5
Year in upper secondary education Second year Third year Use the new school platform
179 231
43.7 56.3 95.1
Female
Mean
Average grade past school year Grades are measured on a 6-item scale in which F (or no grade) = 0 and A = 5.
As we performed the data collection when the semesters began, students in their first year were thought to have limited experience learning with technologies in the particular school and were excluded from the selection criteria, (see Table 1). Thus, we asked each of the principals to allow at least two classes of second- or third-year students to fill out the questionnaire. Of the students from the eleven schools who filled out the questionnaires, we received students’ grades (n=410) from eight of the schools. These responses comprise our final sample.
Developing the questionnaire The questionnaire was inductively informed by interviews The questionnaire was informed by interviews and engagement and disengagement theory (e.g. Fredricks et al., 2004; Wang et al., 2017). We contacted two schools with which our department had established connections, and asked to interview students in upper secondary classes. Two teachers asked all students in their respective classes. Eight students were interviewed, following the Thematic Analysis approach, (Braun & Clarke, 2013). The interviews were semi-structured to ensure that we would cover the four dimensions of engagement and disengagement; emotional, behavioural, cognitive, and social (Fredricks et al., 2004; Wang et al., 2017) in TEL. For example, the participants were asked to elaborate on how they had engaged (cognitively, emotionally, behaviourally and socially) in their learning while using digital technologies, their concerns and beliefs regarding engagement in learning situations in which digital technologies were used, as well as how different social environments and designs of learning influenced their engagement (e.g., “Can you describe your engagement when learning with digital technologies?” and “What was a learning situation with digital technologies that you felt maximised your engagement?”). The
7 interviews lasted between 45 and 60 minutes. All interviews were recorded and transcribed verbatim. Thematic Analysis (TA) was used to analyse the interviews (Braun, & Clarke, 2012: 2013). The coding was done in NVivo version 11.4.43, following a line-by-line approach starting with coding semantic layers, then the researcher coded for more latent content (Braun and Clarke, 2006: 2013). As unit of analysis we used “the user in setting” (Lave, 2009) to identify instances in which students reported feeling engaged or disengaged when learning with technologies, with consideration for behavioural, cognitive, emotional, and social dimensions (Wang et al., 2017). The coding was done iteratively across the entire data set. Table 2. LET— indicators of engagement and disengagement Engagement
Disengagement
Beh1
Uses digital technologies as a support for learning Uses the Internet to research what others have written and find facts
Dbeh1
Beh3
Uses asynchronous media to rehearse and master content
Dbeh3
Beh4
Uses technologies to work on school assignments Concentrates well when using digital technologies Takes own initiative and decides on what digital technologies to use Needs digital technologies to maximise learning Uses IT as a cognitive enhancement
Dcog1
Emo1
Wants to use more digital technologies for learning than what is used today
Demo3
Emo2
Perceives using digital technologies for learning as engaging
Demo4
Emo3
Relies on technologies to manage education
Demo5
Turns in assignments late due to unauthorised use of technologies Delegates group work using technologies to a peer, as no more than one is required to complete the task Is prevented from working due to undeveloped technologies and system breakdowns Chooses to use digital technologies in unauthorised ways Notifications (i.e. from a mobile phone) easily cause a distraction Is overwhelmed by information overflow Is emotionally drawn to an application or digital technology Uses digital technologies to escape feelings of boredom Feels frustration over poor communication over the learning platform Believes that teachers lack the IT skills needed to support individual learning effectively Resists using the laptop for all reading
Emo4
Finds creating with technologies satisfying Desires spatiotemporal solutions and personalisation that technologies may offer Is satisfied with using digital technologies that mediate teachers' insight into student's learning process Is satisfied that teachers use digital technologies to provide feedback
Demo6
Resists using the laptop for all writing
Dsoc1
Is left to manage digital technologies for learning themselves
Dsoc2
Feels that digital technologies are used in ways that enable participation, inclusion and belonging Experiences teachers’ social presence with, as well as, inside applications
Dsoc4
Coming to school is perceived as meaningless when there are no interpersonal activities Is unhappy with repeatedly being directed to learn by looking things up online Feels upset/dispirited that group work does not involve all students
Beh2
Cog1 Cog2 Cog3 Cog 4
Emo5
Soc1
Soc2
Soc3
Soc4
Dbeh2
Dcog2 Dcog3 Demo1 Demo2
Dsoc3
Dsoc5
Experiences decreasing engagement due to feeling isolated while using technology
Table 2 shows an overview of behavioural (beh), cognitive (cog), emotional (emo) and social (soc) indicators of engagement (left column) and disengagement (right column). Questions were mapped to the relevant indicator After the data was coded, we identified subthemes reflecting behavioural, cognitive, emotional and social engagement and disengagement. From the collection of subthemes, indicators were formulated. The authors discussed interpretations and coding to align
8 thinking throughout the analysis. In total, we identified 274 items that reflect student engagement in learning, although not exclusively in technology-enhanced learning. As we sought to capture how students describe their engagement and disengagement in TEL, we only retained indicators that reflect students’ engagement or disengagement when learning with technologies. Thus, to capture student-reported engagement in TEL, 49 questions were developed and mapped to the relevant indicator of engagement (ensuring an even distribution across behavioural, emotional, cognitive, and social dimensions of engagement or disengagement) (see Table 2). This formed the basis of the questionnaire (Appendix B). The questions would be further reduced, following pre-testing and validation (as below). Questions related to student demographic data and additional measures such as grades, merit rating, and IT skills (see: “Additional measures”) were also included. Pre-testing To ensure students in the target group would find the questions valid (but not accidentally run the questionnaire with students who would later be asked to complete it), we pre-tested the questionnaire with five students who had just graduated from upper secondary school. The students were asked to fill in a printed questionnaire and mark questions they reacted to and reflect together. The pre-testing resulted in changing questions like “It makes me upset when...” to “It makes me upset/ dispirited when...” as the students had put forward that they do not stay in an upset state, rather a feeling of powerlessness can replace the irritation. Students also pointed out that teachers’ IT skills vary, hence a question relating to a specific teacher’s digital IT skills was clarified: “I find that the IT skills of many of my teachers are not enough to match the use of digital tools in a way that supports me in my learning”. We then conducted a second pre-testing, in which we asked five students (in their ninth and final year) in secondary school to complete the questionnaire online. This was followed by a reflective session in which the students expressed that some questions were similar or lengthy. To redeem this, questions that had long explanations were shortened, and the mapping against indicators was approached again, to see if any two items was so similar that they could merge. After the pre-testing the number of questions were reduced to 44 (see Appendix B). Validation While the development of the LET was informed by relevant theory (Fredricks et al., 2004; Wang et al., 2017) and interviews, and subsequently pre-tested for ecological validity, we conducted a principal component factor (PCA) analysis, using a varimax rotation, for further validation. The Kaiser–Meyer–Olkin measure of sampling adequate was 0.79 and Bartlett's test of sphericity was significant (x2 = 3713.47, df = 561, p < 0.0001), confirming the appropriateness of PCA. See Fig. 1 for scree plot and Table 3 for the rotated eigenvalues and percentages of variance explained. As a result, three items were removed (see Appendix B). Then a confirmatory factor analysis (CFA) using AMOS, a structural equation modelling software program, was executed to confirm the construct validity of the engagement and disengagement scales. As a first step, a structural model with all items listed in Table 2 as indicators of 8 latent factors was tested (4-factors per scale).
9
Figure 1. Scree plot of components and eigen values Table 3 Total variance explained and percentage of variance by Varimax rotation
Component 1 2 3 4 5 6 7 8 9 10 11
Eigenvalue 4.90 3.77 2.49 1.68 1.49 1.43 1.20 1.11 1.09 1.03 1.00
Percentage of variance 14.43 11.10 7.35 4.92 4.38 4.20 3.52 3.28 3.20 3.02 2.97
Cumulative percentage 14.43 25.53 32.87 37.80 42.18 46.37 49.90 53.17 56.36 59.38 62.35
This model yielded a chi-square statistic that was highly significant [χ2(219) = 712.09, p<0.0001], indicating a relatively poor fit. After removal of factors with loadings below 0.4, an eight-factor model (see Fig. 2) with a total of 23 items were identified that satisfy the recommendations of (Hu & Bentler, 1999) (CFI=0.901; TLI=0.932; RMSEA=0.063; SRMR=0.052), indicating a good fit to the observed data.
Measures Engagement and disengagement To capture indicators reflecting emotional, behavioural, cognitive and social dimensions of engagement and disengagement, we adopted a 6-item Likert scale, ranging from “Not at all like me” to “Very much like me” (if nothing else is stated) (see Appendix B). Educational aspiration and general academic engagement Students educational aspiration has been seen to correlate with engagement in (e.g. Wang & Eccles, 2011). Following the NSSE, a nationwide American survey on student engagement (Center for Postsecondary Research. Indiana University School of Education, 2018) and Wang and Eccles (2011) we used a one-item indicator: “If you could do whatever you like, what level of education would you like to finish?" To answer this question, students were given the options of completing upper secondary school, taking independent courses, graduating from university, or obtaining a PhD degree. We also approached student
10 engagement directly. Students were asked to report on their general engagement in learning on a 6-point Likert scale ranging from "never engaged" to "always very engaged" (see Appendix B).
Figure 2. The final 8-factor model, see Table 2 for item descriptions
Academic Outcome In this study, academic outcome refers to student’s grades. Research has shown that both engagement and disengagement may be related to grades (Balwant, 2018; Wang & Eccles, 2012) why it is relevant to include a measure on academic outcome. Sweden applies an A–F grading system, in which A is the highest possible grade and F is the lowest grade. The collected grades were converted into a 6-point scale (A = 5, B = 4, C = 3, D = 2, E = 1, and F or no grade = 0). Using their most recent semester grades and the 6-point scale, we calculated the students’ grade point averages (GPA). The students’ GPAs were used to categorize them into one of three performance groups. Students who scored 0.5 standard deviations (SD) above the mean GPA (M = 3.37, SD = 0.93) were assigned to the high-performance group, while those who scored 0.5 SD below the mean GPA were assigned to the low-performance group, in accordance with (Hill & Wang, 2014). As a result, all of the low achievers had a GPA below 2.44, the average students had a GPA between 2.45 and 3.83, and the high achievers had a GPA of 3.84 or higher. In Sweden, students gain a merit rating when they graduate from secondary school. The rating is calculated on basis of the final grades in year nine in primary school. Apart from using student grades to reflect student academic outcome, students were also asked to report on their merit rating.
11 Time on tool and students’ perceived IT competence Using a 5-item Likert scale, ranging from “not at all” to “very high,” we asked the students to rate how competent they felt using digital technologies for learning. We also wanted to explore to what extent the students used digital technologies for other purposes than learning and thus asked them how many hours each week they spent gaming, browsing social media, or watching YouTube (or other streaming services), as well as how much time they spent using applications to create media/film/music/image/text (for non-school related activities). Ethics All questionnaire administrators were sent information to read aloud to the students who participated in the study (see Appendix A). When conducting the online questionnaire, students had to provide consent to access the questionnaire. Students were informed of their right to withdraw at any time. Information about consent was distributed both verbally and in written form. [name of institution’s] survey software Survey and Report was used to distribute and collect the questionnaire data. Once the data was collected and matched to the corresponding grades, all names were assigned random codes (e.g., BM51) to ensure students’ anonymity.
Data Collection and Analysis The questionnaire was distributed to 872 students in 11 upper secondary schools; 552 students from 8 schools filled out the questionnaire. When asking for students’ grades, we needed to issue several reminders. School personnel then explained that they, due heavy work load, were late with providing us with the requested data. Although the schools that did not supply the grades did not give a reason, it is plausible that reasons might include heavy work load, limited resources or too many research requests. In the end, we gathered questionnaire responses and matching grades from 410 students (47% of the students responded). (Two schools had as low response rate as around 20%, which affect the average response rate for the whole sample. The majority of the other schools, however, had a response rate above 70%). The data was screened and descriptive statistics were employed to demonstrate the distribution of the participants across the studied variables using SPSS version 25. Moreover, the variables were examined for skewness and kurtosis. We then conducted Spearman’s correlation tests to explore associations between the variables. Furthermore, a series of oneway ANOVA and Fisher’s Least Significant Difference (LSD) post hoc tests were conducted to compare differences and the effect of engagement in TEL on performance, between and within performance groups.
Results Descriptive statistics of educational programmes, performance and additional measures We used descriptive data to explore how student performance was distributed across the educational programmes. While the social sciences programme possessed a higher percentage of low performers, the total proportion of high, average, and low performers was evenly distributed (see Table 4).
12
Table 4. Performance-group and educational programme
Proportion of performers and programmes
Low performance
Average performance
High performance
Social Sciences
83 (40%)
64 (31%)
59 (29%)
Economics
12 (17%)
32 (44%)
28 (39%)
Humanities
9 (20%)
11 (24%)
25 (56%)
23 (26%)
23 (26%)
42 (48%)
126 (31 %)
130 (32%)
154 (38%)
Natural Sciences Total
Total (n=) 205 72 45 88 410
Additional measures were used to reflect students’ educational engagement, aspirations, and outcomes, and further explore the extent to which students perceived that their IT competence was sufficient for school requirements, as well as reveal how much time they spent on digital technologies (see Table 5). Students generally reported high IT skills, with time spent on (in descending order) social media, YouTube (or other streaming services), gaming, and non-school-related digital creation (the time spent using applications to create non-school related media/film/music/images or text). Table 5. Descriptive statistics on additional measures Additional measures General engagement Educational aspiration Merit rating IT skills Time spent gaming Time spent on social media Time spent on YouTube Time spent creating
N
Max.
Mean
SD
f
410
6.00
4.70
0.90
27.25
4.00
2.87
0.69
8.11
400
293
37.62
57.50
6.00
3.85
0.91
2.23
6.00
1.61
1.10
0.47
6.00
2.83
1.34
9.08
6.00
2.64
1.21
2.97
6.00
1.30
0.74
3.45
Correlations between engagement in technology-enhanced learning and performance We then used bivariate correlations to explore the relation between upper secondary school students’ self-reported engagement when learning with digital technologies and their actual academic performance using the three performance groups. As expected, students reported educational aspiration (r = 0.20, p < 0.001) and overall engagement (r = 0.34, p < 0.001) in learning, exhibit a significant correlation with higher grades (see Table 6).
13 Table 6. Correlations between academic performance and engagement variables Variables 1 Academic performance
1 1.00
2
2 Educational aspiration
.20**
1.00
3 General academic engagement
.34
**
.29**
1.00
4 Time used gaming
-.04
-.06
-.17**
1.00
-.08
-.08
-.11*
1.00
.14**
.21**
1.00
.02
.15**
1.00
**
.03
-.15**
1.00
5 Time used on social media 6 Time used on YouTube 7 Time used to create
-.19
**
-.11
*
-.10
*
-.15
3
**
-.02 **
8 I feel drawn to use my mobile -.09 -.13 phone 9 I concentrate easily when working .12* .10* with digital technologies 10 The use of digital technologies -.11* -.14** makes me hand in my homework late 11 I use digital technologies to -.14** -.13** escape 12 I use digital technologies to .16** .14** support my learning 13 I multitask in class by switching -.15** -.12** between games and learning 14 It upsets me that one student in .15** .09* a group often does more than others Statistical significance: ** p <0.01, * p <0.05
-.22
4
**
-.06
.14
5
**
6
7
8
9
10
11
12
13
**
-.05
.30
.26**
.02
-.09
-.03
.02
-.17**
1.00
-.31**
.06
.18**
.05
-.01
.51**
-.27**
1.00
-.35**
.04
.23**
.07
-.10*
.56**
-.22**
.65**
1.00
.16**
-.09
.02
-.09
.03
.09
.33**
-.05
.01
1.00
-.21**
.06
.19**
.04
-.09
.49**
-.16**
.52**
.58**
.04
1.00
.20**
-.09*
.13**
-.03
-.06
.08
.05
-.02
.02
.11*
.02
-.23
14
1.00
Results also show that students with higher grades more often stated they use of digital technologies to support learning (r = 0.16, p < 0.01) and could concentrate easily when working with digital technologies (r = 0.12, p < 0.05), but also with the negative feeling that one student commonly did more than others during group collaborations (r = 0.15, p < 0.01). Interestingly, significant negative correlations are found between high grades and the reported time spent on social media (r = -0.19, p < 0.01) and YouTube (or other streaming services) (r = -0.11, p < 0.05) and the time spent using applications to create non-school related media/film/music/images or text, (r = -0.10, p < 0.05). However, there are no significant correlations with the different performance levels and the time they reported spent gaming (r = - 0.4, p < 0.43). Lower grades correlate with several other ways of engaging with technologies; For example, multi-tasking class (i.e., switching between playing games and focusing on learning) (r = -0.15, p < 0.01), using digital technologies to escape when lessons are perceived to be boring (r = -0.14, p < 0.01), and pointing to the use of digital technologies as a reason why school work is submitted late (r = -0.11, p < 0.05).
Differences between the performance groups We then explored the differences between the performance groups (low, average, and high) in terms of engagement and disengagement in technology-enhanced learning using a one-way ANOVA-test and Fisher’s LSD post hoc test. Additional measures in technology-enhanced learning and performance groups As expected, there are significant differences—in terms of merit rating, educational aspiration, and academic engagement in learning at the p < 0.01 level between the three performance groups. However, no significant difference was found regarding general engagement between average-performance and low-performance groups (see Appendix C). There are significant differences between grades and several of the ways in which digital technologies were used. When measured against performance, significant differences were revealed between the performance groups in terms of reported time spent on social media (F(2,406) = 15.74, p = 0.01), time spent on YouTube and the time spent using applications to create non-school related media/film/music/images or text (F(2,406) = 2.97, p = 0.05 and F(2,406) = 3.45, p = 0.03, respectively). The post hoc test also reveals that average performers reported spending significantly less time on social media than low performers (M = -0.36), but significantly more time than high performers (M = 0.31); Hence, social media use is related to grades: The more time that students reported having spent on social media, the lower their grades were. We also assessed time spent on YouTube: High performers differed significantly from low performers in reported time spent on YouTube (M = -0.31). When comparing students’ reported time spent on creating non-school related media/film/music/images or text, we found a significant difference between the low-performance group and the average-performance group (M = 0.23). Behavioural (dis)engagement in technology-enhanced learning and performance groups The ANOVA test reflected significant differences at the p < 0.01 level between certain behavioural engagement and disengagement variables, such as using IT to support learning, (F (2,406) = 6.56, p = 0.01). The mean score for using IT to support learning was significantly higher for the high-performance group than for either the averageperformance group (M = 0.22) or the low-performance group (M = 0.38). Overall, there
15 are no significant differences between students who identified their use of digital technologies as the reason why their school work was delayed (p < 0.06), (F(2,406) = 2.89, p = 0.06). However, students in the low-performance group are significantly more likely than students in the high-performance group to admit that their use of digital technologies during class was a reason for their school work being late (M = -0.38). Thus, while it was rare for digital technologies to delay students, those who use digital technologies to the extent that they submitted their school work after the deadline were low performers. Cognitive (dis)engagement in technology-enhanced learning and performance groups Significant differences at the p < 0.05 level between grades and cognitive engagement and disengagement were discovered. The post hoc test revealed significant differences between the high-performance groups and low-performance groups (M = 0.36) in reported ability to concentrate easily when learning using digital technologies. While highperformance students report having finding it easy to concentrate when using digital technologies for learning, findings also suggest that low performers might find it harder to concentrate under the same conditions. Moreover, the extent to which students were distracted by notifications from the digital technologies, differed significantly at the p < 0.05 level between the three performance groups (F2, 406 = 3.55, p = 0.03). Interestingly, the post hoc test found that only students in the average-performance group claimed to not get distracted by notifications. Comparing the average group to the low- and highperformance groups, we determined that the more the students’ grades divert to either end (low: M = -0.43, high: M = -0.38), the more they perceive notifications to be distracting. One explanation can potentially be that while struggling students need to focus deeply in order to successfully complete a task, high performers may prefer working with deep concentration without interruption, thus average performers might not need to engage deeply in order to complete the task, nor have a desire to excel above their educational goals. When comparing the performance groups unauthorised multi-tasking with digital technologies in class, we found that the results differed between the performance groups (F(2,406) = 4.83, p = 0.01). We identified a significant difference between the highperformance group and low-performance group (M = -0.47) with the latter group tending to use the technologies in unauthorised ways more often. Emotional (dis)engagement in technology-enhanced learning and performance groups We identified indicators that differed significantly between the performance groups within the dimension of emotional engagement and disengagement. At the p < 0.01 level, we found significant differences between student performances and the use digital technologies to escape when the lesson is perceived as boring, (F(2,406) = 4.33 and p = 0.01). The results of the post hoc test indicate significant differences between the highperformance group and both the average-performance group (M = -0.43) and the lowperformance group (M = -0.52). There were no significant differences between the average- and low-performance groups (M = -0.09). Thus, both average and low performers tend to turn to digital technologies to escape when a lesson is perceived as boring. Social (dis)engagement in technology-enhanced learning and performance groups Significant differences between high, average, and low performers and social engagement and disengagement were identified, which highlights the different needs and preferences
16 of these groups. For example, (significant at the p < 0.05 level) high performers often reported feeling frustrated or upset that one student completed more work than the others in a group (F(2,406) = 3.27, p = 0.04). The post hoc test revealed that both the average- (M = -0.33) and the high-performance (M = -0.49) group differed significantly compared to the low-performance group with regard to this item. Hence, the higher the student’s grade, the more likely they were to be upset or frustrated when engaging in collaborative TEL. An explanation can be that low-performance students may look for confirmation and support, while average of high performers expect peers in collaborative work to be self-sufficient. Thus, they might find it unfair that struggling students should be enabled to complete tasks on the merits of others, or simply wish to spend their school time pursuing on their own educational goals. Then, when low performers do not deliver as expected, this might cause frustration amongst the average and high performers.
Discussion To answer our research question: “How are different performance groups (low, average, and high) engaging and disengaging in technology-enhanced learning” we propose that our findings are indicative of unequal possibilities that explains how low, average and high-performance students engage in TEL. Schools are challenged to not replicate problems in analogue classroom interactions in blended or online learning and to ensure that disadvantaged students do not remain disadvantaged when learning with technologies. While research on engagement in digitalisation have focused on equal access to the internet and devices (Goldhammer et al., 2016), our results point out that schools also must ask themselves which students that are disadvantaged, or risk being excluded as a consequence of the digitalisation. This study revealed a number of interesting results; however, the following key findings, which further nuance our research questions, are especially important. Technology use can be problematic for low and average performance students While many educational institutions are perceiving a push toward becoming more digitalised, the technology use, particularly for low-performance students is not unproblematic. Previous research has concluded that digital technologies are used in unauthorised ways in schools, (e.g. Bergdahl, Knutsson, & Fors, 2018; Rashid & Asghar, 2016), and our findings show that both low- and average-performance groups report using digital technologies to escape when a lesson is perceived to be boring. Exploring students use of social media or streaming media for other purposes than learning, we found a significant correlation between the amount of time that students report spending on social media and low grades; though, we cannot conclude whether low performers are drawn to social media; or if there is some other, more reciprocal relation. Nevertheless, we do know that low-performance students consider their use of digital technologies in class to be a factor in their inability to finish school on time. In other words, browsing on social media does not improve the academic engagement of low-performance students. As low and average-performance students make up the majority of students in our sample, we can conclude that in-class distractions from digital technologies occur routinely. Indeed, the potentials of digital technologies should be exploited simultaneously as disruptions which might impede engagement and learning are managed. Additionally, we found significant correlations between certain performance groups and the various types of uses of digital technologies, such as watching YouTube and using digital technologies to create nonschool related media/film/music/images or text. While average- and high-performance
17 students tend to engage in one or the other, low-performance students were significantly overrepresented in engaging in all of these behaviours. This suggests that lowperformance students are particularly vulnerable, are less capable of resisting the urge to use digital technologies and hence may require more support to develop their abilities to self-regulate and remain focused on their learning (Järvelä & Renninger, 2014). These findings corroborate previous research, which has found that the rise in adolescents’ media multi-tasking might be associated with low-performance (Lin & Parsons, 2018; Ophir, Nass, & Wagner, 2009). On the other hand, we found no correlation between the time students reported they spent on gaming, low grades, and low levels of engagement. However, when games were used during class (e.g., switching between playing games and focusing on learning) this was significantly correlated with low grades. High-performance students use digital technologies productively for learning Our findings show that high-performance students seem to have developed strategies to use digital technologies in supportive and productive ways. This means that they appear to have the ability to resist urges and avoid distractions from technologies, and that they have developed digital competence that support them to use technology for learning in productive ways. For example, the results suggest that the more the students’ grades divert to either end; low-or high-performance, the more they perceive notifications to be distracting. This indicates that low-and high-performance students may respond differently to such distractions: thus, high-performance students’ strategies are characterised both by what they do and what they do not do. High-performance students reported using digital technologies to support learning, by finding it easy to concentrate when working with digital technologies and expecting that all students should contribute equally in group work. At the same time the higher performance students reported that they did not spending so much time using their digital technologies for non-school related activities like social media, media streaming or even using applications to create non-school related media/film/music/images or text. These findings are in line with previous research that have shed light on digital competence for learning that higher education students make use of for learning favourably (Henrie et al., 2015; Rashid & Asghar, 2016; Nouri, 2018). Thus, although average and low performers have developed ways of engaging with digital technologies in their everyday life, they may need to develop their digital competence, similar to the high performers, to better succeed in school. In addition, our findings imply high performers reported higher levels of frustration with group work. Although Wang et al., (2017) have suggested that a social dimension of engagement is reflected by, for example, enjoying collaboration or social interaction with peers, the social engagement offered by schools seem to frustrate high performers if they perceive that they need to do more than their peers or on behalf of peers. In this study, we found that high performers were not less engaged when working alone, on the contrary. This is a challenge for teachers, and underline the importance of establishing inclusive values as well as informed design of collaborative learning activities to ensure all students’ active participation in learning (e.g., Järvelä & Renninger, 2014).
Implications for pedagogical practices Currently, digitalisation is spreading across educational institutions worldwide (Redecker & Punie, 2017). We argue that teachers and researchers need to develop a more critical approach toward digital technologies, as the potentials are often glorified, and marketed with visionary examples of how students become more engaged and learn more (Cuban,
18 1986). While this is touched on in the European Framework (Redecker & Punie, 2017), our findings; that students’ engagement and disengagement when learning with technologies are complex, urge us to recommend that educational stakeholders (teachers, researchers and schools) need to further develop students’ digital competence, which can facilitate productive and self-regulative behaviour when using technologies for learning. These findings, and previous research, suggest that if schools do not guide their students how to use technologies for learning, students might not develop self-regulatory strategies and consequently may direct their engagement elsewhere (Salmela-Aro et al., 2017). Researchers need to conduct further investigations of why certain student groups disengage when learning with technologies. We believe that the instrument developed in this study could be used to explore early signs of disengagement. It can be one of the ways educational institutions and practitioners gain insight on student engagement and disengagement in TEL, could thus enable timely and targeted interventions. The Horizon Report (Johnson et al., 2016) also suggested that some disengagement can be prevented by using technologies to provide a more flexible and personalised learning environment, which is in line with the preferences declared by both average and low performers in this study.
Limitations of the study The education system in Sweden, applies a free school choice. This means students are free to enrol with schools outside traditional catchment areas. Despite this, there are geographical limitations to the study, as a city and surrounding suburbs, still do not reflect the more rural parts of the country. Our study is further limited by the fact that all of the schools were selected from within a single city, even though Stockholm City is the largest city in Sweden. Conclusively, the study is not generalisable across all upper secondary students, but would reflect upper secondary students’ engagement and disengagement in TEL, in similar educational settings. Other limitations are that we examined engagement using students’ self-reports, which are post-reflections rather than “in the moment” data, and that some dimensions in the 8-factor model were captured using two items only (see Fig.2). However, the challenges to capture separate indicators of the global constructs are known in the field (cf. Wang, et al., 2017). While our results offer a contribution to the growing engagement and disengagement theories, more research is needed. For instance; data collected from multiple sources would potentially enrich the results.
Conclusions As digitalisation becomes increasingly prevalent in education, new conditions for engagement emerge. Thus, we identified differences between low-, average-, and highperformance students’ engagement and disengagement in TEL, which reflect current problems in education.
Our key findings indicate that: • • •
Both average- and low-performance students use digital technologies to escape when a lesson is perceived to be boring; Social media use correlates with low grades; Low performers cannot resist the distractive pull of digital technologies;
19 • •
High-performance students seem to have developed digital competence to engage with technology productively for learning purposes; That which is disengaging for low performers’ may be engaging for high performers in TEL.
Contribution We used the LET instrument to approach student engagement and disengagement in TEL. We did this by developing the LET-instrument, which itself is one of the contributions of this paper, and by gathering self-reports from students in schools that had implemented digital technologies (at the very minimum) in the form of individual teacher and student laptops, a learning management system and Internet access. Analysing the results, we believe that this instrument can be used to understand how students engage and disengage when learning with technologies, and for the early identification of students at-risk of receiving low grades or dropping out of school. There is also a contribution towards the conceptualisation of student engagement and disengagement in TEL. While previous research often approached either engagement or disengagement, mostly in analogue settings, aimed at higher education, this study explored indicators reflecting upper secondary students’ academic engagement and disengagement in TEL. Thus, this study also contributes with insights on how low, average, and high performers in upper secondary school differ in regard to their level of engagement when learning with technologies. The insights gained can inform school leaders and decision makers at both the upper secondary level and in higher education, to combat disengagement in upper secondary school, as well as to inform higher educational institutions about existing challenges and preferences related to student engagement and disengagement in TEL. Future research As this study focused on identifying differences between performance groups, more research is needed to explore individual underlying factors, such as students’ emotions and reasoning, to advance investigations on why certain student groups use technology to escape from learning, and how we can provide learning environments to these students that better fits their needs and preferences. Such research could be helpful to provide timely targeted interventions to promote student engagement and combat disengagement. Another venue for future work is to further investigate the digital skills (for learning) that high-performance students make us of in order to support productive and self-regulated learning with technologies.
20
References Abidin, Z., Mathrani, A., & Hunter, R. (2017). Association for Information Systems AIS Electronic Library (AISeL) Student Engagement with Technology Use in Mathematics Education: An Indonesian Secondary School Context. In Twenty-first Pacific Asia Conference on Information Systems, PACIS 2017 (p. 165). Allcoat, D., von Mühlenen, A., & Mühlenen, A. Von. (2018). Learning in virtual reality : Effects on performance , emotion and engagement. Research in Learning Technology, 26(1063519), 1–13. https://doi.org/10.25304/rlt.v26.2140 Alrashidi, O., Phan, H. P., & Ngu, B. H. (2016). Academic Engagement: An Overview of Its Definitions, Dimensions, and Major Conceptualisations. International Education Studies, 9(12), 41. https://doi.org/10.5539/ies.v9n12p41 Balwant, P. T. (2018). The meaning of student engagement and disengagement in the classroom context: lessons from organisational behaviour. Journal of Further and Higher Education, 42(3), 389–401. https://doi.org/10.1080/0309877X.2017.1281887 Bergdahl, N, Fors, U., Hernwall, P., & Knutsson, O. (2018). The Use of Learning Technologies and Student Engagement in Learning Activities. Nordic Journal of Digital Literacy, 13(2), 113–130. https://doi.org/10.18261/ISSN.18919-943x-201802-04 Bergdahl, N, Knutsson, O., & Fors, U. (2018). Designing for Engagement in TEL – a Teacher-Researcher Collaboration. Designs for Learning, 10(1), 100–111. https://doi.org/https://doi.org/10.16993/dfl.113 Bolliger, D. U., & Martin, F. (2018). Instructor and student perceptions of online student engagement strategies. Distance Education, 39(4), 568–583. https://doi.org/10.1080/01587919.2018.1520041 Bower, M. (2019). Technology‐mediated learning theory. British Journal of Educational Technology, 50(3), 1035–1048. https://doi.org/10.1111/bjet.12771 Braun, & Clarke. (2013). Successful Qualitative Research: A Practical Guide For Beginners. London, UK: SAGE Publications. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa Braun, V., & Clarke, V. (2012). Thematic Analysis. In E. Lyons & A. Coyle (Eds.), APA Handbook of Research Methods in Psychology: Volume 2 (2nd ed., Vol. 2, pp. 57– 71). Washington DC: American Psychology Association. https://doi.org/10.1037/13620-004 Center for Postsecondary Research. Indiana University School of Education. (2018). NSSE - National Survey of Student Engagement. Retrieved November 23, 2018, from http://nsse.indiana.edu/ Cuban, L. (1986). Teachers and machines : the classroom use of technology since 1920. New York, NY: Teachers College Press. Department for Business Innovation and Skills. (2014). Learning from Futuretrack: Dropout from higher education. BIS Research paper no.168. London, UK. Finn, J. D, & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 97–131). Boston, MA.: Springer, Boston, MA. Finn, Jeremy D. (1989). Withdrawing from School. Review of Educational Research, 59(2), 117–142. https://doi.org/10.3102/00346543059002117 Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept, State of the Evidence. Review of Educational Research Spring, 74(1), 59– 109. https://doi.org/10.3102/00346543074001059
21 Fredricks, Jennifer A., Reschly, A. L., & Christenson, S. L. (2019). Handbook of Student Engagement Interventions : Working with Disengaged Students. (Jennifer A; Fredricks, A. L. Reschly, & S. L. Christenson, Eds.). Elsevier. Goldhammer, F., Gniewosz, G., & Zylka, J. (2016). ICT Engagement in Learning Environments. In S. Kuger (Ed.), Assessing Contexts of Learning, Methodology of Educational Measurement and Assessment (pp. 331–351). Springer, International Publishing Switzerland. https://doi.org/10.1007/978-3-319-45357-6_13 Groccia, J. E. (2018). What Is Student Engagement? New Directions for Teaching and Learning, 2018(154), 11–20. https://doi.org/10.1002/tl.20287 Halverson, L. R. (2016). Conceptualizing Blended Learning Engagement. Doctoral Dissertation. Brigham Young University, USA. Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344–362. https://doi.org/10.1007/s12528-017-9161-1 Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education, 90, 36–53. https://doi.org/10.1016/j.compedu.2015.09.005 Hew, K. F. (2016). Promoting engagement in online courses: What strategies can we learn from three highly rated MOOCS. British Journal of Educational Technology, 47(2), 320–341. https://doi.org/10.1111/bjet.12235 Hietajärvi, L., Salmela-Aro, K., Tuominen, H., Hakkarainen, K., & Lonka, K. (2019). Beyond screen time: Multidimensionality of socio-digital participation and relations to academic well-being in three educational phases. Computers in Human Behavior, 93, 13–24. https://doi.org/10.1016/J.CHB.2018.11.049 Hill, N. E., & Wang, M.-T. (2014). From Middle School to College: Developing Aspirations, Promoting Engagement, and Indirect Pathways From Parenting to Post High School Enrollment. https://doi.org/10.1037/a0038367.supp Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55. https://doi.org/10.1080/10705519909540118 Järvelä, S., & Renninger, A. K. (2014). Designing for learning: Interest, motivation, and engagement. In R. Sawyer, Keith (Ed.), Cambridge handbook of the learning sciences (2nd ed., pp. 6468–6685). New York, NY: Cambridge University Press. Johnson, L., Becker, S. A., Cummins, M., Estrada, V., Freeman, A., & Hall, C. (2016). NMC Horizon Report; 2016 Higher Education Edition. Retrieved February 13, 2019, from https://www.sconul.ac.uk/sites/default/files/documents/2016-nmc-horizonreport-he-EN-1.pdf Lave, J. (2009). The practice of learning. In K. Illeris (Ed.), Contemporary theories of learning: Learning theorists… in their own words (pp. 200–208). New York, NY: Routledge. Lin, L., & Parsons, T. D. (2018). Ecologically Valid Assessments of Attention and Learning Engagement in Media Multitaskers. TechTrends, (62), 518–524. https://doi.org/10.1007/s11528-018-0311-8 Ma, J., Cheng, J., & Han, X. (2018). Initial development process of a student engagement scale in blended learning environment. Proceedings - 6th International Conference of Educational Innovation Through Technology, EITT 2017, 2018-March, 234–237. https://doi.org/10.1109/EITT.2017.63 Marcus-Quinn, A., & Hourigan, T. (Eds.). (2017). Handbook on Digital Learning for K12 Schools. Springer International Publishing. https://doi.org/10.1007/978-3-319-
22 33808-8 Nouri, J., Spikol, D., & Cerratto-Pargman, T. (2016). A Learning Activity Design Framework for Supporting Mobile Learning. Designs for Learning, 8(1), 1–12. https://doi.org/10.16993/dfl.67 O’Brien, H. L., & Toms, E. G. (2008). What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology, 59(6), 938–955. https://doi.org/10.1002/asi.20801 Ophir, E., Nass, C., & Wagner, A. D. (2009). Cognitive control in media multitaskers. Proceedings of the National Academy of Sciences, 106(37), 15583–15587. https://doi.org/10.1073/pnas.0903620106 Parmaxi, A., Zaphiris, P., & Ioannou, A. (2016). Enacting artifact-based activities for social technologies in language learning using a design-based research approach. Computers in Human Behavior, 63, 556–567. https://doi.org/10.1016/j.chb.2016.05.072 Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior, 35, 157–170. https://doi.org/10.1016/j.chb.2014.02.048 Rashid, T., & Asghar, H. M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior, 63, 604–612. https://doi.org/10.1016/j.chb.2016.05.084 Redecker, C., & Punie, Y. (2017). European Framework for the Digital Competence of Educators: DigCompEdu. JRC Science for Policy Report. Seville, Spain. https://doi.org/10.2760/159770 Salmela-Aro, K., Muotka, J., Hakkarainen, K., Alho, K., & Lonka, K. (2016). School Burnout and Engagement Profiles among Digital Natives in Finland: A personoriented approach. European Journal of Developmental Psychology, 13(6), 704–718. https://doi.org/10.1080/17405629.2015.1107542 Salmela-Aro, K., Upadyaya, K., Hakkarainen, K., Lonka, K., & Alho, K. (2017). The Dark Side of Internet Use: Two Longitudinal Studies of Excessive Internet Use, Depressive Symptoms, School Burnout and Engagement Among Finnish Early and Late Adolescents. Journal of Youth and Adolescence, 46(2), 343–357. https://doi.org/10.1007/s10964-016-0494-2 Skinner, E. A., & Pitzer, J. R. (2012). Developmental Dynamics of Student Engagement, Coping, and Everyday Resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 21–44). https://doi.org/10.1007/978-1-4614-2018-7_2 Skolverket [The Swedish National Agency for Education]. (2018). Preliminär statistik om sökande till gymnasieskolan[Preliminary statistics upper secondary school applications] 2018/19. Retrieved February 26, 2019, from https://www.skolverket.se/skolutveckling/statistik/arkiveradestatistiknyheter/statistik/2018-08-23-preliminar-statistik-om-sokande-tillgymnasieskolan-2018-19 Swedish Schools Inspectorate. (2015). Gymnasieskolors arbete med att förebygga studieavbrott. Upper secondary schools drop out prevention efforts. Stockholm, Sweden. Tafelski, J. J., Hejnal, T., Maring, C., McDowell, G., & Rencher, C. (2017). The cost of disengagement: Examining the real story of absenteeism in two Michigan counties. Dissertation Abstracts International Section A: Humanities and Social Sciences, 77.
23 Timmis, S. (2012). Constant companions: Instant messaging conversations as sus- tainable supportive study structures amongst undergraduate peers. Computers and Education, (59), 3–18. https://doi.org/10.1016/j.compedu.2011.09.026 Trinder, R. (2015). Blending technology and face-to-face: Advanced students’ choices. ReCALL, 28(1), 83–102. https://doi.org/10.1017/S0958344015000166 Wang, M.-T., & Eccles, J. S. (2011). Adolescent Behavioral, Emotional, and Cognitive Engagement Trajectories in School and Their Differential Relations to Educational Success. Journal of Research on Adolescence. https://doi.org/10.1111/j.15327795.2011.00753.x Wang, M.-T., & Eccles, J. S. (2012). Adolescent Behavioral, Emotional, and Cognitive Engagement Trajectories in School and Their Differential Relations to Educational Success. Journal of Research on Adolescence, 22(1), 31–39. https://doi.org/10.1111/j.1532-7795.2011.00753.x Wang, M.-T., & Hofkens, T. L. (2019). Beyond Classroom Academics: A School-Wide and Multi-Contextual Perspective on Student Engagement in School. Adolescent Research Review, 0(0), 0. https://doi.org/10.1007/s40894-019-00115-z Wang, M.-T. Te, Fredricks, J., Ye, F., Hofkens, T., & Linn, J. S. (2017). Conceptualization and Assessment of Adolescents’ Engagement and Disengagement in School. European Journal of Psychological Assessment, 1–15. https://doi.org/10.1027/1015-5759/a000431 Wang, M. Te, & Fredricks, J. A. (2014). The Reciprocal Links Between School Engagement, Youth Problem Behaviors, and School Dropout During Adolescence. Child Development, 85(2), 722–737. https://doi.org/10.1111/cdev.12138 Wylies, C., & Hodgen, E. (2012). Trajectories and patterns of student engagement: Evidence from a longitudinal study. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 585–599). Boston, MA.: Springer. Yang, Y. F. (2011). Engaging students in an online situated language learning environment. Computer Assisted Language Learning, 24(2), 181–198. https://doi.org/10.1080/09588221.2010.538700 Zhai, X., Dong, Y., & Yuan, J. (2018). Investigating Learners’ Technology Engagement A Perspective from Ubiquitous Game-Based Learning in Smart Campus. IEEE Access, 6, 10279–10287. https://doi.org/10.1109/ACCESS.2018.2805758
24
Appendix A “Information to students” – letter sent to questionnaire administrators (Translated into English by the first author)
Hello students, You have been randomly selected to participate in the research project "I use IT", conducted by [name of institution] on behalf of [project and funding]. There are approximately 400–500 participants. The aim of this research is to get a nuanced understanding of how IT affects students' academic engagement. According to ethical guidelines, information about the study must be provided before obtaining your consent. You will find the project information online when accessing the questionnaire. It is only when you click to proceed with the questionnaire that you agree to participate. Your responses are important to us, because only by your sharing how you perceive your learning can opportunities and uses of IT be develop to most benefit the students of tomorrow. Thank you for considering participating in this study. Instructions The survey consists of two parts. In the first part, please provide basic facts about yourself. In the second part, you will be asked to answer or respond to various questions or statements. Your school is in the process of changing learning platform. While a few initial questions refer to this change, the rest of the questions concern your general experience of learning with digital technologies. When answering the questions, please respond based on your entire upper secondary school period. Do not just think about the period during which you started using a new learning platform (if applicable). Do not stop and think about each question; Instead, choose the option that you first believe to be most suitable. For example: "I know how to use computer/mobile/iPad to be effective in my learning." Not at all like me
To some extent like me
To most extent like me
Very much like me
25
Appendix B Questionnaire on student engagement and disengagement in technology-enhanced learning Given the below prompt: a six-item Likert scale was used (if nothing else is stated). Answer by selecting one of the 6 alternatives that matches you best: ““Not at all like me”, “Most often not like me”, “Often not like me” “Sometimes like me”, “Often like me”, “Very much like me”.
Engagement 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15.
16. 17. 18. 19. 20. 21. 22.
BEH 1 I use digital technologies (mobile phone / computer / tablet, etc.) to support my learning. BEH 2 When I do school work, I go online and read what others have written and seek facts. BEH 3 I usually read / watch that what my teacher publishes (PowerPoints, movies, links to webpages). BEH 4 It positively affects my engagement if I can autonomously switch between Internet, Word, calendar, e-mail, learning platform (or similar) to complete my school work. COG 1 I can easily concentrate when using a mobile phone / computer / tablet or other digital technologies for teaching and learning. COG 2 I take the initiative to also use other IT resources (e.g., sites / programs / apps / technologies) in my learning in addition to those that the teachers recommend. COG 3 In order for me to maximize my learning, I need to have access to a mobile phone / computer / tablet. COG 4 I use different functions, for example the memory, in my mobile phone / computer / tablet instead of remembering everything myself. EMO 1 I think we should use digital technologies to support learning to a much greater extent than we currently do. EMO 2 Overall, digital technologies in school increase my engagement in learning. EMO 2 When teachers use several different digital technologies to support our learning during a lesson is fun and makes me engaged. EMO 3 To me, it is very important that the technologies (applications, internet, mobile phone / computer / tablet) work, to optimize my learning. EMO 4 When we get to create digitally, I experience great joy as I am allowed to express my creativity. EMO 5 My engagement in school work would increase if IT were used to personalise the content. EMO 5 Even if one is aiming for a 100% presence, situations may arise that lead to a certain absence during the school year. Think about your absence when you respond to this statement: “If I could attend lessons from home / from elsewhere, I would miss fewer lessons.” SOC 1 I like that, when we use the computers to complete group work, the teachers can view our progress and access the application as well (OneDrive, Google Drive, Wiki, shared workspace etc.). SOC 2 I am satisfied with my teachers’ use of digital technologies (e.g., e-mail / Office365 / Socrative) to keep track of my progress / give feedback. SOC 2 I prefer it when my teachers give me feedback digitally instead of getting a written paper. SOC 3 My experience is that we are given instructions that involve everyone when we have group work and use digital technologies. SOC 3 In my school, we use computers / tablet / mobile phones in a way that prevents social exclusion in the class. SOC 4 My teacher is present in the online places I am / I log on to (for example, websites, learning platforms, virtual worlds, forums, digital learning materials). SOC 4 My teachers know what applications and programs I use in my learning.
Disengagement 23. DBEH 1The time I spend on YouTube / games / social media or similar during class is the reason why I finish my school work later than I should. 24. DBEH 2 When we work in a group, we usually ask the student who is best at using the technology (i.e., the app / program) to do the work (e.g., layout, design, or editing). 25. DBEH 3 In my opinion, the digital resources are of too low quality (for example, they are buggy, reboot or freeze); In short, they are not as developed as they could be. Answer by selecting one of the 5 items: never, a lesson per month, a lesson per week, a lesson per day, every lesson. 26. DCOG 1 How often would you estimate that you are switching between playing games / spending time on social media / watching YouTube (or similar) and writing down the most necessary notes ...
26 27. DCOG 2 I lose my concentration if I receive a notification on my mobile phone / computer / tablet. 28. DCOG 3 My teacher often posts movies and links to good resources. When I look at all the material, it feels like I am drowning in information. 29. DEMO 1 I find it hard to stay away from the mobile phone / computer / tablet during lessons. 30. DEMO 2 I use social media / YouTube / online browsing or fiddle on my mobile phone to escape when a lesson is boring. 31. DEMO 3 I get frustrated that the information changes without me noticing it. 32. DEMO 3 When my teachers publish material about the lessons (links, PowerPoints, film, etc.), I get frustrated because it is unclear what the teacher expects me for me to know for upcoming tests. 33. DEMO 4 I find that my teachers’ IT competence does not match the use of digital tools to an extent that supports me in my learning. 34. DEMO 5 I do not want to read everything off the computer screens/ tablets all the time. Therefore, my engagement is negatively affected when I have to. 35. DEMO 6 I do not want to write everything on the computers / tablets all the time. Therefore, my engagement is negatively affected when I have to. 36. DSOC 1 I have to learn how applications / computer programs work myself. 37. DSOC 1 We, the students, are often more knowledgeable about IT than the teacher, and therefore left to decide how technologies should be used for learning. 38. DSOC 2 As we are merely instructed to sit alone and search the Internet, I would rather do the school assignments at home than in the classroom. 39. DSOC 3 I am often encouraged to find knowledge by "searching the internet" and I want the lesson to be more thought-through (deliberate) than that. 40. DSOC 4 It makes me feel upset / resigned that the same students repeatedly contribute less to the group work, even though we have computers and can easily share the work. 41. DSCO 5 My engagement decreases when working alone on the computers / tablets.
These three questions reflected three items, which were removed: 42. I used to use my mobile phone / computer / tablet more for fun, but now I use it more for school work. 43. I am so used to working with IT (computer, tablet, mobile, or other) that I do not reflect upon doing it. 44. When our class gets access to digital technologies that we have few of (e.g., Google Glasses, 3D printers, robots, VR / AR, etc.), I rarely get the opportunity to try them.
27
Appendix C Anova-test to measure variance between performance groups Sum of Squares
ANOVA Merit rating
Aspiration
Between Groups
127461.00
2
63730.50
Within Groups
449984.60
406
1108.34
7.19
2
3.60
180.00
406
0.44
Between Groups Within Groups
General engagement
Between Groups Within Groups
Time on social media
Between Groups Within Groups
Time on YouTube
Between Groups Within Groups
Time creating
Between Groups Within Groups
Notifications make me distracted I concentrate easily when working with digital technologies IT makes me hand in my homework late I use digital technologies to escape when the lesson is boring I use IT to support my learning I multitask by both playing games and learning It upsets me that one student often does more than the rest in a group
Mean Square
df
Between Groups Within Groups Between Groups Within Groups Between Groups Within Groups Between Groups Within Groups Between Groups Within Groups Between Groups Within Groups Between Groups Within Groups
38.82
2
19.41
289.18
406
0.71
31.48
2
15.74
703.87
406
1.73
8.70
2
4.35
594.60
406
1.46
3.71
2
1.86
218.30
406
0.54
14.74
2
7.37
842.44
406
2.07
9.01
2
4.50
558.40
406
1.38
11.10
2
5.55
785.09
406
1.93
21.99
2
11.00
1030.15
406
2.54
10.69
2
5.34
330.59
406
0.81
15.78
2
7.89
664.10
406
1.64
17.11
2
8.55
734.15
406
1.81
F
Significance
57.50
0.00
8.11
0.00
27.25
0.00
9.08
0.00
2.97
0.05
3.45
0.03
3.55
0.03
3.27
0.04
2.87
0.06
4.33
0.01
6.56
0.00
4.83
0.01
4.73
0.01
Highlights • • • •
We developed and validated an instrument to measure student engagement and disengagement in technology-mediated learning The instrument was used to identify early indicators of engagement and disengagement when students learn with technologies We found that high-performance students have the competencies needed to use technologies to engage in their learning However, using digital technologies for learning can be problematic for low-and averageperformance students