A multimodal measure of cultural intelligence for adolescents growing up in culturally diverse societies

A multimodal measure of cultural intelligence for adolescents growing up in culturally diverse societies

International Journal of Intercultural Relations 72 (2019) 109–121 Contents lists available at ScienceDirect International Journal of Intercultural ...

492KB Sizes 13 Downloads 38 Views

International Journal of Intercultural Relations 72 (2019) 109–121

Contents lists available at ScienceDirect

International Journal of Intercultural Relations journal homepage: www.elsevier.com/locate/ijintrel

A multimodal measure of cultural intelligence for adolescents growing up in culturally diverse societies

T



Miriam Schwarzenthala, , Linda P. Juanga, Maja K. Schachnera,b, Fons J.R. van de Vijverc,d,e,f,1 a

University of Potsdam, Germany College for Interdisciplinary Educational Research, Germany c Tilburg University, the Netherlands d North-West University, South Africa e University of Queensland, Australia f Higher School of Economics, Russia b

A R T IC LE I N F O

ABS TRA CT

Keywords: Cultural intelligence Intercultural competence Adolescents

Adolescents growing up in culturally diverse societies need to develop intercultural competence. To better understand how to develop intercultural competence we need measures specifically relating to the everyday intercultural experiences of adolescents. However, few measures of intercultural competence are available for this target group. Based on the cultural intelligence (CQ) model (Earley & Ang, 2003), we developed a measure that combines a self-report questionnaire and situational judgment tests (SJTs). The latter comprise a brief description of intercultural situations, followed by questions asking the adolescents to interpret and provide a reaction to the situations. The reliability, factor structure, measurement equivalence, and validity of the new measure was tested in two samples of adolescents in culturally diverse regions in North Rhine-Westphalia (N = 631, 48% female, Mage = 13.69 years, SDage = 1.83) and Berlin (N = 1,335, 48% female, Mage = 14.69 years, SDage = 0.74) in Germany. The self-report CQ scale showed good reliability and a four-dimensional factor structure with a higher-order CQ factor. The responses to the SJTs were coded based on a coding manual and the ratings loaded onto one factor. The measurement models showed metric to scalar measurement equivalence across immigrant background, gender, and grade. The CQ factor and the SJT factor were positively correlated with each other, as well as with related constructs such as openness, perspective-taking, and diversity beliefs. We conclude that the new measure offers a reliable and valid method to assess the intercultural competence of adolescents growing up in culturally diverse societies.

Introduction Worldwide migration is increasing and as a result, many societies around the globe are becoming more culturally diverse (United Nations, 2017). In these societies, intercultural competence is crucial to ensure that the human rights of individuals from all cultural backgrounds are acknowledged and respected. As today’s adolescents will become tomorrow’s citizens, intercultural competence



Corresponding author at: Inclusive Education, Karl-Liebknecht-Straße 24 – 25, House 31, Room 1.10, 14476 Potsdam, Germany. E-mail address: [email protected] (M. Schwarzenthal). 1 Deceased June 1, 2019. https://doi.org/10.1016/j.ijintrel.2019.07.007 Received 25 January 2019; Received in revised form 13 June 2019; Accepted 24 July 2019 Available online 03 August 2019 0147-1767/ © 2019 Elsevier Ltd. All rights reserved.

International Journal of Intercultural Relations 72 (2019) 109–121

M. Schwarzenthal, et al.

should already be fostered from an early age (Barrett, Byram, Lázár, Mompoint-Gaillard, & Philippou, 2013). However, research on intercultural competence among adolescents in culturally diverse societies, and measures assessing this competence, are scarce. The majority of intercultural competence measures were developed for adults in work or study-abroad contexts (for overviews, see Leung, Ang, & Tan, 2014; Matsumoto & Hwang, 2013). Research investigating adolescents’ intercultural competence mainly builds on Bennet's (1993) developmental model of intercultural sensitivity (e.g., Holm, Nokelainen, & Tirri, 2009; Straffon, 2003; for an exception, see Reinders, Gniewosz, Gresser, & Schnurr, 2011). Thus, there is a need for measures that are tailored to adolescents growing up in culturally diverse societies, use simple language, and are based on multifaceted models of intercultural competence. One of the most popular multifaceted models of intercultural competence is the model of cultural intelligence (CQ) (Earley & Ang, 2003). Individuals possessing high CQ enjoy engaging in intercultural interactions (motivational CQ), know about norms and practices in different cultures (cognitive CQ), are aware of their own and others’ cultural affiliations (metacognitive CQ), and exhibit appropriate behavior in intercultural situations (Behavioral CQ). The cultural intelligence scale (CQS) (Ang et al., 2007; Van Dyne et al., 2012) has been deemed to be one of the most valid and reliable intercultural competence measures to date (Leung et al., 2014; Matsumoto & Hwang, 2013). However, it employs abstract language and assesses aspects of intercultural competence (e.g., valuing benefits of working abroad) that may not be relevant for adolescents. Thus, the CQS presents a valuable foundation but needs to be adapted for adolescents growing up in culturally diverse societies. Most intercultural competence measures, including the CQS, solely rely on self-report ratings. These are economic and easily applicable in large-scale surveys, but have been criticized as they may mainly reflect a person’s intercultural self-efficacy (Leung et al., 2014) and people might have trouble reporting their own intercultural abilities accurately (Klafehn, Li, & Chiu, 2013). Culture assimilator exercises, often used in intercultural trainings (Brislin, 1986) and situational judgment tests (SJTs), often used in business contexts (Whetzel & McDaniel, 2009), overcome some of these shortcomings. They typically consist of a short description of a situation, followed by questions asking the participants to interpret the situation or to evaluate plausible courses of action. Recently, Rockstuhl, Ang, Ng, Lievens, and Van Dyne (2015) developed intercultural SJTs in which they assessed participants’ interpretations of the situations as well as proposed behaviors. Using a similar approach, Hesse and Göbel (2007) and Busse and Krause (2015) assessed the intercultural competence of secondary school students in Germany. However, the incidents used depicted students studying abroad in English-speaking countries. Thus, there is a need to develop SJTs for adolescents that specifically depict intercultural situations in culturally diverse societies. The present study addresses this need and responds to calls for a multimodal measure of intercultural competence (Deardorff, 2004) by presenting an adapted CQ scale and SJTs that are tailored to adolescents in culturally diverse contexts. We test reliability and factor structure based on two samples of adolescents from culturally diverse regions in Germany. To ensure that the measures are interpreted in a similar way across participants, we assess measurement equivalence across immigrant background (i.e., students with at least one parent born abroad vs. students with both parents born in Germany), gender, and school grade. Validity will be tested by investigating whether self-reported CQ and SJT scores are positively related with each other, as well as with theoretically related constructs. We expect that openness is positively related to CQ, as it may promote treating intercultural situations as a challenge and a learning opportunity (van der Zee & van Oudenhoven, 2013), and in previous research, was positively related to all subdimensions of CQ (Ang, Van Dyne, & Koh, 2006). Diversity beliefs (Adesokan, Ullrich, van Dick, & Tropp, 2011) and perspective-taking abilities (Davis, 1980) may be positively related to CQ because people with high diversity beliefs see diversity as valuable and enjoy being in heterogeneous groups, while people high on perspective-taking abilities may be more aware of different cultural perspectives and better able to integrate these behaviorally. As SJTs typically show some relation with cognitive ability (Whetzel & McDaniel, 2009), we further expect that students’ grades and their SJT scores will be positively associated. The newly developed measures have been successfully set in relation to intercultural contact and friendships (Schwarzenthal, Juang, Schachner, van de Vijver, & Handrick, 2017; Schwarzenthal , Juang, Schachner, & Van de Vijver, 2019) and to the classroom cultural diversity climate (Schwarzenthal, Schachner, Juang, & Van de Vijver, 2019) in other studies. The present study is the first one describing the scale development process, reporting CFAs with the full set of items, testing the equivalence of the full measurement models across immigrant background, gender, and grade, and providing validation analyses for the new measures. Method Item development and pretesting We first conducted exploratory interviews with six students, three teachers, and a principal of a culturally diverse school in Germany. Findings from the interviews, along with results from a literature review, were used to adapt the items of the expanded CQ scale (Van Dyne et al., 2012) (see also Supplementary Materials A). Moreover, eight SJTs were developed that were set in the school or peer context. Each SJT comprised a short description of an intercultural situation including a student of immigrant background (person A) and a student of non-immigrant background (person B). These were followed by open-ended questions assessing participants’ interpretation of the incident (“Why does person A/B behave this way?”) and proposed reactions to the incident (“What would you do next in this situation?”). An open response format was chosen over a multiple choice format, because the latter did not add much beyond self-report measures in other studies (Schnabel, 2015). The adapted self-report CQ items and the SJTs were reviewed by two intercultural experts, pretested with seven students with a variety of cultural affiliations and piloted with 44 students in two classrooms. After omitting items that students found hard to understand or that showed low item-total correlations, the initial pool of 53 self-report CQ items was reduced to 24 items. The three 110

International Journal of Intercultural Relations 72 (2019) 109–121

M. Schwarzenthal, et al.

Table 1 Items of Adapted Self-Report CQ Scale and Situational Judgment Tests for Secondary School Students in Germany. Self-Report CQ Scale (Adapted from Van Dyne et al., 2012) Five-point scale from „Strongly disagree “to „Strongly agree“ Motivation 1 It’s fun for me to interact with people from other cultures. 2 I think it’s exciting to interact with people from other cultures. 3 I find it interesting to talk to people who have a different perspective than I do. 4 I think I can get along with people from another culture I’m not yet familiar with. 5 I think I can work well together with people from completely different cultures. 6 I think I could solve any problems that might arise between myself and people from other cultures. Cognition 1 I can describe festivals and traditions from various cultures. 2 I can describe what is important in various religions. 3 I can describe how daily life looks for people from various cultures. 4 I can describe how parents treat their children in various cultures. 5 I can describe what’s expected of men and women in various cultures (for instance regarding how they are supposed to behave or what tasks they are expected to do). 6 I can describe what people from various cultures think about certain issues. Meta-cognition 1 Before I meet people from another culture for the first time, I try to remember what I know about their culture. 2 Before I meet people from another culture for the first time, I think about how I should act (for instance how I should greet them). 3 When I meet people from another culture, I try to find out how to act appropriately in that culture. 4 If I don’t understand the behavior of people from another culture, I try to find out why they might have acted the way they did. 5 When I meet people from another culture, I try to better understand them by imagining how things look from their perspective. 6 If I don’t understand the behavior of people from another culture, I try to think about how their culture might have influenced the way they were acting. Behavior 1 When I talk to people from another culture, at first I pay more attention to how I am acting. 2 When I talk to people from another culture, sometimes I adapt what I say (for instance regarding which topics I bring up). 3 When I talk to people from another culture, I am considerate regarding their traditions and ways of living. 4 When I talk to people from another culture, I am careful not to say anything that could hurt them. 5 When people from another culture have a different point of view than I do, I try to find a compromise. 6 If there is a misunderstanding between people from different cultures, I try to clear it up. Situational Judgment Tests (SJTs) (Based on Rockstuhl et al., 2015) SJTa (used in survey in North Rhine-Westphalia)

SJTb (used in surveys in North Rhine-Westphalia and Berlin)

SJTc (used in survey in Berlin)

You are friends with a student in your class whose parents come from a different country. On a hot summer day,you meet other classmates in town. One of them proposes to go and eat ice cream. Your friend declines and says that he is fasting. Why does your classmate behave this way? (Please write at least two sentences!) Why does your friend behave this way? (Please write at least two sentences!) What would you do next in this situation? (Please write at least two sentences!) For a group project you are discussing the topic of child labor. A boy from your group says that child labor is bad, because children should go to school instead. A girl from your group, who is originally from another country, says that many families would go hungry if their children didn’t work, and that they would have to pay money if they wanted their children to attend school. Why might the boy from your group have said what he said? (Please write at least two sentences!) Why might the girl from your group have said what she said? (Please write at least two sentences!) What would you do if this situation occurred in your class? (Please write at least two sentences!) One day a new student arrives in your class, who just recently moved to Germany. A month later you notice that during breaks he still sits alone in the corner of the schoolyard, that he sometimes comes to class late, and that he often doesn’t have his homework with him. Some of your classmates sometimes give him funny looks, but don’t speak with him. Why might the new student be acting like this? (Please write at least two sentences!) Why might your classmates be acting like this? (Please write at least two sentences!) What would you do if this situation occurred in your class? (Please write at least two sentences!)

SJTs that produced the longest and most varied student responses were retained. The final items of the adapted CQ scale and the SJTs that were retained are depicted in Table 1 (for the original German version, see Supplementary Materials B). Validation in two student samples The adapted CQ scale, as well as two SJTs each, were used in two surveys among students attending culturally diverse schools in North Rhine-Westphalia (NRW) (N = 631 6th, 8th, and 10th graders, 48% female, Mage = 13.69 years, SDage = 1.83, rangeage = 11–18 years, 49% of immigrant background) and Berlin (N = 1,335 9th graders, 48% female, Mage = 14.69 years, SDage = 0.74, rangeage = 13–19 years, 52% of immigrant background, 11% no parental consent to answer questions about parents’ place of birth). After obtaining permission from school principals, the Berlin Senate Committee for Education, Youth, and Science (only in case of the Berlin dataset), parental approval, and students’ assent or consent, we administered a questionnaire in a 90-minute time period during regular school hours. Means and SDs are depicted in Table 2. The subscales of motivational CQ (αNRW = .88, αBerlin = .90), cognitive CQ (αNRW = .88, αBerlin = .90), metacognitive CQ (αNRW = .86, αBerlin = .88) and behavioral CQ (αNRW = .82, αBerlin = .84) showed good reliabilities. 111

112

(0.64) (0.83) (0.86) (0.90) (0.84) (0.76) (0.74) (0.90) (0.47) (0.78) (1.17) (0.61) (0.76) (0.75) (0.86) (0.86) (0.91) (0.82) (0.53) (0.76) (0.60) (1.13) (0.66) (0.73) (0.74) (0.91) (0.90)

4.68 3.73 2.86 3.27 3.51 2.29 1.91 3.43 2.10 1.69 3.09 3.47 3.32 4.26 3.89 2.91 2.79 3.32 2.43 2.27 1.39 3.27 1.88 2.08 1.91 3.32 3.96

(0.64) (0.73) (0.81) (0.85) (0.73) (0.76) (0.74) (0.94) (0.50) (0.81) (1.23) (0.57) (0.77)

4.18 (0.79) 4.00 (0.83) 3.12 (0.88) 2.92 (0.97) 3.40 (0.84) 2.37 (0.53) 2.08 (0.82) 1.29 (0.53) 3.11 (1.12) 1.76 (0.64) 1.95 (0.70) 1.69(0.64) 3.25 (0.83) 4.01 (0.86)

4.42 3.96 3.25 3.46 3.74 2.24 1.87 3.23 2.05 1.71 3.02 3.65 3.39 4.10 3.77 2.90 2.75 3.21 2.32 2.07 1.27 2.90 1.75 1.92 1.69 3.13 3.86

4.56 3.74 3.03 3.32 3.57 2.23 1.89 3.24 2.06 1.62 2.97 3.54 3.24 (0.76) (0.87) (0.92) (0.95) (0.88) (0.59) (0.81) (0.51) (1.12) (0.61) (0.70) (0.68) (0.92) (0.95)

(0.63) (0.79) (0.88) (0.85) (0.80) (0.72) (0.74) (0.93) (0.45) (0.74) (1.18) (0.60) (0.76) 4.30 4.09 3.11 2.95 3.50 2.47 2.26 1.42 3.56 1.87 2.08 1.88 3.42 4.07

4.56 3.95 3.06 3.40 3.67 2.28 1.87 3.44 2.10 1.78 3.13 3.57 3.48 (0.78) (0.78) (0.84) (0.94) (0.77) (0.48) (0.78) (0.62) (3.44) (0.68) (0.72) (0.70) (0.79) (0.83)

(0.68) (0.77) (0.84) (0.91) (0.80) (0.80) (0.73) (0.90) (0.51) (0.83) (1.22) (0.59) (0.76)

Female M (SD)

– – – – – – – – – – – – – –

4.66 3.84 2.95 3.51 3.67 1.93 1.54 3.24 1.96 1.52 2.95 3.47 3.41

(0.67) (.83) (0.93) (0.83) (0.77) (0.75) (0.66) (0.90) (0.54) (0.66) (1.13) (0.60) (0.74)

6th grade M (SD)

Split by Grade

– – – – – – – – – – – – – –

4.53 3.80 3.00 3.22 3.49 2.37 1.93 3.24 2.12 1.62 3.01 3.52 3.29

(0.62) (0.81) (0.82) (0.94) (0.84) (0.74) (0.73) (0.95) (0.49) (0.78) (1.15) (0.62) (0.81)

8th grade M (SD)

– – – – – – – – – – – – – –

4.50 3.89 3.19 3.37 3.69 2.45 2.15 3.52 2.13 1.93 3.20 3.67 3.37

(0.66) (0.72) (0.81) (0.84) (0.76) (0.70) (0.69) (0.89) (0.41) (0.85) (1.29) (0.56) (0.73)

10th grade M (SD)

Note. NRW dataset: N = 324 students of non-immigrant background, N = 307 students of immigrant background, N = 320 male students, N = 304 female students, N = 7 did not provide information on gender, N = 195 6th graders, N = 221 8th graders, and N = 215 10th graders; Berlin dataset: N = 501 students of non-immigrant background, N = 689 students of immigrant background, N = 145 did not receive parental consent to provide information on parents’ country of birth, N = 685 male students, N = 645 female students, N = 5 did not provide information on gender, all students in 9th grade.

NRW dataset (N = 631) Grade average (higher scores = better grades) Motivational CQ (mean) Cognitive CQ (mean) Metacognitive CQ (mean) Behavioral CQ (mean) SJTa Suspending judgment SJTa Considering cultural explanations SJTa Behavior SJTb Suspending judgment SJTb Considering cultural explanations SJTb Behavior Multicultural Personality Questionnaire – Openness Perspective-taking Berlin dataset (N = 1,335) Grade average (higher scores = better grades) Motivational CQ (mean) Cognitive CQ (mean) Metacognitive CQ (mean) Behavioral CQ (mean) SJTb Suspending judgment SJTb Considering cultural explanations SJTb Considering alternative explanations SJTb Behavior SJTc Suspending judgment SJTc Considering cultural explanations SJTc Considering alternative explanations SJTc Behavior Diversity beliefs

Male M (SD)

Non-Immigrant Back-ground M (SD)

Immigrant Back-ground M (SD)

Split by Gender

Split by Immigrant Background

Table 2 Means and Standard Deviations of Study Variables (Split by Immigrant Background, Gender, and Grade).

M. Schwarzenthal, et al.

International Journal of Intercultural Relations 72 (2019) 109–121

International Journal of Intercultural Relations 72 (2019) 109–121

M. Schwarzenthal, et al.

Coding of the SJTs The first author developed a coding manual for the SJTs together with the second and third author (based on the procedure described by Syed & Nelson, 2015) (see Supplementary Materials C). Development of coding categories was guided top-down by previous literature (e.g., CQ model) and bottom-up by the data at hand. Three coding categories captured the degree to which the adolescents considered cultural influences and suspended judgment when interpreting the behavior of the people in the situation (rated on 3-point-scales), and the degree to which they integrated different cultural interests in their proposed reaction to the situation (rated on a 5-point-scale). Upon coding the Berlin dataset, a fourth coding category was introduced that captured to what degree the adolescents considered alternative explanations when interpreting the behavior in the incidents (3-point-scale), to acknowledge that preconceived attributions to culture may be harmful, especially in educational contexts, and that behavior can result from individual, interpersonal, or cultural motives. All responses were coded by one of two trained research assistants and the first author. Differences were resolved via consensus. Two-way-random-intraclass correlations (Shrout & Fleiss, 1979) confirmed good interrater reliability (ICC(2,2)NRW = 0.86-0.96, ICC(2,2)Berlin = .76–.93). Data analytic strategy CQ has been defined as an aggregated, multidimensional construct (Ang et al., 2007), but to date, there is little consensus on how it should be modeled. Some studies used an overall CQ score (e.g., Bernardo & Presbitero, 2018), while others employed four-factor models (e.g., Koo Moon, Kwon Choi, & Shik Jung, 2012), or suggested that a general intercultural self-efficacy may underlie the responses on all four CQ subfacets (Leung et al., 2014). Therefore, we compared the fit of a one-factor model of CQ with a model that specifies the four subdimensions as correlated factors, and a model in which the four subdimensions of CQ load onto a higher-order CQ factor. Each SJT captures understanding and behavior in one particular intercultural situation, therefore it is likely that the various scores that the participants receive for their performance in one and the same SJT form one factor. However, since we are interested in the students’ CQ that transcends a range of intercultural situations, we also tested whether all scores that students received for different SJTs can be conceptualized as reflecting one latent dimension. Next, we tested factorial, metric, and scalar measurement equivalence across immigrant background, gender, and grade. Finally, examined the validity of the new measures by investigating how they are related with each other, as well as with related constructs. Confirmatory factor analyses (CFAs) CFAs were run in MPlus (Muthén & Muthén, 1998-2011Muthén & Muthén, 1998-2011) using maximum likelihood estimation with robust standard errors and accounting for the clustered structure of the data (students in classrooms). A comparative fit index (CFI) of more than .95, a root mean squared error of approximation (RMSEA) of less than .06, and a standardized root mean square residual (SRMR) of less than .08 were considered to indicate good fit (Hu & Bentler, 1999). Satorra-Bentler scaled χ2 difference tests were run to compare models, but since these are overly conservative in large samples (Rutkowski & Svetina, 2013), we mainly inspected the change in CFI to compare models (for model fit indices and χ2 difference tests, see Table 3). A model in which all selfreport CQ items were specified to load onto a single factor (Model 1) showed bad fit. A four-factor model specifying the four selfreport CQ subscales as correlated factors (Model 2) fit significantly better, and showed good overall fit. The four factors were highly correlated, especially metacognitive and behavioral CQ (see Supplementary Materials D). Therefore, we specified a four-factor model with a higher-order self-report CQ factor which showed good fit (Model 3). The CFI did not indicate a difference in model fit between Model 2 and Model 3. Based on theoretical considerations (e.g., a common intercultural self-efficacy explaining the shared variance, Leung et al., 2014) we decided to retain the model with the higher-order CQ factor. A two-factor model with separate factors for each SJT (Model 5) fit slightly better than a model in which all SJT ratings were specified to load onto a single latent SJT factor (Model 4). However, as both models showed very good fit overall, we decided to retain the one-factor model as we consider the students’ intercultural performance that transcends specific situations to be more important than their performance in a particular intercultural situation. For the factor loadings of both final models, see Fig. 1. Measurement equivalence The models with the higher-order self-report CQ factor (Model 3) and the single SJT factor (Model 4) were tested for measurement equivalence across immigrant background and gender in both samples, and across grade in the NRW sample (see Tables 4–6). Based on Cheung and Rensvold (2002), a change in CFI larger than .01 was considered to constitute a meaningful change in model fit. Following the procedure described by Chen, Sousa, and West (2005) to test measurement equivalence of higher-order factor models, scalar equivalence (i.e., equivalence of factor loadings and intercepts) across immigrant background, gender, and grade was supported for the self-report CQ factor. For the SJT factor, with a few exceptions only metric equivalence (i.e., equivalence of factor loadings) was supported across immigrant background, gender, and grade. Latent mean differences Immigrant background students had a higher latent self-report CQ score than non-immigrant background students (zNRW = 4.95, pNRW = < .001; zBerlin = 1.72, pBerlin = .09) and females tended to have a higher latent self-reported CQ than males (zNRW = 1.44, pNRW = .15; zBerlin = 6.36, pBerlin= < .001). The latent self-reported CQ of 8th graders (zNRW = -1.74, pNRW = .08) and 10th graders (zNRW = 0.20, pNRW = .84) did not differ from the one among 6th graders, and the latent self-report CQ of 10th graders did not differ from the one among 8th graders (zNRW = 1.89, pNRW = .06). In the NRW sample, students of immigrant background had a lower latent SJT score than students of non-immigrant background (zNRW = -2.01, pNRW = .04) and females had a higher latent SJT score 113

114 7 6 16 15

27.20 18.93

df

1.33 1.30 1.30

1.28 1.27 1.27

1.01 1.02

0.91 0.92 .04 .21

.06 .26

p

Scaling corr. factor

Scaling corr. factor

245 239 241

4371.18 1052.93 1075.58

13.52 7.72

χ2

245 239 241

df

1590.36 516.65 528.81

χ2

.02 .02

.04 .02

RMSEA

< .001 < .001 < .001

< .001 < .001 < .001

p

.99 1.00

.98 1.00

CFI

.11 .05 .05

.09 .04 .04

RMSEA

.02 .02

.02 .02

SRMR

.70 .94 .94

.76 .95 .95

CFI

18889.74 18883.58

1 vs. 2 2 vs. 3

1 vs. 2 2 vs. 3

4 vs. 5

4 vs. 5

1713.20*** 19.89***

810.18*** 10.43*

TRd (Satorra-Bentler scaled χ2 diff. Test)

9.65*

19.80***

TRd (Satorra-Bentler scaled χ2 diff. Test)

Model comparison

Model comparison

81919.58 77494.95 77523.18

38739.30 37374.33 37387.34

AIC

7679.70 7676.42

AIC

.11 .06 .06

.09 .05 .06

SRMR

1

1

Δdf

6 2

6 2

Δdf

Note. *p < .05. **p < .01. ***p < .001. Correlated error between CQ motivation items 1 and 2, 1 and 3; CQ cognition items 1 and 2; CQ metacognition items 1 and 2, 2 and 3; CQ behaviour items 1 and 2 and 3, bCorrelated error between SJTa suspending judgment and considering cultural influences, SJTb considering cultural influences and behavior, ccorrelated error between SJTc suspending judgment and considering cultural influences, considering cultural influences and considering alternative explanations, as well as SJTb and SJTc considering alternative explanations, and SJTb and SJTc behaviour.

a

NRW dataset (N = 631) Model 4: All SJT ratingsb (one factor) Model 5: All SJT ratingsb (two factors – one per SJT) Berlin dataset (N = 1,335) Model 4: All SJT ratingsc (one factor) Model 5: All SJT ratingsc (two factors – one per SJT)

SJT

NRW dataset (N = 631) Model 1: Full CQ scalea (one factor) Model 2: Full CQ scalea (four correlated factors) Model 3: Full CQ scalea (four factors plus higher- order CQ factor) Berlin dataset (N = 1,335) Model 1: Full CQ scalea (one factor) Model 2: Full CQ scalea (four correlated factors) Model 3: Full CQ scalea (four factors plus higher-order CQ factor)

Self-reported CQ

Table 3 Results of the Confirmatory Factor Analyses of the Adapted Self-Report CQ Scale and the SJT Ratings.

M. Schwarzenthal, et al.

International Journal of Intercultural Relations 72 (2019) 109–121

International Journal of Intercultural Relations 72 (2019) 109–121

M. Schwarzenthal, et al.

Fig. 1. Results of Confirmatory Factor Analyses.

than males (zNRW = 2.17, pNRW = .03). Due to lack of scalar equivalence, the other latent means of the SJT scores should not be compared across groups. Validity Using the models with the highest level of equivalence achieved as a baseline, we calculated correlations between the higherorder self-report CQ factor and the SJT factor with each other, as well as with related constructs (for sample items and reliabilities, see Supplemental Materials E), split by immigrant background, gender, and grade. The higher-order self-report CQ factor and the SJT factor were positively correlated with each other in all groups (ranging from r = .31 to .51 in the NRW sample and from r = .33 to .57 in the Berlin sample). In the NRW dataset, self-reported CQ and SJT were both positively related to the openness subscale of the multicultural personality questionnaire (van der Zee, van Oudenhoven, Ponterotto, & Fietzer, 2013) and to perspective-taking abilities (Davis, 1980). In the Berlin dataset, self-reported CQ and SJT were positively related to diversity beliefs (adapted from Adesokan et al., 2011). Self-reported CQ typically showed stronger relations with these aspects than the SJT performance. Students with higher grade average self-reported higher CQ and performed better in the SJTs, whereby the relations with SJT were stronger than those with self-reported CQ. Relations were largely similar across groups (see Table 7). Discussion The aim of this study was to develop and test a multimodal intercultural competence measure that is based on a multifaceted model of intercultural competence and is tailored to adolescents in culturally diverse societies. To achieve this goal, we adapted the cultural intelligence (CQ) questionnaire (Van Dyne et al., 2012) and developed situational judgment tests (SJTs; based on Rockstuhl et al., 2015) for adolescents. Application of the measure in two student samples in different parts of Germany provided support for reliability and the coding categories identified for the SJTs resembled those found in previous qualitative research (Sieck, Smith, & Rasmussen, 2013). CFAs and theoretical considerations suggested a four-factor model with a higher-order CQ factor for self-reported CQ, and a one-dimensional model for the SJT ratings. The measures showed metric to scalar equivalence across immigrant background, gender, and grade. The lack of scalar equivalence of the SJT measures may be due to the fact that these were not matched to the participants’ own demographics, which may have led the students to identify with the people in the situations to differing 115

116

482 502 505 525 528

1308.24 1358.11 1362.03 1417.81 1433.35

14 19 24 32 39 47

28.59 31.71 55.03 73.80 121.65

1.22

1.22

1.24 1.23 1.23

1.22

1.22

1.23 1.23 1.23

0.98 0.98

0.99

0.85 0.87

0.90

< .001 < .001

.01

07 .13

.19

p

Scaling corr. factor

Scaling corr. factor

529

830.58

df

526

815.48

18.27

χ2

482 502 506

df

780.78 793.34 796.15

χ2

.04 .05

.04

.04 .03

.03

RMSEA

< .001

< .001

< .001 < .001 < .001

< .001

< .001

< .001 < .001 < .001

p

.96 .91

.97

.97 .98

.99

CFI

.05

.05

.05 .05 .05

.04

.04

.05 .04 .04

.93

.93

.93 .93 .93

.95

.95

.95 .95 .95

CFI

.04 .06

.03

.04 .04

.03

SRMR

RMSEA

16972.71 17002.59

16969.09

7682.35 7675.84

7684.68

AIC

.07

.07

.06 .07 .07

.07

.07

.06 .07 .07

SRMR

d vs. e

c vs. d

a vs. b b vs. c

d vs. e

c vs. d

a vs. b b vs. c

a vs. b b vs. c

a vs. b b vs. c

15.54**

52.26***

48.79*** 3.92

15.10**

16.15

12.56 2.81

TRd (Satorra-Bentler scaled χ2 diff. test)

19.10** 47.85***

11.07 3.47

TRd (Satorra-Bentler scaled χ2 diff. test)

Model comparison

Model comparison

69072.90

69059.58

69035.62 69043.88 69041.32

37395.42

37384.16

37439.73 37412.07 37408.77

AIC

7 8

5 5

Δdf

3

20

20 3

3

20

20 4

Δdf

Note. *p < .05. **p < .01. ***p < .001. Final level of equivalence reached highlighted in bold. NRW dataset: N = 324 students of non-immigrant background, N = 307 students of immigrant background, Berlin dataset: N = 501 students of non-immigrant background, N = 689 students of immigrant background, N = 145 did not receive parental consent to provide information on parents’ country of birth.

NRW dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence (Factor loadings equivalent) Model c: Scalar equivalence (Intercepts equivalent) Berlin dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence (Factor loadings equivalent) Model c: Scalar equivalence (Intercepts equivalent)

SJT (based on Model 4: one factor model)

NRW dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence I (First-order factor loadings equivalent) Model c: Metric equivalence II (First- and second-order factor loadings equivalent) Model d: Scalar equivalence I (First-and second-order factor loadings and firstorder intercepts equivalent) Model e: Scalar equivalence II (First-and second-order factor loadings and first- and second-order intercepts equivalent) Berlin dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence I (First-order factor loadings equivalent) Model c: Metric equivalence II (First- and second-order factor loadings equivalent) Model d: Scalar equivalence I (First-and second-order factor loadings and firstorder intercepts equivalent) Model e: Scalar equivalence II (First-and second-order factor loadings and first- and second-order intercepts equivalent)

CQ (based on Model 3: Four-factor model with higher-order factor)

Table 4 Equivalence of Latent Factor Models across Immigrant Background.

M. Schwarzenthal, et al.

International Journal of Intercultural Relations 72 (2019) 109–121

117

482 502 505 525 528

1353.78 1383.94 1388.21 1476.05 1485.92

14 19 24 32 39 47

20.07 27.11 37.92 61.34 113.84

1.01 1.05

1.00

0.90 0.95

0.82

Scaling corr. factor

530

951.80

df

527

946.03

19.37

χ2

483 503 507

df

878.56 904.18 907.35

χ2

1.25

1.25

1.27 1.26 1.26

1.20

1.20

1.22 1.21 1.21

.01 < .001

.22

.39 .30

.15

p

Scaling corr. factor

.03 .05

.02

01 .02

.04

RMSEA

< .001

< .001

< .001 < .001 < .001

< .001

< .001

< .001 < .001 < .001

p

.97 .92

.99

1.00 .99

.98

CFI

.05

.05

.05 .05 .05

.05

.05

.05 .05 .05

.93

.93

.94 .93 .93

.93

.93

.93 .93 .93

CFI

.04 .05

.03

.03 .04

.03

SRMR

RMSEA

18706.69 18748.04

18696.90

7585.98 7583.64

7593.80

AIC

.07

.07

.06 .07 .07

.07

.07

.07 .07 .07

SRMR

d vs. e

c vs. d

a vs. b b vs. c

d vs. e

c vs. d

a vs. b b vs. c

a vs. b b vs. c

a vs. b b vs. c

9.87*

96.16***

24.01 4.27

5.77

39.45**

23.15 3.00

TRd (Satorra-Bentler scaled χ2 diff. test)

22.77** 46.25***

1.94 6.75

TRd (Satorra-Bentler scaled χ2 diff. test)

Model comparison

Model comparison

77200.69

77196.53

77156.90 77141.69 77137.57

37044.75

37043.49

37065.73 37045.82 37043.39

AIC

7 8

5 5

Δdf

3

20

20 3

3

20

20 4

Δdf

Note. *p < .05. **p < .01. ***p < .001. Final level of equivalence reached highlighted in bold. NRW dataset: N = 320 male students, N = 304 female students, N = 7 did not provide information on gender; Berlin dataset: N = 685 male students, N = 645 female students, N = 5 did not provide information on gender.

NRW dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence (Factor loadings equivalent) Model c: Scalar equivalence (Intercepts equivalent) Berlin dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence (Factor loadings equivalent) Model c: Scalar equivalence (Intercepts equivalent)

SJT (based on Model 4: one factor model)

NRW dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence I (First-order factor loadings equivalent) Model c: Metric equivalence II (First- and second-order factor loadings equivalent) Model d: Scalar equivalence I (First-and second-order factor loadings and firstorder intercepts equivalent) Model e: Scalar equivalence II (First-and second-order factor loadings and first- and second-order intercepts equivalent) Berlin dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence I (First-order factor loadings equivalent) Model c: Metric equivalence II (First- and second-order factor loadings equivalent) Model d: Scalar equivalence I (First-and second-order factor loadings and firstorder intercepts equivalent) Model e: Scalar equivalence II (First-and second-order factor loadings and first- and second-order intercepts equivalent)

CQ (based on Model 3: Four-factor model with higher-order factor)

Table 5 Equivalence of Latent Factor Models across Gender.

M. Schwarzenthal, et al.

International Journal of Intercultural Relations 72 (2019) 109–121

118 21 31 45

40.11 103.37

df

35.97

χ2

1.08

1.08

1.10 1.08 1.09

.89 .84

.71 .13 < .001

.02

p

Scaling corr. factor

Scaling corr. factor

817

811

1363.29 1381.78

724 764 771

df

1213.70 1268.46 1277.82

χ2

.04 .09

.06

RMSEA

< .001

< .001

< .001 < .001 < .001

p

.96 .73

.94

CFI

.06

.06

.06 .06 .06

.05 .08

.04

SRMR

RMSEA

.09

.08

.07 .08 .08

SRMR

7583.20 7614.55

7593.09

AIC

.92

.92

.93 .92 .92

CFI

d vs. e

c vs. d

a vs. b b vs. c

a vs. b b vs. c

18.49**

89.64***

48.56 10.49

TRd (Satorra-Bentler scaled χ2 diff. test)

8.01 70.11

TRd (Satorra-Bentler scaled χ2 diff. test)

Model comparison

Model comparison

37344.99

37339.33

37374.75 37339.80 37339.30

AIC

Note. *p < .05. **p < .01. ***p < .001. Final level of equivalence reached highlighted in bold. NRW dataset: N = 195 6th graders, N = 221 8th graders, and N = 215 10th graders.

NRW dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence (Factor loadings equivalent) Model c: Scalar equivalence (Intercepts equivalent)

SJT (based on Model 4: one factor model)

NRW dataset Model a: Configural equivalence (Same model specified in both groups) Model b: Metric equivalence I (First-order factor loadings equivalent) Model c: Metric equivalence II (First- and second-order factor loadings equivalent) Model d: Scalar equivalence I (First-and second-order factor loadings and firstorder intercepts equivalent) Model e: Scalar equivalence II (First-and second-order factor loadings and first- and second-order intercepts equivalent)

CQ (based on Model 3: Four-factor model with higher-order factor)

Table 6 Equivalence of Latent Factor Models across 6th, 8th, and 10th Graders.

10 14

Δdf

6

40

40 7

Δdf

M. Schwarzenthal, et al.

International Journal of Intercultural Relations 72 (2019) 109–121

119

.06 .38***

.43***

.14** .43***

.06

.44***

.44***

.20***

.62***

.46***

15**

.50***

.40***

−.02

.57***

.12**

.34***

.47***

.09

.43***

– –

– –

.44***

.11

.32***

.45***

−.01





.48***

.41***

−.02

.41***

.27***

.22*

.23**

.15

.34***

.23***

.05

.10

.20*

Immigrant background

.32***

.23***

.24*

.16

.13

Male

.33***





.12

−.03

.26***

.00

.38***

6th grade





.12

.12

.37***

8th grade





.33***

.04

.22**

10th grade

Students Split by Grade

.17

.31***

Female

Students Split by Gender

Note. *p < .05. **p < .01. ***p < .001. NRW dataset: N = 324 students of non-immigrant background, N = 307 students of immigrant background, N = 320 male students, N = 304 female students, N = 7 did not provide information on gender, N = 195 6th graders, N = 221 8th graders, and N = 215 10th graders, Berlin dataset: N = 501 students of non-immigrant background, N = 689 students of immigrant background, N = 145 did not receive parental consent to provide information on parents’ country of birth, N = 685 male students, N = 645 female students, N = 5 did not provide information on gender, all students in 9th grade.

NRW dataset (N = 631) Grade average (higher scores = better grades) Multicultural Personality Questionnaire – Openness (mean of observed items) Perspective-taking (mean of observed Items) Berlin dataset (N = 1,335) Grade average (higher scores = better grades) Diversity beliefs (mean of observed items)

10th grade Non-immigrant background

8th grade

6th grade

Male

Non-immigrant background

Female

Students Split by Immigrant Background

Students Split by Grade

Students Split by Gender

Students Split by Immigrant Background Immigrant background

Latent SJT Factor

Latent Higher-Order Self-Report CQ Factor

Table 7 Correlations of Latent Higher-Order Self-Report CQ Factor and SJT Factor with Related Constructs.

M. Schwarzenthal, et al.

International Journal of Intercultural Relations 72 (2019) 109–121

International Journal of Intercultural Relations 72 (2019) 109–121

M. Schwarzenthal, et al.

degrees. Future research should develop a larger number of SJTs that are balanced across major demographic variables. Mean comparisons between different groups of students should always be treated with caution, because intercultural situations may be interpreted differently due to differences in previous intercultural experiences, salience of cultural identities, and discrimination experiences (Frankenberg, Kupper, Wagner, & Bongard, 2013; Phinney, Jacoby, & Silva, 2007). Students of immigrant background and females reported higher CQ. The former may have more previous intercultural experience, as they or their parents immigrated to a new country, and thus had to learn switching between different cultures (Ward, 2001). The latter tend to show higher levels of empathic concern during adolescence (Van der Graaff et al., 2014), which may also translate to their performance in intercultural situations. Supporting the validity of the measure, self-reported CQ and SJT performance were positively correlated with each other and with related measures. The results suggest that self-reported CQ (as compared to the SJT performance) is more strongly related to other self-report measures capturing openness or positive beliefs about diversity, and thus may mainly assess attitudes or intercultural selfefficacy (Leung et al., 2014), while the SJT performance (as compared to self-reported CQ) is more strongly related to cognitive aspects such as the students ‘grades, and thus may rather capture cognitive aspects of intercultural competence (Whetzel & McDaniel, 2009). Despite these positive results, the new measure also has weaknesses. Some subscales of CQ (especially metacognitive and behavioral CQ) were highly correlated, suggesting that they may not assess the separate subdimensions of CQ as accurately as intended, and might rather capture general intercultural self-efficacy (Klafehn et al., 2013). Since only two SJTs could be used in each study due to space constraints, we only assessed students’ intercultural competence in a limited range of intercultural situations. Future research should use more SJTs to confirm whether intercultural competence is indeed stable across different intercultural situations. The brief nature of the SJTs made it difficult to apply a dynamic conceptualization of culture. In order to delineate a situation as intercultural (Barrett, 2018), we placed salient cultural cues in the incidents, such as references to the students’ heritage countries. However, culture is not only tied to heritage countries, but may also vary between different generations, social classes, or regions, and students may adopt elements of their heritage cultures to varying degrees (Morris, Chiu, & Liu, 2015). Future research should construct SJTs that more clearly reflect a dynamic understanding of culture and should explore alternative methods of SJT presentation and assessment, such as video-based SJTs, interviews, or thinking-aloud procedures. The new measure may be a valuable tool not only for researchers but also for practitioners, e.g., to evaluate the effects of intercultural trainings in the school context. Based on the detailed coding manual, a discussion of the coding manual and coding of sample responses, coding of the responses could also be done by intercultural trainers or interculturally experienced teachers. To conclude, we successfully developed and tested a multimodal intercultural competence measure for adolescents in culturally diverse contexts. The new measure provides an important resource for future research and may help to broaden our knowledge on how adolescents growing up in culturally diverse societies may acquire crucial intercultural skills. Acknowledgements The data collection for the dataset from North Rhine Westphalia was part of a larger cross-national study on inclusive identity (P.I.: Byron Adams, Ph.D., Tilburg University, The Netherlands), that was financially supported by the National Research Foundation (NRF – grant number: 74653) and the University of Johannesburg. Appendix A. Supplementary data Supplementary material related to this article can be found, in the online version, at doi:https://doi.org/10.1016/j.ijintrel.2019. 07.007. References Adesokan, A. A., Ullrich, J., van Dick, R., & Tropp, L. R. (2011). Diversity beliefs as moderator of the contact–prejudice relationship. Social Psychology, 42, 271–278. https://doi.org/10.1027/1864-9335/a000058. Ang, S., Van Dyne, L., & Koh, C. (2006). Personality correlates of the four-factor model of cultural intelligence. Group & Organization Management, 31, 100–123. https:// doi.org/10.1177/1059601105275267. Ang, S., Van Dyne, L., Koh, C., Ng, K. Y., Templer, K. J., Tay, C., et al. (2007). Cultural intelligence: Its measurement and effects on cultural judgment and decision making, cultural adaptation and task performance. Management and Organization Review, 3, 335–371. https://doi.org/10.1111/j.1740-8784.2007.00082.x. Barrett, M. (2018). How schools can promote the intercultural competence of young people. European Psychologist, 23, 93–104. https://doi.org/10.1027/1016-9040/ a000308. Barrett, M., Byram, M., Lázár, I., Mompoint-Gaillard, P., & Philippou, S. (2013). Developing intercultural competence through education. Strasbourg, France: Council of Europe Publishing. Bennett, M. J. (1993). Towards ethnorelativism: A developmental model of intercultural sensitivity (revised). In R. M. Paige (Ed.). Education for the intercultural experience (pp. 21–71). Yarmouth, Me, USA: Intercultural Press. Bernardo, A. B. I., & Presbitero, A. (2018). Cognitive flexibility and cultural intelligence: Exploring the cognitive aspects of effective functioning in culturally diverse contexts. International Journal of Intercultural Relations, 66, 12–21. https://doi.org/10.1016/j.ijintrel.2018.06.001. Brislin, R. W. (1986). A culture general assimilator: Preparation for various types of sojourns. International Journal of Intercultural Relations, 10, 215–234. https://doi. org/10.1016/0147-1767(86)90007-6. Busse, V., & Krause, U.-M. (2015). Addressing cultural diversity: Effects of a problem-based intercultural learning unit. Learning Environments Research, 18, 425–452. https://doi.org/10.1007/s10984-015-9193-2. Chen, F. F., Sousa, K. H., & West, S. G. (2005). Teacher’s corner: Testing measurement invariance of second-order factor models. Structural Equation Modeling: A Multidisciplinary Journal, 12, 471–492. https://doi.org/10.1207/s15328007sem1203_7.

120

International Journal of Intercultural Relations 72 (2019) 109–121

M. Schwarzenthal, et al.

Cheung, G. W., & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 9, 233–255. https://doi.org/10.1207/s15328007sem0902_5. Davis, M. H. (1980). A multidimensional approach to individual differences in empathy. JSAS Catalog of Selected Documents in Psychology, 10, 85–104. Deardorff, D. K. (2004). The identification and assessment of intercultural competence as a student outcome of internationalization at institutions of higher education in the United States (Doctor of Education)Raleigh, North Carolina, USA: North Carolina State University. Earley, P. C., & Ang, S. (2003). Cultural intelligence: Individual interactions across cultures. Palo Alto, California, USA: Stanford University Press. Frankenberg, E., Kupper, K., Wagner, R., & Bongard, S. (2013). Immigrant youth in Germany. European Psychologist, 18, 158–168. https://doi.org/10.1027/10169040/a000154. Hesse, H.-G., & Göbel, K. (2007). Interkulturelle Kompetenz [Intercultural competence]. In E. Klieme, & B. Beck (Eds.). Sprachliche Kompetenzen. Konzepte und Messung. DESI-Studie (Deutsch Englisch Schülerleistungen International) (pp. 256–272). Weinheim, Germany: Beltz. Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling. A Multidisciplinary Journal, 6, 1–55. https://doi.org/10.1080/10705519909540118. Holm, K., Nokelainen, P., & Tirri, K. (2009). Relationship of gender and academic achievement to Finnish students’ intercultural sensitivity. High Ability Studies, 20, 187–200. https://doi.org/10.1080/13598130903358543. Klafehn, J., Li, C., & Chiu, C.y. (2013). To know or not to know, is that the question? Exploring the role and assessment of metacognition in cross-cultural contexts. Journal of Cross-cultural Psychology, 44, 963–991. https://doi.org/10.1177/0022022113492893. Koo Moon, H., Kwon Choi, B., & Shik Jung, J. (2012). Previous international experience, cross-cultural training, and expatriates’ cross-cultural adjustment: Effects of cultural intelligence and goal orientation. Human Resource Development Quarterly, 23, 285–330. https://doi.org/10.1002/hrdq.21131. Leung, K., Ang, S., & Tan, M. L. (2014). Intercultural competence. Annual Review of Organizational Psychology and Organizational Behavior, 1, 489–519. https://doi.org/ 10.1146/annurev-orgpsych-031413-091229. Matsumoto, D., & Hwang, H. C. (2013). Assessing cross-cultural competence: A review of available tests. Journal of Cross-cultural Psychology, 44, 849–873. https://doi. org/10.1177/0022022113492891. Morris, M. W., Chiu, C. Y., & Liu, Z. (2015). Polycultural psychology. Annual Review of Psychology, 66, 631–659. https://doi.org/10.1146/annurev-psych-010814015001. Muthén, L. K., & Muthén, B. O. (1998-2011). Mplus User’s Guide. Los Angeles, CA, USA, Muthén & Muthén. Phinney, J. S., Jacoby, B., & Silva, C. (2007). Positive intergroup attitudes: The role of ethnic identity. International Journal of Behavioral Development, 31, 478–490. https://doi.org/10.1177/0165025407081466. Reinders, H., Gniewosz, B., Gresser, A., & Schnurr, S. (2011). Erfassung interkultureller Kompetenzen bei Kindern und Jugendlichen. Das Würzburger Interkulturelle Kompetenz-Inventar (WIKI-KJ) [Assessing intercultural competences among children and adolescents. The Würzburg Intercultural Competence-Inventory (WIKIKJ). Diskurs Kindheits- und Jugendforschung, 4, 429–452. Rockstuhl, T., Ang, S., Ng, K. Y., Lievens, F., & Van Dyne, L. (2015). Putting judging situations into situational judgment tests: Evidence from intercultural multimedia SJTs. Journal of Applied Psychology, 100, 464–480. https://doi.org/10.1037/a0038098. Rutkowski, L., & Svetina, D. (2013). Assessing the hypothesis of measurement invariance in the context of large-scale international surveys. Educational and Psychological Measurement, 74, 31–57. https://doi.org/10.1177/0013164413498257. Schnabel, D. (2015). Intercultural competence: Development and validation of a theoretical framework, a cross-cultural multimethod test, and a collaborative assessment intervention (Ph.D.)Tübingen, Germany: Eberhard Karls Universität. Schwarzenthal, M., Juang, L. P., Schachner, M. K., van de Vijver, A. J. R., & Handrick, A. (2017). From tolerance to understanding: Exploring the development of intercultural competence in multiethnic contexts from early to late adolescence. Journal of Community & Applied Social Psychology, 27, 388–399. https://doi.org/ 10.1002/casp.2317. Schwarzenthal, M., Juang, L. P., Schachner, M. K., & Van de Vijver, A. J. R. (2019). “When birds of a different feather flock together” ? Intercultural socialization in adolescents’ friendships. International Journal of Intercultural Relations, 72, 61–75. Schwarzenthal, M., Schachner, M., Juang, L. P., & Van de Vijver, A. J. R. (2019). Reaping the benefits of cultural diversity: Classroom cultural diversity climate and students intercultural competence. European Journal of Social Psychology in press. Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86, 420–428. https://doi.org/10.1037/0033-2909. 86.2.420. Sieck, W. R., Smith, J. L., & Rasmussen, L. J. (2013). Metacognitive strategies for making sense of cross-cultural encounters. Journal of Cross-cultural Psychology, 44, 1007–1023. https://doi.org/10.1177/0022022113492890. Straffon, D. A. (2003). Assessing the intercultural sensitivity of high school students attending an international school. International Journal of Intercultural Relations, 27, 487–501. https://doi.org/10.1016/S0147-1767(03)00035-X. Syed, M., & Nelson, S. C. (2015). Guidelines for establishing reliability when coding narrative data. Emerging Adulthood, 3, 375–387. https://doi.org/10.1177/ 2167696815587648. United Nations (2017). The international migration report 2017 (highlights)Retrieved fromhttps://www.un.org/development/desa/publications/internationalmigration-report-2017.html. Van der Graaff, J., Branje, S., De Wied, M., Hawk, S., Van Lier, P., & Meeus, W. (2014). Perspective taking and empathic concern in adolescence: Gender differences in developmental changes. Developmental Psychology, 50, 881–888. https://doi.org/10.1037/a0034325. van der Zee, K. I., & van Oudenhoven, J. P. (2013). Culture shock or challenge? The role of personality as a determinant of intercultural competence. Journal of Crosscultural Psychology, 44, 928–940. https://doi.org/10.1177/0022022113493138. van der Zee, K. I., van Oudenhoven, J. P., Ponterotto, J. G., & Fietzer, A. W. (2013). Multicultural personality questionnaire: Development of a short form. Journal of Personality Assessment, 95, 118–124. https://doi.org/10.1080/00223891.2012.718302. Van Dyne, L., Ang, S., Ng, K. Y., Rockstuhl, T., Tan, M. L., & Koh, C. (2012). Sub-dimensions of the four factor model of cultural intelligence: Expanding the conceptualization and measurement of cultural intelligence. Social and Personality Psychology Compass, 6, 295–313. https://doi.org/10.1111/j.1751-9004.2012. 00429.x. Ward, C. (2001). The ABCs of acculturation. In D. Matsumoto (Ed.). Handbook of culture and psychology (pp. 411–445). New York, USA: Oxford University Press. Whetzel, D. L., & McDaniel, M. A. (2009). Situational judgment tests: An overview of current research. Human Resource Management Review, 19, 188–202. https://doi. org/10.1016/j.hrmr.2009.03.007.

121