Antecedents to the adoption of augmented reality smart glasses: A closer look at privacy risks

Antecedents to the adoption of augmented reality smart glasses: A closer look at privacy risks

Journal of Business Research 92 (2018) 374–384 Contents lists available at ScienceDirect Journal of Business Research journal homepage: www.elsevier...

NAN Sizes 0 Downloads 63 Views

Journal of Business Research 92 (2018) 374–384

Contents lists available at ScienceDirect

Journal of Business Research journal homepage: www.elsevier.com/locate/jbusres

Antecedents to the adoption of augmented reality smart glasses: A closer look at privacy risks

T



Philipp A. Rauschnabela, , Jun Heb, Young K. Rob a b

Universität der Bundeswehr München, College of Business, Werner-Heisenberg-Weg 39, 85579 Neubiberg, Germany University of Michigan-Dearborn, College of Business, 19000 Hubbard Drive, Dearborn, MI-48126, USA

A R T I C LE I N FO

A B S T R A C T

Keywords: Augmented reality Smart glasses Benefits Risks Other people's privacy ARSG Mixed reality

Numerous research studies and corporate press releases highlight the potential of a new form of wearable device appearing on the technology landscape: augmented reality smart glasses (ARSGs), i.e., digital eyeglasses that integrate virtual information into the user's field of vision. Yet research knows very little about this nascent technology. Therefore, the authors develop and empirically test a theoretical model to assess ARSG usage. Core findings are that expected utilitarian, hedonic, and symbolic benefits drive consumers' reactions to ARSGs. The results also show that the extent to which ARSGs threaten other people's, but not one's own, privacy can strongly influence users' decision making. A qualitative second study identifies multiple explanations for this surprising privacy finding. Theoretical and managerial implications conclude.

1. Introduction Glasses-like devices, introduced or announced by Samsung, Facebook, Amazon.com, Magic Leap, Everysight, Microsoft, and other companies in recent years, offer augmented reality (AR) technology integrating virtual and physical information into a user's field of vision. These augmented reality smart glasses (ARSGs) offer tremendous application opportunities in areas of marketing, entertainment, logistics, manufacturing, health care, and others (Eisenmann, Barley, & Kind, 2014; Rauschnabel, 2018; Scholz & Duffy, 2018). A recent Goldman Sachs (2016, p. 4) study concludes that “as the technology advances, price points decline, and an entire new marketplace of applications (both business and consumer) hits the market, we believe VR [virtual reality]/AR has the potential to spawn a multibillion-dollar industry, and possibly be as game-changing as the advent of the PC.” Recently, many companies have already adopted the use of ARSGs, such as in product engineering, employee coaching, warehousing and logistics, or in medical applications. Microsoft launched its AR-based HoloLens, and various firms have invested in this technology (Microsoft, 2016). In contrast, Google Glass is an example that has had less market success (Haque, 2015). AR-device manufacturers and app developers therefore might benefit from deeper insights into the factors that explain consumers' reactions to ARSGs. In particular: Which factors determine consumers' adoption of ARSGs? Extant literature provides only limited answers to this question.



While the managerial importance of this inquiry is high, ARSGs are also theoretically noteworthy. First, they are worn like regular spectacles, so fashion-related factors might be relevant determinants in explaining consumer acceptance (Haque, 2015). Established media and technology adoption theories, however, typically do not include such variables. Second, ARSGs' AR component breaks the boundaries between reality and “virtuality” (Craig, 2013). Except for a few current mobile applications of AR (e.g., Pokémon Go), this boundary crossing is novel to most consumers. However, the potential of how AR can dramatically change business and marketing has been discussed in managerial (e.g., Javornik, 2016a, b) and academic outlets (Scholz & Smith, 2016) alike. With but a few exceptions (e.g., Javornik, 2016a; Pantano et al., 2017; Scholz & Duffy, 2018), the theoretical understanding of how people react to these technological developments remains scarce. Third, various sensors (e.g., cameras) are employed to integrate and process virtual information with real-world information, so privacy concerns with the use of ARSGs arise (Eisenmann et al., 2014). Prior research has widely replicated the finding that people react negatively to technologies that collect too much personal information about them (Debatin et al., 2009); however, ARSGs can also threaten the privacy of other people (Haque, 2015). These boundary-crossing characteristics mean that ARSGs differ from other technologies, and as such, existing theoretical models may not appropriately explain consumers' reactions to ARSGs. Therefore,

Corresponding author. E-mail addresses: [email protected] (P.A. Rauschnabel), [email protected] (J. He), [email protected] (Y.K. Ro).

https://doi.org/10.1016/j.jbusres.2018.08.008 Received 11 April 2016; Received in revised form 9 August 2018; Accepted 12 August 2018 Available online 22 August 2018 0148-2963/ © 2018 Elsevier Inc. All rights reserved.

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

adoption intention of ARSGs. They showed that numerous benefits and social norms are positively associated with adoption intention. Weiz, Anand, and Ernst (2016) found that usefulness and social norms influenced actual use of Google Glass, and Eisenmann et al.'s (2014) case study on Google Glass discussed various technological and social factors, as well as business applications. Finally, Rauschnabel and Ro (2016) investigated the relationship between Google's reputation in handling user data and consumers' reactions to Google Glass. Surprisingly, they found no significance, raising questions about the relevance of privacy concerns to ARSGs adopters. A lack of knowledge on privacy issues for ARSGs is not the only gap in the extant literature as various other antecedents have received little, if any, attention. For example, how the wearing of ARSGs affects a user's (ideal) appearance remains an under-researched area. In this investigation, we draw from two theoretical lenses to better understand ARSGs adoption: (1) U> research, as consumers' needs and motives drive their reactions towards media and technology, and thus, also explain their reactions to ARSGs; and (2) privacy research, as ARSGs may threaten the privacy of both the user and others.

considering the unique and nascent nature of ARSGs, this research aims to investigate their expected benefits and perceived risks from the view of consumers. Building on the literature on technology acceptance, privacy risks, and uses and gratifications theory (U>), we develop a model to tackle the following research question: “Which factors drive the adoption of ARSGs?” (RQ1). While Study 1 confirms the proposed benefits, some unexpected findings with regards to privacy risks remain. Therefore, we conducted a second study to investigate the theoretical mechanisms of these unexpected findings in more detail. Thus, RQ2 is: “Why do (or don't) people care about their own vs. other people's privacy?” 2. Literature review and theory With the rise of smart mobile technologies, an “always and everywhere” online mentality has become ubiquitous. Many of these new devices offer a user the opportunity to install Internet-based apps. Smart technologies in consumer markets began with handheld devices such as personal digital assistants. The breakthrough of smart devices was the Apple iPhone in 2007 (smartphone), followed by tablets (e.g., iPads) in 2010. With the advent of smartwatches (e.g., Apple Watch), smart devices became wearable (Chuah et al., 2016). Recently, several manufacturers have launched a new generation of smart devices: ARSGs, i.e., glasses-like smart devices that integrate virtual information in a user's field. Microsoft HoloLens and ODG R-7 are examples of existing ARSGs.

2.2. Benefits in the context of ARSGs U> provides an additional theoretical lens to understand the motivational aspects of ARSGs adoption/usage. Espoused by communications scholars, U> was originally applied to address how and why people accept new forms of media but has grown in prevalence among scholars (e.g., Eighmey & McCord, 1998). U> is a theoretical motivational paradigm (Katz, 1959) that addresses individuals' motivations to adopt a particular technology (Ruggiero, 2000), as potential users seek different gratifications from various technologies (Sheldon, 2008). U> scholars have developed various categories of individual needs or gratifications (Baldus, Voorhees, & Calantone, 2015) including utilitarian (gaining of benefits, information), hedonic (diversion, release from problems and stress, entertainment), and symbolic (social advantage, connection, self-expression). In general, U> addresses motivational drivers for media use, determinants that affect these drivers, and consequences from technology- and media-related behaviors (Sheldon, 2008). In addition, U> is a robust theory that can be adjusted to various contexts and integrated with other theories (Nysveen, Pedersen, & Thorbjørnsen, 2005; Rauschnabel, 2018).

2.1. Overview of prior research As discussed, ARSGs are interesting for two reasons: First, the AR component among ARSGs is a recently established research domain. However, AR is much more realistic among ARSGs than among other mobile or stationary devices, indicating that looking at AR research alone is not sufficient to explain ARSG adoption. Second, ARSGs are a specific form of smart wearable devices. An increasing number of studies have investigated acceptance of numerous wearable technologies, which reflect a second novel research stream (see Kalantari, 2017 for a review). This study is one of the first to combine these research streams. In recent years, AR has received increased attention in business publications in which applications, benefits and practical success stories have been shared (e.g., Javornik, 2016a, b). Therefore, it is not surprising that scholars from various disciplines have studied user acceptance of AR. Scholars from business disciplines have discussed both the managerial relevance and behavioral aspects of AR. For example, Javornik (2016a, b) found that a good augmentation leads to a ‘flow’ experience, which subsequently drives consumers' reactions. Likewise, Jung, Chung and Leue (2015) show that theme park users' satisfaction with AR apps is influenced by the quality of the AR content, the system as a whole, and the degree to which the AR information is personalized. The second relevant stream of research investigates the acceptance of smart wearable devices, where most of them (e.g. fitness trackers, smartwatches, etc.) do not contain AR components. Chuah et al. (2016), for instance, investigated the acceptance of smartwatches using TAM and found that people tended to categorize them as fashion, technology, or both (‘fashnology’) and made judgments based on the visibility of the device and its usefulness. Yang et al. (2016) focused on wearable devices in general and not on a particular form of device. They found that several established technology acceptance factors such as usefulness, enjoyment, image and financial risk impacted consumers' evaluations of their devices. Surprisingly, consumer research incorporating risk factors, especially privacy, in the context of AR is scarce. Finally, scant research exists on the adoption of ARSGs. A few exceptions include studies from Rauschnabel, Brem, and Ivens (2015), tom Dieck & Jung (2015) or Rauschnabel (2018) who explored the

2.3. Privacy research The development of information technology can pose threats to individual privacy (Collier, 1995). According to Collier (1995, p. 41), “[Privacy concerns are] about the perceived threat to our individual privacy owing to the staggering and increasing power of informationprocessing technology to collect vast amounts of information about us…outside our knowledge, let alone our control.” As technologies become increasingly personal, ubiquitous, and pervasive, privacy concerns will grow in importance. Accordingly, many scholars define privacy concerns as general concerns that reflect a user's inherent worries about the potential loss of personal information from using a target technology (Malhotra, Kim, & Agarwal, 2004). Privacy concerns affect the perceived trustworthiness of the technology and create a psychological barrier of risk, which involves uncertainty and vulnerability (Barney & Hansen, 1994), and therefore affect individuals' willingness to adopt a new technology (Connolly & Bannister, 2007). Centering on personal privacy, however, leaves a gap in the social context of new technology users. For example, users of social network sites often post information about people they know without asking their permission (Nissenbaum, 2010). This development demonstrates that in a technology-enabled and connected world, the flow of

375

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

importance (e.g., Holman, 1980; Tunca & Fueller, 2009). Bierhoff (1989) echoes this view and finds that initial personal judgments are predominantly driven more by visible cues (here, ARSGs worn on a user's face) than less visible cues (e.g., carrying a smartphone). Finally, Belk (1978, p. 39) concludes that “[I]n virtually all cultures, visible products and services are the bases for inferences about the status, personality, and disposition of the owner or consumer of these goods.” While research agrees that people use visible cues to judge others, evidence also shows that people intentionally manage visible cues. For example, consumers tend to buy luxury products that are visible to others for symbolic reasons such as social status (Wilcox, Kim, & Sen, 2009) or buy apparel because they like the design. In the context of wearables, Chuah et al. (2016) showed that people tend to evaluate wearables based on visibility, and theorize that this is due to the impression such wearables make on others. Therefore, the current study argues that ARSGs, having a fashion component, can be intentionally used by consumers to improve how they are perceived by their peers. Thus, we introduce the construct “symbolic benefits” to the ARSG-literature and theorize it as an antecedent to the adoption of ARSGs:

information is beyond the control of any one individual. As such, privacy concerns also extend to other people, an issue that has not been investigated yet. 3. Model and hypotheses development 3.1. Model overview ARSGs have unique characteristics that limit the application of a single theory to explain consumers' reactions. The innovativeness of the technology, its fashion-design component, the potential social interaction with non-users, and possible risks require a comprehensive model that addresses motivations, and risk perceptions in predicting reactions to ARSGs. In particular, we draw on prior media and technology frameworks and privacy research to investigate the technological, motivational, and privacy risk–related factors associated with ARSGs. Our model theorizes that various expected benefits and perceived risks determine consumers' reactions to ARSGs. We assess the model in two different studies. In Study 1, we test the hypothesized effects using a quantitative approach. Study 2 is a qualitative follow-up investigation that identifies explanations for selected findings from Study 1.

H3. Expected symbolic benefits have a positive effect on consumers' intention to adopt ARSGs.

3.2. Expected benefits 3.3. Perceived risks 3.2.1. Expected utilitarian benefits Technology and media research has widely examined and validated the role of utilitarian factors as an antecedent to technology adoption (King & He, 2006). For this study, we define expected utilitarian benefits as a user's assumptions of the extent to which ARSGs make his or her life more efficient (Venkatesh et al. 2012). Examples of such taskoriented outcomes of ARSGs include organizing appointments, searching for information, using navigation, and so forth (Rauschnabel, 2018).

3.3.1. Perceived risk to one's own privacy A core criticism of ARSGs is the potential threat to privacy. There are two possible threats to privacy: Threatening the privacy of a user (H4) and of the people associated with the user (H5). A user's own privacy could be threatened if, for example, hackers gain access to a device. The concern about privacy threats thwarts the development of trust in the technology and results in a perceived risk of privacy loss (Connolly & Bannister, 2007). As people generally care about their privacy, the risk of losing it is likely to be a barrier to adoption.

H1. Expected utilitarian benefits have a positive effect on consumers' intention to adopt ARSGs.

H4. Perceived risk to personal privacy has a negative effect on consumers' intention to adopt ARSGs.

3.2.2. Expected hedonic benefits Expected hedonic benefits refer to the degree to which users expect to gain hedonic rewards (e.g., entertainment, enjoyment) through the use of ARSGs. Following related research (e.g. Rauschnabel, 2018), we hypothesize:

3.3.2. Perceived risk to other people's privacy As ARSGs automatically screen and process a user's environment, the privacy of users and those around them can be affected. In decision making, people often consider how other people perceive their behavior (Nolan et al., 2008). In this vein, research shows that two normative factors are particularly persuasive especially in situations where behavior is visible to others: Perceptions about what most people do and what other people – particularly those who are important to a person – expect a person to do (Nolan et al., 2008). Likewise, most people are generally interested in maintaining their social relationships and avoiding interpersonal conflicts. This likely includes a person's desire to also avoid negative reactions from his or her peers because of the fear of threatening their privacy. Indeed, anecdotal evidence for negative reactions from other people who fear their privacy comes from news sources that report various cases in which Google Glass users were physically assaulted by non-users (Eisenmann et al., 2014). Thus, we propose that the extent to which a consumer believes that ARSGs threaten the privacy of other people, termed as ‘others people's privacy’, the more negatively it influences consumer behavior. In addition, two other assumptions are important to note: First, a user can only threaten the privacy of other people in the presence of other people. Second, prior research also shows that the kind of people and the context of an action matter in decision making. For example, if a

H2. Expected hedonic benefits have a positive effect on consumers' intention to adopt ARSGs.

3.2.3. Expected symbolic benefits Expected symbolic benefits refer to the degree to which a user expects to gain symbolic rewards (e.g., making a positive impression by others) from using ARSGs. Prior frameworks (c.f. King & He, 2006) have looked at image (the degree to which a person thinks that using a technology enhances his or her status in a workplace environment), whereas the framework proposed in this study emphasizes the role of wearing ARSGs as a means to improving the symbolic perception one receives from other people. The marketing literature strongly supports this rationale. In particular, ARSGs are worn like regular glasses and thus are similar to clothes, garments, and other accessories. In the era of hectic and short social interactions, individuals evaluate the appearance of others and make value judgments quickly. Fashion factors affecting physical appearance (e.g., clothing, jewelry, cosmetics) are thus of particular

376

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

Fig. 1. Study 1.

expected benefits and perceived risks of ARSGs regarding adoption. 285 students (58% male; age: M = 23.2 years, SD = 4.6) at a North American university took part in an online survey for partial course credit. The questionnaire began with a brief description of ARSGs (see Appendix B). Then, respondents were randomly assigned to one of seven ARSG devices groups (Microsoft HoloLens, Epson Moverio, Sony SmartEyeglass, ZeissGlasses, Everysight Raptor, Google Glass, and ODG R-7). Two photos of the device were shown to each respondent before and while all device-specific measures were measured. We adopted multi-item scales from the literature to measure each construct (see Appendix A). Confirmatory factor analysis, tests for common method variance and discriminant validity did not elicit any concerns (χ2 = 6445.2, df = 465, p < .001; CFI = 0.95; TLI = 0.94; RMSEA = 0.05; SRMR = 0.04).

person perceives other people as similar or particularly relevant to them, they consider social conformity more than if those people are different from them (Goldstein et al., 2008). Therefore, we will investigate further theoretical mechanisms and boundary conditions to explain why and when a threat to other people's privacy matters in Study 2. H5. Perceived risk to others people's privacy has a negative effect on consumers' intention to adopt ARSGs. 3.3.3. Perceived risk of losing autonomy Research has examined the role of new technologies in heightening users' fears of being controlled, often in the context of users' perceived autonomy (Walter & Lopez, 2008). Rooted in self-determination theory (Deci & Ryan, 2002), perceived autonomy reflects the degree to which individuals believe they have full autonomy over their actions without external interference in a certain situation. When individuals believe their choices depend on their autonomous decisions, they feel psychologically free and are intrinsically motivated (Deci & Ryan, 2002). Conversely, in a situation in which choice depends largely on external factors with little or no autonomous decision making, people feel a loss of autonomy over their actions and therefore are reluctant to act. This situation is most obvious in the use of new technologies such as selfservice technology. The literature suggests that perceived control over a technology affects consumers' tendencies to adopt/use the technology (Wünderlich, Wangenheim, & Bitner, 2013). Perceived control refers to users' confidence of using technology at their disposal to achieve desirable outcomes (Lee & Allaway, 2002). Accordingly, we hypothesize:

4.2. Results We applied structural equation modeling in Mplus 7.1 and controlled for age, gender, ease of use, familiarity with ARSGs in general, and the type of device. An assessment of the overall model fit did not elicit any concerns (χ2 = 7475.8; df = 752 p < .001; CFI = 0.93; TLI = 0.93; RMSEA = 0.05; SRMR = 0.05). Fig. 1 depicts the results. All three expected benefits are significantly related to adoption (utilitarian: βH1 = 0.388, p < .001; hedonic: βH2 = 0.224, p < .01; symbolic: βH3 = 0.165, p < .01), in support of H1–H3. The results also do not show any significant effects for the risk of threatening one's own privacy (βH4 = 0.022, p = .42) or losing autonomy (βH6 = 0.086, p = .13), thus rejecting H4 and H6. However, the results show a weak effect for the risk of threatening other people's privacy (βH5 = −0.169, p < .05), in support of H5.

H6. Loss of autonomy has a negative effect on consumers' intention to adopt ARSGs. 4. Study 1: Antecedents to the adoption of augmented reality smart glasses

4.3. Discussion

4.1. Research design

Study 1 shows that all expected benefits affect consumers' reactions to ARSGs. In contrast, only one of the proposed risks—the risk of threatening other people's privacy—showed a significant effect.

The purpose of Study 1 is to investigate the relationship between the 377

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

Fig. 2. Study 2 (privacy-related findings).

5. Study 2: A closer look at privacy risks

technology experience and use in general. These questions served as opening questions to alleviate formality. We then shifted the conversation to ARSGs and asked for their assessment, evaluation, and prior experience (e.g. “How do you evaluate ARSGs?”, “What are the advantages and disadvantages of ARSGs?”). Once respondents mentioned privacy related topics, we directed the conversation to underlying mechanisms by asking stimulating questions (Corbin and Strauss 2015), such as “Please tell me more about this”, “Can you tell me a situation in which this has/could happen/ed?”, or “How serious is that concern?”. Once respondents did not mention additional insights, we debriefed them and thanked them for their participation. We followed an iterative process for data analysis, consisting of oscillating between data and theory (Corbin and Strauss 2015). Building on suggestions from thematic analysis (Boyatzis 1998; Braun and Clarke 2006), we identified privacy related themes derived from the transcripts and integrated our findings into our conceptual framework (Fig. 2). Following common procedures in qualitative research (Epp and Price 2010), we analyzed our data using open (Miles and Huberman 1994), axial (Goulding 2005), and selective coding (Spiggle 1994). To provide evidence for our analysis and interpretations, we include and discuss quotes from our informants throughout the findings section (Patton 2015), which is presented below.

Findings of the prior study support that people tend to be influenced by the threat to other people's privacy but not their own. To further explore this counter-intuitive finding, we conducted a qualitative post hoc study1 to better understand the mechanisms underlying the effects of the two different privacy risks. 5.1. Research process and data analysis We incorporated the use of unstructured interviews in our qualitative study, a technique that provides researchers with a maximum amount of flexibility. This methodology does not risk imposing predefined mental structures on the informants (as semi-structured interviews often do), and allows for the generation of contextual, nuanced, and authentic accounts of informants' experiences and how they interpret them (Schultze and Avital 2011). Our sample consists of 21 people with different demographic backgrounds and technology experience (see Appendix C), who replied to the researchers' invitation email and agreed to participate in a scientific study on technologies. Interview durations ranged from 20 to 60 min each (m = 34). We encountered first redundancies after approximately twelve to fourteen interviews, and stopped data collection when all involved researchers agreed that a state of theoretical saturation occurred (Guest, Bunce, and Johnson 2006). The interviews started with a vague explanation of the study background, i.e. a study on concerns of innovative technologies. After guaranteeing anonymity and getting approval for recording, respondents were asked to introduce themselves and to explain their

5.2. Findings Fig. 2 summarizes our findings exploring the effects of the two privacy risk constructs from the main model. It presents mechanisms that explain why (box “mechanisms”) and when (box “boundary conditions”) the perceived risk to other people's privacy impacts consumer behavior, as well as themes explaining why the established risk factor to a user's own privacy does not show significant effects in the quantitative study (box “preventers”).

1 We thank an anonymous reviewer for this suggestion. The post-hoc study's interviews were centered around privacy risks; we only discuss privacy-related findings relevant to the results of H5 and H6 in the paper.

378

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

5.2.1.4. Nothing to hide. A frequently mentioned topic (10 respondents, 17 quotes) concerning personal privacy was the statement that consumers had ‘nothing to hide’. This means that in specific situations, many of the interviewees did not care if someone had access to their personal information. For example, some interviewees argued that their lives were not that interesting so they doubted that other people might be interested in collecting their personal information. Likewise, Derek stated that he did not have many secrets to hide from others as he was not involved in any unethical or illegal actions:

5.2.1. Mechanisms related to a user's privacy Study 1 indicates that consumers do not incorporate the threat to their personal privacy into their decision making when it comes to ARSGs. The researchers identified 56 quotations that refer to four explanations for this finding, which we term “preventers” (c.f. Fig. 1) since these factors ‘prevent’ existing privacy risks from influencing behavior. 5.2.1.1. Flexibility and control. Six interviewees stated in eleven different quotes that their own privacy was not a big concern partially because they perceived a strong control over ARSGs. Interviewees argued they could remove or switch off ARSGs in very personal situations. For example, Daisy explained that she would just take the ARGSs off in private situations, such as at home, where hackers could see very personal things. Another interviewee expressed that he would not use ARSGs until he was familiar with the technology including the underlying data collection mechanisms. Jona stated that privacy issues would lead to a more careful use of ARSGs:

“I am not sure why it could be interesting to hack into my glasses. And if so, I'd have nothing to hide. The only people that complain are those that engage in any criminal actions. The same as with video surveillance. Who cares about being recorded while walking through a store unless you are a shoplifter?” 5.2.2. Mechanisms related to other people's privacy We also identified six separate factors (in 84 different quotes) that explain why the risk of threatening other people's privacy affects consumer reactions negatively. We will discuss these mechanisms in the subsequent sections in detail.

“Privacy-wise [third parties and hackers] can see everything I see, which is why I'd have to be careful when I wear them”.

5.2.2.1. Legal fears. Six interviewees argued that using a technology that could track and collect data about other people violated their personal rights (in nine quotes). Some interviewees referred to the U.S. Constitution and other forms of personal rights. Interviewees expressed this with terms such as ‘huge legal violations’ (Marc), ‘violating the rights of people’ (Alex) and other legal concerns. Interviewees associated these violations with illegal actions such as secret recordings of people or collecting and processing any personal information for which they as users feel lawfully responsible. For example, Marc stated that he would be afraid of any legal consequences, especially since “nobody knows what is allowed and what is not”.

5.2.1.2. Resignation. Fifteen interviewees also compared ARSGs' risk to the threat to their privacy through other media and technologies they were already using, such as smartphones, smartwatches, or Facebook. In 21 different quotes, they indicated their resignation regarding privacy. While interviewees generally stated that ARSGs could collect more information than existing platforms or devices, some of them stated explicitly that the kinds of threats were similar. Interviewees showed a form of resignation that threats to their privacy were part of the evolution of digitization, and they had to live with that threat and there was little a user could change. Marc even argued that we were living in an era where there is ‘no privacy at all’. Several interviewees suggested that the loss of privacy was the cost of enjoying the benefits of effective and affordable digital assets in our lives. The following two examples indicate that people “pay” with their privacy, and – at least to a certain extent – accept this as an unchangeable development:

5.2.2.2. Ethical fears. While the legal fears associated with what is or not permissible, another theme emerged among 16 respondents (24 quotes) with regards to what is morally acceptable. Many interviewees perceived that the collection of information about others, especially if others were not aware of it, was highly unethical. One interviewee for example, referred to the “old golden rule: Treat other people how you want to be treated” (Aric). He further expressed that he expected his privacy to be respected by other people so he would not violate other people's privacy in return. These ethical fears are especially prevalent for ARSGs due to their conspicuous design (some of them look very similar to traditional glasses) and because many people might not be aware of it.

“I like technology; it makes things easier. But generally, when we are gonna utilize a technology, we have to sacrifice privacy” (Aaron) [Google] is recording everything on your phone. Let's say while we are sitting over here right now having this conversation, Google might be recording it….I feel like that is happening right now… when it comes to Google glasses, it's not really anything different. That can happen, we live in an age right now where, there's no privacy, we are leaving an imprint everywhere. So Google glasses would not be any different. (Alex)

Daisy: “Looking at other people and tracking them without even asking them…I think that's a huge violation” Interviewer: “Ethically or legally?” Daisy: “ehm…especially ethically…probably legally, I am not sure what the legal issues are…but definitely ethically….I feel everyone should have their own privacy, and by wearing them, I'd violate it”

5.2.1.3. Abstractness of consequences. Another theme emerged from the potential consequences, indicated by six respondents (in seven quotes). Even though people are aware of privacy risks, some of the interviewees mentioned that these consequences are somehow abstract, vague or unrealistic. For example, one interviewee shifted the conversation to her smartphone and mentioned that similar risks existed and stated that “I don't really think about it…that someone could hack into my phone” (Stacey). When asking about the potential consequences of hacking into their ARSGs, most interviewees did not come up with specific ideas. In contrast, some interviewees came up with more specific consequences of data breaches among other technologies. One frequently mentioned example was online banking, where hackers could have access to their financial resources. While interviewees generally were aware that their privacy could be threatened, most of them could not clearly articulate the consequences of having their ARSGs hacked, and expressed this more vaguely, such as ‘it feels bad’.

5.2.2.3. Artificial environment. Some of the interviewees (seven, nine quotes) felt that threatening other people's privacy could alter their social interactions with peers who were skeptical about ARSGs. Interviewees used terms such as ‘like in a fake world’ (Daisy), ‘untrue behavior’ (Lena), ‘less communicative’ (Susan), ‘artificial conversations’ (Jona) or ‘less relaxed’ (Derek) as examples of how their peers would interact with them. Interviewees argued that peers who perceived ARSGs as a threat to their privacy could act in a socially desirable manner, not expressing critical thoughts or particular feelings, and being less communicative. For example, Nina stated: 379

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

matter. Nine informants supported this by discussing the identified mechanisms with examples where other people were present (ten different quotes). As discussed in the Hypothesis section, this boundary condition is plausible as other people's privacy can only be violated if other people are present. Second, the usage context seems to matter, as indicated by 16 respondents (in 22 quotes). For example, some interviewees refereed to specific usage contexts where privacy, in general, was not prevalent, such as in a workplace environment. In the workplace, supervisors and co-workers might observe one other, cameras might surveil the workspace or work behavior is tracked regardless. Whereas this specific example is in line with the findings of Study 2, other examples include particular situations where other people's privacy is even more important. For example, when people interact with friends and talk about very personal things, they care about their privacy whereas the situation can be less relevant when conversations are less private or deep. For example, Lena stated that:

“People would not want to communicate and interact with you as much if they knew that information was recorded. They don't really know what you're capable of with those glasses so I think people would try to avoid communication.” 5.2.2.4. Negative norms. Related to the topics discussed before, there was a consensus among 12 interviewees that threatening other people's privacy would lead to negative reactions from other people (16 quotes) – for example, in a way that people asked ARSG wearers to stop using the technology. Extreme examples of this have been reported in media where Google Glass users have been physically attacked (Eisenmann et al. 2012). Prior technology research has studied the role of social norms, defined as the degree to which a person expects a person to adopt a particular technology (ranging from low to high). In this study, we identified a novel construct that we term ‘negative social norms’, indicating that a user's peers might expect that he or she is not using ARSGs. Negative social norms are therefore not ‘weak’ social norms (i.e. not: “I do not expect that you use it”) but actually the reverse (i.e. “I expect that you do not use it”). The magnitude of the negative social norms, however, depends on the context. For example, Aric stated that he'd get ‘different reactions’ based on who the people around him were, and John even stated that non-users might ‘hate’ being surrounded by ARSG users, and if this was the case, he would be “socially shamed and probably have to take it off…because it's making everyone else around me uncomfortable”. Other interviewees provided explanations that peers might be more tolerant in certain situations (e.g. a work environment being monitored by cameras) than in others (e.g. being out with friends).

If my girlfriend talks and shares some very personal problems with me, or if we have a conversation that's deeper than just small talk, it is in my responsibility to keep this secret.

5.3. Discussion of Study 2 Study 2 sheds additional light onto the underlying mechanisms of the two privacy constructs. The results of this study identify different mechanisms on how privacy concerns affect people's reactions to ARSGs. On a theoretical level, the theory of bounded rationality may provide an explanation (Smith et al., 2011): people generally tend to act as ‘satisficers’, who tend to seek for a satisfactory problem solution rather than an optimal one (Smith et al., 2011). Given the existence of two similar risks (i.e., one's own and other people's privacy), humans are more sensitive to the one whose consequences might occur sooner rather than later ("hypoerbolic discounting,", see O'Donoghue & Rabin, 2001; Smith et al., 2011). With regards to ARSGs, the immediacy of consequences to privacy violation may play a role in people's reactions to the technology. Although people are aware that hackers may steal personal data by hacking the device (as reflected by Study 2), leading to identity theft, such a consequence is only a remote possibility and may happen further away in the future. For example, ‘flexibility and control’ can serve as a strategy to reduce the likelihood of these risks, and ‘nothing to hide’, ‘resignation’ and ‘abstractness of consequences’ (i.e. the Preventers in Fig. 1) as reasons why the consequences of threats to their privacy are not very meaningless. In contrast, when threatening other people's privacy, many of these consequences (i.e. the Mechanisms listed in Fig. 1) can happen immediately. For example, a user might immediately expose ethical or legal fears and other people might immediately change their behaviors, react negatively to ARSG users and so forth.

Derek: “I can't imagine that anyone of my friends would like the idea of having me wear smart glasses while being around them.” Interviewer: “Tell me more about this assumption.” Derek: “Well, they would question what I am doing, if I am recording or analyzing them or whatever. I mean, they might not think that I want to intentionally put them at risk… at least my friends, but they might be afraid that I do not have control over what the technology is doing in the background…I think most of my friends would expect me to put down the glasses while being around them.” 5.2.2.5. Negative perception. In Study 1, we showed that high symbolic benefits drive consumers' intended adoption of ARSGs. However, wearing ARSGs might also lead to a negative reaction from peers particularly because of privacy concerns, and thus, this perception is mostly associated with a user's character. Eleven interviewees expressed this fear in various terms in 13 quotes. For example, Daisy argued that she'd be perceived as ‘disrespecting others’, i.e. as a person who did not care about other people. Other statements were related to being perceived as a person that was ‘rude,’ had ‘a bad personality’, or was ‘egoistic’. Some interviewees even used the term ‘Glasshole’.

6. General discussion 5.2.2.6. Protection of peers. Finally, ten interviewees associated their concerns of others' privacy to the willingness of protecting their peers in 13 different quotes. In other words, interviewees who felt that wearing ARSGs could threaten other people's privacy mentioned that they might not use it in certain situations in order to protect people. John stated that he would like to see regulations in the technology and software that manages the protection of people around him. One interviewee even argued that it was her responsibility to protect the people that were important to her, such as her children. She argued that collecting data about them could ‘put them at risk’ and thus she felt negatively about the idea of wearing ARSGs when being around them.

6.1. Summary of findings Although recent market prognostications suggest that the adoption of ARSGs should be explosive (GoldmanSachs, 2016), knowledge regarding success factors is lacking. This investigation provides a first attempt to address this research gap. While utilitarian, hedonic, and symbolic benefits are positively related to adoption of ARSGs, only the perceived risk of threatening other people's privacy showed significant effects. The qualitative study provides further insights into the counterintuitive findings on privacy concerns. These findings lead to several implications.

5.2.2.7. Boundary conditions. First, the presence of other people seems to 380

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

6.2. Theoretical contributions

6.3. Managerial implications

The first major theoretical contribution of this research is a better understanding of ARSGs by combining the research streams of technology, media, and privacy into an investigatory framework. Most of the existing scholarship studying new technologies has utilized each of these disparate streams independently, but this study synthesizes the relevant findings of each stream to investigate ARSGs. Our empirical testing of the proposed framework provides theoretically notable findings. In line with prior technology and media research, utilitarian, hedonic, and symbolic benefits determine consumers' reactions to ARSGs (Rauschnabel, 2018). However, while some symbolic benefits such as norms or images appear in the management literature (e.g., Venkatesh et al., 2012), no prior research has specifically examined the degree to which a wearable technology influences users' symbolic appearance to others. ARSGs are not just a technology but also, to some extent, a fashion accessory. In addition, consumers tend to value the potential of ARSGs to make their lives more efficient (utilitarian benefits) and enjoyable (hedonic benefits). Prior research has also shown that privacy concerns reduce a user's intention to adopt a technology. In Study 1, we could not confirm this effect for ARSGs, which is also in line with Rauschnabel and Ro's (2016) study on Google Glass. Therefore, the novel construct – the perceived threat to other people's privacy – seems to matter. Study 2 provided qualitative insights into these counter-intuitive findings on privacy that we discuss later. Perceived loss of autonomy also did not exert expected effects in Study 1. The result deviates from previous research on service and selfserving technologies (e.g., Wünderlich et al., 2013). A plausible explanation may be that people view ARSGs as an innovative but immature technology with underdeveloped applications. In an early stage of exploratory use, people do not expect complete autonomy over a technology. Further research is necessary to investigate this link, especially in future stages of the product life cycle. The second major contribution addresses the domain of privacy research (Connolly & Bannister, 2007; Nissenbaum, 2010). The topic of privacy concerns is not just one of the core publicly raised criticisms of ARSGs, but rather also has a long tradition in the literature (e.g., Collier 1995). Study 1 did not confirm this link among ARSGs and thus replicates Rauschnabel and Ro (2016). However, so far no explanation for this counter-intuitive finding has been made. Therefore, findings of Study 2 (Fig. 1; box “Preventers”) indicate that while people are aware of these risks, it does not impact their behavior as they feel a certain level of control over the device (e.g. turning it off in specific situations), do not feel that their personal life is worth hiding, and/or see the consequences as being abstract. Moreover, many respondents reported a feeling of resignation, meaning that they accept living in a world in which privacy does not exist anymore. In addition, we extended the ARSG privacy research by introducing a new dimension of privacy concerns: the degree to which using ARSGs can threaten other people's privacy. This risk is negatively associated with consumers' reactions in both studies suggesting that people tend to incorporate other people's privacy concerns more than their own. Results of Study 2 indicate that this is influenced by ethical and legal fears, the potential of creating an artificial environment, and the negative normative reactions from other people. In addition, consumers expressed their willingness to protect their peers from the consequences of the threats to their privacy, who might also have a negative perception of a user. The importance of other people's privacy seems to be more relevant in situations where other people are existent, and in situations where people expect privacy (e.g., in personal conversations, but not at work). Therefore, besides introducing and validating a novel perspective on ARSGs' privacy, this research also introduces six mechanisms that explain these findings.

The main managerial implications of this research address manufacturers of ARSGs, and thus, compliment those on AR in general (e.g. Javornik, 2016a, b). Manufacturers should decide whether to position their ARSG devices either more as a technology for the workplace, or as ‘fashnology’, i.e. a fashionable device that is worn outside the workplace (Rauschnabel, 2018). When targeting enterprise markets, the positioning should highlight utilitarian benefits but also incorporate symbolic aspects. In consumer markets, hedonic value also plays an important role. In the following sections, we discuss some possible strategies manufacturers can apply. Without a doubt, symbolic benefits require a well-designed device, but in addition to that, they require additional symbolic benefits. Haque (2015) suggests ARSGs to be associated with ‘coolness’ and freedom that liberates “people from established ideas and norms” (p.1). Therefore, focusing on existing norms regarding how conventional glasses appear when worn by individuals might not be the only way to think about possible design strategies for ARSGs. In fact, many successful fashion and technology trends that develop were substantially different from existing ones. For example, the iPhone was the first mass-market cellphone without a physical keypad, and initially, many consumers might have perceived this design feature as strange. However, it is now a de facto standard for all smartphones. One promising way to develop symbolic designs including a ‘coolness’ factor and an associated lifestyle is to collaborate with designers, artists, and/or other brands (e.g. cobranding). Regardless, we contend that the development of ARSGs can substantially benefit from the contribution of fashion designers. When thinking about utilitarian benefits, manufacturers need to provide benefits that go beyond those of smartphones. For example, Berinato (2015) concluded that Google Glass did not offer more than “a short list of functions that you can easily do on your phone in front of your face”. Although still in the developmental stage, Microsoft's HoloLens provides insights into what can be possible in the near future: ARSGs identify objects and locations, ‘understand’ the real world and realistically integrate well-designed objects, which goes beyond ‘grabbing information’ that people can also do with a smartphone (Berinato, 2015). For manufacturers, this requires technologies that most existing mobile devices do not yet possess, such as 3D displays, multiple sensors or indoor object recognition. For app developers, this means that concepts should not just replicate existing applications. For example, retailers often seek for ways to offer new shopping experiences. A home improvement company could create an ARSG app in which consumers can navigate through an empty room, chose furniture or wall colors, walk around the decorated room, and maybe even receive personalized recommendations. Hedonic benefits can be influenced by particular apps. Therefore, manufacturers should promote their devices – especially in consumer markets – as a new opportunity for hedonic scenarios, such as gaming. As discussed for utilitarian benefits, the hedonic benefits should be substantially different from existing technologies. Current games provide numerous ideas on how the gaming landscape might be shaped. For examples, RoboRaid is a game for Hololens in which virtual robots break out of ‘real’ walls. Users have to shoot the robots and hide behind ‘real’ objects in order to avoid being shot by them. Besides gaming, marketers can systematically assess customer journeys and identify opportunities about how consumers' decision making can be perceived as more pleasant. For example, AR applications could integrate virtual shopping shelves in a user's living room and thus might make shopping more convenient and enjoyable. Another important area for manufacturers addresses privacy. Since findings of this study show that people incorporate threats to other people's privacy in their decision making, manufacturers must address and communicate this appropriately. One way is the ban of functionalities via software (e.g. face recognition). However, history has shown that hackers often find ways to navigate around these bans. Another 381

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

mechanisms between ARSGs and other established or innovative technologies. Findings could, for example, identify, characteristics of technologies that make the integration of one's own vs. other people's privacy more or less likely. This is particularly important for the everincreasing interconnectedness and ubiquity of intelligent products (Wünderlich et al., 2015). In this research, we focused on one particular wearable device—namely, ARSGs. This leads to the question as to what degree these findings can be extended to other wearables.

approach could be the installation of physical blinds with which consumers can capture the camera in specific situations. This is a similar approach to how many people protect their privacy by adding stickers on their laptop cameras. Should manufacturers or app developers therefore neglect a user's privacy? Looking at the results, one could conclude ‘yes’. However, besides national privacy laws that require several actions from manufacturers, we argue that companies should still ensure that users and their privacy are well-protected. 6.4. Limitations and future research directions

Acknowledgements Although the pre-market nature of this research in one country increases the managerial relevance of the findings, caution must be taken in extrapolating the findings to other contexts. In addition, the quantitative study focuses on direct effects only, whereas Study 2 identified some underlying mechanisms. Future studies should consider these findings as a source of inspiration for further empirical research. For example, future studies could investigate the role of the two privacy constructs in more detail. These investigations could compare the

The authors gratefully acknowledge the valuable feedback from, Barry J. Babin, Mike Brady, Petra Riefler, Bryan Krulikowski, Aaron Ahuvia, Reto Felix, Daniel Hein and the 2017 Summer AMA conference reviewers, track chairs and attendees for valuable feedback. We give our special thanks to the anonymous Journal of Business Research reviewers and Associate Editor Nancy Wünderlich for their invaluable feedback.

Appendix A. Measures

Expected utilitarian benefits (Venkatesh et al., 2012) I find these smart glasses can be useful in my daily life. Using these smart glasses can help me accomplish things more quickly. Using these smart glasses can increase my productivity. Expected hedonic benefits (Venkatesh et al., 2012) Using these smart glasses can be fun. Using these smart glasses can be enjoyable. Using these smart glasses can be very entertaining. Expected symbolic benefits (Moore & Benbasat, 1991; Richins, 1994) Wearing these smart glasses would… …worsen my appearance[R] …make me unattractive to others[R] …make me look worse[R] Perceived risks to personal privacy (Malhotra et al. 2004) These smart glasses would collect too much information about a user. I would be concerned about my privacy when using these smart glasses. I have doubts as to how well my privacy is protected while using these smart glasses. My personal information would be misused when the camera is running. My personal information would be accessed by unknown parties when using smart glasses in my everyday life. Perceived risks to other people's privacy (Malhotra et al. 2004) These smart glasses would collect too much information about people around me. I would be concerned about other people's privacy when using these smart glasses. I have doubts as to how well other people's privacy is protected while using these smart glasses. Other people's information would be misused when the camera is running. Other people would be accessed by unknown parties when using smart glasses in my everyday life. Perceived loss of autonomy (Walter & Lopez, 2008) Using these smart glasses could decrease my control over daily activities. Using these smart glasses could decrease my discretion over daily decisions. Using these smart glasses could decrease my control over each step of various tasks. Using these smart glasses may increase monitoring of how I do certain things by the glasses manufacturer. Ease of use (Venkatesh et al., 2012) Learning how to use these smart glasses will be easy for me. I expect these smart glasses to be easy to use. I assume that it will be easy for me to become skillful at using these smart glasses. Familiarity with ARSGs (Rauschnabel & Ro, 2016) are unfamiliar to me - are familiar to me I do not recognize - I do recognize I have never heard of before - I have heard of before Adoption (Lu, Yao, & Yu, 2005) I intend to purchase these smart glasses. If I have the financial resources, I would buy these smart glasses.

382

AVE = 0.79;CR = 0.92

AVE = 0.88;CR = 0.95

AVE = 0.81;CR = 0.93

AVE = 0.72;CR = 0.93

AVE = 0.79;CR = 0.95

AVE = 0.58;CR = 0.84

AVE = 0.68;CR = 0.86

AVE = 0.80;CR = 0.92

AVE = 0.72;CR = 0.84

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

Appendix B. Survey description Smart Glassess Survey Smart glasses are smart, wearable miniature computers that utilize various sensors—such as cameras, microphones, and GPS to capture a user’s physical environment. This information can then be processed and understood by the smart glasses and overlaid with digital information in the user’s view field. This creates an “Augmented Reality Technology.” For instance, smart glasses can recognize buildings, landscapes, texts, or human faces. This physical information can then be augmented by additional virtual information. Some examples of how this works include:

• A user can look at a famous building and receive the corresponding Wikipedia information about that building • A user can look at another person and receive his/her social media public profile information • Smart glasses can be used as a navigation system that guides a driver though a city by showing her the route in her field of view, and warns the user about speed limits or dangers • A user looks at a book written in a foreign language and the smart glasses translate the book into the user’s preferred language within their field of view

Appendix C. Sample Study 2

No.

Pseudonym

Age/gender

Background

ARSGs experience*

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21

Stacey Tom Nina Jona Rosanna Alex Susan Lena Steve Aric Anne Daisy Erin John Aaron Mike Derek Paula Edward Ben Marc

25/F 43/M 24/F 27/M 37/F 35/M 35/F 29/F 25/M 26/M 39/F 21/F 23/F 37/M 27/M 40/M 28/M 45/F 26/M 39/M 32/M

Undergraduate degree in dietetics, now MBA student Junior Executive in media entertainment industry Business analyst, has an undergrad university degree Support Specialist at an automotive service club, fulltime employed Office secretary and professional performer Has an MBA and works for a large automotive company as a data security manager Medical degree, currently a graduate student Housewife with a nursing degree Fulltime undergraduate business student Business Student and part-time job in construction Business Student, former in-home nurse Undergrad business student Business Student and a part-time data analyst Jr. Director, data analytics company Social Media & communications manager Faculty in MIS Part-time graduate student and consultant in a commercial research institute Accountant Engineer, automotive start-up Consultant, lean healthcare Middle school teacher

Low Very high Low Moderate Very low Moderate moderate Very high high high high High High Very high High Moderate Very high Low High High Very high

* incl. familiarity

Philipp A. Rauschnabel, PhD, is a Professor of Digital Marketing and Media Innovation at Universität der Bundeswehr München (Munich; effective 9/2018). Prior to that, he held academic positions at University of Michigan-Dearborn (USA) and Darmstadt University (h_da), where this research was conducted. His research addresses contemporary issues in wearable technologies, new media, Augmented Reality, and branding. He frequently presents research findings on academic and industrial conferences and consults organizations in the fields of media, strategy, market research, and branding. Jun He is an Associate Professor of MIS in the College of Business at the University of Michigan-Dearborn. He received his M.B.A. from Tsinghua University (China) and his Ph.D. from the University of Pittsburgh. His research interests include technology acceptance, healthcare systems, team behavior and project management, and research methodology. He has presented a number of papers at many national and international conferences, published in numerous journals such as Communications of the Association for Information Systems, Health and Technology, Information and Management, Journal of Management Information Systems, Omega, and in two books: Current Topics in Management, and Planning for IS. Young K. Ro, Ph.D., Professor of Operations Management at the University of Michigan-Dearborn College of Business. He holds B.S. and M.S. in industrial engineering from Purdue University and a Ph.D. in industrial and operations engineering from the University of Michigan. Presently, he is engaged in several research projects concerning product development, supply chain relationships, and the management of technology in businesses.

Advances in Consumer Research, 5(1), 39–47. Berinato, S. (2015). What HoloLens has that Google glass didn't. Harvard Business Review Bloghttps://hbr.org/2015/01/what-hololens-has-that-google-glass-didnt. Bierhoff, H. W. (1989). Person perception and attribution. New York: Springer. Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Thousand Oaks, CA: Sage. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77.

References Baldus, B. J., Voorhees, C., & Calantone, R. (2015). Online brand community engagement: Scale development and validation. Journal of Business Research, 68(5), 978–985. Barney, J. B., & Hansen, M. H. (1994). Trustworthiness as a source of competitive advantage. Strategic Management Journal, 15(S1), 175–190. Belk, R. W. (1978). Assessing the effects of visible consumption on impression formation.

383

Journal of Business Research 92 (2018) 374–384

P.A. Rauschnabel et al.

perceptions of adopting an information technology innovation. Information Systems Research, 2(3)), 192–222. Nissenbaum, H. (2010). Privacy in context: Technology, policy, and the integrity of social life. Stanford, CA: Stanford Law & Politics. Nolan, J. M., Schultz, P. W., Cialdini, R. B., Goldstein, N. J., & Griskevicius, V. (2008). Normative social influence is underdetected. Personality and Social Psychology Bulletin, 34(7), 913–923. Nysveen, H., Pedersen, P. E., & Thorbjørnsen, H. (2005). Intentions to use mobile services: Antecedents and cross-service comparisons. Journal of the Academy of Marketing Science, 33(3), 330–346. O'Donoghue, T., & Rabin, M. (2001). Choice and procrastination. The Quarterly Journal of Economics, 116(1), 121–160. Pantano, E., Rese, A., & Baier, D. (2017). Enhancing the online decision-making process by using augmented reality: A two country comparison of youth markets. Journal of Retailing and Consumer Services, 28, 81–95. Patton, M. Q. (2015). Qualitative evaluation and research methods (4th ed). Newbury Park, CA: Sage. Rauschnabel, P. A. (2018). Virtually enhancing the real world with holograms: An exploration of expected gratifications of using augmented reality smart glasses. Psychology and Marketing, 35(8), 557–572. Rauschnabel, P. A., Brem, A., & Ivens, B. (2015). Who will buy smart glasses? Empirical results of two pre-market-entry studies on the role of personality in individual awareness and intended adoption of Google Glass wearables. Computers in Human Behavior, 49, 635–647. Rauschnabel, P. A., & Ro, Y. K. (2016). Augmented reality smart glasses: An investigation of technology acceptance drivers. International Journal of Technology Marketing, 11(2), 123–148. Richins, M. L. (1994). Special possessions and the expression of material values. Journal of Consumer Research, 21(3), 522–533. Ruggiero, T. E. (2000). Uses and gratifications theory in the 21st century. Mass Communication & Society, 3(1), 3–37. Scholz, J., & Duffy, K. (2018). We ARe at home: How augmented reality reshapes mobile marketing and consumer-brand relationships. Journal of Retailing and Consumer Services, 44, 11–23. Scholz, J., & Smith, A. N. (2016). Augmented reality: Designing immersive experiences that maximize consumer engagement. Business Horizons, 59(2), 149–161. Schultze, U., & Avital, M. (2011). Designing interviews to generate rich data for information systems research. Information and Organization, 21(1), 1–16. Sheldon, P. (2008). Student favorite: Facebook and motives for its use. Southwestern Mass Communication Journal, 23(2), 39–53. Smith, H. J., Dinev, T., & Xu, J. (2011). Information privacy research: An interdisciplinary review. MIS Quarterly, 35(4), 989–1016. Spiggle, S. (1994). Analysis and interpretation of qualitative data in consumer research. Journal of Consumer Research, 21(3), 491–503. tom Dieck, M. C., & Jung, T. (2015). A theoretical model of mobile augmented reality acceptance in urban heritage tourism. Current Issues in Tourism, 1–21. Tunca, S., & Fueller, J. (2009). Impression formation in a world full of fake products. Advances in Consumer Research, 36, 287–292. Venkatesh, V., Thong, J. Y., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Quarterly, 36(1), 157–178. Walter, Z., & Lopez, M. S. (2008). Physician acceptance of information technologies: Role of perceived threat to professional autonomy. Decision Support Systems, 46(1), 206–215. Weiz, D., Anand, G., & Ernst, C. P. H. (2016). The influence of subjective norm on the usage of smartglasses. In C. P. H. Ernst (Ed.). The Drivers of Wearable Device Usage (pp. 1–11). Springer. Wilcox, K., Kim, H. M., & Sen, S. (2009). Why do consumers buy counterfeit luxury brands? Journal of Marketing Research, 46(2), 247–259. Wünderlich, N. V., Heinonen, K., Ostrom, A. L., Patricio, L., Sousa, R., Voss, C., & Lemmink, J. G. (2015). “Futurizing” smart service: Implications for service researchers and managers. Journal of Services Marketing, 29(6/7), 442–447. Wünderlich, N. V., Wangenheim, F. V., & Bitner, M. J. (2013). High tech and high touch: A framework for understanding user attitudes and behaviors related to smart interactive services. Journal of Service Research, 16(1), 3–20. Yang, H., Yu, J., Zo, H., & Choi, M. (2016). User acceptance of wearable devices: An extended perspective of perceived value. Telematics and Informatics, 33(2), 256–269.

Chuah, S. H. W., Rauschnabel, P. A., Krey, N., Nguyen, B., Ramayah, T., & Lade, S. (2016). Wearable technologies: The role of usefulness and visibility in smartwatch adoption. Computers in Human Behavior, 65, 276–284. Collier, G. (1995). Information privacy. Information Management & Computer Security, 3(1), 41–45. Connolly, R., & Bannister, F. (2007). Consumer trust in Internet shopping in Ireland: Towards the development of a more effective trust measurement instrument. Journal of Information Technology, 22, 102–118. Corbin, J., & Strauss, A. (2015). Basics of qualitative research: Techniques and procedures for developing grounded theory. Thousand Oaks, CA: Sage. Craig, A. B. (2013). Understanding augmented reality: Concepts and applications. Waltham, MA: Elsevier. Debatin, B., Lovejoy, J. P., Horn, A. K., & Hughes, B. N. (2009). Facebook and online privacy: Attitudes, behaviors, and unintended consequences. Journal of ComputerMediated Communication, 15(1), 83–108. Deci, E. L., & Ryan, R. M. (Eds.). (2002). Handbook of self-determination research. Rochester, NY: University of Rochester Press. Eighmey, J., & McCord, L. (1998). Adding value in the information age: Uses and gratifications of sites on the World Wide Web. Journal of Business Research, 41(3), 187–194. Eisenmann, T., Barley, L., & Kind, L. (2014). Google glass. Harvard Business School (Case Study). Epp, A. M., & Price, L. (2010). The storied life of singularized objects: Forces of agency and network transformation. Journal of Consumer Research, 36(5), 820–837. GoldmanSachs (2016). Virtual & augmented reality. Retrieved from http://goldmansachs. com/our-thinking/pages/technology-driving-innovation-folder/virtual-andaugmented-reality/report.pdf, Accessed date: 31 March 2016. Goldstein, N. J., Cialdini, R. B., & Griskevicius, V. (2008). A room with a viewpoint: Using social norms to motivate environmental conservation in hotels. Journal of Consumer Research, 35(3), 472–482. Goulding, C. (2005). Grounded theory, ethnography and phenomenology: A comparative analysis of three qualitative strategies for marketing research. European Journal of Marketing, 39(3/4), 294–308. Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18(1), 59–82. Haque, U. (2015). Google glass failed because it just wasn't cool. Harvard Business Review Bloghttps://hbr.org/2015/01/google-glass-failed-because-it-just-wasnt-cool. Holman, R. H. (1980). Clothing as communication: An empirical investigation. Advances in Consumer Research, 7(1), 372–377. Javornik, A. (2016a). Augmented reality: Research agenda for studying the impact of its media characteristics on consumer behaviour. Journal of Retailing and Consumer Services, 30, 252–261. Javornik, A. (2016b). ‘It's an illusion, but it looks real!’ Consumer affective, cognitive and behavioural responses to augmented reality applications. Journal of Marketing Management, 32(9–10), 987–1011. Jung, T., Chung, N., & Leue, M. C. (2015). The determinants of recommendations to use augmented reality technologies: The case of a Korean theme park. Tourism Management, 49, 75–86. Kalantari, M. (2017). Consumers' adoption of wearable technologies: Literature review, synthesis, and future research agenda. International Journal of Technology Marketing, 12(3), 274–307. Katz, E. (1959). Mass communications research and the study of popular culture: An editorial note on a possible future for this journal. Studies in Public Communication, 2, 1–6. King, W. R., & He, J. (2006). A meta-analysis of the technology acceptance model. Information Management, 43(6), 740–755. Lee, J., & Allaway, A. (2002). Effects of personal control on adoption of self-service technology innovations. Journal of Services Marketing, 16(6), 553–572. Lu, J., Yao, J. E., & Yu, C. S. (2005). Personal innovativeness, social influences and adoption of wireless Internet services via mobile technology. Journal of Strategic Information Systems, 14(3), 245–268. Malhotra, N. K., Kim, S., & Agarwal, J. (2004). Internet users' information privacy concerns (IUIPC): The construct, the scale, and a causal model. Information Systems Research, 15(4), 336–355. Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: A sourcebook. Beverly Hills: Sage Publications. Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the

384