Accepted Manuscript Do you Want to be a Cyborg? The Moderating Effect of Ethics on Neural Implant Acceptance
Eva Reinares-Lara, Cristina Olarte-Pascual, Jorge Pelegrín-Borondo PII:
S0747-5632(18)30137-7
DOI:
10.1016/j.chb.2018.03.032
Reference:
CHB 5436
To appear in:
Computers in Human Behavior
Received Date:
13 October 2017
Revised Date:
13 January 2018
Accepted Date:
20 March 2018
Please cite this article as: Eva Reinares-Lara, Cristina Olarte-Pascual, Jorge Pelegrín-Borondo, Do you Want to be a Cyborg? The Moderating Effect of Ethics on Neural Implant Acceptance, Computers in Human Behavior (2018), doi: 10.1016/j.chb.2018.03.032
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
ACCEPTED MANUSCRIPT DO YOU WANT TO BE A CYBORG? THE MODERATING EFFECT OF ETHICS ON NEURAL IMPLANT ACCEPTANCE Prof. Eva Reinares-Lara, Ph.D. Professor, Department of Business Administration, Universidad Rey Juan Carlos, Facultad de Ciencias Jurídicas y Sociales, Paseo Artilleros s/n. 28032, Vicálvaro, Madrid (Spain), e-mail:
[email protected] Cristina Olarte-Pascual, Ph.D. Professor, Department of Business Administration, Universidad de La Rioja, Facultad de Ciencias Empresariales, La Cigüeña 60, 26006, Logroño, La Rioja (Spain), +34941299381, e-mail:
[email protected] Prof. Jorge Pelegrín-Borondo, Ph.D. Professor, Department of Business Administration, Universidad de La Rioja, Facultad de Ciencias Empresariales, La Cigüeña 60, 26006, Logroño, La Rioja (Spain), +34941299388, e-mail:
[email protected] Acknowledgements. This research was funded by the Spanish Ministry of Economy and Competitiveness through Research Project ECO2014-59688-R, under the Spanish National Program for Research, Development, and Innovation Oriented Toward Societal Challenges, within the context of the 2013-2016 Spanish National Scientific and Technical Research and Innovation Plan. The authors would also like to acknowledge the bridge grants for research projects awarded by the University of La Rioja (2017 call), subsidized by Banco Santander (reference: APPI17/05) and the COBEMADE research group at the University of La Rioja. Author contributions. The three co-authors have participated in all stages of work, including the conception and design of the research, the revision of intellectual content and drafting the work. Abstract The development of neural implants to increase people’s memory is enabling the creation of cyborgs (human-machine hybrids) with superior capacities. This paper aims to advance new technology acceptance models by analyzing the moderating effect of ethics on an integrative Cognitive-Affective-Normative (CAN) model to understand the acceptance of brain implants to increase capacities. The model is tested on a sample of 900 individuals segmented by their ethical assessment of these insideables: ethically in favor, ethically against, or ethically indifferent. The results show that an individual’s ethical assessment of memory implants 1
ACCEPTED MANUSCRIPT explains differences in his or her intention to use them but does not moderate the influence of performance expectancy, effort expectancy, positive emotions, negative emotions, or social influence on the intention to use them. The results have theoretical implications for technology acceptance models and open new lines of research concerning the future cyborg society. Keywords: Cyborg; Ethics; Insideables; Memory Implants; Technology Acceptance Models. JEL classification: M3
2
ACCEPTED MANUSCRIPT DO YOU WANT TO BE A CYBORG? THE MODERATING EFFECT OF ETHICS ON NEURAL IMPLANT ACCEPTANCE Abstract The development of neural implants to increase people’s memory is enabling the creation of cyborgs (human-machine hybrids) with superior capacities. This paper aims to advance new technology acceptance models by analyzing the moderating effect of ethics on an integrative Cognitive-Affective-Normative (CAN) model to understand the acceptance of brain implants to increase capacities. The model is tested on a sample of 900 individuals segmented by their ethical assessment of these insideables: ethically in favor, ethically against, or ethically indifferent. The results show that an individual’s ethical assessment of memory implants explains differences in his or her intention to use them but does not moderate the influence of performance expectancy, effort expectancy, positive emotions, negative emotions, or social influence on the intention to use them. The results have theoretical implications for technology acceptance models and open new lines of research concerning the future cyborg society. Keywords: Cyborg; Ethics; Insideables; Memory Implants; Technology Acceptance Models.
ACCEPTED MANUSCRIPT 1. Introduction From a scientific point of view, much of the research on technology acceptance is conducted on products already on the market. Very little literature has looked at products based on cuttingedge technology, i.e., products arising from highly innovative scientific developments involving transcendental changes in how we understand life and/or the universe. In this context, brain implants have astonishing potential to transform humanity (McGee and Maguire, 2007). Neural control technologies are enabling scientists to create cyborgs straight out of science fiction. A cyborg is a human-machine hybrid (Haddow et al., 2015). Park (2014: 304) defines it as “a human being with an electronic device implanted in or permanently attached to their body for the purpose of enhancing their individual senses or abilities.” Brain implants can be classified into two types: therapeutic and capacity-enhancing. The therapy/enhancement distinction is made between interventions carried out to treat disease or disability and interventions to enhance a person’s normal functioning or endow him or her with entirely new capacities (McGee and Maguire, 2007). Brain implants for therapeutic purposes are seemingly uncontroversial. However, in the debate on the adoption of capacity-enhancing practices versus therapeutic ones, cyborgs pose an ethical dilemma when an individual’s consciousness is modified by the merging of human body and machine (Schermer, 2009; Park, 2014). According to Warwick (2014), the cause for concern is not so much the therapeutic uses, but the modification of an individual’s nature as a result of the linking of human and machine mental functioning. The direct connections with the brain, even more than the connections with the nervous system, open up the possibility for individuals to communicate with each other, control machines with their thoughts, or be constantly connected to the Internet (Warwick, 2003; 2014; Berger, 2011; McGee and Maguire, 2007). According to Park (2014), although the body’s apparent integrity depends on cultural aspects, non-medical implants could become mainstream once the benefits they provide outstrip those of wearables. For Ochsner et al. (2015), it is only a matter of time until non-disabled people begin to use brain chips; these authors further predict that the military use of neurotechnology to adapt the human body to specific circumstances imposed by military or combat operations will become widespread. The development of human capacities through the integration of these technologies opens up the possibility of human beings becoming fundamentally different. A cyborg with a brain that was part human and part machine would retain some links to its human background, but its view of life and of what is and is not possible would be very different from a human’s, since its values, morals, and ethics would relate to its own life and what it believes is or is not important (Warwick 2003; Park, 2014; Schermer, 2009; Jotterand, 2008). 2
ACCEPTED MANUSCRIPT Many authors have pointed to ethical dilemmas related to the risks and potential harm of using nanodevices in the brain (Milleson, 2013; Nijboer et al., 2013; Berger et al., 2008; Schermer, 2009; Clausen, 2008; Ford, 2007; Glannon, 2007; Hansson, 2005; Mnyusiwalla et al., 2003; etc.). These ethical dilemmas cover a range of areas, including social, economic, environmental, educational, moral, and philosophical concerns, such as: a) issues related to security, data protection, and body control; b) conflicts between a potential personality change, autonomy, and informed consent; or c) effects on personal identity, resource allocation, and the use of such technologies for capacity enhancement. Consideration should also be given to issues related to malfunctions and improper or overuse. Research on the acceptance of new technologies is based on the technology acceptance models TAM (Davis, 1989; Davis et al., 1989) and TAM2 (Venkatesh and Davis, 2000) and their extensions through the Unified Theory of Acceptance and Use of Technology (UTAUT) and UTAUT2 (Venkatesh et al., 2003). Drawing on these models, Pelegrín-Borondo et al. (2016; 2017) have developed Cognitive-Affective-Normative (CAN) models for the acceptance of technological implants to increase capacities (insideables) that explain more than 73% of the intention to use this state-of-the-art technology. The benefits of including cognitive and affective factors to better understand subjects’ product assessments are widely recognized in the literature (e.g., Shiv and Fedorikhin, 1999; Levav and McGraw, 2009; Bigné et al., 2008; Zielke, 2011). While the CAN model has been successfully applied to the general concept of technological implants, it is unclear whether it would be as powerful if applied to a specific type of implant. Furthermore, the antecedents for the model’s application point to the potential importance of the ethical component in the acceptance process. In light of the ethical implications of the cyborgization process, this paper will aim to analyze the moderating effect of ethics on an integrative CAN model to understand acceptance of a particular type of insideable, namely, brain implants to increase capacities. 2. Literature Review 2.1. Influence of Performance Expectancy and Effort Expectancy on the Intention to Use Memory Implants The UTAUT (Venkatesh et al., 2003) and UTAUT2 (Venkatesh et al., 2012, p. 159) models introduced the variables performance expectancy and effort expectancy, which have a high predictive power with regard to the intention to use a given technology. Adapted to the context of the acceptance and use of memory implants, these constructs could be defined as “the degree to which a person believes that using a memory implant will improve his or her performance” 3
ACCEPTED MANUSCRIPT (performance expectancy) and “the degree of ease associated with the use of the memory implant” (effort expectancy). Scientifically speaking, sense enhancement is already a reality; Warwick (2014) connected a human nervous system to the Internet to enable basic forms of thought transmission. The same author further noted that many people are likely to want to update and integrate themselves with machines. Such successful integration of the human body with capacity-enhancing technology is easy to imagine from the perspective of computational theory of the mind and cyborg theory (Selinger and Engström, 2008; Reinares-Lara et al., 2016). This integration of technology in the body parallels natural processes of evolution, in which “reasonable” people will seek to enhance their capacities as much as technology allows (Schermer, 2009; Selinger, and Engström, 2008; Rosahl, 2004; Bhattacharyya and Kedzior, 2012). Today, few discussions of cyborgian nature fail to address the concept of capability enhancement (Parkhust, 2012). This could mean that ordinary (non-implanted) humans will be superseded by these superior (implanted) humans. In short, it is clear that in the long term the use of implants to connect the human brain to an information network could yield the advantages of computer intelligence, communication, and other sensory capabilities. From a cognitive point of view, the brain will process any type of information to which it has access. Thus, not only will the most direct or hard-wired interfaces, such as neuroelectronic technology, influence its structure, but also any type of technological extension that is accessible through its sensory system (Greiner, 2014). With regard to the influence of effort expectancy on the intention to use, the literature shows that perceived ease of use favorably influences the acceptance of new health technologies (Bertrand and Bouchard, 2008; Alaiad and Zhou, 2014; Handy et al., 2001). In the field of technological implants to increase capacities, Reinares-Lara et al. (2016) were unable to demonstrate the influence of the perceived ease of use of such implants on attitudes toward them. According to these authors, the most plausible explanation for this finding is that, due to the miniaturized scale of the devices and their ability to work automatically in the human body, people do not feel the need for information on what they do or how difficult they might be to use. This finding suggests that established models, such as the TAM (Davis et al., 1989), are of limited applicability in the context of individual acceptance of nanoimplants. In contrast, Pelegrín-Borondo et al. (2016; 2017) found that perceived ease of use did positively influence the intention to use an implant when the decision was being made for oneself, but that this variable had no influence on the intention to use one when the decision was being made for one’s child (2016).
4
ACCEPTED MANUSCRIPT In light of the contradictory conceptual framework for expectations regarding technological implants to develop capacities, the following hypotheses were proposed in the context of neural implants to increase memory: H1. Performance expectancy regarding the use of memory implants to increase capacities positively affects the intention to use them. H2. Effort expectancy regarding the use of memory implants to increase capacities positively affects the intention to use them. 2.2. Influence of Emotions on the Intention to Use Memory Implants In order to better understand the subjects’ behavior, an affective dimension was added (Zielke, 2011; Levav and McGraw, 2009; Dean et al., 2008; Van Waterschoot et al., 2008), since some emotions have been shown to stimulate action, while others cause it to be inhibited or changed (Cohen et al., 2006; White and Yu, 2005; Mano, 2004; Han et al., 2007; Schwarz, 2000; Bagozzi et al., 1999). In the field of body implants, the idea of the dissolution of the limits of what the human body is as a result of the use of medical implants has been shown to generate feelings of apprehension, anxiety, and even fear (Buchanan-Oliver and Cruz, 2011). According to Park (2014), despite the acceptance of body modification for aesthetic reasons, most people’s initial reaction to the integration of technology in their body is fear that it will damage their body’s integrity. Advances in transplant technology, such as animal organ transplants or the use of technological devices, generate fear of dehumanization. Likewise, there is fear of the very idea of the existence of the cyborg (Lai, 2012). Everything controversial about biotechnological interventions raises moral questions and disputes that evoke both horror and admiration (Schermer, 2009). Similarly, Reinares-Lara et al. (2016) found that emotions influence attitude toward nanoimplants. Pelegrín-Borondo et al. (2017) subsequently confirmed the influence of emotions on behavioral intention toward technological implants to increase capacities and identified a new affective dimension consisting of emotions of anxiety that was not significant. This latter dimension is a disaggregation of the negative emotions dimension of the PANAS scale (Watson et al., 1988). These authors were likewise unable to confirm that the fear of dehumanization caused by the idea of the cyborg, identified by Lai (2012), had a decisive impact on the decision to accept implants in the body. In light of these previous findings on technological implants, the following hypotheses were proposed in the field of neural implants to increase memory:
5
ACCEPTED MANUSCRIPT H3. Positive emotions toward the use of memory implants to increase capacities positively affect the intention to use them. H4. Negative emotions toward the use of memory implants to increase capacities negatively affect the intention to use them. H5. Emotions of anxiety toward the use of memory implants to increase capacities negatively affect the intention to use them. 2.3. Social Influence on the Intention to Use Memory Implants Social influence is defined as the degree to which individuals perceive that people who are important to them believe they should use a new technology (Venkatesh et al., 2003: 451). The models of the Theory of Reasoned Action (TRA) (Fishbein and Ajzen, 1975) and its extension, the Theory of Planned Behavior (TPB) (Ajzen, 1991), together with the TAM2 model (Venkatesh and Davis, 2000) and its extensions through UTAUT and UTAUT2 (Venkatesh et al., 2003) provide the rationale for connecting social influence to the intention to use this new technology. Brain surgery has always been socially controversial. It is a highly invasive therapy targeting a very complex and sensitive organic system that forms the basis for the most important human mental states and activities. Social concern regarding its development and techniques is natural. This concern is likely to increase with the development of neurostimulation, especially if it is achieved through nanoimplants (Berger et al., 2008). Family, friends, and society influence the decision to undergo body modification (Von Soest et al., 2006; Most et al., 2007; Hyde et al., 2010; Adams, 2010; Javo and Sørlie, 2010; and Dorneles de Andrade, 2010). In the context of implants in the human body, Most et al. (2007) found that family environment influences attitudes toward cochlear implants. Likewise, social pressure to maintain a youthful and attractive image has been shown to play an important role in the context of body modification to increase seductive capacity (De Andrade, 2010; Von Soest et al., 2006). In keeping with the aims of the present paper, Pelegrín-Borondo et al. (2016; 2017) demonstrated the high explanatory power of social norms with regard to the intention to use technological implants to increase capacities. Based on the above, the following hypothesis was proposed: H6. Social influence to use memory implants to increase capacities positively affects the intention to use them.
6
ACCEPTED MANUSCRIPT 2.4. Moderating Effect of Ethics on the Intention to Use Memory Implants The shift from wearable technology to the cyborgization of the future is reshaping social thought beyond technological or biological issues; it is necessary to reflect on the essential bases and parameters that will govern the future cyborg society (Greiner, 2014). Milleson (2013) and Mnyusiwalla et al. (2003) have noted the lack of literature on the relationship between nanotechnology and the brain and its ethical implications. According to Clausen (2011), the therapeutic benefits of the brain-hardware interface positively influence the ethical assessment, but further research is required into this perspective from non-medical fields. Berger et al. (2008) argued that there is a moral obligation to research and develop nanotechnologically improved neural implants due to the potential benefits for patients. And yet, the development of brain implants poses several legal, social and ethical problems. In the context of the circular evolution of ethics, what individuals believe to be ethical influences their behavior and, over time, the behaviors they observe influence what they believe to be ethical (Goel et al., 2016). Applied ethics is a branch of ethics that looks at specific issues in certain fields, such as euthanasia or abortion in medical ethics or social responsibility in business ethics (Cohen, 2005; Frey, Heath and Wellman, 2004). Decisions and actions are often guided by applied ethical perceptions rather than absolute judgments of what can or should be done (Cohen, 2005; LaFollette, 2002). Psychological contract theory conceptualizes decisionmaking in a subjective way. This theoretical basis can be used to address similar decisions made by individuals in the absence of absolute rules regarding what can or cannot be done (Dunfee, 2006; Thompson and Hart, 2006; Goel et al., 2016). In the present paper, it was deemed necessary to analyze the moderating effect of ethics on the variables that determine the intention to use neural implants to increase memory in the framework of this theory as an individual perception of what use behaviors are appropriate from an applied ethics perspective (Thompson and Hart, 2006). Given the existence of groups of people both for and against cyborgization (Olarte et al., 2015), for the purpose of studying the acceptance of neural implants, a nano-level analysis was used, as proposed by Thompson and Hart (2006). Ethical beliefs depend on individual factors, such as sex, age, and education, as well as factors specific to the situation (Ford and Richardson, 1994). As noted by Park (2014), from an ethical point of view, becoming a cyborg should be an individual decision. In this context, the interest of the present paper is to advance knowledge of the potential moderating effect of the ethical component on the cognitive, affective, and social dimensions
7
ACCEPTED MANUSCRIPT on the intention to use memory implants. In light of the scant literature on this moderating influence, it was included through the following propositions: P1. Ethics has a moderating influence on the positive relationship between performance expectancy regarding the use of memory implants to increase capacities and the intention to use them. P2. Ethics has a moderating influence on the positive relationship between effort expectancy regarding the use of memory implants to increase capacities and the intention to use them. P3. Ethics has a moderating influence on the positive relationship between positive emotions toward the use of memory implants to increase capacities and the intention to use them. P4. Ethics has a moderating influence on the negative relationship between negative emotions toward the use of memory implants to increase capacities and the intention to use them. P5. Ethics has a moderating influence on the negative relationship between emotions of anxiety toward the use of memory implants to increase capacities and the intention to use them. P6. Ethics has a moderating influence on the positive relationship between social influence to use memory implants to increase capacities and the intention to use them. Figure 1. Theoretical model of memory implant acceptance Affective Negative Emotions
Positive Emotions H3 = +
Effort Expectancy
H5 = -
H2 = +
Social Influence
H6 = +
Cognitive Performance Expectancy
Anxiety
H4 = -
H1 = +
Normative
Intention to Use P1 P2
P4
P3
P5
P6
Ethics
In the literature on technology acceptance and adoption, many researchers have gathered variables from earlier studies that have been shown to influence acceptance (Hameed et al., 2012). Thus, the proposed hypotheses and propositions form an integrative theoretical model of the variables that influence the intention to use memory implants (Figure 1). 8
ACCEPTED MANUSCRIPT 3. Method A structured personal survey was administered to a sample of 900 people aged 18 and older living in Spain, distributed equally with regard to gender and age. Table 1. Technical details of the research and sample description TECHNICAL DETAILS Universe Individuals aged 18 and over Sampling procedure Stratified equally by sex and age Data collection Personal survey using a structured questionnaire Scope Spain Sample size 900 individuals Date of fieldwork November 15-28, 2016 SAMPLE CHARACTERISTICS Sex 50% men; 50% women 20% between the ages of 18 and 25; 20% between the ages of 26 and 35; Age 20% between the ages of 36 and 45; 20% between the ages of 46 and 55; 20% between the ages of 56 and 75 26.8% basic education; 43.4% upper-secondary education; 29.8% Educational attainment university education 23.3% had an implant for health or aesthetic reasons; 76.7% did not have an Implants implant for health or aesthetic reasons
Before answering the survey, the respondents were provided with the following explanation of the concept to be tested due to its novelty: “A neural implant to increase memory is a technological device that is implanted in the brain of a healthy person, not for medical or health-related purposes, but rather to increase his or her memory. A pilot test conducted by Wake Forest University and the University of Southern California has demonstrated the effectiveness of memory implants.” All the surveys were recorded to assure the quality of the fieldwork and the data collected by the survey takers. The measurement scales used in the questionnaire were developed based on the literature review (Table 2). Because of the novelty of the object of study and to verify comprehension of the scales, the questionnaire was pretested with a sample of 40 individuals. Table 2. Constructs, items, and scales
Effort Expectancy Performance Expectancy
Cognitive
Construct
Item PE1. The memory implant will be useful in my daily life PE2. Using the memory implant will increase my chances of achieving things that are important to me PE3. Using the memory implant will help me accomplish things more quickly PE4. Using the memory implant will increase my productivity EE1. Learning how to use the memory implant will be easy for me EE2. My interaction with the memory implant will be clear and understandable 9
Scale
Venkatesh et al. (2012) 7-point Likert scale Venkatesh et al. (2012)
Social Influence Anxiety Negative Emotions
Normative
Affective
Positive Emotions
ACCEPTED MANUSCRIPT
Ethics
Intention to Use
EE3. I will find the memory implant easy to use EE4. It will be easy for me to become skillful at using the memory implant POS1. Interested POS2. Excited POS3. Determined POS4. Enthusiastic POS5. Proud POS6. Inspired POS7. Strong POS8. Active NEG1. Distressed NEG2. Upset NEG3. Guilty NEG4. Ashamed NEG5. Scared NEG6. Hostile NEG7. Afraid NEG8. Irritable NEG9. Alert A1. Nervous A2. Attentive A3. Jittery SI1. People who are important to me will think that I should use the memory implant SI2. People who influence my behavior will think that I should use the memory implant SI3. People whose opinions I value will prefer that I use the memory implant Unethical – Ethical Unfair – Fair Unjust – Just Not morally right – Morally right Not acceptable to my family – Acceptable to my family Culturally unacceptable – Culturally acceptable Not personally satisfying – Personally satisfying Violates an unwritten contract – Does not violate an unwritten contract IU1. I intend to use the memory implant IU2. I predict that I will use the memory implant
Watson et al. (1988)
Venkatesh et al. (2012)
Semantic differential -3 to +3
Reidenbach and Robin (1988; 1990)
7-point Likert scale
Venkatesh and Davis (2000)
Note: The items removed during the refinement process are shown in italics.
Partial least squares structural equation modeling (PLS-SEM) was used in accordance with the recommendations of Hair et al. (2011: 144) (“if the goal is predicting key target constructs or identifying key ‘driver’ constructs, select PLS-SEM”) and Sarstedt et al. (2016), who point to its suitability when the measurement model conceptualization is reflective and the representation of the constructs in the model is composite. SmartPLS 3.0 software was chosen as it is less sensitive to violations of assumptions of data normality (Ram et al., 2014).
10
ACCEPTED MANUSCRIPT 4. Results The results of the exploratory factor analysis performed on the ethics items yielded a single factor, explaining 61.27% of the variance. The KMO index was good (0.93), and the Bartlett test of sphericity yielded a p-value < 0.001. Subsequently, a confirmatory factor analysis based on covariance was performed using the EQS6 software. The goodness of fit of this confirmatory factor analysis was satisfactory: BBNFI = 0.98; BBNNFI = 0.97; CFI = 0.98; robust CFI = 0.99; GFI = 0.97; AGFI = 0.95. With regard to convergent validity, as Table 3 shows, the indicators converge in a single factor, and the standardized lambda parameters were significant and > 0.5 (Anderson and Gerbing, 1988). The average variance extracted (AVE) was > 0.5 (Hair et al., 1999), and the composite reliability coefficient was > 0.7 (MacKenzie et al., 2005). Cronbach’s alpha was 0.91. Table 3. Composite reliability and convergent validity Construct/dimension and indicator
Standardized parameters > 0.5 (t-value > 1.96)
F1. Ethics (Reflective) Ethical Fair Just Moral Acceptable for my family Culturally acceptable Personally satisfactory Does not break a social contract
0.78 (27.6) 0.76 (26.5) 0.70 (23.7) 0.83 (30.5) 0.75 (26.1) 0.73 (25.3) 0.72 (24.6) 0.67 (22.2)
Composite reliability coefficients > 0.7 0.91
Average variance explained (AVE) > 0.5 0.55
Once it had been verified that all the items were variables reflective of the ethical assessment of implants to increase memory and that the scale was reliable and valid, these observable variables were used to form clusters based on the ethical assessment. The ANOVA results reflect the existence of mean inequality between the groups with p-value < 0.01. In the discriminant analysis, Wilks Lambda, with a value of p <0.01, supports the hypothesis of differences between the groups in the scores given to the independent variables. Box’s M test yielded a value of 313.66, with an F statistic p-value of 0.01, making it possible to reject the null hypothesis that the variance-covariance matrices do not show statistically different differences between the groups. Finally, the confusion matrix showed that 97.8% of the original group of cases has been correctly classified. All of this confirmed that the three clusters obtained were different and correctly identified. The three clusters were assigned names in accordance with their characteristics:
11
ACCEPTED MANUSCRIPT Group 1 (G1). Ethically in favor. In this group, the 8 variables of the ethics scale had scores ranging from 1.43 to 1.82 (on a scale of -3 to 3) (ethical 1.67, fair 1.72, just 1.61, moral 1.84, acceptable for my family 1.72, culturally acceptable 1.82, personally satisfactory 1.67, and does not break an unwritten social contract 1.43). These people can thus be said to consider neural implants to increase memory ethical. This group accounted for 33.5% of the sample. Group 2 (G2). Ethically against. For this group, all the variables had negative scores, ranging from -2.23 to -1.48. All the variables were well below the scale’s mean value of 0. This group accounted for 26.9% of the sample. Group 3 (G3). Ethically indifferent. This group has not decided whether or not neuronal implants to increase memory are ethical. The range of scores for the variables was very close to 0 (-0.18 to 0.14). This was the largest group, accounting for 39.6% of the sample. Once the groups had been established, their interest in neural implants to increase memory was compared. The p-values of the ANOVA and the Kruskal-Wallis H test showed deep differences between the three groups in the intention to use these insideables and their predicted use thereof (Table 4). The group of people ethically in favor registered mean values close to 4 and medians of 4 (scale of 1 to 7) for intention to use and predicted use. The group of people ethically against had mean values close to 1.7 and a median of 1. The ethically indifferent group had mean values close to 2.6 and a median of 2. For all three groups, the standard deviation was high. It was thus considered useful to analyze the variables explaining this variability through causal models. This variability also reflects the fact that Groups 1 and 3 included people who both did and did not intend to use the memory implants. Specifically, 33.9% of the total number of respondents had a score of 4 or more for intention to use and 33.3% for predicted use. Table 4. Comparison of the intention to use vs predicted use of memory implants between groups (ethics) ANOVA/Kruskal-Wallis H p-value G1. Ethically in favor
G2. Ethically against
G3. Ethically indifferent
Indicators Mean Standard deviation Median Mean Standard deviation Median Mean Standard deviation Median
12
Intention to use <0.001 / <0.001 3.96 2.11
I predict I will use <0.001 / <0.001 3.97 2.01
4 1.68 1.43
4 1.71 1.44
1 2.64 1.70
1 2.56 1.70
2
2
ACCEPTED MANUSCRIPT The results of the proposed explanatory models and a comparison between them by means of a multigroup analysis are shown below. 4.1 Exploratory Factor Analysis and Evaluation of the Measurement Model The results of the exploratory factor analyses of the performance expectancy (PE), effort expectancy (EE), social influence (SI), and intention to use (IU) scales yielded a single factor in all cases, with high explained variances and KMO indices: PE = 80.22% (KMO = 0.84), EE = 77.75% (KMO = 0.84), SI = 88.40% (KMO = 0.77), IU=93.55% (KMO = 0.50). The Bartlett test of sphericity reflected a significance level < 0.001 for all the scales. On the scale of emotions elicited by the idea of implants, the exploratory factor analysis identified three factors explaining 58.73% of the variance. The KMO index was good (0.92) and the Bartlett test of sphericity reflected a significance level < 0.001. The three factors obtained were the same ones obtained by Pelegrín-Borondo et al. (2017) in the development of the CAN model applied to insideables: positive emotions, negative emotions, and emotions of anxiety. With regard to the reflective measurement model, for the indicators to achieve a satisfactory level of reliability, the standardized factor loadings must be > 0.7 and significant (t-value > 1.96) (Hair et al., 2013). In the present study, the observable variables with values < 0.7 and tvalues < 1.96 were eliminated, and the model was respecified to obtain greater convergence (Anderson and Gerbing, 1988). The variables inspired, disgusted and nervous also had standardized loadings slightly < 0.7, but t-values > 1.96. These variables were kept. According to Hair et al. (2013), the rule limiting standardized loadings to 0.7 should be flexible, especially when the indicators increase the factor’s content validity. The rest of the variables had standardized loadings > 0.7 and t-values > 1.96 (see Table 5). The reliability of the individual items was thus satisfactory (Hair et al., 2011). Table 5. Model loads and cross-loadings (t-value)
PE1. PE2. PE3. PE4. EE1. EE2. EE3. EE4.
Cognitive dimension Performance Effort expectancy expectancy (PE) (EE) 0.90 (122.2) 0.90 (117.0) 0.89 (105.2) 0.88 (96.2) 0.41 0.55 0.43 0.44
0.48 0.48 0.46 0.45 0.86 (71.1) 0.90 (107.6) 0.89 (93.1) 0.88 (93.3)
Positive emotions (POS)
Affective Dimension Negative Anxiety emotions (A) (NEG)
Normative dimension Social influence (SI)
Intention to use (IU)
0.58 0.59 0.56 0.54 0.35 0.47 0.40 0.41
-0.01 0.04 0.10 0.05 0.05 0.03 0.05 0.05
0.50 0.49 0.48 0.47 0.34 0.44 0.37 0.43
0.51 0.49 0.46 0.45 0.36 0.45 0.41 0.41
13
0.20 0.24 0.23 0.21 0.16 0.21 0.17 0.18
Intention
ACCEPTED MANUSCRIPT POS1. POS2. POS3. POS4. POS5. POS6. POS7. POS8. NEG1. NEG2. NEG3. NEG4. A1. A2. A3. SI1. SI2. SI3. IU1. IU2.
0.59 0.39 0.51 0.56 0.47 0.44 0.48 0.47 0.10 -0.09 0.04 -0.04 0.13 0.28 0.10 0.51 0.50 0.52 0.53 0.50
0.41 0.28 0.36 0.40 0.36 0.32 0.39 0.36 0.05 0.03 0.06 -0.01 0.08 0.22 0.11 0.43 0.40 0.44 0.46 0.44
0.77 (51.3) 0.70 (29.8) 0.82 (65.5) 0.87 (81.0) 0.82 (61.4) 0.67 (29.3) 0.79 (49.5) 0.80 (49.5) 0.19 0.07 0.23 0.11 0.33 0.53 0.30 0.50 0.48 0.52 0.66 0.65
0.06 0.34 0.27 0.13 0.21 0.15 0.20 0.23 0.74 (0.92) 0.61 (5.2) 0.82 (12.7) 0.70 (7.6) 0.49 0.27 0.43 0.17 0.18 0.13 0.15 0.14
0.33 0.39 0.42 0.42 0.42 0.36 0.49 0.50 0.33 0.33 0.31 0.35 0.66 (13.2) 0.91 (42.2) 0.76 (20.6) 0.27 0.27 0.26 0.30 0.30
0.42 0.33 0.43 0.45 0.44 0.27 0.51 0.44 0.11 0.06 0.16 0.11 0.13 0.29 0.19 0.94 (156.7) 0.94 (147.2) 0.94 (187.1) 0.55 0.54
0.61 0.45 0.48 0.59 0.55 0.39 0.61 0.51 0.12 0.02 0.14 0.07 0.12 0.35 0.15 0.51 0.52 0.55 0.97 (263.2) 0.97 (259.4)
As shown in Table 6, all constructs had Cronbach’s alpha and composite reliability values > 7, indicating satisfactory construct reliability. Likewise, all constructs had an average variance extracted (AVE) > 0.5, confirming the convergent validity. With regard to the discriminant validity (Roldán and Sánchez-Franco, 2012): (1) the square root of the AVE of each construct was greater than the inter-construct correlations; and (2) the model loadings were greater than the cross-loadings. Table 6. Construct reliability, convergent validity and discriminant validity Construct
Composite reliability > 0.7
Cronbach’s alpha
AVE > 0.5
PE
EE
POS
A
NEG
SI
IU
Performance 0.94 0.92 0.80 0.72 expectancy (PE) Effort 0.93 0.91 0.78 0.43 0.79 expectancy (EE) Positive 0.93 0.91 0.61 0.25 0.53 0.78 emotions (POS) Negative 0.81 0.73 0.52 0.05 0.20 0.46 0.88 emotions (NEG) Anxiety (A) 0.83 0.74 0.62 0.05 0.25 0.63 0.52 0.90 Social influence 0.96 0.93 0.88 0.17 0.29 0.53 0.45 0.54 0.94 (SI) Intention to use 0.97 0.93 0.94 0.15 0.31 0.68 0.46 0.53 0.56 0.97 (IU) Note: The diagonal elements (in bold) are the square root of the AVE. The off-diagonal elements are the interconstruct correlations.
14
ACCEPTED MANUSCRIPT 4.2. Assessment of the Structural Model and Multigroup Analysis In order to compare the groups, the database was separated into three groups. Bootstrapping with 5,000 resamples was used to assess the significance of the path coefficients in each of the three groups (Hair et al., 2011). The initial results yielded negative explained variance scores for intention to use for the anxiety dimension in all three models. This was due to the existence of redundancy, as the variable anxiety was highly correlated with the negative emotions variables in two groups (G1 = 0.48; G3 = 0.49) and with positive emotions in all three (G1 = 0.62; G2 = 0.37; G3 = 0.55). According to Falk and Miller (1992: 76), when correlations are substantial, redundancy is more likely; to eliminate it, the variables producing the redundancy should be eliminated, unless doing so would result in a large decrease in R2. Following the elimination of the variable anxiety, the value of R2 was 45.8% for G1, 47.2% for G2, and 36.9% for G3 (see Table 7), quite similar to the values obtained prior to its elimination (G1 = 45.8%; G2 = 47.4 %; G3 = 37.9%). The Stone-Geisser cross-validated redundancy measure Q2 for the models without the anxiety dimension (G1 = 0.399; G2 = 0.409; G3 = 0.308) was also quite similar to the values obtained prior to the dimension’s elimination (G1 = 0.395; G2 = 0.412; G3 = 0.316). Therefore, its elimination was advisable and did not pose any problem. The variance explained (R2) of the three models finally proposed was good, and in each model the Q2 confirmed the model’s predictive relevance (i.e., Q2 > 0), as “Q2 values larger than 0 indicate that the exogenous constructs have predictive relevance for the endogenous construct under consideration” (Hair et al., 2011, p. 145). All in all, the models were highly predictive of the intention to use the memory implants. Table 7 shows the amount of variance that each antecedent variable explained in the dependent variable intention to use, the path coefficients, and the t-values based on a one-tailed Student’s t distribution (4,999 degrees of freedom) for each of the three groups (Henseler et al., 2009). The negative value of the variance explained in the performance expectancy values for the ethically indifferent group is “due to the fact that the original relationship between the two variables is so close to zero that the difference in the signs simply reflects random variation around zero” (Falk and Miller, 1992, p. 75). In all of the models (G1, G2, and G3), positive emotions and social influence were found to have a very significant influence on the intention to use memory implants, thereby supporting Hypotheses H3 and H6. Support was also found for the influence of effort expectancy on the intention to use these insideables (H2), although the influence was weaker in all the groups. Performance expectancy was found to have a slight influence (p < 0.05) on the intention to use memory implants in the ethically against group but not the others; thus, partial support was 15
ACCEPTED MANUSCRIPT found for H1. The influence of negative emotions on the intention to use was insignificant in all groups, so H4 was rejected. With regard to the influence of anxiety, following the elimination of this dimension from the models for all three groups, the R2 remained virtually unchanged, indicating it did not have a significant effect; H5 was thus also rejected. Table 7. Effects of the endogenous variables R2
Q2
Direct effects
Significance
Correlation
ETHICALLY IN FAVOR 45.8% 0.399 Intention to use 0.56 0.03ns 0.43 H1: Performance expectancy => (+)Intention to use 1.80 0.11* 0.40 H2: Effort expectancy => (+)Intention to use 0.52*** 8.53 0.65 H3: Positive emotions => (+)Intention to use 0.08 0.00ns 0.23 H4: Negative emotions => (-)Intention to use 2.49 0.16** 0.43 H6: Social influence => (+)Intention to use ETHICALLY AGAINST 47.2% 0.409 Intention to use 1.72 0.10* 0.47 H1: Performance expectancy => (+)Intention to use 1.77 0.08* 0.34 H2: Effort expectancy => (+)Intention to use 0.42*** 5.23 0.61 H3: Positive emotions => (+)Intention to use 1.00 -0.05ns -0.14 H4: Negative emotions => (-)Intention to use 0.26*** 4.57 0.52 H6: Social influence => (+)Intention to use ETHICALLY INDIFFERENT 36.9% 0.308 Intention to use 0.21 -0.01ns 0.35 H1: Performance expectancy => (+)Intention to use 2.95 0.14** 0.35 H2: Effort expectancy => (+)Intention to use 0.37*** 7.06 0.52 H3: Positive emotions => (+)Intention to use 0.91 0.04ns 0.26 H4: Negative emotions => (-)Intention to use 0.26*** 4.86 0.47 H6: Social influence => (+)Intention to use Note: * p <0.05 => t> 1.65; ** p <0.01 => t> 2.33; *** p <0.001 => t> 3.09; n.s. = not significant (based on a one-tailed Student’s t (4.999) distribution).
As Table 8 shows, in the three comparisons between groups (G1 vs. G2, G1 vs. G3, and G2 vs. G3), the p-values were > 0.05 for both parametric tests and for all the analyzed relationships, meaning no difference was found between the groups. Henseler’s non-parametric test yielded p-values > 0.05 in all cases except for the influence of positive emotions on the intention to use memory implants in the comparison between the ethically in favor (G1) and ethically against (G3) groups, where it was 0.033. There were thus no differences between groups according to this test. Nor did the test of confidence intervals reveal differences in any case. Consequently, the hypothesis regarding differences between the models explaining the intention to use memory implants based on the ethical assessment thereof (H7) was rejected. Table 8. Multi-group comparison Hypothesis Ethically in favor (G1) vs. against (G2) Performance expectancy =>Intention to use Effort expectancy => Intention to use Positive emotions => Intention to use Negative emotions=> Intention to use
Dif. Path G1-G2 0.049 0.063 0.101 0.109
PEV
PW-S
PH
0.427 0.689 0.311 0.459
0.421 0.676 0.319 0.463
0.789 0.339 0.159 0.225
16
Test of Confidence Intervals 2.5%, 97.5% 2.5%, 97.5% (G1) (G2) -0.08, 0.14 -0.01, 0.20 -0.01, 0.23 -0.01, 0.16 0.39, 0.64 0.24, 0.55 -0.09, 0.08 -0.14, 0.08
Signif. n.s. n.s. n.s. n.s.
Variance explained
1.38% 4.28% 33.50% 0.00% 6.67%
4.46% 2.61% 25.54% 0.70% 13.83%
-0.39% 4.88% 19.08% 1.10% 12.19%
ACCEPTED MANUSCRIPT Social influence => Intention to use Ethically in favor (G1) vs. indifferent (G3) Performance expectancy =>Intention to use Effort expectancy => Intention to use Positive emotions => Intention to use Negative emotions=> Intention to use Social influence => Intention to use Ethically against (G2) vs. indifferent (G3) Performance expectancy =>Intention to use Effort expectancy => Intention to use Positive emotions => Intention to use Negative emotions=> Intention to use Social influence => Intention to use
0.030 G1-G3 0.044 0.035 0.150 0.047 0.106 G2-G3 0.106 0.065 0.049 0.095 0.003
0.199
0.191
0.904
0.583 0.646 0.061 0.479 0.196
0.582 0.649 0.063 0.470 0.197
0.293 0.677 0.033 0.764 0.901
0.182 0.340 0.591 0.190 0.972
0.165 0.313 0.608 0.178 0.971
0.082 0.844 0.297 0.908 0.487
0.04, 0.28 (G1) -0.08, 0.14 -0.01, 0.23 0.40, 0.63 -0.10, 0.07 0.04, 0.27 (G2) 0.00, 0.20 -0.01, 0.16 0.23, 0.55 -0.13, 0.07 0.15, 0.38
0.15, 0.37 (G3) -0.12, 0.09 0.04, 0.23 0.27, 0.47 -0.06, 0.14 0.16, 0.37 (G3) -0.12, 0.10 0.05, 0.24 0.27, 0.47 -0.06, 0.13 0.16, 0.37
n.s. n.s. n.s. n.s. n.s. n.s. n.s. n.s. n.s. n.s. n.s.
Notes: Levels of significance based on two-tailed Student’s t (4,999) distribution. Dif. Path = difference between path coefficients. PEV = p-value equivalent variances test. PW-S = p-value Welch-Satterthwaite test. PH = p-valor Henseler test. n.s. = non-significant difference.
5. Conclusions This paper has analyzed and validated a model explaining the intention to use memory implants and the moderating effect of ethics in a process of evolution toward the cyborg. In this context, an expanded model was proposed combining cognitive-affective-normative variables with ethics in the acceptance of a controversial cutting-edge technology. The proposed model explains the intention to use memory implants to increase capacities well, with a variance explained (R2) of 45.8% in the group of people with a positive ethical assessment of the technology, 47.2% in the group with a negative ethical assessment, and 36.9% in the group with an indeterminate ethical assessment. Although the intention to use memory implants to increase capacities was shown to vary considerably depending on the ethical assessment thereof (the intention to use increases as the ethical assessment becomes more positive), the moderating capacity of ethics had no influence on the cognitive, affective and normative variables of the proposed theoretical model. The three analyzed models explaining the intention to use these insideables (ethically in favor, ethically indifferent, and ethically against) basically behaved the same way, and ethics did not moderate acceptance behavior toward the technology. Thus, while the ethical assessment of memory implants helps explain differences in the intention to use them, it does not moderate the influence of performance expectancy, effort expectancy, positive and negative emotions, or social influence on this intention. To explain this finding, the three elements that might come into play in the assessment of an ethical decision were considered, namely: (1) the object, (2) the intention, and (3) the circumstances: The object is ethically controversial. As noted, according to authors such as Park (2014) or Jotterand (2008), the life view of a cyborg would be different from that of a human, despite certain links to its human background, since its values, morals, and ethics would be related to its own life and to what it feels is or is not important or possible. 17
ACCEPTED MANUSCRIPT As for intention, the therapy/enhancement distinction is the key to the ethical assessment of the purpose of the interventions carried out, whether to treat illness or disability or with a view to enhancing people’s normal functioning or endowing them with entirely new capacities. The underlying intention in the use of brain implants for therapeutic purposes does not seem to be ethically controversial. However, according to Warwick (2014), cyborgs pose an ethical dilemma when an individual’s consciousness is altered by the merging of human and machine (Schermer, 2009; Park, 2014). In accordance with the theory of the circular evolution of ethics, the perceived benefits of non-medical implants, as compared to wearable technology, reflect the influence of the evolving behavior observed in the group ethically in favor of this technology on what society at large ultimately considers ethical (Goel et al., 2016). With regard to the circumstances, while the ethical issues are real, this research has shown that more than a quarter of society has a negative ethical opinion. Schemer (2009) has noted that it is not easy to adequately address the moral issues posed by such new technologies, because they also challenge some of the central concepts and categories used to understand and respond to moral questions. Hansson (2005: 523), for example, has argued that brain implants may be “reason to reconsider our criteria for personal identity and personality changes.” These new technologies can also change elements of common morals, norms, and values. Notwithstanding the above, regardless of individuals’ ethical assessments, the variables to show the highest predictive capacity were the positive emotions produced by the idea of a memory implant and social influence. Negative emotions did not have a significant impact on the explanation of the intention to use them. Thus, the affective dimension – specifically, its positive component – and the normative dimension are what have the greatest predictive power regarding the intention to use. The cognitive dimension proved to have less explanatory capacity. The influence of effort expectancy on the intention to use memory implants was low and, contrary to what was expected, performance expectancy had no explanatory capacity with regard to the intention to use these insideables. These findings regarding memory implants are very similar to those obtained with regard to the concept of insideables in general. As in Pelegrín-Borondo et al. (2017), positive emotions and the social norm had the greatest predictive capacity in all three groups (ethically in favor, ethically against, and ethically indifferent). The main difference is that the predictive capacity of the model applied to insideables in general was 73.92%, while in the three models obtained to explain the acceptance of memory implants in particular, it was lower. Among the possible explanations for this, consideration should be given to the following.
18
ACCEPTED MANUSCRIPT First, the model was applied to a specific product type rather than a product category. Second, the CAN model has been shown to work better in cases in which there is greater dispersion in the intention to use. Memory implants are a highly invasive product in the brain, which generates negative emotions (e.g. Buchanan-Oliver and Cruz, 2011; Park, 2014; Lai, 2012; Schermer, 2009) and homogeneous scores with regard to the intention to use in each segment: those who consider them ethical are clustered around 4 (on a scale of 1 to 7); those who are ethically against them, around 1.7; and those who are ethically indifferent, around 2.5. As the results are concentrated at similar values within each group, the range of explanation of the intention to use and, thus, its explanatory capacity, is lower. It is worth noting that only 14% and 13% of the sample gave scores of 6 or 7 to the intention to use and predicted use, respectively. The main limitation of this work arises from the fact that the research was conducted in a single country. As noted in the discussion of the results, moral questions, on which individuals base their ethical assessments, are inherently cultural. To address this limitation, the study should be extended to other countries in order to examine cross-cultural differences in the model’s application. Second, because a considerable segment of society – more than a third – has a positive ethical assessment of the use of memory implants, future multidisciplinary research should seek to identify and analyze the social, legal, economic and/or philosophical bases and parameters that might govern a cyborg society. Finally, because the predictive capacity of the model applied to insideables in general was greater than that obtained to explain the acceptance of memory implants in particular, it is crucial to advance the knowledge of cutting-edge technology acceptance models and to continue testing the present model’s application to both product categories and specific products. References Adams, J. (2010). Motivational narratives and assessments of the body after cosmetic surgery. Qualitative Health Research, 20, 755-767. doi:10.1177/1049732310362984 Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179-211. Alaiad, A., & Zhou, L. (2014). The determinants of home healthcare robots adoption: An empirical investigation. International Journal of Medical Informatics, 83(11), 825-840. doi:10.1016/j.ijmedinf.2014.07.003
19
ACCEPTED MANUSCRIPT Anderson, J.C., & Gerbing, D.W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103(3), 411-423. doi:10.1037/0033-2909.103.3.411 Bagozzi, R. P., Gopinath, M., & Nyer, P. U. (1999). The role of emotions in marketing. Journal of the Academy of Marketing Science, 27, 184-206. doi: 10.1177/0092070399272005 Berger, F., Gevers, S., Siep, L., & Weltring, K.M. (2008). Ethical, legal and social aspects of brain-implants using nano-scale materials and techniques. NanoEthics, 2(3), 241–249. doi:10.1007/s11569-008-0044-9 Berger, J.L. (2011). Medical implant device with RFID tag and method of identification of device. Washington, DC: U.S. Patent and Trademark Office. U.S. Patent No. 7,932,825. Retrieved April 10, 2017 from https://www.google.com/patents/US7932825 Bertrand, M., & Bouchard, S. (2008). Applying the Technology Acceptance Model to VR with people who are favorable to its use. Journal of Cyber Therapy & Rehabilitation, 1(2), 200-210. Bhattacharyya, A., & Kedzior, R. (2012). Consuming the cyborg. Advances in Consumer Research, 40, 960–961. Bigné, J.E., Mattila, A.S., & Andreu, L. (2008). The impact of experiential consumption cognition and emotion on behavioural intentions. Journal of Service Marketing. 22(4), 303-315. doi:10.1108/08876040810881704 Buchanan-Oliver, M., & Cruz, A. (2011). Discourses of technology consumption: Ambivalence, fear, and liminality. In R. Ahluwalia, T. L. Chartrand, & R. K. Ratner (Eds.), Advances in Consumer Research, 39 (pp. 287–291). Duluth, MN: Association for Consumer Research. Clausen. J. (2008). Moving minds: ethical aspects of neural motor prostheses. Biotechnology Journal, 3(12), 1493-1501.doi: 10.1002/biot.200800244 Clausen, J. (2011). Conceptual and ethical issues with brain–hardware interfaces. Current Opinion in Psychiatry, 24(6), 495-501. doi:10.1097/YCO.0b013e32834bb8ca Cohen, J. B., Pham, M. T., & Andrade, E. B. (2006). The nature and role of affect in consumer behavior. In C. P. Haugtverdt, P. Herr, & F. Kardes (Eds.), Handbook of consumer psychology (pp. 297–348). Mahwah, NJ: Lawrence Erlbaum. Cohen, A.I. (2005). Contemporary debates in applied ethics. Wiley-Blackwell. Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly: Management Information Systems, 13(3), 319340. doi: 10.2307/249008 20
ACCEPTED MANUSCRIPT Davis, F.D., Bagozzi, R.P., & Warshaw, P.R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982-1003. doi: 10.1287/mnsc.35.8.982 De Andrade, D.D. (2010). On norms and bodies: Findings from field research on cosmetic surgery in Rio de Janeiro, Brazil. Reproductive Health Matters, 18(35), 74-83. doi: 10.1016/S0968-8080(10)35519-4. Dean, M., Raats, M., & Shepherd, R. (2008). Moral concerns and consumer choice of fresh and processed organic foods. Journal of Applied Social Psychology, 38(8): 2088–2107. doi: 10.1111/j.1559-1816.2008.00382.x Dunfee, T., & Donaldson, T. (1995). Contractarian Business Ethics: Current Status and Next Steps. Business Ethics Quarterly, 5(2), 173-186. doi:10.2307/3857352 Falk, R.F., & Miller, N. B. (1992). A primer for soft modeling. University of Akron Press. Fishbein, M., & Ajzen, I. (1975). Belief, attitude, intention, and behavior: An introduction to theory and research. Reading, MA: Addison-Wesley. Ford, P.J. (2007). Neurosurgical implants: clinical protocol considerations. Cambridge Quarterly of Healthcare Ethics, 16(3), :308–311 doi:10.1017/S096318010707034X Ford, R., & Richardson, W. (1994). Ethical decision making: a review of the empirical literature. Journal of Business Ethics, 13(3), 205–221. doi:10.1007/BF02074820 Frey, R.G., & Heath Wellman, C. (2004). A companion to applied ethics. Blackwell Publishing Glannon, W. (2007). Bioethics and the brain. Oxford: Oxford University. Goel, L., Hart, D., Junglas, I., & Ives, B. (2016). Acceptable IS Use: Conceptualization and measurement. Computers
in
Human
Behavior, 55,
322-328.
doi:10.1016/j.chb.2015.09.029 Greiner,
S.
(2014).
Cyborg
Bodies-Self-Reflections
on
Sensory
Augmentations. NanoEthics, 8(3), 299-302. doi:10.1007/s11569-014-0207-9 Haddow, G., King, E., Kunkler, I., & McLaren, D. (2015). Cyborgs in the everyday: Masculinity and biosensing prostate cancer. Science as Culture, 24(4), 484-506. doi: Hair, J.F., Anderson, R., Tatham, R., & Black, W. (1999). In Hair J.F. (Ed.), Análisis multivariante (5ª ed. ed.). Madrid: Prentice Hall Iberia. Hair, J.F., Ringle, C. M., & Sarstedt, M. (2013). Partial least squares structural equation modeling: Rigorous applications, better result and higher acceptance. Long Range Planning, 46 (1/2), 1-12. doi:10.1016/j.lrp.2013.08.016 Hair, J.F., Ringle, C.M., & Sarstedt, M. (2011). PLS-SEM: Indeed a Silver Bullet. Journal of Marketing Theory and Practice, 19(2), 139-151. doi:10.2753/MTP1069-6679190202 21
ACCEPTED MANUSCRIPT Hameed, M.A., Counsell, S., & Swift, S. (2012). A conceptual model for the process of IT innovation adoption in organizations. Journal of Engineering and Technology Management, 29(3), 358-390. doi:10.1016/j.jengtecman.2012.03.007 Han, S., Lerner J., & Keltner, D. (2007) Feeling and consumer decision making: the appraisal– tendency framework. Journal of Consumer Psychology, 17(3): 158–168. Handy, J., Hunter, I., & Whiddett, R. (2001). User acceptance of inter-organizational electronic medical
records.
Health
Informatics
Journal,
7(2),
103–107.
doi:
10.1177/146045820100700208 Hansson, S.O. (2005). Implant ethics. Journal of Medical Ethics, 31(9), 519-525. doi:10.1136/jme.2004.009803 Henseler, J., Ringle, C. M., & Sinkovics, R. R. (2009). The use of partial least squares path modeling in international marketing. In New challenges to International Marketing (pp. 277-319). Emerald Group Publishing Limited. Hyde, M., Punch, R., & Komesaroff, L. (2010). Coming to a decision about cochlear implantation: Parents making choices for their deaf children. Journal of Deaf Studies and Deaf Education, 15(2), 162-178. URL: http://www.jstor.org/stable/42659026 Javo, I.M., & Sørlie, T. (2010). Psychosocial predictors of an interest in cosmetic surgery among young Norwegian women: A population-based study. Plastic Surgical Nursing, 30(3), 180-186. doi 10.1097/PRS.0b013e3181bcf290 Jotterand, F. (2008). Beyond therapy and enhancement: The alteration of human nature. NanoEthics, 2(1), 15-23. doi: 10.1007/s11569-008-0025-z LaFollette, H. (2002). Ethics in practice (2nd ed.). Blackwell Publishing. Lai, A. L. (2012). Cyborg as commodity: Exploring conception of self-identity, body and citizenship within the context of emerging transplant technologies. Advances in Consumer Research, 40, 386-394. URL: http://hdl.handle.net/2381/27579 Levav, J., & McGraw, A.P. (2009). Emotional accounting: How feeling about money influence consumer
choice.
Journal
of
Marketing
Research,
46(1),
66–80.
doi:http://dx.doi.org/10.1509/jmkr.46.1.66 MacKenzie, S.B., Podsakoff, P.M., & Jarvis, C.B. (2005). The problem of measurement model misspecification in behavioral and organizational research and some recommended solutions; 16060788. Journal of Applied Psychology, 90(4), 710-730. doi:10.1037/00219010.90.4.710 Mano, H. (2004). Emotion and consumption: Perspectives and issues. Motivation and Emotion, 28(1), 107-120. doi:10.1023/B:MOEM.0000027280.10731.76 22
ACCEPTED MANUSCRIPT McGee, E.M., & Maguire, G.Q. (2007). Becoming borg to become immortal: regulating brain implant technologies. Cambridge Quarterly of Healthcare Ethics, 16(03), 291-302. doi:10.1017/S0963180107070326 Milleson, V. (2013). Nanotechnology, the Brain, and the Future: Ethical Considerations. In Nanotechnology, the Brain, and the Future (pp. 79-96). Netherlands: Springer. doi:10.1007/978-94-007-1787-9_5 Mnyusiwalla, A., Daar., A.S., & Singer, P.A. (2003). “‘Mind the gap’: science and ethics in nanotechnology. Nanotechnology, 14(3), R9–R13. doi: 10.1088/0957-4484/14/3/201 Most, T., Wiesel, A., & Blitzer, T. (2007). Identity and attitudes towards cochlear implant among deaf and hard of hearing adolescents. Deafness & Education International, 9(2), 68-82. doi:10.1002/dei.207 Nijboer, F., Clausen, J., Allison, B.Z., & Haselager, P. (2013). The asilomar survey: Stakeholders’
opinions
on
ethical
issues
related
to
brain-computer
interfacing. Neuroethics, 6(3), 541-578. doi:10.1007/s12152-011-9132-6 Ochsner, B., Spöhrer, M., & Stock, R. (2015). Human, non-human, and beyond: cochlear implants
in
socio-technological
environments. NanoEthics, 9(3),
237-250.
doi:10.1007/s11569-015-0242-1 Olarte-Pascual, C., Pelegrin-Borondo, J., & Reinares-Lara, E. (2015). Implantes para aumentar las capacidades innatas: integrados vs apocalípticos ¿existe un nuevo mercado?/ Implants to increase innate capacities: integrated vs. apocalyptic attitudes. Is there a new market. Universia Business Review, 48, 86-117. Park, E. (2014). Ethical issues in cyborg technology: Diversity and inclusion. NanoEthics, 8(3), 303-306. doi: 10.1007/s11569-014-0206-x Parkhust, A. (2012). Becoming Cyborgian: Procrastinating the Singularity. The new bioethics, 18(1), 68-80. doi:10.1179/2050287713Z.0000000006 Pelegrín-Borondo, J., Reinares-Lara, E., & Olarte-Pascual, C. (2017). Assessing the acceptance of technological implants (the cyborg): Evidences and challenges. Computers in Human Behavior, 70, 104-112. doi:10.1016/j.chb.2016.12.063 Pelegrín-Borondo, J., Reinares-Lara, E., Olarte-Pascual, C., & Sierra-García, M. (2016). Assessing the moderating effect of the end user in consumer behavior: the acceptance of technological implants to increase innate human capacities. Frontiers in Psychology, 7, 1-13. doi:0.3389/fpsyg.2016.00132
23
ACCEPTED MANUSCRIPT Ram, J., Corkindale, D., & Wu, M.-L. (2014). ERP adoption and value creation: Examining the contributions of antecedents. Journal of Engineering and Technology Management, 33, 113-133. doi:10.1016/j.jengtecman.2014.04.001 Reidenbach, R. E., & Robin, D. P. (1988). Some initial steps toward improving the measurement of ethical evaluations of marketing activities. Journal of Business Ethics, 7(11), 871–879. doi: 10.1007/BF00383050 Reidenbach, R. E., & Robin, D. P. (1990). Toward the development of a multidimensional scale for improving evaluations of business ethics. Journal of Business Ethics, 9(8), 639–653. doi:10.1007/BF00383391 Reinares‐Lara, E., Olarte‐Pascual, C., Pelegrín‐Borondo, J., & Pino, G. (2016). Nanoimplants that Enhance Human Capabilities: A Cognitive‐Affective Approach to Assess Individuals’
Acceptance
of
this
Controversial
Technology. Psychology
&
Marketing, 33(9), 704-712. doi:10.1002/mar.20911 Roldán, J.L., & Sánchez-Franco, M.J. (2012). Variance-based structural equation modeling: Guidelines for using partial least squares in information systems research. In M. Mora, O. Gelman, A. Steenkamp, & M. Raisinghani (Eds.), Research methodologies, innovations and philosophies in software systems engineering and information systems (pp.193–222). Hershey, PA: Raisinghan Information Science Reference. Rosahl, S.K. (2004). Vanishing senses-restoration of sensory functions by electronic implants. Poiesis Prax, 2, 285-295. doi: 10.1007/s10202-003-0057-y Sarstedt, M., Hair, J.F., Ringle, C.M., Thiele, K.O., & Gudergan, S. P. (2016). Estimation issues with PLS and CBSEM: Where the bias lies! Journal of Business Research, 69(10), 39984010. doi:10.1016/j.jbusres.2016.06.007 Schermer, M. (2009). The mind and the machine. On the conceptual and moral implications of brain-machine interaction. NanoEthics, 3(3), 217–230. doi:10.1007/s11569-009-0076-9 Selinger, E., & Engström, T. (2008). A moratorium on cyborgs: Computation, cognition, and commerce.
Phenomenology
and
the
Cognitive
Sciences,
7(3),
327–341.
doi:10.1007/s11097-008-9104-4 Shiv, B., & Fedorikhin, A. (1999). Heart and mind in conflict: The interplay of affect and cognition in consumer decision making. Journal of Consumer Research, 26(3), 278–292. Thompson, J., & Hart, D. (2006). Psychological contracts: a nano-level perspective on social contract theory. Journal of Business Ethics, 68(3), 229-241. doi:10.1007/s10551-0069012-x
24
ACCEPTED MANUSCRIPT Van Waterschoot, W., Kumar Sinha, P., Van Kenhove, P., & De Wulf, K. (2008). Consumer learning and its impact on store format selection. Journal of Retailing and Consumer Services, 15(3), 194-210. doi:10.1016/j.jretconser.2007.03.005 Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the Technology Acceptance Model: Four longitudinal field studies. Management Science, 46, 186-204. doi:10.1287/mnsc.46.2.186.11926 Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly: Management Information Systems, 27(3), 425-478. doi:10.2307/30036540 Venkatesh, V., Thong, J., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the Unified Theory of Acceptance and Use of Technology. MIS Quarterly:
Management
Information
Systems,
36(1),
157-178.
URL:
https://ssrn.com/abstract=2002388 Von Soest, T., Kvalem, I.L., Skolleborg, K.Chr., & Roald, H.E. (2006). Psychosocial factors predicting the motivation to undergo cosmetic surgery. Plastic and Reconstructive Surgery. 117(1), 51-62. doi: 10.1097/01.prs.0000194902.89912.f1 Warwick, K. (2003). Cyborg morals, cyborg values, cyborg ethics. Ethics and information technology, 5(3), 131-137. doi:10.1023/B:ETIN.0000006870.65865.cf Warwick, K. (2014). The cyborg revolution. NanoEthics, 8(3), 263-273. doi:10.1007/s11569014-0212-z Watson, D., Clark, L.A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology, 54(6), 1063–1070. doi:10.1037//0022-3514.54.6.1063 White, C., & Yu, Y.T. (2005). Satisfaction emotions and consumer behavioral intentions. Journal of Services Marketing, 19(6): 411-420. doi:10.1108/08876040510620184 Wismeijer, D., Van Waas, M.A.J., Vermeeren, J.I.J.F., Muldel, J., & Kalk, W. (1997). Patient satisfaction with implant-supported mandibular overdentures: A comparison of three treatment strategies with ITI-dental implants. International Journal of Oral and Maxillofacial Surgery, 26(4), 263-267. Yang, H.D., & Yoo, Y. (2004). It’s all about attitude: Revisiting the Technology Acceptance Model. Decision Support Systems, 38(1), 19-31. doi:10.1016/S0167-9236(03)00062-9 Zielke, S. (2011). Integrating emotions in the analysis of retail price images. Psychology & Marketing, 28, 330–359. doi:10.1002/mar.20355
25
ACCEPTED MANUSCRIPT
DO YOU WANT TO BE A CYBORG? THE MODERATING EFFECT OF ETHICS ON NEURAL IMPLANT ACCEPTANCE
Acknowledgements. This research was funded by the Spanish Ministry of Economy and Competitiveness through Research Project ECO2014-59688-R, under the Spanish National Program for Research, Development, and Innovation Oriented Toward Societal Challenges, within the context of the 2013-2016 Spanish National Scientific and Technical Research and Innovation Plan. The authors would also like to acknowledge the bridge grants for research projects awarded by the University of La Rioja (2017), subsidized by Banco Santander (reference: APPI17/05) and the COBEMADE research group at the University of La Rioja.
Author contributions. The three co-authors have participated in all stages of work, including the conception and design of the research, the revision of intellectual content and drafting the work.
ACCEPTED MANUSCRIPT DO YOU WANT TO BE A CYBORG? THE MODERATING EFFECT OF ETHICS ON NEURAL IMPLANT ACCEPTANCE
Highlights
Neural implants development enables the creation of highly qualified cyborgs.
The intention to use memory implants is linked to individual’s ethical assessment.
Affective and normative factors have the greatest influence on the acceptance.
Within the affective dimension, positive emotions have the greatest impact.