Contextualizing privacy on health-related use of information technology

Contextualizing privacy on health-related use of information technology

Computers in Human Behavior 105 (2020) 106204 Contents lists available at ScienceDirect Computers in Human Behavior journal homepage: http://www.els...

623KB Sizes 0 Downloads 58 Views

Computers in Human Behavior 105 (2020) 106204

Contents lists available at ScienceDirect

Computers in Human Behavior journal homepage: http://www.elsevier.com/locate/comphumbeh

Contextualizing privacy on health-related use of information technology Yong Jin Park a, Donghee Don Shin b, * a

Department of Strategic, Legal, Management Communications, Graduate Department of Communication, Culture and Media Studies School of Communications, Howard University, 2400 Sixth Street NW, Washington, DC 20059 b Zayed University, Abu Dhabi, PO Box 144534 UAE

A R T I C L E I N F O

A B S T R A C T

Keywords: privacy Data algorithm Health information technologies

Privacy amid rapid digitalization of medical records is a critical ingredient to the success of electronic-based health service. This paper explores the potential roles of privacy attitudes concerning medical data, based on a large set of a national sample data (n ¼ 2638) from the U.S. Health Information National Trend Survey. We examine the ways in which privacy concern and confidence are (a) mediated through one’s interest in sharing information with health professionals and (b) moderated by one’s medical condition and the reliance on Internet. Evidence from this study provides insights into the factors shaping health-related engagement in information technologies, helping us argue that privacy is a key predictor. Discussion offers interpretations of how people’s perceived need of medical data will mediate privacy concern, contextualizing the affordances of health tech­ nologies in future algorithmic applications.

If a person’s medical information is the key to finding clinical treatment, how to maintain the privacy of health records is a central issue that determines the success of medical practice. Increasingly, people interact with health-care providers, using digital media tech­ nologies (Lin, Zhang, Song, & Omori, 2016; Lupton, 2019; Park, Chung, & Shin, 2018). They find valuable advice concerning their health online; schedule a doctor’s appointment through email; share their experiences about a recent hospital visit in social media; and track their daily ex­ ercise habit in their smartphone apps. People also follow automated machine-based health advice. These activities, once thought to be ‘pri­ vate’, now leave permanent traces, which can be used to track and store one’s life history in perpetual digital dossiers of health-related profiles. Accompanying the acceleration of medical data collection are rapid advancements in algorithmic computing capacities to aggregate, analyze, and draw sensitive inferences about individuals from their health data (Gandy & Nemorin, 2018, pp. 1–15; Libert, 2015; Park, Chung, & Shin, 2018). This paper examines these issues from the perspective of individual users, as opposed to that of health providers. We seek to explore whether and to what extent privacy concern and confidence regarding medical data may discourage or encourage people’s engagement in healthrelated use of information technologies, such as email, text messaging, smartphone, mobile app, and social media. We draw upon the notion of contextual integrity (Nissenbaum, 2011) and synthesizes diverse

streams of privacy research to highlight: (a) contextual condition that influences one’s decisions to engage in or disengage from health-related use of digital technologies and, (b) contextual process in which the interest in sharing one’s private boundaries mediates behavioral effects of privacy concern and confidence. Our rationale is that the shaping of privacy in algorithm-based sys­ tem will determine its future success. Thus, it is primarily important to better understand effects of privacy concern and confidence, as con­ textually specified in the use of health-related technologies. Accord­ ingly, we organize this paper as follows. We first theorize the contextual condition and process of privacy concern and confidence, and propose a set of hypotheses regarding their links to behavioral consequences. Subsequently, we describe our approach to data analyses and discuss how potential benefits of algorithm-based health system will depend upon people’s perceptions about increasingly-risky data environment. 1. Theorizing contextual integrity approach in digital health privacy A sense of privacy is closely related to the nature of context in which individuals come to evaluate the extent of privacy violation with their

* Corresponding author. E-mail addresses: [email protected] (Y.J. Park), [email protected] (D.D. Shin). https://doi.org/10.1016/j.chb.2019.106204 Received 5 July 2019; Received in revised form 18 October 2019; Accepted 15 November 2019 Available online 18 November 2019 0747-5632/© 2019 Elsevier Ltd. All rights reserved.

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204

normative expectations or values. Nissenbaum (2011) called this the ‘contextual integrity’ of privacy, which means distinctive social contexts giving birth to different sets of norms and appropriateness related to privacy practices, needs, and expectations. This notion of contextual integrity effectively helps us reject privacy as a binary matter of public versus private, inviting more subtle insights on how individual percep­ tions of privacy protection or violation are contingent upon contextual appropriateness. People choose to reveal their identities according to specific contexts, as privacy norms and expectations might differ by contextual need and interest in sharing ‘selves’ with others. That is, being able to perform in public and manage identities by contextual demands is essential of human as a social being. We premise that the medical record and its privacy epitomize the notion of contextual integrity (Barocas & Nissenbaum, 2014). Digital technologies disrupt relationships among physicians, patients, and nurses, as health data ecology in emerging algorithmic applications extends to numerous entities such as insurance companies, pharmacists, advertising sponsors, online content publishers, and so forth. For those with a medical condition, a range of personal data becomes vast over symptoms, diagnoses, prescriptions, as well as biographical metadata, propelling a new set of privacy practices and norms to evolve (Libert, 2015). To us, this is a disruptive change giving rise to the contextual conundrum in which sharing personal data with health professionals has become a critical point of a person’s decision, yet with no clear norm and boundary of privacy established over the flow of medical data. One might project that both individuals and health professionals can equally benefit from sophisticated algorithmic analytics of largelyshared digital databases. For instance, hereditary disease detection, preventive treatment, or probabilistic diagnostics are all possible with the help of personal data. However, the potential threat to privacy can hinder the development of health care system, not merely because users shun away out of privacy violation, but also because mistrust, concern, and lack of confidence over the overall system may follow. Sharing data can translate into immediate progress and efficiency of health services. On the other hand, it raises a possibility of personal data being abused for manipulation, hidden influence, or even discrimination against those with preexisting medical issues (Gandy & Nemorin, 2018, pp. 1–15). A tenet of privacy studies has been about individual concern about whether they are able to control data (Katz & Tassone, 1990). Westin (2003) famously developed categories of privacy concern to cluster in­ dividuals into three groups: privacy unconcerned (people with no pri­ vacy concern at all), pragmatists (those who are willing to divulge for benefit), and privacy absolutists (individuals with a high concern about losing control of their personal information). Yet the predictive power of attitudinal measures of concern was soon questioned in the literature, as scholars began to detect so-called ‘privacy paradox.’ The privacy paradox refers to a somewhat paradoxical phenomenon whereby people tend to say they are concerned about privacy loss, but when asked about actual behaviors, people confess that they rarely take any protective action. Nor do they hesitate to divulge data for such compensations as free access, monetary reward, or coupon (Chen et al., 2018; Dienlin & Trepte, 2015; Trepte et al., 2014). This study argues that privacy paradox (no apparent effect of privacy attitude) should be taken seriously in the context of health data to detect potential effects of legitimate worries over a medical system. Previous studies made various accounts of plausible scenarios regarding privacy paradox. One explanation is that although people do concern about privacy, they engage in cost-benefit analysis and may not seek privacy protection when they see that revealing data would result in specific rewards. Behavioral economists subscribe to this view because a trans­ actional cost of taking protective measures or making decisions that are alternative to seeking rewards, such as monetary compensation or free access, often turns out to be cumbersome. Scholars (McDonald & Cranor, 2008) also pointed out somewhat burdensome obligation, for instance, of weighing in privacy statement of websites, which brings similar behavioral consequences of non-action as almost no one would invest

time and effort to read a privacy policy. Another robust explanation derives from knowledge-deficit hypoth­ esis. That is to say, the behavioral gap exists between those who un­ derstand potential risks of personal data loss and those who don’t. Studies (Acquisti & Grossklags, 2005) attributed inadequate protective behavior and reckless disclosure to the lack of knowledge, and those studies demonstrated that there are social consequences of widening disparities among different populations. Scholars (Hargittai & Marwick, 2016; Trepte et al., 2014) suggested that knowledge fixes the disjuncture between privacy concern and behavior, and even help cognitive calculus behind privacy behavior. A lack of knowledge tends to lead people to resign to data disclosure because they perceive privacy violation as unavoidable and do not fully understand potential pitfalls of digital activities. Here this study intends to spurt further debates by adding the notion of contextual integrity as a plausible explanation. 1.1. Research question and hypotheses Careful consideration is in order. First, the assumption of privacy paradox postulates the function of concern having only a direct behav­ ioral effect. This account eliminates nuanced contextual conditions that may attenuate attitudinal effects. Second, the focus on behavioral con­ sequences alone neglects a subtle psychological process in which specific individual contextual needs intervene. This may be particularly perti­ nent to those with a medical issue, or those who might have a rational interest to seek out medical help, as well as a psychological need to share and open personal boundaries to connect with health professionals. Third, behavioral consequences of privacy confidence (being selfassured over data control), as opposed to concern (being worried over data loss), is rarely analyzed. Humans are complex as their behavioral outcomes can be influenced by how much they are worried, as much as how much they are confident, about their privacy when they actively cope with the disclosure of personal boundaries (Marwick & boyd, 2014). The realistic context, in which people engage in health-related technologies, would be influenced by both attitudes. RQ1. To what extent will privacy concern and confidence toward medical data affect health-related use of information technologies? One of the key arguments in this work is that paradoxical puzzle of privacy may be solvable when we specify behavioral effects of privacy attitudes in more delicate contextual condition and process (Barocas & Nissenbaum, 2014; Libert, 2015). We make two contextual specifica­ tions. First, in assessing contextual conditions, we specify the following backgrounds: (a) those who primarily rely on Internet for health infor­ mation and (b) those who have chronic medical issue. Second, we pro­ pose the contextual process in which effects of privacy concern and confidence are mediated through one’s interest to share health-related personal data (that is, the opposite of enclosing in one’s private boundaries). A study by Rice (2006; Tan & Goonawardene, 2017) indicated that people who searched health information online were more likely to be involved in healthy activities than those who did not. Previous research (Cotten & Gupta, 2004) also suggested that those who relied on Internet for health information were found to be more health-conscious and hold stronger health beliefs. Thus, it is reasonable to speculate that there is a distinctive attitude toward health information, particularly among those who primarily rely on Internet for health-related purposes. Here we reason that behavioral effects of privacy attitudes, such as concern and confidence, may be moderated by the context of health-related online experience. Put it differently, we propose conditional effects in which effects of concern and confidence might depend upon the extent to which a person relies on Internet for health information. Those with higher reliance on Internet, for instance, might forego their health privacy concern because they see outstanding informational benefits and stay inclined to engage in technologies. On the other hand, it is plausible to reason that those who are chronically ill might also feel 2

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204

the further need to protect their medical records. That is to say, specific medical conditions may trigger people to be even more data-conscious in their health-related digital engagement, especially when they already have a certain level of privacy concern (McMullan, 2006). Alternatively, it is possible that a medical condition can attenuate any behavioral effect of privacy concern because people may be salient to medical care and its benefit, but less alert to data practice per se. In a similar vein, a low level of privacy confidence might drive people away from the use of health-related technologies, despite having a certain illness. Reflecting these expectations, we hypothesize the moderating effects of (a) a medical condition and (b) a reliance on Internet for health in­ formation. We remain still cautious about formulating a directional hypothesis because contextual moderation can be supportive of either direction.

interest to share data to outcome variables of health-related digital engagement. Put it differently, people with higher concern may not be interested in sharing data because they are turned away from contextual as well as psychological needs. On the other hand, those with higher confidence will be interested in sharing data and feel able to open up personal boundaries to connect, which can lead to a higher level of health-related engagement. To the best of our knowledge, however, direct empirical evidence concerning mediated paths remains in paucity. Further, extant literature tended to focus mostly on the context of ecommerce. Nevertheless, much of the reasoning above enables us to hypothesize mediating effects as follows. H2. The effects of privacy concern and confidence on the healthrelated use of information technologies will be indirect, mediated through the interest to share personal health data. The hypothesized relationships are summarized in Fig. 1.

H1. The effects of privacy concern and confidence on health-related use of information technologies will differ by the levels of (a) a medi­ cal condition and (b) a reliance on Internet for health information.

2. Method

The above contextual condition aside, we propose to assess the contextual process, and estimate how the effects of privacy attitudes (concern and confidence) might be mediated by specific individual contexts. Prior studies (Chen, Quan-Haase, & Park, 2018; Jensen & Potts, 2004) showed that privacy concern was inversely related to the willingness to diverge data to ecommerce sites. Other studies (Olivero & Lunt, 2004) also indicated that perceived risk and mistrust were asso­ ciated with a higher level of demand for data control (that is, less will­ ingness to give up or share personal data), negatively affecting the relationship between online retailers and consumers. Applying this to health privacy in the context of health-information technologies, we expect that one’s interest in sharing personal medical data will mediate the effects of privacy concern and confidence. In other words, behavioral effects of privacy attitudes will be indirect as there can be a specific contextual need to open personal boundaries, and share personal details to connect with health professionals. Our ability to ex­ press and accurately share personal or emotional boundaries is central to building relationships in human life (Acquisti, Brandimarte, & Loe­ wenstein, 2015; Jiang & Hancock, 2013). That is, the interest to share personal details and associated health data may be spurred or discour­ aged by privacy concern and confidence, eventually intervening their behavioral effects. Notice two paths. The first path is from health privacy concern and confidence to the interest to share data, with the second path from the

2.1. Participant sample This study re-examined secondary data from Health Information National Trends Survey (2014), a nationally representative sample of adults (18 years or older) in the U.S. (N ¼ 3677). We analyzed data collected from August 20 to November 17, 2014, with a response rate of 26.3% for completed return. Analysis for this study filtered out those who did not have smartphone or tablet. This helped us capture a sample of respondents who had the easier or more rapid access to those digital devices and personalized mobile applications related to health. Table 1 summarizes the characteristics of both smartphone/tablet (n ¼ 2392, the interest to this study) and Internet user samples (n ¼ 2638). Just over 90 percent of Internet users in HINTS sample had a smartphone or tablet, reflecting a trend of increasing smartphone penetration. The rate was higher than the general population in the U.S. where 81 percent of the public owned a smartphone device in 2016. Nevertheless, there was no discernible demographic difference be­ tween Internet and smartphone/tablet sample users (Table 1). Age of smartphone/tablet sample was not necessarily skewed toward the younger generation (M ¼ 50.18), with no difference from Internet users (M ¼ 51.54). Although we have close to an equal representation of men and women (60.4%; 59.6%, respectively, for smartphone/tablet and Internet samples), the proportion of women was slightly higher than the U.S. population. The average education level of the smartphone/tablet

Fig. 1. Hypothesized relationships. 3

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204

health issues coded as 1.

Table 1 Characteristics of Internet (n ¼ 2638) and smartphone/tablet sample (n ¼ 2392). % or M

SD

Min

Max

15.40 14.97

18 18 0

94 94 1

Income

51.54 50.18 59.6 60.4 5.19 5.17 5.68

Nonwhite (coded higher) Condition

5.74 0.22 0.25 1.22

2.13 0.42 0.43 1.28

1.18 0.77 0.76 1.94

1.27 0.42 0.42 0.66

Age Female (coded high) Education

Internet Reliance for Health Health Privacy Confidence Health Privacy Concern Interest in Sharing

1.89 25.46

1.45 1.49 2.10

0.69 8.22

2.3. Outcome variables: health-related use of information technologies To operationalize outcome variables, health-related use of informa­ tion technologies was conceptualized as three dimensions: (a) infor­ mation exchange with a health provider, (b) no exchange at all, and (c) having health-related software or mobile applications. We used dichotomous items, with Yes coded as 1 and No as 0. For (a) information exchange, respondents were asked to report whether in the past 12 months they used each of the followings to exchange medical informa­ tion with a health professional: email (M ¼ 0.29, SD ¼ 0.45), text (M ¼ 0.08, SD ¼ 0.28), social media (M ¼ 0.03, SD ¼ 0.19), and an app on a smartphone or mobile device (M ¼ 0.06, SD ¼ 0.24). For (b) no ex­ change, a unitary item was used. Respondents were asked, “In the past 12 months, you have used “None” (of the above) to exchange medical information with a health care professional” (M ¼ 0.63, SD ¼ 0.48). A unitary item was also used for (c) having health-related software/app. The wording for the question was: “On your tablet or smartphone, do you have any software applications or “apps” related to health?” (M ¼ 0.35, SD ¼ 0.48). Here it is important to note dimensional distinction. Information exchange items (a) are those that capture active engagement or inter­ action with a health professional via digital technologies, whereas the item for no exchange (b) is to recognize disengagement from such technological uses. Having health-related software or applications (c), on the other hand, indicates the potential access to machine-based automated algorithm, not necessarily via personal contact with a health provider. These items could be combined to create a parsimo­ nious index. However, we opted to analyze them as disparate measures because we were interested in different forms of health-related tech­ nologies, which were key to this study. For instance, email use and its formality is distinctively different from texting usually reserved for more intimate and informal relationships. Likewise, social media platform is often used for fostering social support and interconnectedness. This is also in line with our conceptual premise that the contextual subtlety in privacy (and its protection) as perceived by individuals may produce significant difference in their behavioral outcomes. By parceling out different forms of personalized technologies, we can detect such dif­ ferences more clearly as the type and extent of influences from predic­ tive variables could be observed precisely.



1 (� 9999) 0 0 (no medical condition) 0 1 1 9



– –

– – – –

9 (� 200,000) 1 6 (aggregate) 1 3 3 36



Note. Entries in the first row of each characteristic and those in the second row denotes values of smartphone/tablet sample.

sample was 5.17 (some college, on a scale of 7), with the income level at $49,000 (M ¼ 5.74, on a scale of 9). In terms of race, 75 percent of the smartphone/tablet user sample was white. These self-reported de­ mographic profiles served as control variables in our analyses. 2.2. Measures of mediator and moderators In specifying contextual process, the mediating variable was conceptualized in terms of a person’s interest in sharing personal data with a health professional. Participants were asked on a scale of 4 (1 ¼ not at all, 4 ¼ very) about the extent of their interest to share each of nine medical data. Those were: general health tips; medication reminders; lab/test results; diagnostic information (e.g., medical illnesses or dis­ eases); vital signs (e.g., heart rate, blood pressure, glucose level); life­ style behaviors (e.g., physical activity, food intake, and the like); symptoms (e.g., nausea, pain, dizziness, etc.); and digital images/video (e.g., photos of skin lesions). We constructed an additive scale by adding individual scores of the nine items (M ¼ 25.46, SD ¼ 8.22). Certainly, sharing these data may overlap each other because different medical information is often needed simultaneously for multiple clinical pur­ poses. Thus, instead of teasing out what type of data is more likely to be shared than one or another at a given point, we examined the combined interest and its contribution to the changes in dependent variables. Cronbach’s alpha reliability coefficient for the scale was .92. We used two moderating variables in terms of specifying contextual condition: (a) a reliance on Internet for health information and (b) a medical condition. A reliance on Internet for health information was measured by asking respondents to identify the source of information to which they go first when looking for medical or health related topic. Respondents were given twelve categories: brochures, pamphlets, etc.; cancer organization; family; friend/co-worker; doctor or health care provider; library; magazines; newspaper; telephone information num­ ber; complementary, alternative, or unconventional practitioner; and the Internet. Categories were recoded as Internet (1) and all others (0) (M ¼ 0.76, SD ¼ 0.42). A medical condition was measured by counting the number of chronic conditions each respondent reported (diabetes, high blood, heart attack/condition, chronic lung disease, arthritis, or depression–anxiety). A variable was created based on each respondent’s cumulative scores (M ¼ 1.18, SD ¼ 1.27), with each of any chronic

2.4. Privacy predictors As for predictive variables, this study accounts for variance attrib­ utable to two sets of privacy variables: (a) concern and (b) confidence. As discussed earlier, the dominant measure of privacy attitude has been concern (Westin, 2003). Although useful, this is a self-reported measure of affect that assesses an emotional state of worry or fear in a single dimension. But privacy attitude has both negative and positive di­ mensions and excluding this positive dimension would have precluded the ability to compare the extents of behavioral disjuncture invoked by positive and negative appraisals regarding their use of new technologies. Thus, this study included two separate measures of privacy confidence and concern in all relevant analyses. Two unitary items, related the electronic health data, were used. Privacy concern was measured by asking, “If your medical information is sent electronically from one health care provider to another, how concerned are you?” (M ¼ 1.89, SD ¼ 0.69, on a 3-point scale of 1 ¼ not concerned, 2 ¼ somewhat concerned, 3 ¼ very concerned). Privacy confi­ dence was measured by asking, “How confident are you that safeguards (including the use of technology) are in place to protect your medical records?” (M ¼ 1.94, SD ¼ 0.66, on a 3-point scale of 1 ¼ not confident, 2 ¼ somewhat confident, 3 ¼ very confident). To assess the discriminant validity of two items, un-rotated principal component factor analysis was performed. Factor loading confirmed the one component solution 4

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204

actual process in which the discrete relationships between all variables can be more precisely examined. Certainly, a large sample size like ours (n ¼ 2392) warrants Sobel tests, which allow the legitimate use of standard errors without the need to resort to bootstrapping simulation.

( 0.802, at the eigenvalue of 1.28), with the total variance accounted for 64.3%. The negative loading signified that each item was related to the factor solution in an opposite direction. Accordingly, two items of concern and confidence were treated as respective measures, instead of combining them into one factor score. Inter-item correlation (r) was – 0.28 and significant at 0.01, lending support to the separation of two attitudinal items.

4. Results RQ1 asked the extent to which privacy attitudes toward medical data affected health-related use of information technologies, with pre­ liminary results concerning bivariate correlations shown in Fig. 2. Di­ rections of correlations were in line with what’s expected of consequences of privacy attitudes. Those who expressed concern tended to report their disengagement from information technologies, while those with higher confidence were more likely to report engagement. For instance, email exchange was negatively correlated with concern (r ¼ 0.09, p < .01), but positively with confidence (r ¼ 0.05, p < .05). Those who did not engage in any of the technologies expressed higher concern (r ¼ 0.06, p < .01) and lower confidence (r ¼ 0.06, p < .01). Still, the patterns were not consistent. Text message exchange was found positively correlated with confidence (r ¼ 0.07, p < .01), but not with concern at all. Social media exchange was not correlated with either concern or confidence, whereas having a health-related app/software on smartphone or mobile device was only marginally correlated with confidence (r ¼ 0.06, p < .10). Table 2 shows the result of multivariate OLS regression that exam­ ined the relationships between predictors of privacy concern and con­ fidence, controlling all variables in one model (RQ1). Paradoxical noneffects of privacy attitudes were even more evident than what’s shown in correlational analyses. First, in terms of privacy concern, we found no significant effect. The effects of privacy confidence, on the other hand, were present ((b) ¼ 1.87, p < .01, mobile app exchange; (b) ¼ 1.67, p < .05, text message exchange; (b) ¼ 0.82, p < .05, no exchange). However, its effects were absent for email exchange, social media exchange, and having a health-related app/software. In other words, although privacy confidence had more predictive powers than privacy concern, the effects were sporadic with no consistency. Interestingly, both privacy confi­ dence and concern failed to predict email exchange, social media ex­ change, and having a health-related app/software. Reflecting these results, overall explanatory powers (Nagelkerke R2) in each model remained as low as 0.040. Before turning to the findings for H1 and H2, it is critical to note significant predictors from other control variables to better understand the effects of privacy predictors. As shown in Table 2, contributions of socio-demographic variables were larger than those of privacy concern

3. Data analysis All the effects of privacy concern and confidence were examined with zero-order bivariate correlations. Then, OLS least-square regression was run to investigate the relationships between privacy predictors and health-related use of technologies (RQ1). Followed by privacy pre­ dictors, we entered demographics and all variables to see if the bivariate relationships would sustain more robust controls. H1 moderation (contextual condition) was investigated via interaction terms between (a) a reliance on Internet for health, (b) a medical condition and each of privacy predictors. H2 mediation (contextual process) was examined in two stages. The first stage consisted of two successive OLS regressions, having the first regression with independent variables (concern and confidence) predicting the mediator (a person’s interest in exchanging personal data) and then, the effects of the mediator estimated on the dependent variables. As for the second stage, Sobel tests were performed to determine mediating effects on overall models, that is, to see whether the inclusion of the mediator in the models would produce the signifi­ cant reduction or change in the effects of independent variables. Before moving on to the results, a few comments regarding the test of mediation are in order. The most conventional mediation test has been an approach by Baron and Kenny (1986). Their method, however, rigidly requires independent, dependent, and mediator variables to be all correlated. Moreover, Baron and Kenny (1986) paid little attention to the size and significance of indirect effects in each path. This study follows the approach suggested by Valkenburg and Peter (2007), as described above. Their suggestion is useful to the purpose of this study, because we start with the assumption of privacy paradox that there may be no evidence X (concern) and Y (behavior) are directly associated. In other words, this study is interested in parceling out each step of the proposed mediation, being attentive to all other variables in three models (i.e., from X1 to Y, X1 to X2, and X2 to Y). Conceptually, there is no difference between Andrew Hayes’ model and ours. While Hayes’ model provides a parsimonious test of intervening effects when the main interest is in the explanatory power of the variable, what we take is an ‘old school’ way. To the present study, a primary goal is detecting the

Fig. 2. Correlations between confidence, concern, and each activity. 5

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204

Table 2 Logistic regression on privacy concern and confidence. (Yes/No)

Having Health-Related App/ Software

Mobile App Exchange

Text Message Exchange

Social Media Exchange

Email Exchange

No Exchange

Constant

0.22 (0.49) **

0.01 (1.28) **

0.02 (1.23) **

0.02 (2.02) þ

0.03 (0.84) ***

13.01 (0.48) ***

1.10 (0.03) 1.29 (0.12) 1.24 (0.13) 1.16 (0.04) 0.97 (0.00)

** ***

1.05 (0.08) 0.95 (0.30) 0.84 (0.34) 1.13 (0.12) 0.99 (0.01)

1.12 (0.07) 0.86 (0.28) 1.54 (0.30) 1.12 (0.12) 0.99 (0.01)

0.77 (0.11) * 0.74 (0.50) 1.14 (0.52) 1.29 (0.20) 1.02 (0.01)

1.13 (0.12) * 0.97 (0.19) 1.13 (0.21) 1.20 (0.08) * 0.99 (0.00)

0.86 (0.03) *** 1.07 (0.11) 0.71 (0.13) * 0.83 (0.04) *** 1.00 (0.00)

1.22 (0.05) *** 1.16 (0.14)

1.22 (0.11) 0.77 (0.35)

1.12 (0.11) 0.65 (0.32)

1.00 (0.19) 0.59 (0.52)

1.20 (0.08) * 1.00 (0.24)

0.85 (0.04) ** 0.94 (0.14)

1.14 (0.09) 1.13 (0.09) .082

1.87 (0.24) ** 1.16 (0.23) .043

1.67 (0.23) * 1.00 (0.22) .050

0.95 (0.37) 1.07 (0.36) .063

1.47 (0.15) 1.00 (0.15) .069

0.82 (0.09) * 1.12 (0.09) .071

Demographic Background Income Gender (Female) Race (Nonwhite) Education Age Contextual Background Medical Condition Internet Reliance for Health Privacy Predictor Variables Confidence Concern Nagelkerke R2

** *

Note. Entries in parentheses are standard error.

and confidence, with income and education found to be the most consistent predictors. Those with lower level of income were more likely to disengage from the use of technologies ((b) ¼ 0.86, p < .001). People with higher income tended to report higher level of email exchange and having a health-related app/software ((b) ¼ 1.13, p < .05; 1.10, p < .01), while they were less likely to rely on social media exchange ((b) ¼ 0.77, p < .05). In a similar vein, education played critical roles. People with higher education tended to have a health-related app/software ((b) ¼ 1.16, p < .01) and were found more likely to use email for health data exchange ((b) ¼ 1.20, p < .05). As in income, those with lower education were also more likely to disengage from any technologies ((b) ¼ 0.83, p < .001), suggesting far more robust effects of socio-demographics, as compared to those of privacy concern and confidence. H1 hypothesized the contextual condition in which (a) a medical condition and (b) a reliance on Internet for health could moderate behavioral effects of privacy concern and privacy confidence. Table 3 shows the mixed supports. We found a significant interaction as a medical condition moderated privacy concern to predict social media exchange ((b) ¼ 0.72, p < .05, se ¼ 0.13), although the main associa­ tions for concern were not significant. This finding attests a possibility that privacy concern when coupled with no medical issue may have led to disengagement from the use of technologies at all. Similarly, a med­ ical condition moderated privacy confidence in predicting mobile app exchange ((b) ¼ 0.72, p < .05, se ¼ 0.16), indicating that for those with any medical condition, they were likely to disengage from the use of information technology when they had no privacy confidence. In addi­ tion, a medical condition interacted with privacy confidence for no ex­ change. However, the significance was marginal ((b) ¼ 1.13, p < .10). In terms of a reliance on Internet for health, none of interactions was significant. Table 4 depicts the findings regarding the mediation of a person’s

interest in sharing personal data with a health professional (H2). The upper part of Table 4 shows the results from the first analyses regressing the IVs on the mediator. The results from second sets of analyses regressing the mediator on the DVs were displayed in the lower part of Table. The supports for mediation were clear. First, privacy confidence and privacy concern significantly predicted the interest in sharing, but in the opposite direction ((b) ¼ 0.12, p < .001, for confidence; (b) ¼ 0.09, p < .01, for concern, respectively). Second, all dimensions of healthrelated use of information technologies were predicted by the interest in sharing, controlling other variables. The only DV that was not found to be directly related to the interest in sharing was social media ex­ change (see Table 5). The Sobel tests confirmed the presence of mediation. The indirect effects were sizable and consistent, for instance, for having a healthrelated app/software (z-score ¼ 3.33, p < .001, confidence; z-score ¼ 2.72, p < .001, concern). For no exchange, the supports for indirect relationships were also significant and robust (z-score ¼ 3.63, p < .001, confidence; z-score ¼ 2.80, p < .001, concern). However, for having a health-related app/software, the relationships were found to be only marginal, with the z-scores not leading to the critical value of 1.96. Thus, the mediator did not carry the influence of the independent var­ iable to the dependent variable in this dimension. As shown in the above main regressions, the Sobel test yielded no significant indirect effect for social media exchange. 5. Discussion 5.1. Theoretical implication The aim of this study was to generate theoretical insights from the notion of contextual integrity and to use it as a framework for better

Table 3 Moderating effects. Interaction Terms Privacy Confidence x Medical ondition x Internet Reliance for Health Privacy Concern x Medical Condition x Internet Reliance for Health Nagelkerke R2

Having Health-Related App/ Software

Mobile App Exchange

Text Message Exchange

Social Media Exchange

Email Exchange

No Exchange

1.02 (0.07) 1.38 (0.22)

0.72 (0.13) * 0.66 (0.45)

1.10 (0.12) 0.80 (0.39)

0.79 (0.17) 0.97 (0.49)

0.87 (0.77) 0.99 (0.23)

1.13 (0.73) þ 0.86 (0.21)

1.01 (0.06) 1.21 (0.22)

0.83 (0.12) 0.57 (0.41)

1.01 (0.11) 0.93 (0.37)

0.72 (0.16) * 0.52 (0.46)

0.97 (0.07) 0.85 (0.22)

1.04 (0.06) 1.12 (0.20)

.082

.055

.040

.070

.080

.075

Note. Entries in parentheses are standard error. 6

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204

The results of this study show that the phenomenon, widely known as so-called ‘privacy paradox’, was very much present in the context of health-related use of information technologies, as shown by tenuous behavioral effects of both concern and confidence in our regression analysis. These patterns of non-significant associations (withstanding privacy confidence being significant in spurring engagement via text and mobile app) provide theoretical guidance in re-conceptualizing the effects of privacy attitudes as more contextually sensitive, not purely as direct consequences. Our findings regarding the mediation help sharpen the contour of this line of thinking by specifying the contextual process in which behavioral effects of privacy concern and confidence are mediated through a person’s interest in sharing personal data. That is to say, the active engagement with health-information technologies seems, not only to be a manifestation of one’s concern (or confidence) alone, but also the illustration of social norms and contexts in which one can feel able to develop the interest in sharing and open up personal boundaries. This is consistent with the notion of contextual integrity, but in this case, extending its applicability to health data flow. We stress that contextual integrity of privacy may well be reflected through people’s interest, as well as their perceptions and evaluation, about the appro­ priateness of sharing personal details. Being able to disclose personal boundaries is essential of a human wellbeing in that this state of interest help us connect to others (Acquisti et al., 2015; Jiang & Hancock, 2013). What we argue is that privacy attitudes (of concern and confidence alike) are likely to configure the way the interest to share personal data is being developed, which eventually determines how people come to connect with the rapidly digitalizing health system. If we put this in the theoretical context, we can say that the effects of privacy concern and confidence are not necessarily direct, but indirect as it is fruitful to un­ derstand privacy in light of contextual integrity, which helps us dissect the subtle contextual process and identify some key pieces of the puzzle.

Table 4 Mediating effects. First Set of Regression Analysis DV: Interest in Sharing IV: Privacy Confidence IV: Privacy Concern Constant 22.82 (1.66) Second Set of Regression Analysis DV: No Exchange IV: Privacy Confidence IV: Privacy Concern MV: Interest in Sharing Constant 4.06 (0.54) DV: Email Exchange IV: Privacy Confidence IV: Privacy Concern MV: Interest in Sharing Constant 4.76 (0.58) DV: Social Media Exchange IV: Privacy Confidence IV: Privacy Concern MV: Interest in Sharing Constant 3.53 (1.34) DV: Text Message Exchange IV: Privacy Confidence IV: Privacy Concern MV: Interest in Sharing Constant 5.18 (0.96) DV: Mobile App Exchange IV: Privacy Confidence IV: Privacy Concern MV: Interest in Sharing Constant 6.19 (1.17) DV: Having Health-Related App/Software IV: Privacy Confidence IV: Privacy Concern MV: Interest in Sharing Constant 2.58 (0.54)

B (SE) β 1.43 (0.32) 0.12 *** 1.01 (0.31) 0.09 ** B (SE) exp(b) 0.11 (0.09) 0.89 0.07 (0.09) 1.08 0.05 (0.00) 0.94 *** 0.07 (0.10) 1.07 0.16 (0.10) 0.85 0.07 (0.00) 1.07 *** 0.03 (0.24) 1.03 0.06 (0.23) 1.06 0.02 (0.02) 1.02 0.40 (0.16) 1.49 * 0.06 (0.16) 0.93 0.05 (0.01) 1.05 ** 0.56 (0.20) 1.75 ** 0.27 (0.19) 1.32 0.08 (0.02) 1.08 *** 0.10 (0.09) 1.11 0.19 (0.09) 1.20 * 0.04 (0.00) 1.04 ***

5.2. Moderation finding The results of moderation offer another support for the explanation of contextual integrity, though the findings were discrete rather than uniform. Foremost, privacy confidence, when specified with the contextual condition, gained its predictive power for mobile app ex­ change, as people with lower confidence remained disengaged from its use despite having a chronical medical condition. On the other hand, a medical condition also moderated the effect of privacy concern in pre­ dicting social media use, as those who had a medical issue did not necessarily engage with social media when they had privacy concern. In this context, the finding that the direct effect of a chronic medical condition in the use of technologies remained miniscule is telling. In other words, the medical condition alone did not lead to active engagement with health-information technologies, while its impacts were discretely moderated by privacy concern and confidence. The important lesson is the persistent weight of privacy attitudes in influ­ encing people’s engagement in a highly selected and situated fashion, despite their tenuous direct effects. Extending this perspective, it is significant to see how effects of the ‘objective’ contextual conditions, such as having a chronical medical issue, can be qualified with the ‘subjective’ state of positive (confidence) and negative (concern) ap­ praisals of privacy. This is not to say that those health-related facts do not matter when it comes down to technology use. But it appears that people are less inclined to engage with health providers via technolo­ gies, so long as their privacy perceptions concerning technologies are not fitting.

Table 5 Indirect paths tested using Sobel’s method. Z-score

No Exchange Email Exchange Social Media Exchange Text Message Exchange Mobile App Exchange Having HealthRelated App/ Software

Confidence→Interest in Sharing→DVs

Concern→Interest in Sharing→DVs

3.63 *** 3.87 *** 0.97

2.88 ** 3.00 *** 0.95

3.33 ***

2.72 **

2.98 ** 1.90 þ

2.52 ** 1.77 þ

understanding the shaping of digital-based health system from the perspective of individual users. This aim was justified by the theoretical need for conceptualizing paradoxical non-behavioral effect of privacy attitudes (of concern and confidence) in the context of heath privacy. We are not arguing that the conceptual premise of contextual integrity has the sole explanatory power, but rather that the broader context sur­ rounding the flow of personal data in the rapid digitalization of health services calls for additional theoretical explanation. With the measures of HINT data that were publicly accessible, this study was able to generate a strand of moderators and a mediator that can be grouped into two contextual process and condition. That is to say, privacy, when its contexts are specified, helps us better understand how the shaping of personal information flow in algorithm-based health system will be a critical factor in inducing (or curtailing) people’s engagement, with significant behavioral consequences.

5.3. Mediation finding The result of marginally significant mediation for having a healthrelated app or software is perplexing. Foremost, the Sobel test revealed tenuous yet significant mediations of privacy concern and 7

Computers in Human Behavior 105 (2020) 106204

Y.J. Park and D.D. Shin

confidence. But when the mediation was broken down into each step, privacy concern in the second regression turned positive, indicating that concerned people were more likely to rely on health-related app or software. While this finding supported the prior studies that found pri­ vacy concern was inversely related to protective behavior and the gen­ eral theme of privacy paradox (Hargittai & Marwick, 2016), this adds another complexity: the addition of the mediator helped privacy concern stand out as a predictor of having an algorithmic machine-based app and software. In fact, the best possible interpretation is that access to auto­ mated app and software results from higher privacy concern, because they give a false impression that app or software does not involve per­ sonal surveillance via health providers. We speculate this, given public misunderstanding over data algorithm in broader digital ecosystem, although we did not have a direct measure of people’s awareness in this particular regard.

Similarly, our current work remains limited in that we could not gain more precise insights on behavioral effects of privacy concern and confidence. Thus, future studies should ask how the rationality of pri­ vacy will remain contextual, bounded by one’s interest as well as sociodemographic backgrounds. In this case, we will have privacy attitude, not as the predictor, but as the mediator that relays the effects of sociodemographics to the development of one’s interest—that is to say, two mediators intervening the influence of socio-demographics. Longitudi­ nal panel dataset will shed a further light by enabling us to dissect (a) temporal priorities among this study’s constructs, especially concerning the use of technologies and privacy attitudes, and (b) the question of causality that is not addressed in this work. The data set of 2014 HINTS will also need updates, though our measures related to privacy concern, confidence, and the interest, would not be critically different from what we would have in the recent years. Particularly, we see the need to include better measures of the use of technologies that are constantly evolving, because we could not have measures related to algorithmbased consumer devices such as Google Home, Amazon Alexa, as well as health-related wearables. The inclusion of those items will help us better capture the extent of health-related digital engagement. Related, we were puzzled by the lack of substantial relationships for social media use, but this may be explained by strikingly low frequency. The mean of social media use was 0.03, with very low variation (SD) of 0.19, sug­ gesting it was extremely rare that people used social media for healthrelated purposes. None significant interaction with a reliance on Internet also indicates that its explanatory power perhaps began to disappear because of the prevalence of online use (with 76% of respondents indicating their reliance on Internet for health purposes). As noted earlier, other digital consumption measures, which will help researchers discern the roles of emerging algorithm-based devices, should follow in future studies. Surely, political, cultural, or interpersonal context imbued with different populations of teenagers, the elderly, or minorities might invoke different conceptual underpinnings, helping us detect the presence of privacy paradox that may not be parallel in respective contexts €m, 2015; Park, 2018). To illustrate, the contextual boundaries (Bergstro of privacy can differ even in a single domain of health, when we dissect into different health conditions as well as digital platforms, apps and software. Thus, it would be fruitful to account for such contextual complexities in future research by asking how behavioral effects of privacy attitudes would occur, not in isolation but in connection with other digital domains.

5.4. Finding related to socio-demographics It is interesting that the multivariate model, accounting for sociodemographics, displayed that those with higher education and in­ come, as well as younger people, were more likely to turn to mobilebased algorithmic app and software for health-related issues. The irony is that access to mobile-based software and app, while conve­ niently aiding people to bypass a face-to-face interaction with health providers, can have unintended consequences. The puzzle of privacy concern set aside for the moment, it appears that at least based on this study’s finding, those with higher socioeconomic background are not necessarily immune from the vulnerability of privacy associated with the reliance on algorithm based health-related app and software. 6. Future studies and implications At the end, the fundamental insight is that privacy is being shaped, not in a contextual vacuum, but in intersection with differentiated contextual process and condition, as they shape how the digitalization of a health system eventually comes to be used and realized. This is to make the case that it’s impossible to divorce privacy—its concern and con­ fidence—from the success of future medical system in its algorithmic transition and the societal context in which privacy, its values, and norms are evaluated and practiced. The fact that privacy concern and confidence have no direct impact does not mean that privacy doesn’t matter with no behavioral consequence—as skeptics might argue. Instead, what we suggest is that it should be taken as a lesson that pri­ vacy is in fact ‘contextually bounded’ as privacy concern and confi­ dence, only when highly contextualized, become tangible and bring meaningful consequences. At the end, the effects of privacy remain deeply embedded in specific contextual process and conditions, without which hardly detectible is how precisely people’s concern, emerging norms, and values about personal data come to take a shape. We comment here on several substantive issues worth further research. First, what this study had strived to achieve is the under­ standing of contextual specificities that bound the rationality of behavioral responses to concern and confidence, as their behaviors are qualified to be resulting from social norms, or emerging perceptions, notably about contextual interest to share. We are concerned that soci­ odemographic gaps persisted in the dimension of disengagement (no exchange). In this regard, this study hasn’t explicitly investigated sociodemographics under the notion of contextual integrity, but this will be a particularly fruitful inquiry, for instance, on how and to what extent different communities and those with low income remain disengaged from the digital health system because of privacy concern (Park & Chung, 2017; Tifferet, 2018). Future studies should also undertake how socio-demographics incubate the perceived context in which individuals evaluate appropriateness of privacy differently, potentially producing the interest (or disinterest) in data sharing that influences the extent of digital engagement.

Appendix A. Supplementary data Supplementary data to this article can be found online at https://doi. org/10.1016/j.chb.2019.106204. References Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and human behavior in the age of information. Science, 347(6221), 509–514. Acquisti, A., & Grossklags, J. (2005). Privacy and rationality in individual decision making. IEEE Security and Privacy, 3(1), 26–33. Barocas, S., & Nissenbaum, H. (2014). Big data’s end run around procedural privacy protections. Communications of the ACM, 57(11), 31–33. Baron, R. M., & Kenny, D. A. (1986). The moderator–mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173. Bergstr€ om, A. (2015). Online privacy concerns: A broad approach to understanding the concerns of different groups for different uses. Computers in Human Behavior, 53, 419–426. Chen, W., Quan-Haase, A., & Park, Y. J. (2018). Privacy and data management: The user and producer perspectives. American Behavioral Scientist, 62(10), 1316–1318. Cotten, S. R., & Gupta, S. S. (2004). Characteristics of online and offline health information seekers and factors that discriminate between them. Social Science & Medicine, 59(9), 1795–1806. Dienlin, T., & Trepte, S. (2015). Is the privacy paradox a relic of the past? An in-depth analysis of privacy attitudes and privacy behaviors. European Journal of Social Psychology, 45(3), 285–297.

8

Y.J. Park and D.D. Shin

Computers in Human Behavior 105 (2020) 106204 Nissenbaum, H. (2011). A contextual approach to privacy online. Dædalus, 140(4), 32–48. Olivero, N., & Lunt, P. (2004). Privacy versus willingness to disclose in e-commerce exchanges: The effect of risk awareness on the relative role of trust and control. Journal of Economic Psychology, 25(2), 243–262. Park, Y. J. (2018). Social antecedents and consequences of political privacy. New Media and Society, 20(7), 2352–2369. Park, Y. J., & Chung, J. E. (2017). Health privacy as sociotechnical capital. Computers in Human Behavior, 76, 227–236. Park, Y. J., Chung, J. E., & Shin, D. H. (2018). The structuration of digital ecosystem, privacy, and big data intelligence. American Behavioral Scientist, 62(10), 1319–1337. Rice, R. E. (2006). Influences, usage, and outcomes of internet health information searching: Multivariate results from the pew surveys. International Journal of Medical Informatics, 75(1), 8–28. Tan, S. S. L., & Goonawardene, N. (2017). Internet health information seeking and the patient physician relationship: A systematic review. Journal of Medical Internet Research, 19(1). Tifferet, S. (2018). Gender differences in privacy tendencies on social network sites: A meta-analysis. Computers in Human Behavior, 93, 1–12. Trepte, S., Teutsch, D., Masur, P. K., Eicher, C., Fischer, M., Hennh€ ofer, A., et al. (2014). Do people know about privacy and data protection strategies? Towards the “online privacy literacy scale” (OPLIS). Reforming European Data Protection Law, 20, 333. Valkenburg, P. M., & Peter, J. (2007). Online communication and adolescent well-being: Testing the stimulation versus the displacement hypothesis. Journal of ComputerMediated Communication, 12(4), 1169–1182. Westin, A. F. (2003). Social and political dimensions of privacy. Journal of Social Issues, 59(2), 431–453.

Gandy, O. H., & Nemorin, S. (2018). Toward a political economy of nudge: Smart city variations. Information, Communication & Society. Hargittai, E., & Marwick, A. (2016). “What can I really do?” Explaining the privacy paradox with online apathy. International Journal of Communication, 10, 21. Jensen, C., & Potts, C. (2004). Privacy policies as decision-making tools: An evaluation of online privacy notices. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 471–478). ACM. Jiang, C. L., & Hancock, J. T. (2013). Absence makes the communication grow fonder: Geographic separation, interpersonal media, and intimacy in dating relationships. Journal of Communication, 63(3), 556–577. Katz, J. E., & Tassone, A. R. (1990). A report: Public opinion trends: Privacy and information technology. Public Opinion Quarterly, 54(1), 125–143. Libert, T. (2015). Privacy implications of health information seeking on the web. Communications of the ACM, 58(3), 68–77. Lin, W. Y., Zhang, X., Song, H., & Omori, K. (2016). Health information seeking in the Web 2.0 age:Trust in social media, uncertainty reduction, and self-disclosure. Computers in Human Behavior, 56, 289–294. Lupton, D. (2019). The thing-power of the human-app health assemblage: Thinking with vital materialism. Social Theory & Health, 17(2), 125–139. Marwick, A. E., & boyd, D. (2014). Networked privacy: How teenagers negotiate context in social media. New Media & Society, 16(7), 1051–1067. McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. ISJLP, 4, 543. McMullan, M. (2006). Patients using the Internet to obtain health information: How this affects the patient–health professional relationship. Patient Education and Counseling, 63(1–2), 24–28.

9