An analytical framework for online privacy research: What is missing?

An analytical framework for online privacy research: What is missing?

Accepted Manuscript Title: An analytical framework for online privacy research: What is missing? Authors: Avshalom Ginosar, Yaron Ariel PII: DOI: Refe...

613KB Sizes 10 Downloads 99 Views

Accepted Manuscript Title: An analytical framework for online privacy research: What is missing? Authors: Avshalom Ginosar, Yaron Ariel PII: DOI: Reference:

S0378-7206(17)30091-5 http://dx.doi.org/doi:10.1016/j.im.2017.02.004 INFMAN 2979

To appear in:

INFMAN

Received date: Revised date: Accepted date:

13-6-2016 21-1-2017 2-2-2017

Please cite this article as: Avshalom Ginosar, Yaron Ariel, An analytical framework for online privacy research: What is missing?, Information and Management http://dx.doi.org/10.1016/j.im.2017.02.004 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

TITLE PAGE

An analytical framework for online privacy research: What is missing?

Authors: Avshalom Ginosar & Yaron Ariel Affiliation: Department of Communication, The Academic College of Yezreel Valley, Israel

Abstract The study of online privacy addresses three separated domains: user privacy concerns and behavior, website privacy notices and practices, and state privacy policies and regulations. This study suggests an analytical framework which combines these domains. While most of the framework's variables are addressed in current literature, one is missing: views of websites' managers. Our survey reveals that managers share users' concerns but report that their own websites do not violate users' privacy although asking them to provide personal information. The younger the website managers were, the less they were concerned, but the more they acted to safeguard user privacy.

Keywords: online privacy; privacy policy; personal information; privacy concerns; privacy behavior

1

Highlights

    

An analytical framework is suggested for interdisciplinary online privacy research Websites' managers views and knowledge is a neglected topic in privacy research Websites' managers indicate that their own websites do not violate users' privacy The younger the website manager is, the less he/she is concerned about user privacy The younger the website manager is, the more he/she acts to safeguard user privacy

2

An analytical framework for online privacy research: What is missing?

Abstract The study of online privacy addresses three separated domains: user privacy concerns and behavior, website privacy notices and practices, and state privacy policies and regulations. This study suggests an analytical framework which combines these domains. While most of the framework's variables are addressed in current literature, one is missing: views of websites' managers. Our survey reveals that managers share users' concerns but report that their own websites do not violate users' privacy although asking them to provide personal information. The younger the website managers were, the less they were concerned, but the more they acted to safeguard user privacy.

Keywords: online privacy; privacy policy; personal information; privacy concerns; privacy behavior

1. Introduction Privacy has become a major issue in the research of the interface between Internet technologies and society (Bennett & Parsons, 2013; Smith, Dinev, & Xu, 2011). When mapping the privacy research literature, Greenaway & Chan (2005) identified three levels of analysis: the individual, the sectoral/national, and the organizational. The individual level of research mainly addresses Internet users and focuses on their concerns regarding online privacy as well as their behavior following these concerns (e.g. Awad & Krishnan, 2006; Earp & Baumer, 2003; Luo, 2002; Paine, Reips, Stieger, Joinson, & Buchanan, 2007). The sectoral/national level of research focuses mainly on policy-makers and regulators as protectors of users' privacy, and on the benefits and pitfalls of state regulation vs. industry self-regulation (e.g. Ang, 2001; Hirsch, 2011; Rasmus & Stine, 2013; Strauss & Rogerson, 2002). The third level of research - organizational - addresses the individual organization (a firm or a website). At this level, two different points of view can be found. First is the study of website privacy notices, their legal implications, and their effect on users' online behavior. Most of these studies are conducted either by law or communications scholars (Birnhack & Elkin-Koren, 2011; Earp, Antón, Aiman-Smith, & Stufflebeam, 2005; Fernback & Papacharissi, 2007; Jensen & Potts, 2004; Pan & Zinkhan, 2006; Tsai, Egelman, Cranor, & Acquisti, 2011). The second point of view concerns firm–customer relationships, fair procedures for handling customers' personal information, and the benefits of such procedures for business (Culnan & Armstrong, 1999; Gerlach, Widjaja & Buxmann, 2015; Mollick &

3

Mykytyn, 2009; Schwaig, Kane & Storey, 2006). These studies are mostly conducted by scholars of organizational behavior, management, or information studies. The most salient topic in all three levels of research is the online disclosure of personal information and its implications for users' behavior on the one hand, and website policies and practices on the other (Bansal, Zahedi, & Gefen, 2016; Joinson, Reips, Buchanan, & Paine Schofield, 2010; Taddei & Contena, 2013). However, scholars of different disciplines usually address this topic from only one analytical level (according to the above mentioned classification of Greenaway & Chan, 2005). Therefore, the first aim of this work is to propose a unified analytical framework for studying the issue. The framework consists of three research domains: the users, the state, and the websites. It points to variables related to each of these domains and to the presumed connections between them. However, one of the most relevant components within the website domain is hardly addressed in the privacy research literature. Website owners and managers, as individuals, have not been consulted regarding their concerns, views, ideas and efforts (if there are such) about online privacy, its related risks, and the implications for their users and websites alike. Therefore, a survey which directly addressed website owners and managers was conducted in order to fill the gap in current privacy literature. Presenting the empirical findings of this survey is the second aim of this work. It should be noted that the suggested analytical framework includes all three research domains mentioned above; however, we decided to empirically investigate only the websites domain and more specifically – the websites' owners and managers, who are the missing link in current privacy literature. Our decision was practical, and we hope that future studies would employ our suggested framework for investigating online privacy with reference to all three research domains. The next section reviews the privacy literature with respect to the three domains upon which the proposed framework is built. The framework and its various components and connections are then presented, after which we introduce the results of the empirical study and present the views and ideas of website owner/managers about online privacy. Finally, we discuss some theoretical and practical implications and draw our conclusions. 2. Online privacy: Three domains of research Privacy has often been associated with the ability of an individual to control the terms under which personal information is acquired and used (Westin, 1967). This definition, which was originally related to the offline world, is about what can be named "informational privacy" which is only one type of privacy (alongside with physical privacy, psychological privacy, etc.) Yet, Westin's definition has been adopted in different online environments as well, such as online commercial transactions (Culnan & Armstrong, 1999; Greenaway & Chan, 2005; 4

Schwaig, Kane & Storey, 2006), surveillance issues (Lyon, 2001; Oulasvirta et al., 2014), social networks (Gerlach, Widjaja & Buxmann, 2015; Xu, Michael & Chan, 2013), and in a more general manner when discussing any online information system, (Hsu & Kuo, 2003). Bennett and Parsons (2013) argue that although the very essence of internet protocols (regardless of the specific online environment) have disrupted traditional conceptions of personal information control, "basic information privacy principles have become widely accepted, creating a common consensus about how responsible organizations should collect and process personally identifiable information (PII)" (p. 502). Among these principles are: only collect personal information for defined and relevant purposes; only use and disclose information in ways that are consistent with its purposes; grant access and correction rights to individuals, and, finally, keep the data secure. Two main observations should be mentioned: first, these principles are addressed in one way or another in various types of online activities and websites (commercial, social, governmental, etc.). Second, in most cases three groups of stakeholders are relevant: the users, the state, and the websites themselves. Each of these groups responds to the possible risks of online privacy. In the next sections, the three groups of stakeholders are discussed. 2.1 The Users: Privacy paradox and privacy calculus Internet users are occupied by three main privacy concerns while browsing the Net. The first is about the collection of their personal information without their informed consent; the second is about the extent to which their information is shared with third parties, and the third concern is that their personal information may be used for unauthorized, secondary purposes (Mollick & Mykytyn, 2009). Different users behave differently in response to these concerns in terms of the extent to which they are willing to expose their personal information (Bansal, Zahedi, & Gefen, 2016; Fogel & Nehmad, 2009; Riquelme & Roman, 2014). Traditionally, as Westin (1967) argued, one half of the population is pragmatic and willing to trade-off privacy against certain benefits they can get while disclosing personal information; the other half is divided between the unconcerned and the fundamentalists. In online environments, the proportion of these three groups has changed, as Westin (2003) and Sheehan (2002) indicated, but the main division is still relevant. However, it seems that users' behavior regarding the willingness to disclose personal information is much more complex than described by this classification. Knijnenburg, Kosba, & Jin (2013), for example, argue that these distinct groups do not necessarily differ in their overall degree of disclosure; rather, they differ in their disclosure tendencies for each kind of personal information. This means that users can be classified according to different "disclosure profiles."

5

Regardless of the type of user classification, many studies point to the fact that there is a gap between users' concerns regarding the risks of violating their privacy on the one hand, and their willingness to disclose personal information on the other hand. In the research literature this gap is called "the privacy paradox" (Barnes, 2006; Norberg et al., 2007) and is explained mainly by three variables: (a) different kinds of benefits – material as well as social – that users can get while disclosing personal information (Gross & Acquisti, 2005; Kachhi & Link, 2009; Sayre & Horne, 2000; Taddicken, 2014; Xu et al., 2011); (b) different levels of trust that users have, either of specific websites or in specific activities on the Internet (Barney & Hansen, 1994; Earp et al., 2005; Joinson et al., 2010; Mollick & Mykytyn, 2009; Norberg et al., 2007; Tsai et al., 2011); and (c) different levels of knowledge about, and awareness of, the Internet-specific technological characteristics as well as their risks. This knowledge and awareness can considered as online literacy. The level of user's online literacy affects his/her online privacy literacy as well. Livingstone, Bober, and Helsper (2005), who investigated children and youth, demonstrated that the more time users had spent online, the more skilled they became in using the internet, and Livingstone (2011) showed that children who were more digitally literate less likely to be harmed online. In the same vein, Bartsch and Dienlin (2016) who studied Facebook users found that people who spend more time on Facebook and who changed their privacy settings more frequently (meaning they are more experienced) reported to have better online privacy literacy which enabled them to behave more cautiously regarding their privacy. Trepte et al. (2015) pointed to five dimensions of online privacy literacy: knowledge about practices of organizations and online service providers; knowledge about technical aspects of online privacy and data protection; knowledge about laws and legal aspects of online data protections (in both national and supranational levels); and knowledge about user strategies for individual privacy regulation. Park (2013) adds the dimension of understanding privacy policy. Based on all these, one can argue that the level of privacy literacy affects the level of user trust in online activities (Acquisti & Grossklags, 2005; Blank et al., 2014; Paine et al., 2007; Rader, 2014). The issue of online trust is discussed in more detail in section 2.3 dealing with websites' privacy policies. By using the three variables discussed above and the relationships between them, the "privacy calculus" approach attempts to explain – and even predict – users' intentions to disclose personal information. Privacy calculus is based on the comparison of expected benefits (variable a) and perceived risk (variables b and c) in a given context (Culnan & Armstrong, 1999; Laufer & Wolfe, 1977; Smith, Dinev & Xu, 2011). Li (2012) suggests adopting a "dual-calculus model" which consists of "privacy calculus" (the trade-off between expected benefits and privacy risks) and "risk calculus" (the trade-off between privacy risks and the efficacy of coping mechanisms). This author argues that the "dual-calculus model" better 6

predicts an individual's intention to disclose personal information online. However, this suggestion refers not only to a user's perceptions of benefits and risks, but to the existence of "coping mechanisms" related to the risks. The existence of such mechanisms depends on each website's privacy policy and behavior. This is the main topic of the next section. 2.2 The websites: Privacy as a managerial tool The user-centered perspective discussed above focuses on different factors which might affect users' online behavior, in particular disclosing their personal information. This behavior in its turn affects website success, and therefore the question is, “Do websites attempt to influence users‟ behavior, and in what ways?” In other words, how do websites shape their privacy policies in order to make it more appealing for users to disclose their personal information in spite of their concerns? Hsu and Kuo (2003), for example, argue that subjective ethical norms and organization-based self-esteem are the main factors that influence information systems professionals in shaping policies that keep users' privacy. However, this assertion sounds too naïve. If we refer to websites, commercial, social, or governmental, as organizations aiming at success, we should consider two other, more convincing, approaches. The first is institutional theory (IT) according to which most organizations create privacy policies as responses to external pressures, and pursue legitimacy through these policies. The second approach is the resource-based view (RBV) according to which users' information is considered by the organization as an important resource for achieving a competitive advantage (Greenaway & Chan, 2005). In other words, a website‟s privacy policy should be considered as a managerial concept or as an integral part of the website's business model (Barney & Hansen, 1994; Gerlach, Widjaja & Buxmann, 2015). Both approaches share the view that adopting user-friendly privacy policies serves the organization by gaining business advantage. This advantage is either legitimacy according to the IT approach, or competitive advantage according to the RBV approach. The goal of such policies is to establish trusting relationships between the organization (the website) and its customers (the website users). This is a kind of "social contract" based on the balance between the user/customers' need for privacy and the firm/website's need to collect and use personal information. The balance between these two contradictory needs can be achieved if the organization adopts "fair information practices" (FIPs) as the main component of its privacy policy. Five such practices are commonly addressed, based on the OECD guidelines and the US Federal Trade Commission (FTC) regulations (Milne & Culnan, 2002): (a) Notice – users/consumers have a right to know if personal information is being collected and how it will be used; (b) Choice – users/consumers have a choice about whether or not information is collected for one purpose but is used for other purposes, and about whether or not information 7

will be shared with third parties; (c) Access – users/consumers have a right to access their own information and to correct errors; (d) Security – websites/firms must protect personal information from unauthorized access during transmission and storage, and (e) Enforcement – procedures are needed to ensure that organizations comply with their policies, and that consumers‟ complaints are addressed (Culnan & Armstrong, 1999; Culnan & Bies, 2003; Greenaway & Chan, 2005; Mollick & Mykytyn, 2009). Empirical research conducted by Schwaig, Kane and Storey (2006) demonstrated that most of the 500 biggest firms in the US ("Fortune 500") complied with the Notice component of the FIPs, but many of the firms in the list failed to address the other components. Another survey, conducted by the FTC since 1998, tried to find out whether organizations post their privacy policies online and whether those policies include FIP components. Milne & Culnan (2002) compared the survey data between 1998 and 2001 and found that in each year more organizations (compared to the previous year) disclosed their privacy policies online (1998 – 2%; 1999 – 14%; 2000 – 20%; 2001 – 55%). Furthermore, in 2001 more than half of the websites had implemented, at least in part, three components of the FIPs (Notice, Choice, and Security). An important and interesting question in this regard is whether the publication and implementation of privacy policies based on the FIPs affect user behavior by convincing users to disclose personal information. However, there is a pre-condition for privacy policy to have an effect on users, which is that the user reads it. Many studies demonstrated that only about a half of the users bother to do so (e.g. Erap & Baumer, 2003; Gerlach, Widjaja & Buxmann, 2015; Milne & Culnan, 2002; Steinfeld, 2016). Yet, these studies and others show that the length and the language of the privacy policy notice, and whether the reading is mandatory or optional – are among the variables which affect users' willingness to read the notices. Furthermore, some studies demonstrate that when users do read privacy notices, these notices have an effect on user behavior. Here are three examples from three different online domains. First is the study conducted by Gerlach, Widjaja & Buxmann (2015) which reveals that there is a connection between the content of privacy policies of online social networks (OSNs) and the disclosure of personal information by users. The authors show that the level of permissiveness of a privacy policy impacts user perceptions of privacy risks (a mediator variable), and this in turn has an impact on user behavior. This finding aligns with the second example – another recent study showing that transparency regarding the identity, intentions, and practices of the data collector in different surveillance scenarios decreases the privacy concerns of users (Oulasvirta et al., 2014). The third example is from an online consumer domain which demonstrates that there is a causal relationship between three privacy policy variables (the users‟ informed consent for collecting data, limiting data sharing within the 8

organization, and limiting the secondary use of data), and customers‟ perceptions of organizational fairness (Mollick & Mykytyn, 2009). The three studies above share one common understanding: transparent and fair privacy policies and practices affect users' intentions to disclose personal information. Yet many other studies point to trust as an intervening variable between the website policy and practices on the one hand and user's behavior (his/her willingness to disclose personal information) on the other hand. Joinson et al. (2010) argue that a high level of trust compensates for low privacy, and vice versa. Online trust, as Beldad, de Jong, & Steehouder (2010) demonstrate, can be influenced by three main factors: (a) users' experience with the relevant technology and users' tendency to trust; (b) the quality of a specific website in terms of security assurances, and (c) the reputation of a specific online organization. However, studies of online trust reveal diverse factors which affect users' trust levels in different ways. Some factors point to users' personality traits (e.g. Fogel & Nehmad, 2009; Riquelme & Roman, 2014); some point to users‟ perceived control over their personal information (Taddei & Contena, 2013); and some point to users' online experience and their general attitudes towards technology (Dutton & Shepherd, 2006; Blank & Dutton, 2012). Other studies point to factors such as different online contexts (e.g. Bansal, Zahedi, & Gefen, 2016; Bergström, 2015); and the shape and content of websites' privacy statements and/or assurances (e.g. Earp et al., 2005; Joinson et al., 2010; Luo, 2002; Pan & Zinkhan, 2006; Tsai et al., 2011; Wu, Huang, Yen, & Popova, 2012). To sum up, from the organizational point of view, the main variables that affect users' willingness to disclose personal information are (a) the type and content of a privacy notice, (b) the practices a website uses in handling users' personal information, and (c) the benefits that an online firm (through its website) can provide for users in exchange for their personal information. These three variables and their effects on users' behavior have been studied for several years in various online environments. However, one other variable that affects all three of these variables is missing from the current privacy literature. This variable comprises the knowledge, views, and beliefs of website owners and managers regarding online privacy. Owners and managers are the people who make the decisions regarding specific privacy policies and the use of certain practices. It is fair to assume that these views and beliefs are partly influenced by legal requirements and direct regulation. However, it is also reasonable to argue that website owner and managers' personality traits, capabilities, and experience also affect their decisions. Therefore, in order to fill this gap in the current privacy literature, we conducted the current research in which we directly reached out to website owners and managers (see section 4).

9

2.3 The state: Privacy policy challenges As shown in the last two sections, websites as well as users address problems of online privacy risks and react to these risks in different ways. However, as Bennett & Parsons (2013) demonstrate, the issue has also been framed as a policy problem that is addressed through both national and international privacy regimes. It is worth noting that there is an ongoing debate in privacy policy literature about the benefits and pitfalls of state policy and regulation on the one hand, and industry self-regulation on the other hand, regarding which is the better regulatory mechanism for reducing privacy risks and meeting users' concern (Kobsa, 2007). The proponents of state regulation maintain that strong government regulation is necessary to protect unsuspecting users from the self-interested behavior of online firms. On the other hand, supporters of industry self-regulation argue that internet businesses have a market incentive to protect user privacy to avoid losing customers (Hirsch, 2010; Strauss & Rogerson, 2002). Hirsch suggests an alternative approach: a co-regulatory regime, in which the government and the internet industry share responsibilities for users' privacy. The state‟s basic policy instrument is of course legislation. Privacy law is first of all about being alone, but it is also about the ability to control information about oneself, and consequently about the right to profit from one's personal information. Following this, Fernback & Papacharissi (2007) indicate that “privacy encompasses both social and economic dimensions and therefore the concept of privacy itself can be regarded as a public good” (p. 731). The first data protection law was formed in Germany in 1970. Since then, more than 90 countries have adopted similar legislation aimed at restricting the ways in which personal information is collected, processed, stored, and used by online organizations (Millard, 2014). Millard points to broad European legislation in terms of data protection laws at both the European Union level as well as at the national level. However, other researchers (such as Luzak, 2014), argue that the main European legislation is not clear enough and that there is a need for guidelines for future regulation and standardization. Millard (2014) adds that, as in Europe, many countries in the Asia-Pacific region have initiated similar legislation, while in the USA processing of personal data is subject to a complex patchwork of both federal and state, sector-specific, laws and regulations. Six principles for the processing and use of data can be identified in the existing data protection laws worldwide: (1) personal data processed only with consent or some other legal justification; (2) data processed fairly and lawfully; (3) data is adequate, relevant, and not excessive for specific, identified purposes; (4) data is accurate and kept up to date as necessary; (5) data kept in an identifiable form only for as long as is necessary; and (6) data protected against unauthorized or unlawful processing, and protected against accidental loss or destruction (Millard, 2014: 335).

10

An interesting question – to which there is no clear answer – is: Does privacy legislation affect the online behavior of users and websites alike? Birenhack & Elkin-Koren (2011) for example, who investigated the compliance of 1360 websites with legal requirements related to information privacy, found that only a small minority of these websites complied with legal requirements. Luzak (2014) came to the conclusion that even though the European ePrivacy Directive requires users' informed consent to the gathering, storing, and processing of their personal data, it is impossible to talk about “informed consent” when users are not aware of cookies through which the data is collected. However, in their recent study, Miltgen & Smith (2015) found that when users trust in the state‟s privacy regulatory system, they feel that they can behave online in a less cautious way. In the next section, the proposed analytical framework is presented, based on the main variables discussed previously in this section. 3. An analytical framework for online privacy research The privacy literature reviewed above points to three main domains of investigation when studying online privacy and the disclosure of personal information: (a) international and national regulation, (b) website policy and behavior, and (c) users' characteristics, attitudes, and behavior. In most privacy empirical studies, authors usually address one, rarely two, of these domains. This has been the reason for several attempts in recent years to suggest more comprehensive models encompassing various variables related to all these domains. An outstanding such attempt is the "Antecedent- Privacy-Concerns-Outcomes (APCO) Macro Model" (Smith, Dinev & Hu, 2011). This model brings together most of the variables discussed in the previous section while suggesting connections between various antecedents for privacy concern and its outcomes. Privacy concern plays in this model a role as both dependent (on the antecedents) and independent variable (explaining behavior). While the APCO Model emphasizes external antecedents related to deliberative cognitive response, Dinev, McConnel & Smith (2015) suggest the "Enhanced APCO Model" which adds variables taken from behavioral economics and psychology, such as biases and heuristics.

Both versions of the APCO model focus on connections among individual variables related to online privacy. The framework suggested in this work (see Figure 1) takes a different analytical path; a more institutional one. The main difference is grouping the variables into three main groups according to the three aforementioned research domains: users, websites, and the state. Variables in each group relate to each other, while as a group it affects the other two groups. The framework presents three external variables – online privacy risks, users' privacy concerns, and international privacy regulation – that affect the grouped variables. In addition, there are two variables – benefits and trust – which are positioned between the 11

Websites box and the Users box. On the one hand, websites privacy notices, FIPs, and temptations for users (the three variables within the Websites box) point to the benefits that users gain following the disclosure of their private information. On the other hand, users' online knowledge and experience (variables which are positioned within the Users box) affect users trust in a specific website. And finally, both the benefits that users gain as well as users trust in a website affect users' online behavior (the extent of disclosing personal information). Following this, "benefits" and "trust" should be positioned between the boxes of website and users and not within one of them.

All the variables included in the framework (with one exception which will be discussed later) and the connections between them are mentioned in the reviewed literature in section 2 of this paper and therefore need not be elaborated on here. We believe that the suggested framework enables scholars interested in online privacy who represent different disciplines to look at the phenomenon from a broader perspective. This is in contrast with the relatively narrow point of view of each discipline that usually addresses only one of the discussed domains of research. Such broad perspective might lead to various empirical studies such as (a) the investigation of the effects of state privacy regulation on websites' FIPs; (b) studying the relationships between international regulation, state regulation, and website privacy policies; (c) searching for connections between perceived risks, legal requirements, and privacy behavior; etc.

Yet, there is one variable within the websites domain (the central box in the model) which we could not find a single study that directly refers to. This is the views, beliefs, and knowledge of website owners and managers regarding online privacy. We think that these actors play a critical role in shaping website privacy policies and practices regarding the three main features discussed in the literature: privacy notices, fair information practices (FIP), and the types of temptation offered to users in exchange for their personal information (all of these are represented in the "websites box" of the framework). In order to fill this gap in the privacy literature, we conducted a survey that addressed website owners and managers. This survey is reported in the next section.

12

Online Privacy Risks

Users Privacy Concerns

Users Personal traits (e.g. tendency to trust)

Websites Managers Views / knowledge

Privacy notices Perceived risks Fair Information practices (FIPs) Online knowledge / experience

Temptations for users

Online Behavior

Benefits

(Disclosing Personal Information)

Trust

Figure 1. Analytical framework for online privacy research 13

International privacy regulation

The State Legal requirements

Direct regulation

4. The missing variable: Website owners' and managers' views and knowledge

4.1 Methodology and participants An online survey was distributed between August and October 2015 among individuals who operate various types of Israeli-based websites. The main website types were: commercial, social, educational, local authority, and governmental. Some of the commercial and educational website operators are the website owners as well. The potential participants were personally approached through one or more of three channels: the website's "contact box", personal email addresses, and telephone calls. Altogether, almost 1000 individuals were asked to participate in the survey, of which 100 (about 10%) responded positively and filled in the questionnaire. We have no way to identify those who responded to the questionnaire – neither by name nor with regards to the type of their website – because of a requirement for anonymity. Therefore, we cannot be sure that our sample well represents the 1000 individuals who were asked to respond. While ten percent response rate is not so high, it is not so unusual in online surveying. Several past studies have already pointed to such difficulties (questions regarding representativeness and relatively low response rate) in online surveys, and still addressed their benefits such as convenience, low administration cost, ease of follow-up, etc. (see for example, Baruch, Y., Holtom, B. C., 2008; Deutskens, E., Ruyter, K. D., Wetzels, M., and Oosterveld, P., 2004; Evans, J. R. and Mathur, A., 2005). The survey consisted of 43 questions based on Westin's (1967; 2003) questionnaires which were verified in many studies using different groups of population and in various contexts. The questionnaire is in three parts: the first part consists of twelve statements that directly or indirectly address views about trust and concerns regarding user privacy violation by websites and third parties. Two of the statements reflect the respondents' trust that laws and technologies can prevent privacy violation. The other ten statements address various situations in which user privacy might be violated, and the associated website behavior that might prevent such violation. The second part of the questionnaire addresses certain activities of the responder's website with regard to users' personal information. In addition, the respondents were asked in this part about their views regarding whether users should be asked to provide seven specific personal details while browsing their websites. In the first two parts of the questionnaire, the respondents were asked to indicate their level of consent to each of the statements in a six-degree Likert scale. In the third part of the questionnaire, the respondents were asked to provide some personal demographic information such as age, gender, and education. Prior to the distribution of the questionnaire, it was approved by the Research Ethical Committee of our institution (equivalent to IRB approval in the US).

14

Based on the website managers' responses in the first two parts of the questionnaire, three indexes were built: "privacy concern index," "privacy-related actions index," and (privacy) "data demand index." Each of the indexes reflects the average value in each of the three issues that were investigated. Of the participants, one quarter indicated that they were both the website owner and the manager, while one third of the participants defined themselves as employees who functioned as managers. Forty three percent of these two groups indicated that they were the website's content manager. Fifty four percent of the participants were male and 45% were female. Almost half of the participants held high school degree, while one quarter had higher education degrees (this might be because some high-tech experts in Israel gain their expertise while serving in the army and then they join the labor market with no higher education). Another reason might be that not all governmental employees need to be college graduates in order to join the service). The participants‟ ages range from 21 to 68 (average = 41.35; SD = 12.18). 4.2 Findings The findings are presented according to three phases of analysis; this is mainly for making it easier for readers to follow. The first phase, a descriptive statistics one, demonstrates the responses to each statement in the first two parts of the questionnaire; then, the three indexes are presented. The second phase of analysis addresses the connections between the investigated variables: the managers' demographic data (taken from the responses to the third part of the questionnaire), their concerns (the first part of the questionnaire), and the actual behavior of their websites (taken from the second part of the questionnaire). Then, we used the PLS-SEM software to employ a recursive structural equation model in order to give meaning to these connections. The third and last phase analyses how well the respondents understood the statements. This phase is as descriptive as the first one; however, we decided to present these data separately because we believe that this issue – website managers' literacy – is a significant one for understanding the findings presented in the two other phases. Phase 1: Participants’ responses to the statements – the three indexes The first group of statements addresses concern and trust. Some of the "concern statements" are articulated not as a direct speech of concern, but as actions to be done in order to protect users' privacy. In this way, it is possible to understand whether the respondents think that there is reason to be concerned or not. As shown in Table 1, managers‟ level of trust in existing laws and technologies is relatively low (27%), while their levels of concern are very

15

high (from 64% to 100%). The "acceptance" column refers to respondents' consent with each statement. Table 1: Levels of Concern and Trust (n=100)

Statement

Acceptance (%)

I trust existing technologies that protect users' online privacy I trust existing laws that protect users' privacy and their information I am concerned that commercial corporations will misuse information that users upload to the Net The duration of time in which users' personal information is kept by content providers on the Internet should be limited The amount of users' personal information that content providers collect and keep should be limited The misuse of users' personal information by other users should be prevented When users provide online personal information to state authorities, a balance between state interests and users' right for privacy should be guaranteed The enforcement of privacy notices published on websites should be strengthened The use that content providers do with users' personal information should be limited Users' control of their online personal information should be guaranteed The use that content providers do with users' personal information without users' permission should be limited When users provide online personal information to commercial firms, it should be guaranteed that this information would be kept and not be transferred to a third party

27 27 64 70 74 80 81

86 91 95 96 100

Based on the responses to the statements about concern and trust, a "concern index" was built as an average value of the twelve statements. After appropriate reverse coding, the index values ranged from 1 (low concern) to 5 (high concern). The internal consistency of the index according to Cronbach's alpha (α) = 0.75, and the index average = 6..4 (SD = 0.64). This index value demonstrates a considerable high level of concern, or at least a declared high level of concern (we discuss this distinction later).

The second group of statements aims at exploring the actual behavior of the respondents' websites regarding possible situations in which the users' privacy might be violated. As Table 2 shows, the managers‟ responses point to relatively moderate actions that might lead to such violations.

16

Table 2: Privacy-related actions (n=100)

Statement

Acceptance (%)

My website transfers information about users to state authorities

5

My website transfers users to other websites without their permission

10

My website transfers information about users to commercial firms

10

My website tailors advertisements according to the user's profile

22

My website automatically crosschecks users' information

23

My website keeps users' names for a certain period of time

44

My website checks what websites users are coming from

48

My website collects information about users' browsing habits

55

My website identifies the device users use for browsing

67

Based on the responses to the above nine statements, a "privacy-related actions index" was built as an average value of the responses to these statements. The index's value ranged from 1 (Never) to 5 (Always). The internal consistency of the index according to Cronbach's alpha (α) = 0.78, and the index average = 9.81 (SD = 0.89). This indicates a relatively low level of activity that might violate users' privacy.

The respondents were asked about their views regarding the requirement by websites of seven specific items defined as users' personal information: name, street address, place of work, date of birth, gender, sexual preference, and hobbies. Three possibilities were suggested: allow the requirement of the specific personal details, limit the requirement, and ban the requirement. As can be seen in Figure 2, the respondents' answers depended on the type of information. Regarding three of the personal details (name, gender, and date of birth) most respondents approved the requirements. Of another three of the personal details (hobbies, street address, and place of work) most respondents supported limiting the requirement, and only with regard to sexual preference did most respondents favor banning the requirement.

17

Name Gender Date of birth Allow Hobbies

Limit Ban

Street address Place of work Sexual preference 0

10

20

30

40

50

60

70

80

Figure 2. Managers' views: Personal details users should provide (%)

Based on these responses, a privacy "data demand index" was built as an average value of the relevant statements. The values in this index ranged from 1 (Ban) to 3 (Allow). The internal consistency of the index according to Cronbach's alpha (α) = 0.84, and the index average = ..98 (SD =0.89). This relatively high level can be considered – not surprisingly – as reflecting the willingness of website managers to acquire users' personal information.

Phase 2: Towards a causal model

In this phase of analysis we attempted to find the relationships between the various investigated variables and then describe their combined relationships in a model. The most significant relationships were found between the managers' age, his/her level of concern, and the websites' behavior regarding user privacy. A positive and significant Pearson correlation was found between manager‟s age and the privacy concerns index (r = 0.378, p<0.01). However, a negative significant Pearson correlation was found between the managers‟ age and the privacy-related activities performed on the website (r = -0.101, p<0.01). These findings reveal that younger managers are less concerned about the violation of users' privacy; however, they are better in protecting users' privacy while operating their websites. In contrast, older managers are more concerned about the violation of users' privacy, but the level of protecting users' privacy on their websites is

18

relatively lower. This difference between the levels of concern and behavior on the one hand, and the manager's age on the other, is the first part of what we call "the website privacy paradox," to which we refer later.

In order to investigate the differences within managers' concern alongside their level of education, we used one-way analysis of variance (ANOVA). We found differences with respect to levels of concern between the different education levels (F(98) = 3.65, p< 0.05). The highest level of concern (M = 4.44, SD = 0.46) is shown by respondents with postgraduate degrees, closely followed by respondents with high school education (M = 4.36, SD = 0.44), while the lowest level of concern (M = 4.0, SD = 0.50) is shown by respondents with bachelor degrees. Our data do not point to any possible explanation for the differences related to level of education; future research should address this point while referring to privacy literacy which was discussed earlier.

Finally, we used the PLS-SEM software to employ a recursive structural equation model (see Figure 3) in which causation is directed in one single direction. Based on the model goodness of fit (X2=7.122, p < .05) we can inspect the various connections, which were measured in this study and reported above, leading to the values in the privacy related action index. As can be seen in the model, the overall explanatory percentage of this path analysis is 5% (based on the adjusted r-square). The attributional variables (age, sex, education) display mixture effect on the indexes. A positive effect of age (r = 0.231) on the data demand index in contrast to negative effect (r = -0.173) on the concern index; sex (measured here as a binary/dummy variable) affects negatively (r = -0.287) considering the data demand index and affects positively (r = 0.312) on the concern index. Thus women have a tendency for higher values in the concern index and men have a tendency for higher values in the data demand index. Education level prompts a negative effect in both indexes (r = -0.173 for data demand index and r = -0.322 for the concern index). The model presents a positive effect of the data demand index (r = 0.153) and a negative effect of the concern index (r = -0.191), both less strength than the previous indicators. This might be the results of other variables that were not measured in this research (some, mainly exogenous ones, are suggested in our analytical framework for online privacy research, see Figure 1).

19

Figure 3: A path model describing the attribution variables and indexes Discriminant validity was measured by following the Fornell and Larcker (1981) criterion. The proposition of this criterion is that validity established if a latent variable in the model account for more variance in its related indicator variables than it shares with other constructs. Chin (2010) suggests that ideally, the average of the variance extracted (AVE) should be greater than 0.50. The path model (figure 3) is accounted for values range of 0.487 ≤ AVE ≤ 0.557. Phase 3: The managers’ understanding of the statements The most unexpected finding is that a relatively high percentage of the respondents (13% to 31%) claimed that they did not understand one or more of the privacy issues presented in the questionnaire (see Table 3). However, those managers who did not understand the issues were convinced that they had met users' privacy concerns much more than the managers who claimed to understand the same issues. Table 3: Statements that managers claim not to understand Statement When users provide online personal information to commercial firms, it should be guaranteed that this information would be kept and not be transferred to a third party Content providers' use of users' personal information should be limited The misuse of users' personal information by other users should be prevented Users' control on their online personal information should be guaranteed

20

% Not understand 31

25 24 23

Statement The use that content providers do with users' personal information without users' permission should be limited The duration of time in which users' personal information is kept by content providers on the Internet should be limited I am concerned that commercial corporations will misuse information that users upload to the net When users provide online personal information to state authorities, a balance between state interests and users' right for privacy should be guaranteed

% Not understand 23 17 16 13

This unexpected phenomenon of website managers who claim not to understand certain possible situations of privacy violation is the second part of what we call "the website privacy paradox". As mentioned earlier, the first part of the paradox addresses the relationships between managers' age, concern, and behavior. We discuss the two parts of this paradox in the next section.

5. Discussion This research consists of two main parts: first, it proposes an analytical framework for studying online privacy, and second, it presents a study about website owners' and managers' views and knowledge regarding user privacy. Most of the current online privacy literature addresses one of three levels of analysis (Greenaway & Chan, 2005): (a) the individual level of research, which mainly addresses internet users and focuses on their concerns regarding online privacy as well as their behavior following these concerns; (b) the sectoral/national level of research, which focuses mainly on policy-makers and regulators as protectors of users' privacy, and (c) the organizational level, in which the research unit is usually the individual organization (a firm or a website). The separation between these research levels is understandable because of the interdisciplinary nature of online privacy research. While Internet and communications scholars, as well as psychologists and sociologists, focus on the individual level, public policy, law and regulation scholars are more interested in the sectoral/national level of research, and scholars of organizational behavior, management, and information studies focus more on the organizational level. Online commerce and advertising researchers conduct studies addressing all three levels. However, the fact that this is an interdisciplinary issue drove us to suggest the analytical framework (Figure 1) which brings together the variables addressed in all relevant dimensions of online privacy while still relating each variable to a specific research domain. We hope that this framework will contribute to future empirical studies that would take into

21

consideration the relevant variables from all three dimensions, and provide the research about privacy in general, and the issue of disclosing personal information in particular, with more holistic answers. Yet, scholars who choose to focus on one domain can use the framework for pointing to exogenous variables for the specific research domain they chose to study. As to the second part of this paper, we hope that the findings of the website owners' and managers' survey will add to the current knowledge about how and why website privacy policies and practices are formed. The individuals in our sample indicated that they support various limitations on website activities while handling users' personal information. For example, limitations on the amount of such information that can be collected and kept, limitations regarding the use of this information, and limitations regarding the sharing of the information with third parties. In addition, the respondents support the idea that users should have control over their personal information (see Table 1). These findings can be explained in one of two ways: (1) website managers share the same concerns that users have regarding online privacy, or to put it in another way: website manager really care about their users' privacy, or (2) website managers are aware of users' privacy concerns (even though they do not necessarily share those concerns) and feel that they should meet those concerns with appropriate policies for the benefit of their businesses. While the first explanation is a rather naïve one, the second is a more practical explanation. The findings regarding the actual behavior of the websites (see Table 2), and the findings about the managers' views regarding the requirement of users' personal information (see Figure 2) support the second explanation. Website managers do see the importance, and even the necessity, of collecting and storing users' personal information for the benefit of their websites; however, they indicate relatively cautious behavior when handling this information. This analysis supports the view that websites consider privacy policies as a managerial concept (Barney & Hansen, 1994; Gerlach, Widjaja & Buxmann, 2015), in order to gain either legitimacy or competitive advantage (Greenaway & Chan, 2005). The relatively low average value of our "privacy-related actions index" (1.89 on a 5-degree scale) implies that the managers support fair information practices when handling users' personal information (Culnan & Armstrong, 1999; Culnan & Bies, 2003; Mollick & Mykytyn, 2009). Furthermore, this study demonstrates that a manager‟s age might affect the website‟s privacy policies. The younger the manager is, the lower her/his privacy concern level is, and the higher her/his cautious behavior level is. This might be seen as a paradox ("the website privacy paradox") because one might expect that when the level of concern is high, the level of cautious behavior would be high as well. However, this paradox can be explained by the different levels of knowledge (or online literacy) of the managers. Such different levels of knowledge are one of the explanations for the gap between users' privacy concerns and their 22

willingness to provide personal information (Acquisti & Grossklags, 2005; Blank et al., 2014; Paine et al., 2007; Rader, 2014). In the same way, we suggest that younger website managers are more informed about – and more acquainted and familiar with – internet-specific technological characteristics, as well as internet practices and risks, compared to older managers. Therefore, the informed, younger managers are less concerned about their users' privacy on the one hand, but on the other hand they can more easily and naturally provide users with solutions to their own privacy concerns. This explanation also serves the other part of the website privacy paradox: the relatively high percentage of managers in our sample who claimed not to understand some of the privacy situations that they were asked about in the questionnaire (see Table 3). If one accepts the assumption that a younger manager holds more knowledge about privacy risks and solutions than an older manager, it follows that while older managers might not be familiar with all the privacy risks on the one hand, they are still sure that they can secure their users' privacy on the other hand. All these findings regarding websites' managers bring us to think that media literacy in general and online privacy literacy in particular (Trepte et al., 2015) should become a major issue not only in education but in regulation as well. The practical ways to do it are not within the scope of this work. Finally, we would like to point to three main limitations of this research. (a) In our framework, as well as in the empirical research, we did not distinguish between the various types of websites: commercial, public, governmental, etc. Making such a distinction might change the findings regarding the relationship between the identity of the website manager (his/her age and education) on the one hand, and the levels of privacy concerns and privacyrelated activities on the other. Future research should address this distinction in order to disclose not only the effect of demographic differences on managers' privacy behavior, but the effect of different websites' types as well; (b) statistically, the response rate in this survey (10%) is acceptable in online surveys; however, it is not high enough to be sure that those who responded to the questionnaire are a representative sample of Israeli websites' managers; (c) the demographic variables that were tested in our survey (age, gender, and education) were chosen based on previous studies done in other countries; however, future studies could test other variables' (such as study discipline of the manager or the level of manager's income) effect on privacy behavior. And last comment: we believe that in spite of these limitations of this empirical study, it signifies the opportunities for future research about online privacy. At the theoretical level, the analytical framework we suggested, provides privacy researchers with a broad perspective on the issue and enables scholars to conduct an interdisciplinary research about online privacy.

23

Short Biography Avshalom Ginosar, PhD, is a Senior Lecturer at the Communication Department at the Academic College of Yezreel Valley, Israel. His main research areas include media policies and regulation, journalism ethics, and online journalism.

Yaron Ariel, PhD, is a Lecturer at the Communication Department at the Academic College of Yezreel Valley, Israel. His main research areas include new media and online communities.

24

References

Acquisti, A. and Grossklags, J. (2005). Privacy and rationality in individual decision making. IEEE Security & Privacy, 3(1): 26-33. Ang, P. H. (2001). The role of self-regulation of privacy and the internet. Journal of Interactive Advertising, 1(2): 1-9. Awad, N. F. and Krishnan, M. (2006). The personalization privacy paradox: An empirical evaluation of information transparency and the willingness to be profiled online for personalization. MIS Quarterly, 30(1): 13-28. Bansal, G., Zahedi, F. M., and Gefen, D. (2016). Do context and personality matter? Trust and privacy concerns in disclosing private information online. Information & Management, 53: 1-21. Barney, J. B. and Hansen, M. (1994). Trustworthiness as a source of competitive advantage. Strategic Management Journal, 15:175-190. Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First Monday, 11(9). Bartsch, M. and Dienlin, T. (2016). Control your Facebook: An analysis of online privacy literacy. Computers in Human Behavior, 56:147–154. Baruch, Y., and Holtom, B. C. (2008). Survey response rate level and trends in organizational research. Human Relations, 61(8): 1139-1160. Beldad, A., De Jong, M., and Steehouder, M. (2010). How shall I trust the faceless and intangible? A literature review on the antecedents of online trust. Computers in Human Behavior, 26: 857-869. Bennett, C. J. and Parsons, C. (2013). Privacy and surveillance: The multidisciplinary literature on the capture, use, and disclosure of personal information in cyberspace. In Dutton, W. (ed), The oxford handbook of internet studies. Oxford UK: Oxford University Press, pp. 486-508.

25

Bergström, A. (2015). Online privacy concerns: A broad approach to understanding the concerns of different groups for different uses. Computers in Human Behavior, 53: 419-426. Birnhack, M. and Elkin-Koren, N. (2011). Does law matter online? Empirical evidence on privacy law compliance. Michigan Telecommunications & Technology Law Review, 17: 337-384. Blank, G., Bolsover, G. and Dubois, E. (2014). A new privacy paradox: Young people and privacy on social network sites. Prepared for the Annual Meeting of the American Sociological Association, 16-19 August 2014, San Francisco, California. Blank, G., and Dutton, W. H. (2012). Age and trust in the internet: The centrality of experience and attitudes towards technology in Britain. Social Science Computer Review 30(2): 135-151. Chin, W. W. (2010). How to write up and report PLS analyses. In V. Esposito Vinzi, W. W. Chin, J. Henseler, & H. Wang (Eds.), Handbook of partial least squares: Concepts, methods and applications in marketing and related fields (pp. 655–690). Berlin: Springer. Culnan, M. J. and Armstrong, P. K. (1999). Information privacy concerns, procedural fairness, and impersonal trust: An empirical investigation. Organization Science, 10(1): 104-115. Culnan, M. J. and Bies, R. J. (2003). Consumer privacy: Balancing economic and justice considerations. Journal of Social Issues, 59(2):323-342. Deutskens, E., Ruyter, K. D., Wetzels, M. and Oosterveld, P. (2004). Response rate and response quality of internet-based surveys: An experimental study. Marketing Letters, 15(1): 21-36. Dinev, T., McConnel, A. R., A. and Smith, H. J. (2015). Informing privacy research through information systems, psychology, and behavioral economics: Thinking outside the "APCO" box. Information Systems Research, 26(4): 639-655. Dutton, W. H. and Shepherd, A. (2006) Trust in the internet as an experience technology. Information, Communication & Society 9(4): 433-451.

26

Earp, J. B., Antón, A. I., Aiman-Smith, L. and Stufflebeam, W. H. (2005). Examining internet privacy policies within the context of user privacy values. Engineering Management, IEEE Transactions On, 52(2): 227-237. Earp, J.B. and Baumer, D. (2003). Innovative web use to learn about consumer behavior and online privacy. Communications of the ACM, 46(4): 81-83. Evans, J. R., and Mathur, A. (2005). The value of online surveys. Internet Research, 15(2): 195-219. Fernback, J. and Papacharissi, Z. (2007). Online privacy as legal safeguard: The relationship among consumer, online portal, and privacy policies. New Media & Society, 9(5): 715734. Fogel, J. and Nehmad, E. (2009). Internet social network communities: Risk taking, trust, and privacy concerns. Computers in Human Behavior, 25: 153-160. Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobserved variables and measurement error. Journal of Marketing Research, 18: 39-50. Gerlach, J., Widjaja, T. and Buxmann, P. (2015). Handle with care: How online social network providers' privacy policies impact users' information sharing behavior. Journal of Strategic Information System, 24: 33-43 Greenaway, K. E. and Chan, Y. E. (2005). Theoretical explanations for firms' information privacy behaviors. Journal of the Association for Information Systems, 6(6): 171-198. Gross, R. and Acquisti, A. (2005). Information revelation and privacy in online social networks. Proceedings of the 2005 ACM Workshop on Privacy in the Electronic Society Virginia, USA, pp. 71-80. Hirsch, D. D. (2011). The law and policy of online privacy: Regulation, self-regulation or coregulation? Seattle University Law Review, 34: 439-480. Hsu, M. and Kuo, F. (2003). The effect of organization-based self-esteem and deindividuation in protecting personal information privacy. Journal of Business Ethics, 42: 305-320. Jensen, C. and Potts, C. (2004.) Privacy policies as decision-making tools: An evaluation of online privacy notices. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 471-478.

27

Joinson, A. N., Reips, U., Buchanan, T. and Schofield, C. B. P. (2010). Privacy, trust, and self-disclosure online. Human-Computer Interaction, 25(1): 1-24. Kachhi, D. and Link, M. W. (2009). Too much information: Does the internet dig too deep? Journal of Advertising Research, 49(1): 74-81. Knijnenburg, B. P, Kobsa, A. and Jin, H. (2013). Dimensionality of information disclosure behavior. International Journal of Human-Computer Studies, 71: 1144-1162. Kobsa, A. (2007). Privacy-enhanced web personalization. In: Brusilovsky, P., Kosba, A. and Nejdl, A. (eds). The adaptive web. Springer, pp. 628-670 Laufer, R. S. and Wolfe, M. (1977). Privacy as a concept and a social issue: A multidimensional developmental theory. Journal of Social Issues, 33(3): 22-42. Li, Y. (2012). Theories in online information privacy research: A critical review and an integrated framework. Decision Support System, 54: 471-481. Livingstone, S., Bober, M., and Helsper, E. (2005). Internet literacy among children and young people: Findings from the UK Children Go Online Project. London, UK: LSE. http://eprints.lse.ac.uk/397/ Livingstone, S., Haddon, L., Görzig, A., and Ólafsson, K. (2011). EU kids online final report. London School of Economics & Political Science, London, UK http://eprints.lse.ac.uk/39351/ [Retrieved 28.9.2016] Luo, X. (2002). Trust production and privacy concerns on the internet: A framework based on relationship marketing and social exchange theory. Industrial Marketing Management, 31(2): 111-118. Luzak, J. A. (2014). Privacy notice for dummies? Towards European guidelines on how to give "clear and comprehensive information" on the cookies' use in order to protect the internet users' right to online privacy. Journal of Consumer Policy, 37: 547-559. Lyon, D. (2001). Surveillance Society: Monitoring everyday life. Open University Press: Philadelphia. Marsden, C.T. (2008). Beyond Europe: The internet, regulation, and multi-stakeholder governance—representing the consumer interest? Journal of Consumer Policy, 31 (1): 115–32.

28

Millard, C. (2014). Data privacy in the cloud. In Graham, M. and Dutton, W. H. (eds). Society & the Internet, Oxford: Oxford University Press, pp. 333-347. Milne, G. R. and Culnan M. J. (2002). Using the content of online privacy notices to inform public policy: A longitudinal analysis of the 1998-2001 U.S. web survey. The Information Society, 18: 345-359. Miltgen, C. L. and Smith, H. J. (2015). Exploring information privacy regulation, risks, trust, and behavior. Information & Management, 52: 741-759. Mollick, J. S. and Mykytyn, P. P. (2009). An empirical investigation on the effects of privacy policies on perceived fairness of online vendor. Journal of Internet Commerce, 8: 88112. Murray, A. D. (2011). Cyberspace Regulation. In Levi-Faur, D. (ed.), Handbook on the Politics of Regulation, Cheltenham Glos, UK; Northampton, MA: Edward Elgar Publishing, 267–282. Norberg P. A., Horne, D. R. and Horne, D. A. (2007). The privacy paradox: Personal information disclosure intentions versus behaviors. Journal of Consumer Affairs, 41(1): 100-126. Oulasvirta, A., Suomalainen, T., Hamari, J., Lampinen, A. and Karvonen, K. (2014). Transparency of intentions decreases privacy concerns in ubiquitous surveillance. CyberPsychology, Behavior, and Social Networking, 17: 633-638. Paine, C., Reips, U., Stieger, S., Joinson, A. and Buchanan, T. (2007). Internet users‟ perceptions of „privacy concerns‟ and „privacy actions‟. International Journal of Human-Computer Studies, 65(6): 526-536. Pan, Y. and Zinkhan, G. M. (2006). Exploring the impact of online privacy disclosures on consumer trust. Journal of Retailing 82(4): 331-338. Park, Y. J. (2013). Digital literacy and privacy behavior online. Communication Research, 40: 215-236. Rader, E. (2014). Awareness of behavioral tracking and information privacy concern in Facebook and google. Proc. of Symposium on Usable Privacy and Security (SOUPS), Menlo Park, CA, USA.

29

Rasmus, H. and Stine, L. (2013). Regulatory response? Tracking the influence of technological developments on privacy regulation in Denmark from 2000 to 2011. Policy & Internet, 3: 289-303. Riquelme, I. P. and Roman, S. (2014). Is the influence of privacy and security on online trust the same for all types of consumers? Electron Markets, 24: 135-149. Sayre, S. and Horne, D. (2000). Trading secrets for savings: How concerned are consumers about club cards as a privacy threat? Advances in Consumer Research, 27(1): 151-155. Schwaig, K. S., Kane, G. C. and Storey, V. C. (2006). ??????Information & Management, 43: 805-820. Sheehan, K. B. (2002). Toward a typology of internet users and online privacy concerns. The Information Society, 18(1): 21-32. Smith, H. J., Dinev, T. and Xu, H. (2011). Information privacy research: An interdisciplinary review. MIS Quarterly, 35(4): 989-1016. Steinfels, N. (2016). "I agree to the terms and conditions": (How) do users read privacy policies online? An eye-tracking experiment. Computers in Human Behavior, 55: 9921000. Strauss, J. and Rogerson, K. S. (2002). Policies for online privacy in the United States and the European Union. Telematics and Informatics, 19(2): 173-192. Taddei, S. and Contena, B. (2013). Privacy, trust and control: Which relationships with online self-disclosure? Computer in Human Behavior, 29: 821-826. Taddicken, M. (2014). The „Privacy paradox‟ in the social web: The impact of privacy concerns, individual characteristics, and the perceived social relevance on different forms of Self-Disclosure. Journal of Computer-Mediated Communication, 19(2): 248273.

Trepte, S., Teutsch, D., Masur, P. K., Eicher, C., Fischer, M., Hennah fer, A., and Lind, F. (2015). Do people know about privacy and data protection strategies? Towards the "Online Privacy Literacy Scale" (OPLIS). In Gutwirth, S., Lennes, R., and de hert, P. (eds). Reforming European data protection law. Dordrecht: Springer, pp 333-366. 30

Tsai, J. Y., Egelman S., Cranor, L. and Acquisti, A. (2011). The effect of online privacy information on purchasing behavior: An experimental study. Information Systems Research, 22(2): 254-268. Westin, A. F. (1967). Privacy and freedom. New York: Athenaeum. Westin, A. F. (2003). Social and political dimensions of privacy. Journal of Social Issues, 59(2): 431-453. Wu, K., Huang, S. Y., Yen, D. C., and Popova, I. (2012). The effect of online privacy policy on consumer privacy concern and trust. Computer in Human Behavior, 28: 889-897. Xu, H., Luo, X. R., Carroll, J. M. and Rosson, M. B. (2011). The personalization privacy paradox: An exploratory study of decision making process for location- aware marketing. Decision Support Systems, 51(1): 42-52. Xu, F., Michael, K. and Chan, X. (2013). Factors affecting privacy disclosure on social network sites: An integrated model. Electronic commerce Research, 13: 151-168.

31