Supporting negotiation mechanism privacy authority method in cloud computing

Supporting negotiation mechanism privacy authority method in cloud computing

Knowledge-Based Systems 51 (2013) 48–59 Contents lists available at SciVerse ScienceDirect Knowledge-Based Systems journal homepage: www.elsevier.co...

1MB Sizes 3 Downloads 79 Views

Knowledge-Based Systems 51 (2013) 48–59

Contents lists available at SciVerse ScienceDirect

Knowledge-Based Systems journal homepage: www.elsevier.com/locate/knosys

Supporting negotiation mechanism privacy authority method in cloud computing Changbo Ke a,⇑, Zhiqiu Huang a, Mei Tang b a b

College of Computer Science and Technology, Nanjing University of Aeronautics and Astronautics, Nanjing, Jiangsu 210016, China Business Administration School, Nanjing University of Finance & Economics, Nanjing, Jiangsu 210046, China

a r t i c l e

i n f o

a b s t r a c t

Article history: Received 12 August 2012 Received in revised form 28 June 2013 Accepted 5 July 2013 Available online 16 July 2013

Cloud computing have become a software paradigm, providing services dynamically according to user requirements. However, it is difficult to control personal privacy information because of the opening, virtualization, multi-tenancy and service outsourcing characters. Therefore how to protect user privacy information has become a research focus in cloud computing. Considering the service outsourcing character, we propose a privacy information description method and negotiation mechanism. Firstly, we describe privacy property with Privacy Negotiation Language (PNL) based on description logic. Secondly, we get privacy attribute sequence through pre-negotiation between user and service composer. Thirdly, though exchanging privacy disclosure assertion, we obtain privacy policy that satisfying both parties. In the end, we put forward privacy policy negotiation algorithm. Through case study we proved the feasibility and correctness of this method.  2013 Elsevier B.V. All rights reserved.

Keywords: Cloud computing Description logic Privacy property Privacy policy Privacy preference

1. Introduction Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction [1]. With the character of service outsourcing, virtualization, distribution and multi-tenancy, cloud computing has become a new computing paradigm and research focus. Such characters enhance the service quality and reduce the wastage of computing resources, for example, service outsourcing enhances the service capability and specialization through service composition [2]. While because of the transparency of privacy information to the outsourcing service provider, users worry that it will be hard to prevent user privacy data from be illegally propagated and used. For example, Google is sued by many users in America because of its new unified privacy policy implemented from Mar. 1st, 2012. In Europe, the implementation of this new privacy policy have been investigated by European Union and postponed. According to the analysis by America Electronic Privacy Information Center, Google new privacy policies do not consider how to use privacy data in the product, and to whom privacy data be propagated according to user privacy requirement, and it may have conflicts with local laws. Therefore,

⇑ Corresponding author. Tel.: +86 15205176348. E-mail addresses: (Z. Huang).

[email protected]

(C.

Ke),

[email protected]

0950-7051/$ - see front matter  2013 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.knosys.2013.07.001

privacy protection in cloud computing have become research focus in evolving computing paradigm. Privacy was proposed as the human right to be let alone in the beginning [3]. In the domain of information system and software engineering, privacy protection means the capability of preventing individual information from be collected, disclosed and stored by others [4]. The Platform for Privacy Preferences (P3P) [5] that developed by World Wide Consortium (W3C) in 2002 provides a standard and machine-understandable privacy policy, which matches with user privacy preference. According to the matched results, user can select service that meeting privacy preference. However, the described privacy requirement with P3P lack semantic information and P3P only apply to Web Site, not supporting privacy protection in service composition. Therefore P3P cannot be applied in cloud computing, since all entitles in cloud computing are service and provide service through service composition. In 2005 Extensible Access Control Markup Language (XACML) [6] is proposed by Organization for the Advancement of Structured Information Standards (OASIS). XACML 2.0 [7] extends the support of privacy policy through profile for privacy policies. However, different users in cloud computing have different privacy requirements, requiring different definition of sensitive privacy information. XACML privacy policies only apply to service provider without considering user privacy requirement, and hardly guarantee the composite service satisfying user privacy requirement. Pearson et al. [8,9] defined privacy protection in cloud computing as the capability of user controlling Personal Sensitive Information (PSI) not be collected, used, disclosed and stored by cloud service provider. They provided certain theoretical guidance, but do not

49

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

put forward specific solution about privacy protection in cloud computing. Cloud computing is three-tiered framework, Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS). Considering service outsourcing character of cloud computing, our research focuses on privacy protection of service or service composition in SaaS. In this paper, we propose a privacy authority method supporting negotiation mechanism. First of all, we make the two assumptions.  The privacy data is encrypted when transferring the network link, Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) in cloud computing.  Services in Software as a Service (SaaS) are traditional web service. On the basis of the assumptions, this paper contributes to the below aspects: firstly, addressing privacy property description method, which is semantic oriented and machine-readable; secondly, depicting negotiation mechanism for privacy policy, which can also be extended to other negotiation scenario, for example, negotiation between service composers and atomic service providers; thirdly, describing negotiation algorithm and framework, which can be theory basis for implementing privacy policy negotiation system in cloud computing. The other parts of this paper are structured as the follows: Section 2 we introduce related works. Section 3 we describe our research motivations. Section 4 we depict basic theories, including syntax and semantic of description logic and description of privacy property. Section 5 we put forward privacy policy negotiation method, including the pre-negotiation of privacy policy, exchange of privacy disclosure assertion and framework of privacy policy negotiation. Section 6 we present case study to prove the feasibility and correctness of our method. Section 7 we evaluate the algorithm and analyze our method limitations. In the end we conclude and point out the future works in Section 8.

on P3P, which only apply to web site, not supporting service composition. Moreover, privacy information described by P3P do not have semantic, hardly be extracted and negotiated automatically. Our work describes privacy attribute based on description logic, supports service composition with semantic and automatic extraction and negotiation of privacy information. Therefore, our work supports privacy policy negotiation in cloud computing. Detailed as below Table 1: 3. Motivations To explicitly clarify our research issue, we present an application scenario as following: Suppose Tom wants to buy commodity from seller B through service A in cloud computing. Service A requires Tom to input his sensitive privacy information, like real name, bank account, mobile phone number and detail address. Without negotiating with service A for privacy agreement, Tom may worry about two aspects. (1) Privacy information be illegally used or propagated by service A or seller B. Because of no privacy agreement, Tom cannot sue service A or seller B for recovering financial or spiritual losses. In case Tom does not eagerly want this service, once privacy conflict with service provider occurs, Tom will stop the service and select other services. Scenario showed as Fig. 1a. (2) In case Tom eagerly wants the service and provides his sensitive privacy information to service A. However, privacy information is disclosed, causing financial or spiritual losses. Scenario showed as Fig. 1b. In this paper, our motivation is to build a service, which automatically provides privacy policy negotiation for both user and service provider in cloud computing. Through this service, privacy agreement is obtained, to protect privacy right for user and constraint service provider.

2. Related works

4. Basic theories

We classify the related works of privacy protection as computing process oriented and data oriented privacy protection. The former is classified into five categories, which are modeling and verification of privacy requirement, matching and negotiation of privacy policy, disclosure and risk. The latter is classified into three categories, which are obfuscation, encryption and anonymity of privacy data. In the mean time, we compare them from contribution, applied computing paradigm, whether supporting service composition and whether supporting semantic, and highlight our work in the table. In this paper, we major discuss the related works about privacy policy negotiation. Zhu et al. [16] proposed a semantic web service privacy framework, which defined privacy ontology by using DAML-S and allowed service provider to clarify the required privacy data in service input. In the mean time, this framework also provided a privacy negotiation protocol, through which user and service provider can negotiate automatically. El-Khatib et al. [17] presented negotiation protocol pointing to the inconsistency of user and service provider privacy policy, and also put forward a privacy system, in which privacy policy based on P3P can be defined. Yan et al. [18] set up a framework of parsimonious semantic trust negotiation, which can greatly reduce the degree of disclosed privacy identity information, without exchanging entire attribute certificates. Zhu et al. and Yan et al. only research on negotiation method, but not on specific computing paradigm. El-Khatib et al. research on privacy policy negotiation in service computing based

4.1. Syntax and semantic of description logic Description Logic is the basis of Ontology Web Language for Service (OWL-S), which is decidable subset of first-order logic and formalism for representing knowledge. Description Logic is also called Term Logic, Terminology Knowledge Representation Language, Concept Language and Term Representation Language. Description Logic is composed of concepts, roles and individuals. Complex concepts and roles can be described with simple concepts and roles. In this paper, we build a privacy negotiation model between service provider and user on the basis of description logic, transforming the pre-negotiation of privacy policy to be decidable issue of Tableau algorithm. Supposing A and B are atomic concept, C and D are concept description, u and u0 are atomic formula, p and q represent individuals, R and S represent atomic roles. Basic constructors include atomic negation:, atomic intersection u, value restriction" and limited existential quantification$. The basic description logic is called ALC. All concept descriptions in ALC can be achieved through syntax rule (1).

C; D ! Ajfpgj:AjC u DjC t Dj8R:Cj9R:Cj ? j>

ð1Þ

All formulas in ALC can be obtained through atomic formula (2).

u; u0 ! CðpÞjRðp; qÞj:uju _ u0 ju ^ u0 ju ! ujtruejfalse Syntax and semantic of ALC as showed in Table 2:

ð2Þ

50

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

Table 1 Comparison of related works. Methods

Authors

Contributions

Computing process oriented privacy protection modeling and Hamadi et al. Conceptual modeling of privacy-aware Web service verification [10] Privacy-aware Web service protocol replaceability Guermouche et al. [11] Mokhtari et al. Verification of privacy timed properties [12] Liu et al. [13] Minimal privacy authorization in web services collaboration Matching

Negotiation

Disclosure

Risk

Barth et al. [14]

Privacy and utility in business processes

Zhiqiang et al. [15]

Privacy-protection policy for pervasive computing

Zhu et al. [16] El-Khatib et al. [17] Yan et al. [18] Our work

Role-based collaboration and its kernel mechanisms A privacy negotiation protocol for web services Parsimonious semantic trust negotiation Supporting negotiation mechanism privacy authority method

Kolter et al. [19]

Visualizing past personal data disclosures

Liu et al. [20]

Analysis of the minimal privacy disclosure

Yu et al. [21]

Modeling and measuring privacy risks

Jason et al. [22] Data oriented privacy protection Obfuscation Ni et al. [23] Bakken et al. [24] Encryption

Bao et al. [25] Gilburd et al. [26]

Anonymity

N/A: Not Applicable

Ye et al. [27] Sweeney [28] p

Computing paradigm

Support for service composition

Service computing Service computing Service computing Service Computing

   p p

Service computing Pervasive computing



N/A Service Computing N/A Cloud computing Service computing Service computing

Privacy risk models

p p   p p

N/A 

 

N/A p

p p





p p

Service computing ubiquitous computing

Support for semantic



  p

A mixed mode data obfuscation method AENDO Anonymity and desensitization of usable data sets

N/A N/A

N/A N/A

 

An efficient and practical scheme for privacy protection in the ecommerce of digital goods k-TTP: a new privacy model for large-scale distributed environments

N/A

N/A



N/A

N/A



Anonymizing classification data using rough set theory Achieving k-anonymity privacy protection using generalization and suppression

N/A N/A

N/A N/A

 

: Support : Not support.

(a)

realName, mobilePhone, detailedAddress

nickName, officePhone, Non-detailedAddress Conflict

end

end

Cloud Service A

Fig. 1a. Service be terminated because of privacy conflict.

(b)

nickName, officePhone, Non-detailedAddress

e l usag Illega agatio n p ro and p

realName, mobilePhone , detailedAddress Conflict

Cloud Service A Illeg and p al usag e ro pag ation

realName, mobilePhone, detailedAddress

Fig. 1b. Service be proceeded with privacy information disclosed.

Tableau algorithm is an algorithm detecting satisfiability among concepts in description logic. Since reasoning issue in description logic can be specified as satisfiability issue among concepts, most reasoner use Tableau algorithm, such as Pellet and Fact. Supposing negative normal form of concept A is nnf(A), notation[path] of each

concept represents path of concept be generated. The reasoning rule of Tableau algorithm is as follows: (1) Extension rule: Supposing A is atomic concept, and A v B; A½path 2 AðxÞ; nnf ðBÞ R AðxÞ, then AðxÞ ¼ AðxÞ[ fnnf ðBÞ½path:A g.

51

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59 Table 2 Syntax and semantic of ALC. Constructor Atomic concept Atomic relationship Atomic negation Intersection Value restriction Limited existential quantification

Syntax A R :A AuB "R.C $R.C

Semantic I

Instances

I

A # M RI # MI  MI MInC CI \ DI {xj$yhx, yi 2 RI ^ y 2 CI} {xj"yhx, yi 2 RI ? y 2 CI}

Name hasIDNumber :Name Name u IDNumber "hasIDNumber. Name $ hasIDNumber. Name

(2) t rule: Supposing C = {C1, C2}, if C 1 t C 2 2 AðxÞ; fC 1 ; C 2 g uAðxÞ ¼ /, then AðxÞ ! AðxÞ [ C. (3) u rule: Supposing C = {C1, C2}, if C 1 u C 2 2 AðxÞ; fC 1 ; C 2 g R AðxÞ, then AðxÞ ! AðxÞ [ fCg. (4) $ rule: Supposing C = {C1, C2}, if 9S:C 2 AðxÞ, if x do not have successor y of S, that makes C 2 AðyÞ, then we add a note y, value Aðx; yÞ ¼ S, and AðyÞ ¼ fCg. (5) " rule: Supposing C = {C1, C2}, if 8S:C 2 AðxÞ, if x do not have successor y of S, and C R AðxÞ, then AðxÞ ! AðxÞ [ fCg.

4.2. Description of privacy property Through analyzing the description document of atom service, which joins in service composing in cloud computing, negotiation engine obtains the user privacy information to be used or disclosed from input and pre-condition. In the mean time, through analyzing user privacy requirement, negotiation engine obtains the privacy information to be protected by user. Both obtained privacy information are known as privacy attributes. Under the framework of privacy policy negotiation, conflict detection of privacy attributes and exchange of privacy disclosure assertion between negotiating parties are automatically done by respective negotiation engine. Therefore, privacy policy assertion needs to be machine-understandable, supporting context semantic reasoning. In this paper, we take advantage of ontology and description logic, to describe and reason the privacy policy at the bottom of the framework. Before elaborate the relative theory of privacy description, we present an instance that applied throughout the text. Instance: There are following participants in this instance, Seller, Buyer, ES (Service Composer), Bank, Shipper, Post office. ES has the business license issued by Industrial and Commercial Bureau, certificated as legal E-commerce platform. ES can issue reputation certification for those VIP sellers who have reputation value over 600, or have credit over 6000 issued by bank. Bank can issue credit card certification to both Seller and Buyer. One day, Tom wants to purchase some Furniture from service provider corporation S via cloud service composer CSC. Tom has the following privacy requirements. Furniture Corporation S needs better to be VIP seller. In transaction only VIP seller can obtain Tom realName and phoneNumber (Mobile phone), while non-VIP seller can only obtain Tom nickName or officePhoneNumber. All phone number can only be provided to corporation S and Shipper. Tom bank account can only be provided to Bank. Address and zip code can only be provided to Shipper. In certain period after transaction is done, all participants have to delete all privacy information automatically. Maintenance service is provided through tracking number and phone number. Corporation S especially requires getting feedback about service quality, from those customers with amount over $20,000 through phone number and address. Considering the above requirements, the point is that whether Tom can successfully negotiate with S to obtain interaction sequence, and then place an order or not.

a⊃b

b∈a

Privacy Attribute (PA)

Fig. 2. Ontology description of privacy attributes.

4.2.1. Privacy property description for service provider We obtain the relationship of privacy attributes through mapping among knowledge ontology, and then describe it with ontology tree. The ontology tree of privacy attributes is showed in Fig. 2. Thing is root node, which is super class of all privacy attributes. Except the bottom layer, the relationship between all the other layers is subsuming, namely, relationship of class and subclass. For example, class name includes subclass realname and nickname. Only the bottom layer belongs to its upper layer, for example, Country, Province and City belong to class Address. Definition 1 (Privacy attribute of service provider PA-S). We describe privacy attribute as 5-tuple, namely, PA-S = hIssuer, Owner, Subject, PDA-S, Signaturei. Issuer represents the mark of privacy attribute, to record the farther class and son class of a privacy attribute class. Owner represents participants of service composition and also the owner of privacy attribute. Subject represents name of the privacy attribute. PDA-S represents Privacy Disclosure Assertion of Service. Signature is digital signature. Each privacy attribute corresponds to one concept in privacy attribute ontology. Subject can be described as follows:

Subject ¼ Cðop1 : I1 ; op2 : I2 ; op3 : I3 ; . . . ; opn : In Þ _ Cðdp1 : D1 ; dp2 : D2 ; dp3 : D3 ; . . . ; dpn : Dn Þ; where C represents class of privacy attribute ontology, opi and dpi represents object property and digital property respectively, belonging to class, satisfying "(i – j) ? (opi – opj) ^ (dpi – dpj), Ii represents instance of privacy attribute ontology, and Di is one certain value or constant. Definition 2 (Delegation of authority statement DAS). DAS is part of privacy disclosure assertion, namely, DAS must be satisfied for each disclosure of privacy attribute, to prove if service participants have the authority of privacy attribute or not. It can be expressed as below:

52

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

DAS : ½ðser v erConstr  authorizerÞ ( assertion Suppose authorizer = {offical,serviceComp}, whereas official represents official organization, serviceComp represents service composer, serverConstr represents the rules or law constraint on service participants issued by official organization or service composer. For example, ES is e-commerce platform. DAS clarified that service participants must satisfy the assertion expression. Valid DAS must include signature on the assertion by official organization or service composer, and in the mean time the signature is included in the privacy attribute. Example 1. The DAS issued by ES is as follows. To be VIP user of ES, the user must have reputation value over 600 or have credit over 6000 issued by bank. It can be express as below:

Definition 4 (Privacy Discloser Strategy (PDS)). PDS is an ordinal sequence of privacy discloser assertion, namely PDS = PDAS1 ^ PDA-S2 ^ PDA-S3 ^    ^ PDA-Sn. 4.2.2. Privacy property description for user Definition 5 (Privacy attribute of user (PA-U)). The description of PA-U is similar as the description of PA-S. We can also describe it in 5-tuple, namely, PA-U = hIssure, Service, Subject, PDA-U, Signaturei, whereas, Issuer represents the mark of privacy attribute, to record the farther class and son class of a privacy attribute class, Service represents the object of personal privacy information to be disclosed, Subject represents name of the privacy attribute. PDA-U represents Privacy Disclosure Assertion of Users. Signature is digital signature. The description of subject here is same as that in Definition 1.

VIP  EBay ( credit½> 6000ðratingÞ  Bank _ reputation½> 600ðv alueÞ  ES

ð3Þ

Definition 3 (Privacy disclosure assertion PDA-S). Privacy disclosure assertion is a constraint assertion for certain service participant to own certain privacy attribute, it can be expressed as below:

PDA-S ¼ subjConstrHownerConstr;

PDA-U ¼ subjConstr Hser v erConstr; subjConstr ¼ Cðop1 : fow1 ; . . . ; own g; . . . ; opn : fow1 ; . . . ; own gÞ ^ Cðop1 : ow1 ðv t 1 Þ; . . . ; opn : own ðv tn ÞÞ;

subjConstr ¼ Cðop1 : fow1 ; . . . ; own g; . . . ; opn : fow1 . . . own gÞ^

serv erConstr ¼ trustDegree;

Cðop1 : ow1 ðv t 1 Þ; . . . ; opn : own ðv t n ÞÞ; ownerConstr ¼ DAS; Privacy Disclosure Assertion PDA-S is composed of two parts, namely, constraint on subject of privacy attribute subjConstr, and constraint on service participant who own the privacy attributes ownerConstr. The former subjConstr mainly set constraints on the ownership of privacy attribute and validity period of using the privacy attribute by service participants. C represents class of privacy attribute, specifying that privacy attribute exchange can only be processed among corresponding class or subclass. op1:{ow1, . . . , own}, in which op1 represents some instance of class and {ow1 . . . own} represents service participants corresponding to the instance. op1:{ow1, . . . , own}, . . . , opn:{ow1, . . . , own} represents all instances of class that owned by certain service participants. op1:ow1 (vt1), . . . , opn:own(vtn) represents validity time for service participants owning the privacy attribute instance. If data domain of vti is integer, then owi is a 1-tuple predicate based on data domain of vti. Common predicate is Pa or 6a, in which a is constant. The latter ownerConstr mainly set constraints on service participants who own privacy attributes, ownerConstr = DAS means that service participants who own privacy attributes must also meet DAS requirement. Example 2. Corporation S plans to sell furniture via ES on internet. The privacy disclosure assertion of user address issued by ES is as follows: Supposing corporation S is ES VIP seller. ES requires buyer address can only be disclosed to shipper, and be deleted within 2 h after goods delivered and deal is finished, we can express it as below,

address : shipper ^ ½address : 6 2 hðv aildtimeÞ H Seller : VIP  ES

Definition 6 (Privacy disclosure assertion of user (PDA-U)). Privacy Disclosure Assertion of User is a constraint assertion for certain service provider to own certain privacy attribute, it can be expressed as below:

ð4Þ

Privacy Disclosure Assertion PDA-U is composed of two parts, namely, constraint on subject of privacy attribute subjConstr, and constraint on service provider who own the privacy attribute serverConstr. The expression of subjConstr is the same as that in Definition 3, while serverConstr is determined by user trust degree on service or service provider. Trust degree can be expressed as trustDegree = {S, DAS, Re}, whereas S represents Security, to certify the truth and integrity of data and trustworthy of QOS. DAS is delegation authority statement that owned by service or service provider. Re represents reputation of service or service provider [29]. DAS usually is issued by official organization or service composer to service provider, for example, stating whether Seller is VIP, whether Shipper or Bank have business license issued by official organization or not. We assume all shipper and bank have business license in this paper. Definition 7 (Privacy Preference PP). PP is an ordinal sequence of privacy discloser assertion. Namely,

PP ¼ PDA-U 1 ðC 1 ðop1 ÞÞ ^ PDA-U 2 ðC 2 ðop2 ÞÞ ^ PDA-U 3 ðC 3 ðop3 ÞÞ ^ . . . . . . ^ PDA-U n ðC n ðopn ÞÞ Example 3. Tom requires corporation S better to be VIP seller of ES. Only for VIP seller, Tom Real Name and Mobile Phone Number can be disclosed, while for non-VIP seller, only Tom Nick name or Office Phone Number can be disclosed. Bank account can only be provided to bank and address, and zip code can only be provided to shipper. Once transaction is done, ES and all service participants must delete all privacy information within 20 min. The privacy disclosure strategy as showed in Table 3: Therefore, privacy preference is:

Take formula (1) into formula (2), we can obtain:

address : shipper ^ faddress : 6 2 hðv aildtimeÞH

PP ¼ PDA-U 1 ðrealNameÞ ^ PDA-U 2 ðphoneNumberÞ^

credit½> 6000ðratingÞ  Bank _ reputation½> 600ðv alueÞ  ESg

PDA-U 3 ðnickName _ phoneNumberÞ ^ PDA-U 4 ðcreditCardÞ^ ð5Þ

PDA-U 5 ðaddress ^ zipCodeÞ ^ PDA-U 6 ðallPAÞ

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

53

Table 3 Instance of privacy disclosure assertion. Privacy requirement

PDA-U

realName can only be provided to VIP seller of ES phoneNumber(mobile phone) can only be provided to VIP seller of ES nickName or officePhoneNumber can be disclosed to non-VIP seller of ES Bank Account can only be provided to bank. Address and ZipCode can only be provided to shipper or postoffice Once transaction is done, ES and all participants must delete all privacy information within 20 min.

PDA-U1 realName:[Seller:VIP  ES] PDA-U2 phoneNumber:[Seller:VIP  ES] PDA-U3 nickName _ officePhoneNumber: [Seller::VIP  ES] PDA-U4 creditCard:Bank PDA-U5 address ^ zipCode:shipper _ postoffice PDA-U6 success ^ [allPA: 6 20min(vaildtime)]

We take advantage of the above Privacy Negotiation Language (PNL) syntax, to describe privacy attribute and service privacy strategy assertion in simplification way, and also to easily understand user privacy requirement and privacy strategy of service provider. The basis of PNL is description logic; therefore, class object and data in privacy attribute corresponds to concept role and character in description logic, respectively. We design the following Privacy Description Logic Function: PDL (); serverConstr  authorizer ? serverConstr u authorizer; PDA1 _ PDA2 ? PDL(pda1) t PDL(pda2); PDA1 ^ PDA2 ? PDL(pda1) u PDL(pda2); (pda) ? (PDL(pda)); subjConstr H ownerConstr

! 9hasID:ðPDLðsubjConstrÞÞ

ject:name is {C1(op1, op2, . . . , opn), C2(op1, op2, . . . , opn), . . . , Cn(op1, representing the 2(op1, op2, . . . , opn), . . . , Cn(op1, op2, . . . , opn)}, corresponding object of privacy attribute class. Tableau() is conflict detecting algorithm of privacy attribute. u is instance assertion axiom transformed from user privacy requirement. By taking advantage of Tableau(subject:name, u) algorithm, we can check if privacy attributes collection subject:name meets user privacy requirement. Definition 10 (Sensitive Privacy Attribute Collection SPAC). Suppose PP{C1(op1), C2(op2), C3(op3), . . . , Ci (opi), . . . , Cn(opn)}, if Ck(opk) is = subject:name of another privacy attribute, while UPKB {C 1(op1), C2 (op2), C3(op3), . . . , Ci(opi), . . . , Ck(opk) . . . , Cn (opn)}, namely, Ci(opi) u Ck(opk) = /, which means that Ci(opi) and Ck(opk) cannot be exist at the same time. Then we call {Ci(opi), Ck(opk)} to be sensitive privacy attribute collection.

u 9authorizerIs:PDLðownerConstrÞÞ; subjConstr H serverConstr

! 9hasID:ðPDLðsubjConstrÞÞ u 9userIs:PDLðser v erConstrÞÞ; C(op1:{ow1 . . . own} . . . opn:{ow1 . . . own}) ^ C(op1:ow1(vt1) . . . opn:own(vtn))

! C u ð9op1 fow1 . . . own g u . . . u 9opn fow1 . . . own g u ow1 ðv t1 Þ u . . . u own ðv tn ÞÞ; 5. Privacy policy negotiation The privacy policy negotiation is composed of two steps: Step 1, With Tableau algorithm of description logic, by detecting the conflicts of privacy attribute collections, we can obtain the Privacy Knowledge Base (PKB) that satisfy user requirements. Then by reasoning based on PKB, we can get the user sensitive privacy attribute collection, in the mean time, we can also obtain the privacy attribute collections that satisfy user requirement, namely, privacy attribute sequences that can be used for privacy policy negotiation. Step 2, through ordinal exchange of privacy disclosure assertion based on privacy attribute sequences between user and service provider, we obtain the privacy policy that meet both user and service provider privacy requirements. 5.1. Pre-negotiation of privacy policy

Definition 8 (Instance assertion axiom u). To the corresponding instance collection of user privacy attribute class, instance assertion axiom u can be expressed as u  "Ci(opi) u :$Ck(opk), whereas {Ci(opi), Ck(opk)} is sensitive privacy attribute collection.

Definition 9 (User Privacy Knowledge Base UPKB). UPKB can be expressed as UPKB = hsubject:name,Tableau(), u i, in which sub-

Definition 11 (Min-privacy Attribute Collection MPAC). If privacy attribute collection satisfy the following conditions: (1) There is no sensitive privacy attribute collection SPAC, namely

UPKB  fC 1 ðop1 Þ; C 2 ðop2 Þ; C 3 ðop3 Þ; . . . ; C i ðopi Þ; . . . ; C n ðopn Þg (2) Satisfying the input and pre-condition of service, and be able to provide service to user execute(service) = true (3) There is no redundancy of privacy attribute, namely:

subject : name \ ðserv iceðinputÞ [ ser v iceðpre conditionÞÞ ¼/ Then we call such privacy attribute collection to be MPAC. MPAC algorithm: MPAC algorithm is based on Tableau algorithm. We can obtain MPAC with this algorithm. The first line input the corresponding instance of privacy attribute collection. The second line, output the MPAC. The third and the forth line, initiate privacy attribute instance collection stack and MPAC stack, respectively. The fifth line pushes privacy attribute instance collection into stack. From the sixth to the fourteenth line, it is iteration. While privacy attribute instance collection stack is not empty, pop the top stack element, and then start conflict detection based on instance assertion axiom by taking advantage of Tableau algorithm. If there is no conflict, then push the instance into min-privacy attribute collection stack, or else search the brother node of the instance corresponding class in privacy ontology tree, and push the brother node corresponding instance into privacy attribute instance collection stack. From the fifteenth line to the end, if the privacy attribute collection obtained after conflict detection enables service to operate successfully, and there is no redundant privacy attributes, return the privacy attribute collection, that is MPAC, or else search and rebind new service.

54

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

Algorithm 1. MPAC(subject:name, u)

1 2 3 4 5 6 7 8 9 10 11 12

13 14 15 16 17 18 19 20

Input: subject:name, u Output: MPAC Init Stack (subject:name); Init Stack (MPAC); Push(Stack(subject:name), {C1(op1), C2(op2), C3(op3),    , Ci(opi),    , Cn(opn)}); while(Stack(subject:name)! = ;) do Pop(Stack(subject:name), Ci(opi)) //pop the top stack subject:name; Tableau(Ci(opi),/); //detecting the conflict of Ci (opi) with instance assertion axiom u; if(conflict = false)do Push(Stack(MPAC), Ci(opi));//pushing Ci(opi) into stack MPAC; else Push(Stack(subject:name), brother(Ci (opi)));// substituting Ci(opi) with brother node and pushing the brother node into stack subject: name; end if end while if(execute(service) = true^ //checking if service can be successfully executed; subject:name \ (service(input) [ service(pre_condition)) = / );//checking redundancy; return true; else rebinding service; end if

The pre-negotiation of privacy policy is completed by both user and service composer. From user privacy requirement, we obtain the corresponding instance collection of privacy attribute class, and transform it into instance assertion axiom /. Service composer have privacy attribute collection for user that enabling service to operate successfully. Firstly, user send service requirement to service composer. Upon received the requirement, service composer show user the required privacy attribute collection to be disclosed during service operation, which is {C1(op1), C2(op2), C3(op3), . . . , Ci (opi), . . . , Cn(opn)}. Then at client, MPAC algorithm is started to check, if the required privacy attribute collection meets user privacy requirement or not. The main content of checking is as follows:

(1) Whether single privacy attribute satisfy user privacy instance assertion axiom or not, namely, uCi(opi); (2) Whether privacy attribute collection satisfy user privacy knowledge base UPKB, namely, UPKB  {C1(op1), C2(op2), C3 (op3), . . . , Ci(opi), . . . , Cn(opn)}, or UPKB  subject:name; (3) Whether the privacy attribute collection enable service to be executed successfully, namely, execute(service) = true (4) Whether there is redundant privacy attribute or not, namely, subject:name \ (service(input) [ service(pre_condition)) = /; From client checked result is reported to server. If result of item = i(opi), at server ontology tree searching algorithm is (1) is uC started, to search brother node of this privacy attribute brother(Ci(opi)) in ontology tree. By substituting with brother node and rechecking it until user privacy instance assertion axiom u is satisfied, item (2) will automatically be satisfied by this time. If u cannot be satisfied in the end, then user send message requiring service composer to rebind service. If result of item (3) is execute(service) = false, service composer have to rebind new service and show privacy attribute to be disclosed in service operation to user again and restart checking from Item (1). If result of item (4) is subject:name \ (service(input) [ service(pre_condition)) – /, which mean there is redundant privacy attribute, then service composer delete it according to user requirement, return the result to user for confirmation and obtain the min-privacy attribute collection. Pre-negotiation process is done successfully. The detailed pre-negotiation process as showed in Fig. 3 (take the successful pre-negotiation process as an example). {C1(op1), C2(op2), C3(op3), . . . , Ci(opi), . . . , Cn (opn)} is equivalent withsubject:name. To show the element variation of collection we use the former expression. To show the whole collection we use the latter expression. AXIOM 1 the privacy attribute sequence that satisfying min-privacy attribute collection is the corresponding privacy attribute exchange sequence of privacy disclosure assertion. 5.2. Exchange of privacy disclosure assertion Once the pre-negotiation between user and service composer succeeds, we obtain the sequence of subject:name, namely, {C1(op1), C2 (op2), C3(op3), . . . , Cj(opj), . . . , Cn(opn)}. To user, each Cu(opu) has its corresponding privacy disclosure assertion PDA-U. We can express it as Cu(opu) ´ PDA-U. To service composer, each Cs(ops) has its corresponding privacy discloser assertion PDA-S. We can express it as Cs (ops) ´ PDA-S. These assertions set

Fig. 3. Pre-negotiation of privacy policy.

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

constraints on how service provider to use the privacy attributes, which privacy attribute can be passed to service provider and the validity time of using privacy attribute. Definition 12 (PDA-U/PDA-S Semantic Matching SM). PDA-U/PDA-S Semantic matching satisfies one of below conditions: (1) Cu(opu) a PDA-U  Cs(ops) a PDA-S, which means that the corresponding PDA-U of user privacy attributes class instance is equivalent with the corresponding PDA-S of service provider privacy attributes class instance. (2) Cu(opu) a PDA-UCs(ops) a PDA-S, which means that the corresponding PDA-U of user privacy attributes class instance is subsumed by the corresponding PDA-S of service provider privacy attributes class instance. Suppose PDA-Si and Brother(PDA-Si) satisfy PDA-Si _ Brother( PDA-Si) w PDA-Ui. Algorithm 2 is presented as semantic matching algorithm. The first line, input the corresponding PDA-U and PDAS sequence of min-privacy attribute collection, which obtained through pre-negotiation. The second line, output the obtained privacy disclosure strategy after negotiation between user and service provider. From the third to the fifth line, respectively initiate PDA-Ui stack, PDA-Si and Service Level Agreement SLA. The sixth and seventh line reversely push privacy disclosure assertion {PDA-Un, PDA-Un 1, . . . , PDA-Ui, . . . , PDA-U1} and {PDA-Sn, PDA-Sn 1, . . . , PDA-Si, . . . , PDA-S1} into stack, respectively. From the eighth line to the end of this algorithm, it is a while iteration. When both stack is not empty, pop the top stack element of PDA-Ui and PDASi, and then match the DAS of PDA-Si with trust degree of PDA-Ui. If PDA-Ui is equivalent with PDA-Si, enter eitherPDA-Ui or PDA-Si (PDA-Ui _ PDA-Si) into queue of SLA. If PDA-Ui subsumes PDA-Si, or DAS of PDA-Si matches with trust degree of PDA-Ui, enter PDA-Si into queue of SLA. If PDA-Ui A PDA-Si, search and find corresponding brother privacy disclosure assertion in ontology tree Brother(Ci(opi) ´ PDA-Si), named PDA-Sk, then push it into Stack (PDA-S), and then push PDA-Ui into Stack (PDA-U), keeps iterating until privacy disclosure assertion that satisfy user requirement is found. Algorithm 2. SM((PDA-Si, PDA-Ui), PDA) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17

Input:{PDA-Si}, {PDA-Ui} Output: {PDA} Init Stack(PDA-Ui); Init Stack(PDA-Si); Init Queue(SLA); Push(Stack(PDA-U), {PDA-Un, PDA-Un 1, . . . , PDAUi, . . . , PDA-U1}); Push(Stack(PDA-S), {PDA-Sn, PDA-Sn 1, . . . , PDASi, . . . , PDA-S1}); while(Stack(PDA-U)! = ; ^ Stack(PDA-S)! = ;) do Pop(Stack(PDA-U), PDA-Ui); Pop(Stack(PDA-S), PDA-Si); Match(PDA-Si[DAS], PDA-Ui[trustDegree]);// matching PDA-S[DAS] with PDA-U[trustDegree]; if(PDA-Ui  PDA-Si)do EnQueue(Queue(SLA), PDA-Ui _ PDA-Si); Else if(PDA-Ui @ PDASi _ Match(DAS, trustDegree) = ture) EnQueue(Queue(SLA), PDA-Si); Else Brother(Ci(opi) ? PDA-Si);// substituting PDA-Si with brother node Ci(opi);

18 19 20 21 22 23

55

Push(Stack(PDA-S), PDA-Sk); Push(Stack(PDA-U), PDA-Ui); end if end if end if end while

Exchange of privacy disclosure assertion between user and service composer is also semantic matching process. Firstly, at server PDA-Si is sent to client according to the privacy assertion exchange sequence, which is obtained by pre-negotiation. Upon received by client, description logic reasoner Reasoner(PDA-Si, PDA-Ui) is started. There are three different reasoning results: (1) PDA-Ui  PDA-Si. Since it is equivalent, client send success negotiation message to server and put the corresponding privacy attribute of PDA-Ui or PDA-Si into SLA. (2) PDA-Ui @ PDA-Si _ Match(DAS, trustDegree) = ture. Since PDAUi is subsumed by PDA-Si, or DAS of service provider matches with user trust degree, client send success negotiation message to server and regard PDA-Si as final privacy disclosure assertion, and put its corresponding privacy attribute into SLA. (3) PDA-Ui A PDA-Si. Since PDA-Ui subsumes PDA-Si, client send failure negotiation message to server. Server search and find corresponding brother node of Ci(opi) and send the PDA-Sk of the brother node to client, to re-negotiate with user PDA-Ui, and keeps iterating. Detailed negotiation process as showed in Fig. 4. 5.3. Framework of privacy policy negotiation The framework of privacy policy negotiation has two layers, as showed in Fig. 5: (1) Mapping Layer, which supports the mapping between Privacy Attribute Collection (PAC) and Knowledge Domain Ontology (KDO), so that the semantic relationship among privacy attributes can be determined and privacy attribute ontology can be set up. During the period of the privacy policy pre-negotiation, once conflicts are detected, negotiation engine can substitute the conflicted attribute with brother attribute of ontology tree, which is found by semantic relationship among privacy attributes, and find the privacy attribute sequence that satisfying user privacy requirement. (2) Negotiation Layer, which has two periods, namely, prenegotiating period and Privacy Disclosure Assertion (PDA) exchange period. (I) During the period of pre-negotiation, firstly, negotiation engine analyzes the user requirement document and service input and pre-condition provided by service provider, respectively, obtaining user Privacy Preference (PP) and service Privacy Attribute Collection (PAC). Secondly, detects the conflict between PP and PAC, to discover privacy attribute that not satisfying user privacy requirement. Thirdly, search engine substitutes the discovered privacy attribute with brother attribute of ontology tree through calling the mapping layer. (II) During the PDA exchange period, exchange the corresponding PDA of service privacy attribute and user privacy requirement, and iterate this exchange process. Through this process, PDA collection that satisfying both

56

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

Fig. 4. Exchange of privacy disclosure assertion.

User Negotiation Engine

Server Negotiation Engine

Privacy Preference (PP)

Privacy Disclosure Strategy (PDS) Negotiating Delegation of Authority Statement (DAS)

Privacy Disclosure Assertion (PDA)

Privacy Disclosure Assertion (PDA) Service Requester

Services Composer

Mapping

Knowledge Domain Ontology

Privacy Attribute Collection (PAC)

Fig. 5. Privacy policy negotiation framework.

service provider and user, namely, Privacy Disclosure Strategy (PDS) may be found. Then PDS is included in Service Level Agreement (SLA). 6. Case study Tom wants to purchase some Furniture from service provider corporation S via cloud service composer CSC. Corporation S is non-VIP seller of E-commerce service (ES). To non-VIP seller of ES, Tom requires that, if realName will be disclosed, then only officePhone can be disclosed at the same time, and the address must not include community information. realName, AddressWithoutCommunity and officePhone can only be provided to shipper or post office. Upon transaction is done, all service participants including CSC, ES and corporation S have to delete all user privacy information within 20 min. To non-VIP seller corporation S of ES, CSC only allow corporation S to provide Tom realName AddressWithoutCommunity and officePhone to shipper or post office. Upon transaction is done, S and all service participants have to delete all user privacy information automatically within 15 min. The service includes Customer (Tom), cloud service composer CSC and three associate participants which is online purchase platform ES, Seller (S), Post office or Shipper. Name, address, postcode and phone are customer personal privacy data. At the beginning of transaction, Customer send order requirement (OrdReq)

to Seller. CSC send message for privacy attributes required (PriReq) to three associate participants, so as to collect relative privacy attributes. Then send message to Tom via privacy negotiation service, informing collected relative privacy attributes are ready. Then privacy negotiation service is triggered to pre-negotiate privacy policy and exchange privacy disclosure assertion. Once receive OrdReq and the success negotiation message (NegoSucc) from CSC, seller will select Postoffice or Shipper to send goods to Tom, and then send Postoffice requirement (PostReq) or Shipper requirement (ShipReq). Upon goods delivered (PostEnd/ShipEnd), Postoffice or Shipper requires Tom to make payment. Once receive payment, Seller is informed that deal is done (PostSucc/ ShipSucc). The transaction process as showed in Fig. 6 Online Purchase Case. Assign value to the user privacy attribute collection {subject:name} required by ES: userName (Brobo); realName (Tom); Street (YUDAO STREET); City (NANJING); Province (JIANGSU); Country (CHINA); officePhone (+86 0258686866) [ Mobile (+86 123456789); postCode (210016);

57

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

Feedback

Cloud-Service Composer (CSC)

Privacy Attribute

A cy

PriReq

e ut ttr ib

iR eq

te

iv a

bu

Pr

i ttr A

Pr

y eq iR Pr

Shipper/ Postoffice

Service

ac

Privacy Attribute

Customer(Tom)

Cryptographic

iv

Feedback

Privacy Negotiation Service

Pr

Privacy

E-commerce Service (ES)

OS Service

Infrastructure Service

Seller (S)

Fig. 6. Online purchase case.

According to user Tom privacy requirement, we can obtain instance assertion axiom u, which is u = $ hasrealName.Name (Tom) u AddressWithoutCommunity (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) u hasOfficePhone (Brobo, +86 0258686866). In which non-atomic concept Address and AddressWithoutCommunity can be express as: Address Community u Street u City u Province u Country. AddressWithoutCommunity  Address u "hasAddress.: Community. We can start conflict detection by taking advantage of Tableau(). (1) Extend the non-atomic concept AddressWithoutCommunity with extension rule: suppose A is atomic concept and A v B, A½path 2 AðxÞ; nnf ðBÞ R AðxÞ, then AðxÞ ¼ AðxÞ [ fnnf ðBÞ½path:A g. We can obtain that u = $ hasRealname.Name (Brobo) u Address u " hasAddress.:Community (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) u hasOfficePhone (Brobo, +86 0258686866). (2) Extend non-atomic concept Address with extension rule again, and we can obtain that / = $ hasRealname. Name (Brobo) u Community u Street u City u Province u Country u "hasAddress.:Community (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) u hasOfficePhone (Brobo, +86 0258686866). (3) Take advantage of $ rule of Tableau algorithm: suppose C = {C1, C2}, if 9S:C 2 AðxÞ, x does not have successor y of S that making C 2 AðyÞ, then add a node y, assign value Aðx; yÞ ¼ S and AðyÞ ¼ fCg. Simplify the above formula and we can obtain that:u = Name (Tom) u hasrealName (Brobo, Tom) u Community u Street u City u Province u Country u" hasAddress.:Community (YUDAO STREET, NANJING CITY, JIANGSU PROVINCE, CHINA) u hasOfficePhone (Brobo, +86 0258686866). (2) Take advantage of " rule of Tableau algorithm, suppose C = {C1, C2}, if 8S:C 2 AðxÞ, while C R AðxÞ, then AðxÞ ! AðxÞ [ fCg. Simplify the above formula and we can obtain that:u = Name (Tom) u hasrealName (Brobo, Tom) u Street u City u Province u Country u hasOfficePhone (Brobo, +86 0258686866). (3) Take advantage of u rule of Tableau algorithm, suppose (a) C1 u C2 2 u(x) and x is not be blocked directly, (b) {C1, C2} R u(x), then u(x) ? u(x) [ {C1, C2}./ = Name (Tom), hasrealName (Brobo, Tom), Street, City, Province, Country, hasOfficePhone (Brobo, +86 0258686866).

(4) Through simplifying the above formula we can obtain thatu = realName, Street, City, Province, Country, officePhone, substitute it with value of privacy attributes, u = Tom, YUDAO, NANJING, JIANGSU, CHINA, +86 0258686866, satisfying the formula UPKB = hsubject:name, Tableau(), ui, therefore no conflicts, this privacy attribute collection is user privacy knowledge base. From definition 7 we know that Tom {realName, Adress} and {realName, mobilePhone} is sensitive privacy attribute collection. From definition 9 we know that UPKB {realName, AdressWithoutCommunity, officePhone} is just Tom min-privacy attribute collection. If Tom does not want to disclose office phone, and service cannot be executed at this time, then it is not min-privacy attribute collection. According to Axiom 1, we know that {realName, AdressWithoutCommunity, officePhone} is corresponding privacy attribute exchange sequence of privacy disclosure assertion. To simplify privacy disclosure assertion, for non-VIP of ES, we omit serverConstr part. According to user privacy requirement, we obtain user privacy disclosure assertion PDA-Ui: PDA-U 1 ¼ realName : ðES ^ S ^ shipper _ postofficeÞ^ 6 20 minðv aildtimeÞ; PDA-U 2 ¼ AdressWithoutCommunity : ðES ^ S ^ shipper _ postofficeÞ^ 6 20 minðv aildtimeÞ; PDA-U 3 ¼ officePhone : ðES ^ S ^ shipper _ postofficeÞ^ 6 20 minðv aildtimeÞ;

According to service privacy strategy of service composer, we can obtain privacy disclosure assertion PDA-Si:

PDA-S1 ¼ realName : ðES ^ S ^ shipper _ postofficeÞ^ 6 15 minðv aildtimeÞ; PDA-S2 ¼ AdressWithoutCommunity : ðES ^ S ^ shipper _ postofficeÞ^ 6 15 minðv aildtimeÞ; PDA-S3 ¼ officePhone : ðES ^ S ^ shipper _ postofficeÞ^ 6 15 minðv aildtimeÞ; We omit the transforming of the above privacy disclosure assertion by transformation function of description logic PDL () and reasoning by Reasoner(PDA-Si,PDA-Ui). According to theory in 5.2, through exchange of privacy disclosure assertion, we can obtain that PDA-Ui @ PDA-Si. Therefore we enter PDA-S1 ^ PDA-S2 ^ PDA-S3 into SLA. Therefore, the detailed negotiation process for privacy policy is as follows:

58

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

Fig. 7. Negotiation process between Tom and CSC.

(1) Pre-negotiation Phase: Tom sends OrdReq to service provider corporation S via cloud service composer CSC, to purchase some Furniture. CSC collects required privacy attributes {realName, Address, OfficePhone} in transaction and pre-negotiates through client algorithm Tableau(Ci(opi), u). {realName, Address} is identified as Sensitive Privacy Attribute Collection SPAC. Therefore, Address does not meet Privacy = Disclosure Assertion of User PDA-U u, namely, uAddress. Through function brother(Address), obtain brother node AddressWithoutCommunity in ontology tree, and then keep iterating. With pre-negotiation again, {realName, AddressWithoutCommunity, OfficePhone} meet User Privacy Knowledge Base UPKB, namely UPKBsubject:name, and service composition can be executed not only normally, namely, execute(service) = true, but also without redundant privacy attributes, namely, subject:name \ (service(input) [ service (pre_condition))–/. At this moment, pre-negotiation succeeds. (2) Privacy Disclosure Assertion Exchange Phase: Upon privacy attribute collection {realName, AddressWithoutCommunity, OfficePhone} is obtained through pre-negotiation, trigger semantic matching on privacy disclosure assertion by Reasoner(PDA-Si, PDA-Ui). The description of service provider privacy disclosure assertion is stricter comparing with that of Tom, namely, PDA-U1 @ PDA-S1, PDA-U2 @ PDA-S2, PDAU3 @ PDA-S3, negotiation succeeds. Detailed process as showed in Fig. 7. 7. Discussion 7.1. Algorithm evaluation At Pre-negotiation Phrase, generally, in Algorithm 1, counts of elements in user privacy attribute collection are less than or equal to counts of elements in privacy attribute collection service composer required, namely, jPA-Uj 6 jPA-Sj. Therefore, frequency of conflict detection depends on user sensitive privacy attribute collection. Moreover, counts of elements in sensitive privacy attribute collection must be less than or equal to counts of user privacy attribute, namely, jSPACj 6 jPA-Uj. Supposing jSPACj = m, the iterating frequency maximum be the sum of corresponding brother node in privacy ontology tree of sensitive privacy attributes, which are m 1 counts. Then we can derive the corresponding sequence subject:name of privacy disclosure assertion, namely, {C1(op1), C2(op2), C3 (op3), . . . , Cj(opj), . . . , Cn(opn)}.

At Privacy Disclosure Assertion Exchange Phase, in Algorithm 2, supposing j{C1(op1), C2(op2), C3(op3), . . . , Cj (opj), . . . , Cn(opn)}j = n, the frequency of privacy disclosure assertion exchange depends on assertions counts of each C(op). Moreover, the corresponding privacy assertions of each C(op) are limited. Therefore, the iterating frequency of algorithm maximum be sum counts of corresponding brother nodes of privacy assertion ontology tree of each C(op). From the above analysis, we can derive that the iterating frequency of algorithm are limited and the negotiation results can be achieved. Therefore, we prove the correctness and effectiveness of the algorithm. 7.2. Method limitations analysis This paper presents privacy policy negotiation mechanism in cloud computing. Our method has following limitations: (1) In cloud computing, all entities are service. Our method also is a service provided to user and service provider. However, as privacy policy negotiation service, it will hold great amount of user privacy data. We do not discuss how to guarantee the security of user privacy data in this service. This issue relates to research on information security. To avoid malicious attack on internet requires encryption on user privacy data. (2) The structure of cloud computing is distributive, each service composer or service provider can be connected through internet. Privacy policy negotiation requires internet transmission. We do not discuss how to guarantee the security of user privacy data in the process of internet transmission. This issue also relates to research on information security. To solve it requires encryption. (3) Some service requires human participation, for example: courier services. During such process, user privacy information may be disclosed because of human factors. To solve this issue requires work ethic and professional techniques. (4) In this paper, privacy policy is obtained through negotiation between user and service composer. This method can shield uncertain service provider caused by virtualization. However, as negotiation agent of service provider, service composer may implement privacy policy improperly for own interests. In the mean time, it is also hardly to guarantee service provider to obey privacy policy during the process of service providing. Therefore, the next step is to supervise service composer and service provider to implement privacy policy correctly.

C. Ke et al. / Knowledge-Based Systems 51 (2013) 48–59

8. Conclusions and future work In this paper, we present description method of privacy property and negotiation mechanism of privacy policy based on description logic. Considering service outsourcing character, this paper provides a theory basis and implementation method for user privacy information protection in opening cloud computing. Our method can effectively protect user privacy and prevent service provider from illegally using and propagating. Future work is to supervise service provider whether obey the negotiated privacy policy in the process of service composition. Acknowledgements This work was supported by the National Natural Science Foundation of China (Grant 61272083), China Postdoctoral Science Foundation (Grant 20110491411), Jiangsu Planned Projects for Postdoctoral Research Funds (Grant 1101092C), Funding for Outstanding Doctoral Dissertation in NUAA (Grant BCXJ12-14), and the Fundamental Research Funds for the Central Universities. References [1] P. Mell, T. Grance, Draft NIST Working Definition of Cloud Computing, Referenced on June 3rd, 2009. . [2] M. Armbrust, A. Fox, R. Griffith, A.D. Joseph, R.H. Katz, A. Konwinski, G. Lee, D.A. Patterson, A. Rabkin, I. Stoica, M. Zaharia, Above the Clouds: a Berkeley View of Cloud Computing, University of California, Berkeley, Tech. Rep. UCB-EECS2009-28, February 2009. [3] Louis D. Brandeis, Samuel D. Warren, Steven Alan Childress, The Right to Privacy. [4] I. Goldberg, D. Wagner, E. Brewer. Privacy-Enhancing Technologies for the Internet, Proceedings, in: Proceedings of the 42nd IEEE International Computer Conference(COMPCON’97), 1997, pp. 103–109. [5] F. Xiao, Z. Huang, Z. Cao, J. Hu, L, Liu, Modeling cost-aware Web services composition using PTCCS, in: Proceeding of the International Conference on Web Service (ICWS 2009), 2009. [6] G. Yee, L. Korba, Privacy policy compliance for Web services, in: Proceedings of 2004 IEEE International Conference on Web Services (ICWS 2004), 2004, pp. 158–165. [7] J. Zhang, C.K. Chang, L.J. Zhang, P.C.K. Hung, Toward a service-oriented development through a case study, IEEE Transaction on Systems Man, Cybernetics, Part A 37 (6) (2007) 955–969. [8] S. Pearson, Taking account of privacy when designing cloud computing services, in: ICSE-Cloud’09, Vancouver, IEEE, Also Available as HP Labs Technical Report, HPL-2009-54. [9] S. Pearson, A. Charlesworth, Accountability as a Way Forward for Privacy Protection in the Cloud, HP Labs Technical Report, HPL-2009-178.

59

[10] R. Hamadi, H.Y. Paik, B. Benatallah, Conceptual modeling of privacy-aware Web service protocols, in: Proceeding of the 19th International Conference on Advanced Information System Engineering (CAiSE 2007), 2007, pp. 233–248. [11] N. Guermouche, S. Benbernou, E. Coquery, M.S. Hacid, Privacy-aware Web service protocol replaceability, in: Proceeding of the International conference on Web services (ICWS 2007), 2007, pp.1048–1055. [12] K. Mokhtari, S. Benbernou, M. Hacid, E. Coquery, F. Leymann, Verification of privacy timed properties in Web service protocols, in: Proc. of the IEEE In’l Conf. on Services Computing(SCC 2008), 2008, pp. 593–594. [13] LinYuan Liu, Haibin Zhu, Zhiqiu Huang, Dongqing Xie, Minimal privacy authorization in web services collaboration, Computer Standards & Interfaces 33 (3) (2011) 332–343. [14] A. Barth, J. Mitchell, A. Datta, S. Sundaram, Privacy and utility in business processes. In: Proc. of the 20th IEEE Computer Security Foundations Symp. (CSF 2007), 2007, pp. 279–294. [15] Wei Zhiqiang, Kang Mijun, et al., Research on privacy-protection policy for pervasive computing, Chinese Journal of Computers. 33 (1) (2010) 128– 138. [16] H. Zhu, M.C. Zhou, Role-based collaboration and its kernel mechanisms, IEEE Transaction on Systems, Man, Cybernetics, Part C 36 (4) (2006) 578–589. [17] K. El-Khatib, A privacy negotiation protocol for web services, in: Workshop on Collaboration Agents: Autonomous Agents for Collaborative Environments Halifax, 2003, pp. 85–92. [18] ZHANG Yan, FEN Deng-Guo, Parsimonious semantic trust negotiation, Chinese journal of computers 32 (2009) 1989–2003. [19] Jan Kolter, Michael Netter, Günther Pernul, Visualizing past personal data disclosures, ARES (2010) 131–139. [20] LinYuan Liu, Haibin Zhu, Zhiqiu Huang, Analysis of the minimal privacy disclosure for web services collaborations with role mechanisms, Expert Systems with Applications 38 (4) (2011) 4540–4549. [21] T. Yu, Y. Zhang, K.J. Lin, Modeling and measuring privacy risks in QoS Web services, in: Proceeding of 8th IEEE International Conference on E-Commerce Technology (CEC 2006)/3th IEEE International Conference on Enterprise Computing, E-Commerce and E-Services (EEE 2006), 4. [22] Jason I. Hong, Jennifer D. Ng, Scott Lederer, James A. Landay, Privacy risk models for designing privacy-sensitive ubiquitous computing systems, Conference on Designing Interactive Systems, 2004, pp. 91–100. [23] Weiwei Ni, Zhihong Chong, Clustering-oriented privacy-preserving data publishing, Knowledge-Based Systems (35) (2012) 264–270. [24] David E. Bakken, Rupa Parameswaran, Douglas M. Blough, Andy A. Franz, Ty J. Palmer: Data Obfuscation: Anonymity and Desensitization of Usable Data Sets. IEEE Security & Privacy 2 (6) (2004) 34–41. [25] F. Bao, R. Deng, P. Feng, An efficient and practical scheme for privacy protection in the e-commerce of digital goods, Information Security and Cryptology—ICISC 2000 (2001) 162–170. [26] B. Gilburd, A. Schuster, R. Wolff, k-TTP: a new privacy model for large-scale distributed environments, in: Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ACM, 2004, pp. 563–568. [27] M. Ye, X. Wu, X. Hu, et al., Anonymizing classification data using rough set theory, Knowledge-Based Systems (2013). [28] L. Sweeney, Achieving k-anonymity privacy protection using generalization and suppression, International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 10 (05) (2002) 571–588. [29] Li HaiHua, Du XiaoYong, Tian Xuan, A capability enhanced trust evaluation model for web services, Chinese Journal of Computers 31 (8) (2008) 1473– 1474.