Formal apparatus for measurement of lightweight protocols

Formal apparatus for measurement of lightweight protocols

Computer Standards & Interfaces 31 (2009) 305–308 Contents lists available at ScienceDirect Computer Standards & Interfaces j o u r n a l h o m e p ...

187KB Sizes 0 Downloads 81 Views

Computer Standards & Interfaces 31 (2009) 305–308

Contents lists available at ScienceDirect

Computer Standards & Interfaces j o u r n a l h o m e p a g e : w w w. e l s e v i e r. c o m / l o c a t e / c s i

Formal apparatus for measurement of lightweight protocols Denis Trček ⁎, Damjan Kovač Laboratory of E-media, Faculty of Computer and Information Science, Tržaška c. 25, 1000 Ljubljana, Slovenia

a r t i c l e

i n f o

Article history: Received 16 March 2007 Received in revised form 26 January 2008 Accepted 24 February 2008 Available online 6 March 2008 Keywords: Ubiquitous computing Lightweight protocols Security Security services metrics Standardization

a b s t r a c t Lightweight protocols are an important topic in the area of computer communications. With the proliferation of security services not only ordinary communication protocols, but also cryptographic protocols, i.e. security services, have become a subject of research into possible appropriate lightweight solutions. At first glance it may seem surprising, but the evidence suggests that there is a permanent need for lightweight protocols. And this need is ever increasing, due to the gap between desktop (and other ordinary computing devices) and mobile wireless devices that have inherently limited resources. However, the notion of lightweight protocol has not been formally addressed in the literature, which is the purpose of this paper. A formal model that can be used to evaluate lightweight properties of protocols is presented and the appropriate metrics are introduced. Despite the fact that the model and the metrics target weak processing devices, they can be deployed for ordinary computing environments and may present a methodology for evaluation of lightweight cryptographic protocols in standardization processes. © 2008 Elsevier B.V. All rights reserved.

1. Introduction Current information technology research and standardization is largely focused on services oriented architectures (SOAs). These architectures are based on a distributed computing paradigm, which is already decades old. Among the earliest standardization and implementation efforts was OSI protocol stack, introduced roughly twenty years ago. One particular representative that was focused on distributed services was Job Transfer and Manipulation or JTM [1]. Besides OSI, also other approaches existed, but none were really successful, including OSI. About ten years ago, the situation changed significantly with the proliferation of WWW [2], and XML [3]. These two cornerstones (together with TCP/IP technology) provided the necessary infrastructure that was open, flexible and globally available. On this basis, the distributed computing paradigm, that is oriented towards services to support various business needs, gained enough potential to take root. Web services (WS) were born that consist of three building blocks: Simple Objects Access Protocol or SOAP [4], Web Services Description Language or WSDL [5] and Universal Description, Discovery and Integration or UDDI [6]. The first protocol is about exchange of messages between distributed entities, the second involves the description of available services, and the third enables registration and discovery of services. WS is the first important trend in the current development of globally interconnected information systems. Two other important trends are mobility and wireless communications. Wireless communications are the key enabler on which mobility and consequently,

⁎ Corresponding author. E-mail addresses: [email protected] (D. Trček), [email protected] (D. Kovač). 0920-5489/$ – see front matter © 2008 Elsevier B.V. All rights reserved. doi:10.1016/j.csi.2008.02.004

proliferation of embedded and ubiquitous computing is based. Further, devices in the above described communications scenarios include not only “ordinary” mobile terminals (e.g. phones and palm tops), but more and more weak processing devices. To be specific, a significant proportion of communication devices that is rapidly increasing includes smart-cards and Radio Frequency Identification devices or RFIDs [7,8]. A common property of these devices is their low level of resources, be it processing power, storage capacity, communication bandwidth, or power/energy supply. One might quibble that due to Moore's law the above mentioned devices will become more and more capable, which is true. But this fact does not eliminate the permanent need for lightweight protocols. Moore's law applies equally to other devices, like mainframes, desktop computers, etc. Thus the gap will always exist, because weak mobile devices are inherently limited by their physical dimensions, weight and production costs. Other, non-technical reasons for the use of lightweight protocols may also exist. For example, a decade ago, when crypto products were under the control of the US Dept. of Defense, it was prohibited to export products that deployed strong symmetric and asymmetric encryption. However, it was possible to export strong one-way hash functions based products. Consequently, lightweight protocols that were deploying strong one-way hash functions were not subjected to such restrictions. Thus lightweight protocols remain an important research area, not only for basic security, but more and more for ensuring privacy. And this is where our paper comes in. In Section 2 a standardization perspective on lightweight protocols is given. In Section 3 a formal metrics is introduced. It provides quantitative means for analysis (and synthesis) of lightweight communication protocols, and can serve as a measuring standard for the whole range of distributed application environments, not only wireless and ubiquitous. The conclusion is in Section 4.

306

D. Trček, D. Kovač / Computer Standards & Interfaces 31 (2009) 305–308

2. Standardization perspective on lightweight protocols To better convey the message of this paper and its justification, some historical facts from standardization perspective need to be given first. One most widely known lightweight protocol is Lightweight Directory Access Protocol or LDAP [9]. It has roots in the OSI architecture that was mentioned in the first section. Within OSI, a universal directory was proposed, which was specified in X.500 standard [10]. Access to this X.500 directory was by means of Directory Access Protocol or DAP, which was complex (this holds true also for the majority of other OSI solutions). DAP introduced overhead that was excessive even for the majority of fixed and wired computing devices of that time. In order to reduce this overhead, LDAP was developed at University of Michigan. It is a client– server protocol, initially intended only for front-end accessing to X.500. But it has gained wide acceptance and its use actually exceeded the initially planned ways of application. Despite being lightweight, LDAP was still too demanding for certain application that required low connection establishing overhead and exchange of small amounts of data. For this reason, IETF standardized Connectionless Lightweight Directory Access Protocol — CLDAP [11]. Another well known example of lightweight protocols from the nineties is the KryptoKnight family of protocols [12,13]. KryptoKnight deployed strong one-way hash functions to provide security services, such as strong authentication with session key exchange. This family played an important role in the area of GSS-API [14]. A similar solution from that period that provided the same functionality was the MBAKE family of lightweight protocols. This also deployed strong one-way hash functions based message authentication codes (initial work is described in [15], while the complete family is given in [16]). At the turn of the century, when we were entering the era of contemporary computing environments, web services came into focus. For the core representative protocol its standard states that “SOAP Version 1.2 is a lightweight protocol intended for exchanging structured information in a decentralized, distributed environment.” Initially, this protocol was lightweight, but many authors now agree that it no longer is (see e.g. [17,18]). Beside de iure standards, some important de facto standards have to be mentioned. An important examples are Lightweight Access Point Protocol (LWAPP) [19] that standardizes the communications protocol between access points and WLAN systems (switches, routers, etc.), and Lightweight Extensible Authentication Protocol (LEAP) [20] that supports effective authentication in WLANs. It may sound surprising, but the concept of lightweight protocols goes back to the era when the Internet was born. Actually, User Datagram Protocol (UDP), which provides unreliable transport service, is considered to be lightweight [21]. 3. Formal apparatus Clearly, there is now an open room for discussion as to which properties of the above mentioned protocols qualify them to be lightweight. The following questions have to be considered: • What does lightweight protocol actually mean? • When can a protocol be characterized as lightweight? • Is one protocol “lighter” than another protocol? It is evident that some formalism and metrics have to be introduced to provide answers to the above questions. Last but not least, science is about measuring phenomena. To cope better with our problem area, typical current computing devices in contemporary networks will be briefly analyzed. The wide variety of these devices can be structured into the following categories (from the most powerful to the weakest): mainframes, desktop and laptop computers, palmtop computers, mobile phones, smart-cards and non-processor based devices, e.g. RFIDs.

Consideration of the resources of these typical devices needed to implement security and privacy assuring services leads to the following: • With regard to mainframe computers and the implementation of security and privacy assuring services, there is no practical limitation on implementation, and consequently very little need for lightweight solutions. • The resources of desktop and laptop computers are comparable to those of the above mentioned category and are of the same order of magnitude. Thus, a similar statement to that given above holds true for this kind of systems. • With regard to palmtops — their physical dimensions are constrained, which is reflected in their computing potential. Certain lightweight solutions are welcome. • The resources of mobile phones are similar to those of palmtops, or even more constrained. • With smart-cards the situation becomes tighter. These are small devices (tiny microcontrollers) with very limited dimensions, usually without power autonomy. The need for lightweight protocols here becomes evident. • With regard to non-processor based devices, i.e. RFIDs, there is no doubt — these are the weakest computing devices. For these devices, lightweight protocols are a must. Key characteristics of contemporary network devices are given and quantified in Table 1 below, where typical range of resources that the above mentioned categories possess is summarized with approximate figures. Further, in Table 1, permanent storage includes hard disks and built-in flash memory (without taking into account expansion cards). Further, smart-cards are assumed to be processor cards and, together with RFIDs, they are battery powered or passive. Despite the fact that RFIDs do not have processors, they are driven by internal clock as well. So stating the frequency of the clock that drives their gate operations makes sense. Now going to the formal treatment — one of the most widely used theoretical models in computer sciences is Turing's machine. It is aimed at studying what is theoretically possible, i.e. what in principle can be computed. Also, related to this model is the measurement of computational complexity (in terms of time and space) of a certain problem as a function of its increasingly large input. These premises differ from what a computer scientist or an engineer is faced with when designing a new cryptographic protocol. First, the input size is often small, having nothing to do with asymptotic behavior. Second, a concrete and real computing architecture has to be considered, because the protocol will run on this architecture and has to reflect the general nature of current computing devices. The majority of current computing architectures are still best described by von Neumann's model. However, if we want to include the weakest devices, which are RFIDs, and whose number in the global network is expected to outgrow all other kinds of computing devices, von Neumann's architecture is not appropriate. In addition, none of the models above addresses communications.

Table 1 Categories of networking devices with their typical current system resources

Processor speed RAM Permanent storage Network Autonomy

Desktop computer

Palmtop computer

Mobile phone

Smart-card

RFID circuit

3 GHz

0.3 GHz

0.3 GHz

7.5 MHz

20 kHz

2 GB 320 GB

64 MB 128 MB

18 MB 50 MB

4 KB 96 KB

few K gates 2 KB

1 Gbps Full

54 Mbps Few hours

54 Mbps Few hours

115.4 kbps None or few months

100 kbps None or few months

D. Trček, D. Kovač / Computer Standards & Interfaces 31 (2009) 305–308

In the rest of the paper we will focus on wireless devices and particularly RFIDs. They are the most primitive computing devices that are estimated to be very widely deployed in ubiquitous computing environments. Thus RFIDs will form the basis on which the notion of lightweight protocols is formalized and the appropriate metrics are provided. Concentrating on RFIDs also gives a common basis for evaluation of protocol costs for the rest of (more complex) computing devices. RFIDs consist of two basic circuitries. The core is a circuit with logical gates, and the second is communication circuitry (RF part). A suitable model requires the following characteristics: • The measuring model should support implementation of any logical function. • The measuring model should include communication costs. • The production technology should be taken into account. The key components of the required model are given in Fig. 1. Taking into account current technology (and the fact that this technology with its principles has dominated our research area for decades), NAND gates are taken to represent the metrics. The reason is as follows. To realize any logical function, a logically complete set of Boolean functions has to be used. Put another way, using a logically complete set of functions over the Boolean algebra, any Boolean function can be implemented. The first such complete set consists of conjunction and negation. Its implementation with NAND gates is shown in Fig. 2. It requires one NAND gate for negation and two NAND gates for conjunction. Of course, this means that NAND Boolean function is also logically complete over the Boolean algebra. The second such set consists of disjunction and negation. It requires three NAND gates for disjunction. Thus our metrics could be equally based on NOR gates. Finally, it should be taken into account that RFIDs require clocked elements. Every protocol implementation requires some storage, and for this purpose D storage cells are chosen for metrics, because they are clocked. Each D cell, i.e. flip-flop, requires five NAND gates [22]. In addition to storage, typical logical functions that are needed for provision of security (cryptographic protocols) are bitwise XOR and addition mod 2 n. Bitwise XOR of variables x and y, denoted by x ⊕ y, can be obtained as [x ⁎ NOT(y)] AND [NOT(x) ⁎ y]. After some optimization steps, the result are four NAND gates. Similarly, one bit full adder can be implemented with eleven NAND gates. For performing mod 2 n addition, n-times eleven NAND gates have to be used. This explanation should be sufficient for the purpose of this paper; more details about implementing Boolean functions with NAND gates can be found in [22]. A slightly harder issue is the introduction of metrics for the communications part. This communication part is actually doing the coding of the messages and sending/receiving them over the distance through electromagnetic coupling. The more complex the protocol, the larger the number of bits that will be transferred. These bits can be thought of as being stored in some memory and transported from the

Fig. 1. Key components of a primitive computing device that serve for derivation of metrics for measuring complexity of cryptographic protocols.

307

Fig. 2. One complete Boolean functions set implemented with NAND gates.

transmitter to the receiver by means of this memory. Thus, D storage cells implemented with NAND gates can be introduced for communication metrics. To summarize, the metrics for the RF part will be the number of bits that need to be transferred, and the cost of these bits will be measured with the number of D NAND storage cells required to store them. It is now possible to formally define the notion of a lightweight protocol. We introduce the cost N, which includes the total number of NAND gates required for implementation of a certain protocol. This means that N includes storage (S), processing (P) and communications (P) gates, i.e. N = S + P + C. This cost N is used as a metrics that serves to evaluate lightweight protocols. The cost N of a protocol is measured by the total number of NAND gates that are needed for its implementation and which includes storage, processing and communications. The above definition is useful for comparing costs of lightweight protocols. To state an absolute limit on whether a protocol is lightweight or not is a matter of a context and depends on the current state of technology. Currently, a reasonable limit for lightweight protocol in RFIDs environment is the cost of two thousand NAND gates. One thousand gates is on the account of crypto-operations supporting gates in RFID, and one thousand on the account of communications (the reader should note that the total number of gates in current RFIDs is approx. 5000). An example of a lightweight protocol follows that is derived from the one developed by L. Lamport [23]. This is a one-way hash function based protocol for authentication that protects privacy by means of so called pseudonyms. Its implementation in RFIDs goes as follows: 1. An RFID producer chooses a random seed, which it processes iteratively, e.g. 128 times, by sending it through a strong one-way function (suppose the function produces 160 bits of output). In the first iteration, the seed is used to give the first output, i.e. y1 =H(x1), where x1 is the seed and H denotes a strong one-way hash function. Afterwards, y1 is used as an input for a one-way hash function to give the next output, and so on until y128 is obtained. 2. All these outputs, i.e. y1,…, y128 are written into RFID memory locations, where each location is fused (or writable). Once a location transmits its value, it is blanked. 3. Now when an interrogator sends value n, RFID responds to it by sending back the n-th output of the hashed seed. Afterwards it blows the fuse (or erases the content of memory location) to prevent access to the n-th location. 4. When being interrogated for the next time, RFID responds with the (n − 1)-th output of the hashed seed, and so on. If we assume that blowing a fuse (or erasing the content) introduces no additional cost, the total cost of the protocol with one run is 160 bits for storing the hashed value (S), 160 bits for its transmission (C) and 7 bits for transmission of the address of the chosen RFID memory location in a certain run of the protocol (there are 128 different values). The total cost of a protocol run is thus 5 ⁎ 160 NAND gates + 5 ⁎ 160 NAND gates + 5 ⁎ 7 NAND gates = 1653 NAND gates. This qualifies the above protocol as a lightweight protocol. It is worth to mention that logical gates are often used as metrics in evaluation of cryptography related solutions — an example can be found in e.g. [24]. But there is a need for evaluation of the complete

308

D. Trček, D. Kovač / Computer Standards & Interfaces 31 (2009) 305–308

protocols and not only crypto primitives. Thus data transfers, which have not been taken into account so far, have to be addressed. Besides, storage capacity has to be taken into account. And the complete protocol runs have to be taken into account. At the end, a few words about other possible metrics should be given. Besides the number of logical gates, energy consumption could be a possible candidate. However, there has to be distinction between energy consumption and power consumption even in case of passive and active RFIDs [25]. If we decide for energy consumption, a significant part of devices will not be properly addressed. And if we decide for power consumption, the rest of devices will not be properly addressed. Therefore power — energy consumption approach would be problematic for obtaining general metrics. On the other hand, translating logical gates based metrics to energy or power consumption is certainly possible. And finally, the presented metrics helps to avoid common misconception that protocols that are based on strong one-way hash functions are lightweight by default. In many cases, a symmetric cipher like AES can be cheaper to implement than e.g. SHA-1 [26]. 4. Conclusions We are entering an era of ubiquitous computing, driven by wireless and mobile communication devices. A growing proportion of these devices (which is anticipated to become dominant) have limited resources, including smart-cards and RFIDs. At the same time, security and privacy requirements are getting tighter. Assuring privacy in ordinary computing environments is not an easy problem (see e.g. [27]). However, dealing with severe restrictions on the consumption of resources in ubiquitous computing environments is even harder issue. These requirements imply the need for new lightweight protocols, especially because of growing numbers of RFID devices. Although the notion of lightweight protocols is used extensively in the literature and standards, it is not formally elaborated. In order to provide appropriate metrics to support related standardization, this paper presents a model that focuses on the weak computing devices in the network (primarily RFIDs, but also smartcards). It provides a measurement apparatus for these purposes. It is designed in such way as to reflect engineering reality, and enables transformation of abstractly presented protocols into a form that provides evaluation for real computing environments. In short, it provides metrics for evaluation of security and privacy supporting cryptographic protocols. It can also serve to measure properties of protocols that are intended for devices with more resources. It is worth pointing out that RFID technology is already being complemented by Rubee technology, which is currently under standardization at IEEE [28]. Rubee principles of operation are similar to those of RFIDs, and the metrics introduced here remains applicable for technologies we will use in the near future.

[6] L. Clement, A. Hatley, C. von Riegen, T. Rogers (Eds.), UDDI v 3.0.2, OASIS, Billerica, 2004, http://uddi.org/pubs/uddi-v3.0.2-20041019.htm. [7] ANSI, Radio Frequency Identification (RFID), INCITS 256 Standard, Washington D.C., 1999. [8] Weis S.A., Security and Privacy in Radio-Frequency Identification Devices, M.Sc. Thesis, MIT, Cambridge, 2003. [9] W. Yeong, T. Howes, S. Kille, Lightweight Directory Access Protocol, IETF RFC 1777, Reston, 1995. [10] CCITT, The Directory, CCITT X.500 Standard, Geneva, 1988. [11] K. Zeilenga, The Connection-Less Lightweight Directory Access Protocol (CLDAP), IETF 3352, Reston, 2003. [12] R. Bird, I. Gopal, A. Herzberg, P. Janson, S. Kutten, S. Molva, M. Yung, The KryptoKnight family of light-weight protocols for authentication and key distribution, EEE/ACM Transactions on Networking 3 (1) (1995) 31–41. [13] P. Janson, G. Tsudik. Secure and minimal protocols for authenticated key distribution. Computer Communications, vol. 18, No. 9, pp. 645–653, Elsevier. [14] The Open Group, Secure Communication Services, X/Open SS, San Francisco, 1996. [15] D. Trček, Hash functions based one-time passwords, Proc. of the 14th Worldwide Congress on Computer and Communications Security Protection, Paris, 1996, pp. 269–276. [16] D. Trček, MAC based lightweight protocols for strong authentication and key exchange, Journal of Information Sciences and Engineering 21 (4) (2005) 753–765. [17] C. Shirky, R. Dornfest, SOAP, O'Reilly Web Services Devcenter, 2002 http://tim. oreilly.com/pub/a/webservices/2002/04/16/soap.html. [18] H. Niss, Application to Application Web Services — Interactive Web Services, IT University of Copenhagen, Copenhagen, 2004 www.itu.dk/courses/IWSJ/E2004/ slides/app2app.pdf. [19] Cisco Systems, Understanding the Lightweight Access Point Protocol, White Paper, San Jose, 2005. [20] Cisco Systems, Lightweight Extensible Authentication Protocol, Cisco Q & A, San Jose, 2005. [21] J. Postel, User Datagram Protocol, IETF RFC 768, Reston, 1980. [22] L. Vodovnik, S. Rebersek, Digital Circuits, Faculty of Electrical Engineering, Ljubljana, 1986. [23] L. Lamport, Password authentication with insecure communication, Communications of the ACM 24 (11) (1981) 770–772. [24] M. Feldhofer, J. Wolkerstorfer, V. Rijmen, AES implementation on a grain of sand, Information Security, IEE Proc., vol. 152, No. 1, 2005, pp. 13–20. [25] M. Feldhofer, C. Rechberger, A case against currently used hash functions in RFID protocols, Proc. of OTM Workshops, 2006, pp. 372–381. [26] S. Zhang, J. Ford, F. Makedon, A privacy-preserving collaborative filtering scheme with two-way communication, Proc. of the 7th ACM conference on Electronic commerce, ACM, New York, 2006, pp. 316–323. [27] D. Trček, Security and privacy in RFID based wireless networks, Handbook of Research on Wireless Security, IGI Global Publishing, Hershey, 2008. [28] J.K. Stevens, IEEE Begins Wireless, Long-Wavelength Standard for Healthcare, Retail and Livestock Visibility Networks, 2006 http://standards.ieee.org/announce-ments/ pr_p19021Rubee.html. Prof. Dr. Denis Trček is heading Laboratory of e-media at the Faculty of Computer and Information Science, University of Ljubljana. He has been involved in the field of computer networks and information systems security and privacy for almost twenty years. He has taken part in various European projects, as well as domestic projects in government, banking and insurance sectors. His bibliography includes over one hundred titles, including works published by renowned publishers like Springer and John Wiley. D. Trèek has served (and still serves) as a consultant and a member of various international bodies and boards, from editorial to professional ones. He is an inventor of a patented family of light-weight cryptographic protocols. His interests include e-business, security, trust management, privacy and human factor modeling.

References [1] ISO: Job Transfer and Manipulation, Concepts and Service, ISO Standard 8831, Geneva, 1989. [2] R. Fielding, et al., Hypertext Transfer Protocol — HTTP v 1.1. RFC 2616, IETF, Reston,1999. [3] T. Bray, J. Paoli, C.M. Sperberg-McQueen, E. Maler, F. Yergeau (Eds.), Extensible Markup Language (XML) 1.1., W3C REC-xml11-20040204, Cambridge/Sophia Antipolis/ Kanagawa, 2004, http://www.w3.org/TR/xml11/. [4] M. Gudgin, M. Hadley, N. Mendelsohn, J.J. Moreau, H.F. Nielsen, SOAP Version 1.2, Part 1: Messaging Framework W3C Recommendation, 2003 http://www.w3.org/ TR/2003/REC-soap12-part1-20030624/. [5] E. Christensen, F. Curbera, G. Meredith, S. Weerawarana, Web Services Description Language (WSDL) v 1.1. W3C 2001/NOTE-wsdl-20010315, Cambridge/Sophia Antipolis/Kanagawa, , 2001 http://www.w3.org/TR/2001/NOTE-wsdl-20010315.

Damjan Kovač is a Ph.D. candidate and holds M.Sc. in Computer Science from the Faculty of Computer and Information Science, University of Ljubljana, Slovenia. His research interests are distributed software architectures in e-business environments, service oriented computing, trust and reputation management, service orchestration languages (BPEL), and information security.