Net Neutrality in Europe: Desperately seeking a market failure

Net Neutrality in Europe: Desperately seeking a market failure

Telecommunications Policy 35 (2011) 1–11 Contents lists available at ScienceDirect Telecommunications Policy URL: www.elsevierbusinessandmanagement...

228KB Sizes 0 Downloads 127 Views

Telecommunications Policy 35 (2011) 1–11

Contents lists available at ScienceDirect

Telecommunications Policy URL: www.elsevierbusinessandmanagement.com/locate/telpol

Net Neutrality in Europe: Desperately seeking a market failure Pietro Crocioni n Centre for Management under Regulation, Warwick Business School, Scarman Road, University of Warwick, Coventry CV4 7AL, UK

a r t i c l e in f o

abstract

Available online 4 January 2011

Net Neutrality has become the focus of attention in the regulatory debate on the Internet. This article attempts to strip down the debate to its bare essential. It identifies two main types of Net Neutrality obligations that have been put forward and assesses what type of potential concerns they may be designed to address. It concludes that while some of these concerns may be important it remains doubtful (at least in Europe) that an ex ante per se rule, such as those proposed under the Net Neutrality term, is the best way to address them. & 2010 Elsevier Ltd. All rights reserved.

Keywords: Net Neutrality Foreclosure Competitive-bottleneck Congestion

‘‘I will not be someone who comes up with a solution first and then looks for a problem to attach to it. I am not a police officer in search of a busy corner.’’1

1. Introduction A lot has been written on and discussed about Net Neutrality (NN) in journals, in conferences and regulatory proceedings. There is also an emerging economic literature on this issue. Yet, most of the discussions over NN seem still to lack clarity as to what NN means as regulatory remedy and, more importantly what is the market failure concern it is supposed to address. This is probably due to the genesis of the concept. The term NN was initially put forward by legal scholars, to later spill over into the regulatory debate and become the subject of economic research. This is why the way NN is often discussed feels like a prescriptive, though not clearly defined, remedy desperately in search for some market failure to address. This is not to say that concerns like those identified by the NN advocates are never likely to arise. What is questioned here is whether a blanket per se rule is likely to be appropriate in most circumstances faced in Europe. This article is an attempt to provide an economic regulation framework to assess the main types of NN obligations discussed (although it does not cover all nuances and variations), to assess what type of concerns they could address and to clarify under which circumstances such remedy may be appropriate. The discussion is confined to those aspects of the NN debate that are related to economic regulation. The debate goes beyond this to cover issues such as freedom of speech or social policies which are not discussed here. It is organised as follows. Section 2 provides a basic description of the Internet vertical chain connecting consumers to Content and Application Providers (CAPs). It also highlights what is the precise focus of the NN debate. Section 3 puts forward two alternative definitions of NN. The more relevant and recent regulatory initiatives are briefly discussed in Section 4. Section 5 hypothetically considers what may happen if strict NN obligations were put in place. Section 6 works backwards from the definitions mentioned in Section 3 to consider what type of market failure NN may be tailored at. Two broad potential purposes of a NN obligation have been identified. It could be aimed at preventing exclusionary behaviour (Section 7) or to address a wider array of potential regulatory concerns (Section 8). Final remarks are confined to Section 9. n

Tel.: +44 24 7652 4306; fax: +44 24 7652 3719. E-mail address: [email protected] 1 Neelie Kroes, ‘‘Net Neutrality in Europe’’, speech in Paris, 13 April 2010.

0308-5961/$ - see front matter & 2010 Elsevier Ltd. All rights reserved. doi:10.1016/j.telpol.2010.12.007

2

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

Internet core

Retail ISPs Access network

Backhaul Traffic shaping

Fixed consumers

Wireless consumers

Traffic peering agreements Peering point

Service providers

Local exchange / mobile mast CDN server

Web hosting

Fig. 1. The Internet vertical chain.

2. The Internet vertical chain NN is a broad term to indicate a remedy to address potential concerns in the provision of Internet access services to CAPs. There are complex vertical chains and commercial arrangements that allow consumers to access content and applications on the Internet and, vice versa, allow providers of the latter to reach consumers. Understanding this vertical chain – Fig. 1 – and where exactly a potential NN remedy would sit is critical in assessing whether there is a potential market failure concern. In order to connect to the Internet consumers subscribe to an Internet Service Provider (ISP).2 In Europe consumers can chose among many ISPs. Mobile networks also act as ISPs, while on the fixed side the ISP of the fixed incumbent (and of any rival cable provider) compete with rivals that rely on leasing mandated wholesale access from the incumbent. Therefore, few ISPs rely on self-provided infrastructure but most are dependent on the incumbent’s network infrastructure and, hence, on effective access regulation. This includes the local loop, local exchanges and backhaul. Beyond this point of the network these (retail) ISPs rely on peering agreements with ISPs that provide regional or worldwide connectivity in the Internet—the Internet core in Fig. 1. On the other side sit the CAPs—e.g. Google, Ebay, YouTube, Facebook, etc. They need to upload their content or applications on the web hosting and then enter into peering agreements for the distribution with a regional or worldwide ISP. In the past in order to be available worldwide a CAP had to rely on peering agreements with a number of ISPs. Recently in response to the increased risk of congestion in the Internet core some CAPs, such as Yahoo, have set up their own Content Distribution Networks (CDNs) while others have relied on the services provided by third parties such as Akamai.3 CDNs providers use their own network of servers at the edge of their network to store information and content close to the consumers. In other words, consumers can access content in servers much closer to them without the need to send and receive traffic across the Internet core. While consumers may not note much difference in quality of their Internet connection, CDNs are an attempt of reducing the access quality concerns that a ‘‘best-effort’’ model runs into. A number of agreements and contracts allow consumers to access content available worldwide. In the Internet core these take the form of either peering agreements on a Bill & Keep basis or often, if the size or coverage of the ISPs is unequal or the traffic unbalanced, payments. CAPs pay for uploading their content, web hosting and worldwide connectivity, including CDN. On the other side, consumers pay ISPs for connectivity to the Internet and many ISPs in order to provide that need to lease access network and backhaul facilities and for Internet core. There are two issues that are central to the NN debate. First, CAPs do not pay the ISPs chosen by consumer in order to deliver their content to the ISPs customers. They pay for the delivery of their content close to the consumers’ chosen ISP, but not for the last part of the journey through the ISP’s network. As mentioned above, payments between ISPs are often governed by peering agreements. This is one of the two critical aspects in the NN debate. As a result, ISPs do not use prices towards CAPs in order to ration demand, in case this became excessive. This could result in congestion with all consumers experiencing lower quality of service. The second aspect of the NN debate is that content and application are largely delivered according to ‘‘best-effort’’, in other words without quality assurance. This is why content at times is slow to upload onto one’s PC, why sometimes email may take long to be delivered or why video images get frozen. The practical consequence is that ISPs could only provide one type of connectivity service based on ‘‘best-effort’’. This suffices for the majority of content and applications that are not sensitive to delays, but could effectively destroy the value of applications for which delays or latency are critical. This is the case of online games, video, Voice over IP (VoIP) or more speculative applications such as telemedicine. While the current Internet is often described to operate under a ‘‘best-effort’’ model, in reality ISPs currently engage in some form of traffic management. For example, they may give precedence to some form of traffic over other or may block consumers’ use of some applications that generate a substantial amount of traffic such as peer-to-peer. In some cases they may engage in these activities not to better

2 3

In this article the term ISP refers to the retail ISP—i.e. those ISP that consumers subscribe to in order to obtain connectivity to the Internet. These are now understood to carry as much as 40 per cent of Internet world traffic (Yoo, 2010).

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

3

serve their customers but to harm rival CAPs. What they have not yet done though is to use prices towards CAPs to manage incoming traffic. 3. What is Net Neutrality? The curious aspect of the debate on NN is that it focuses on a remedy rather than a market failure, but even as ‘‘remedy’’ NN is often not clearly defined. Proponents of a NN per se rule identify two broad concerns – not necessarily market failures – from which one can deduce what form such rule could take (Schuett, 2010). First, there is a concern that ISPs could charge positive prices to CAPs in order to deliver their services to their subscribers. This is, indeed, one of the main concerns of some NN proponents (Economides & Tag, 2009). A simple rule to prevent that is to impose a zero price cap for CAPs to access the ISPs’ subscribers.4 The second, but often related, concern is that ISPs could discriminate in favour of their own content against that provided by rivals. A zero price cap rule would also prevent this. However, under a non-discrimination rule ISPs could be allowed to charge a positive price, but be prevented from discriminating. A non-discrimination rule consists of a prohibition to charge a higher price and/or provide a lower quality of access to rival content providers. This is an important distinction when assessing the possible market failure concerns these different forms of NN obligations may be able to address. 4. The status of the regulatory debate The NN debate commenced about a decade ago in the US. In the last few years it has become one of the main issues for the Federal Communications Commission (FCC). The FCC intervened in a couple of cases sanctioning the behaviour of ISPs/ networks. First in the Madison River (2005) case where a local US telecoms operator denied access to Voice over the IP (VoIP) services to Internet users, later on in the more famous Comcast (2008) case. The cable operator was found interrupting and degrading peer-to-peer traffic (and particularly BitTorrent traffic) on its network. An important element of the complaint was by customers and advocacy groups that Comcast had changed its policy on traffic management without informing its customers of such change. Comcast was sanctioned by the FCC, arguing that its practices violated the FCC’s ‘‘Internet Policy Statement’’. This spelled out Four Internet Freedoms for consumers: to access lawful content, to use applications, to attach personal devices that do not harm the network and to obtain service plan information entitlement to competition. In 2009 the FCC added two further Freedoms the first of which has proved controversial:

 A non-discrimination principle whereby ISPs cannot discriminate against particular CAPs, though they may engage in ‘‘reasonable network management’’.

 A consumer transparency principle where ISPs must disclose their network management practices to consumers, CAPs, and the FCC.5 In April 2010, the District of Columbia court of appeals ruled in favour of Comcast arguing that the FCC lacked sufficient statutory basis for its order. Therefore, the situation in the US is still unclear. In Europe the new Regulatory Framework was approved in 2009 (though it still needs to be transposed in many Member States). In terms of NN it introduced new duties and powers for the national regulators to enforce consumer transparency and also a potential new tool, the minimum Quality of Service (QoS), at the discretion of national regulators. No ex ante nondiscrimination rule was introduced. However, the European Commission has issued a questionnaire in preparation of consultation6 where, among others, it raised some questions related to discriminatory practices. This is additional to the existing rules that allow to impose ex ante remedies if certain conditions are met.7 Other European national regulators have started to consult on NN (Norwegian, Swedish, French, British, etc.). In particular, the British regulator, Ofcom, has published a discussion document8 where it suggested that the best approach may be to first ensure that competition between ISPs remain vibrant and consumers have and can act on the relevant information before considering additional and more radical interventions. 5. Today’s Internet and the effect of NN Today’s Internet could be neither described by a situation where operators engaged in substantial traffic management – i.e. by either extensively prioritising traffic or charging CAPs – nor the free ‘‘best-effort’’ world sought after by NN proponents. 4 This discussion is on first principles and abstracts from considerations as to whether, if such rules was deemed appropriate, it could be implemented under the current legal framework in Europe. 5 Federal Communications Commission (FFC), Notice of Propose Rulemaking, In the Matter of Preserving the Open Internet/Broadband Industry Practices, GN Docket No. 09-191/WC Docket No. 07-52, October 22, 2009. 6 Retrieved from http://ec.europa.eu/information_society/policy/ecomm/doc/library/public_consult/net_neutrality/nn_questionnaire.pdf. 7 First, the national regulatory authority has to pass the three criteria test in order to be able to impose ex ante regulation. This requires that in the market there are high and non-transitory barriers to entry, the market structure does not tend towards effective competition and that competition law is insufficient to address the market failures of concern. Having passed the test the authority needs to identify a relevant market and find that some providers have Significant Market Power (SMP) in order to impose any ex ante remedies. 8 Retrieved from http://stakeholders.ofcom.org.uk/binaries/consultations/net-neutrality/summary/netneutrality.pdf.

4

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

In Europe several fixed ISPs restrict peer-to-peer traffic at peak times and/or throttle traffic when there is congestion, while most, but not all, mobile operators charge consumers to use VoIP services and often impose consumers’ usage caps. Although some traffic management takes place today, its extent and form may be influenced by the current debate on NN and the, hence, the potential threat of regulation. Nonetheless, it may be useful to consider the risks that could arise in a hypothetical world where a strict NN obligation was mandated. If a ‘‘best-effort’’ world with no traffic management and with a zero price cap was enforced consumers may lose out in a number of ways. Sections 6–8 consider whether or not these benefits from traffic management could outweigh its risks. It may be useful to run a road traffic analogy to illustrate the risks. In heavily populated urban areas there is a risk that, if any car could drive into the city centre for free, road congestion would be likely to follow. All drivers would be worse off because of longer travel times, but the worst affected would be those for whom time savings on the road are particularly valuable. This is why some city centres, such a London or Singapore, have introduced charging mechanisms to ration demand and ensure that priority is given to those drivers that value most their trip to the city centre. Similarly, without charging or prioritisation ISPs may not be able to constrain incoming traffic demand and congestion may follow. Like drivers sitting in queues, Internet users will sit in front of their screens waiting for their content and applications to download. Congestion is a negative externality that reduces the value that consumers derive from driving their cars and surfing the Internet. Unable to use charging to ration demand, ISPs or city centres local authorities could only resort to second best tools to constrain demand. ISPs could throttle incoming traffic or block traffic intensive applications such as peer-to-peer9 in the same way as local authorities could restrict parking by taxing parking spaces or impose parking fines. These, however, are often second best tools because they only imperfectly, if at all, can identify those with the highest valuation for the service and let them in while restrict those whose marginal value for the service is limited. There is an important difference though between road and Internet traffic analogy. The provision of Internet connectivity is a two-sided market, as discussed in Section 8 below. While road traffic demand could be rationed by making it more expensive for drivers to enter a city centre, Internet traffic could be rationed by acting on either CAPs or/and also on consumers. Consumers generally pay a flat fee to get connected to the Internet, but the traffic they generate could be controlled also by usage related fees.10 While this is possible, consumers do not seem to particularly like this as they have historically rapidly moved away from metered access. More importantly, it may be less efficient than acting on the content and application side. For example, while it may be relatively easy to identify which services generate substantial traffic at peak and are less sensitive to delays, designing retail tariffs that achieve the same effect may be more complex. As discussed in Section 8, it may also be more efficient to recover costs from both, rather than one, sides. But why constraining demand? Congestion provides important signals. It ensures that it is clear when either new capacity may be needed or, if that is not efficient at least in the short term – for example because investment in capacity is lumpy – demand needs to be rationed to allow use only by those that value the access most, whether it is car trips or Internet surfing. To continue the road traffic analogy, prioritisation plays a substantial role in traffic management. Those with the highest valuation that are prepared to take a taxi can enjoy dedicated lanes and do not need to park. Similarly priority is given for a number of reasons to buses, fire brigade, police, ambulance vehicles, etc. This reflects the demand from society to give priority to certain types of traffic which carry a higher value for society. Similarly not all Internet content and applications are valued the same. Some are more valuable than others. In other words, services and applications are heterogeneous. To the extent that there are capacity constraints – which currently are particularly strong on mobile compared to fixed networks – there are three main risks under a strict NN regime.11 First, congestion could emerge and make everyone worse off. Internet traffic is rapidly increasing and it seems likely that, without further investment in network capacity, congestion will occur more frequently.12 Second, without prioritisation delay-sensitive applications – i.e. first and foremost online games, but also video and VoIP and services yet to be commercialised or taken up such as telemedicine13 – delays in the transmission, known as latency,14 could destroy the entire

9 Traditionally the Internet architecture was organised in a ‘‘client-server’’ way. Content and applications are stored in large computers at centralised locations and consumers contact these computers to have access to their content. This means that the traffic uploaded to these servers is small, while the lion’s share is that from the computers to the users. Peer-to-peer technology which now may account for almost half of the world Internet traffic is very different. The distinction between computers that store applications and those who request them is no longer relevant. Both perform both functions simultaneously. The reason why this is causing significant congestion in certain areas of the network is that traffic is exchanged by these edge computers as long as they are left running by consumers. Peer-to-peer applications, therefore, increase traffic and may result in congestion in the last mile of the network. This is particularly burdensome on networks that share bandwidth locally, such as cable and mobile. See Yoo (2010) for more details. 10 There are some potential signs that wireless operators, such as AT&T are starting to introduce caps to manage congestion. One way to interpret this is in the light of the FCC’s stance in favour of NN. If that is interpreted as a sign that FCC would not accept ISPs charging CAPs, ISPs may then chose the only other feasible way to ration demand – i.e. acting on the consumers’ side (Cain Miller & Stone, 2010). 11 There are also other risks. For example Section 8 discusses how it may be more efficient to share costs recovery between both sides. Theoretically, at least, this could lead to a reduction in consumers’ prices for connecting to the Internet and increase Internet take-up. 12 It is extremely difficult to predict when this may become a substantial concern and where along the vertical Internet chain. However, ISPs and CAPs are taking measures to reduce the potential occurrence and/or its negative effects – i.e. the emergence of CDN by locating content closer to the consumers can be seen as an attempt to reduce the risk of congestion. 13 Games are likely to be the worst affected because delays may seriously ruin the enjoyment by consumers. For video buffering could reduce the risk that delays will affect the enjoyment of the services. VoIP, on the other hand, requires such limited capacity that it is unlikely to be efficient to not prioritise it. 14 Latency is defined as the amount of time it takes a packet of data to move across a network connection. When a packet is being sent, there is ‘‘latent’’ time, when the computer that sent the packet waits for confirmation that the packet has been received. The higher is the latency the longer the delays and the lower the quality of the underlying service.

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

5

value consumers derive from these applications. As a result, some applications may never be developed in the first place. Third, and closely related to this, not only applications differ in their access requirements, but also not all consumers are identical in their requirements and preferences. As consumers are heterogeneous in their preferences they would prefer to assign preferences to some content and applications but not others. Restricting all of them to the ‘‘best-effort’’ world would not satisfy their preferences and maximise the benefits they derive from the Internet. A strict interpretation of NN would severely constrain the ability of ISPs to tailor their access services to their consumers’ demand and to react to exogenous factors. In presence of risks of congestion, capacity constraints, heterogeneous consumers and application and content providers requiring different quality access, inefficiencies are likely to emerge if ISPs were prevented from rationing demand. They could do this by charging, differentiating the quality of access to consumers and CAPs and price discriminating between the two types of consumers or sides of the market. First, when capacity is constrained and there is excessive demand, prices have two functions: they signal that new capacity may need to be put in place or that rationing demand to allow in those services that are more valuable to consumers at the expense of those that are not may be necessary. Second, when consumers have heterogeneous preferences and services and applications may require different priority levels, it is efficient to provide different quality of access for different prices. Therefore, a world without NN could lead to a two-tier access system where ‘‘best-effort’’ access is provided for free and priority access for a (positive) price.15 As discussed by Weisman and Kulick (2010) while the NN debate tends to focus on (price) discrimination, here the discussion is about service differentiation to respond to consumers and suppliers heterogeneous preferences. Hermalin and Katz (2007) conclude that a NN obligation would force ISPs to offer a single quality of access service. Consumers that would have only purchased a low quality access no longer connect to the Internet. Consumers that would like to purchase a high quality service have to purchase a low quality one. Both would lose from NN. Some consumers with a demand for intermediate quality of access who end up purchasing under NN may be better off. They concluded that the overall effect of a NN obligation is ambiguous but unlikely to be positive. Last, in presence of a large proportion of fixed and common costs in the provision of Internet access/connectivity, price discrimination is likely to be an efficient way to recover such costs on the basis of relative demand price elasticities within and across the two sides of this market. 6. What may NN be trying to remedy? Although for a long time the debate on NN focused on whether such as rule was appropriate, it has remained unclear what this rule is supposed to remedy. Above it was argued that there are likely to be substantial benefits from traffic management – i.e. diverging from the world that would exist under a strict NN interpretation – especially in presence of congestion, capacity constraints, consumers’ heterogeneous preferences, different access requirements across services and applications and fixed and common costs. However, there are also some potential concerns that have started to take shape in the (economic) debate. In assessing what is the best regulatory approach it is important to balance potential benefits and costs. First, traffic management is a term encompassing a wider range of practices, which may also include behaviour that could have exclusionary effects against rival CAPs and lead to consumer harm. Second, there are issues that have been raised and can be grouped together under the term ‘‘wider regulatory concerns’’. These include a wide array of concerns that have been mentioned in the debate, but essentially these could be summarised as:

 a concern about the potential presence of a ‘‘bottleneck monopoly’’ and  closely related concerns about the investment and innovation incentives of both CAPs on one side and ISPs and/or network providers on the other. While the ‘‘competitive bottleneck’’ is a static, the latter is a dynamic efficiency argument. In the next two sections these arguments are further examined to better understand under which circumstances a concern may arise and, if there was a risk, what would be the most appropriate regulatory response. In particular, the choice seems restricted to the per se ex ante rules that some NN proponents put forward (clearly acknowledging that there are variations in how such rules can be designed) and a case-by-case ex post approach that considers that traffic management carries both benefits and risks for consumers and that per se rules are not appropriate in these circumstances. However, two aspects of the more recent regulatory debate have acquired a particular relevance in Europe after the enactment of the new European Regulatory Framework.16 These could be thought of as alternative or complementary measures to the ex ante or ex post access rules that have so far been central in the NN debate. First, as ISPs move away from providing only ‘‘best-effort’’ access, the services offered by ISPs to consumers are expected to become more differentiated and complex. This may raise concerns that consumers may not be able to grasp these technical 15 Some (Economides & Tag, 2009) explicitly assume that service differentiation does not take place. As a result, they assume that if ISPs started charging they will provide the same access service to all content and application providers. This was deemed unrealistic by Hahn and Singer (2008). 16 See http://ec.europa.eu/information_society/policy/ecomm/tomorrow/index_en.htm for an overview and legal documents.

6

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

features and, as a result, not be able to make informed choices. Ensuring that consumers have access to the right type of information, understand it and are able to make the right choices is essential to ensure that the ISP consumer market remains competitive (Faulhaber, 2010). This is a complementary measure to access regulation. Second, the new European Regulatory Framework also contains the possibility that regulators could impose a minimum Quality of Services (QoS) obligation. This is best interpreted as a Universal Service Obligation (USO) which raises similar, but perhaps more complex, types of issues as those related to voice telephony. QoS is only a tool to set a floor on the quality of the Internet traffic experienced by consumers. Unlike USO in voice telephony there is neither a geographic nor a social tariff aspect. When would such a measure be justified? Presumably this could occur if the proportion of prioritised traffic would extend to a level such that the best effort ‘‘lane’’ would become smaller and services slower. If consumers’ demand for Internet access to many CAPs was stronger than that for managed or prioritised services, this risk would seem small. ISPs would provide that. QoS could coexist with prioritised and tiered arrangements but would ensure, if appropriate, that not all capacity was used for prioritised traffic. QoS presents some important practical and implementation difficulties. For example, if the QoS level were set too high, consumers demanding low QoS may not be willing to pay and would exit the market (Hermalin & Katz, 2007). The difficulties in setting the correct QoS do not appear trivial and, hence, the risk of regulatory failure is not insignificant. Furthermore, the technical definition is unclear. While voice telephony was a simple and technologically stable service, it is unclear which service should be used as a reference for QoS for Internet traffic. The answer would depend on whether it is the quality necessary to support high quality online games or, for example, low quality video. As the services and applications change rapidly with technological evolution, unlike voice telephony, QoS may have to be frequently adapted. Its verification and enforcement present equally complex problems. Presumably ISPs would be responsible for ensuring QoS. However, the quality of Internet traffic is not fully under the control of ISPs – e.g. if the CAPs chose hosting services of a poor quality.

7. Do exclusionary concerns justify a per se prohibition? Perhaps conceptually the simplest concern is that that NN may be designed to address the risk of anti-competitive exclusion. ISPs with market power may have the ability and incentives to engage in anti-competitive practices to exclude or marginalise their rivals and lead to consumer harm. ISPs may either block access, price discriminate against or degrade the quality of access provided to rival CAPs. While the NN debate focuses on ISPs and CAPs, like in any other vertically organised markets, market power could emerge at different levels of the vertical chain. The critical aspect is what is the best way to intervene to prevent the harm that could arise if market power emerged and was used anticompetitively. Conceptually this could take two forms. ISPs could either defend the position of their existing content and application services from new entry or they could attempt to leverage a position of market power in retail Internet access into the relevant markets for content and applications.17 If this was a substantial and recurrent concern requiring per se ex ante rules, either a zero price cap or a non-discrimination obligation could work, but it seems that the latter in the form of a no price squeeze rule18 may suffice to prevent such behaviour. Consumer harm from exclusionary behaviour (including leveraging19) could arise when certain conditions are met (Crocioni, 2008). Although these are well-known it is critical to spell them out in order to better understand to what extent this type of behaviour raises strong concerns of consumer harm in the context of Internet access. These are cumulative conditions. First, the ISPs should have market power. Currently in Europe there is no indication that this is a serious concern as no regulator has intervened either ex ante or ex post so far. While it is possible that some ISPs in any European country may acquire a position of retail market power it is at least unclear how large this risk may currently be.20 Competition between ISPs in Europe is largely the result of mandated wholesale access regulation21 reducing barriers to entry into the Internet 17 Note that while the NN debate has focused on the exclusionary concerns such as market power leveraging by ISPs, the same behaviour could take place if CAPs had market power and attempted to leverage that into the ISP market (although this latter strategy may not be that feasible and rational in presence of wholesale regulation in the provision of Internet connectivity). 18 This abstracts from any practical implementation difficulties for such a rule. 19 There are a number of ways a firm could leverage their market power. First, ISPs could engage in refusal to supply. They could either refuse to (including the provision at higher prices) supply access to certain CAPs which offer rival content or provide a lower quality access to rival content providers. Second, they could engage in tying and bundling – i.e. ISPs could effectively limit the ability of rival content providers to compete in the provision of content by bundling content and access services. The main concern raised in the debate on NN appears to relate principally, but not solely, to the former. For example, Economides and Tag (2009) argue that a vertically integrated ISP would have an incentive to disrupt or lower the quality of the services of its CAP competitors, which are not vertically integrated to further its own content or applications. They argue that in order to discriminate against a competitor it would be sufficient for the ISP to set a high fee that will effectively block profitable operation by the competitor. 20 There are no known examples of the national regulators having successfully concluded a market review on the provision of Internet access at the retail level and imposed remedies. 21 This consists mainly of mandatory access in Wholesale Broadband Access (WBA) with mandatory bitstream access and Wholesale Local Access (WLA) with mandatory provision of Local Loop Unbundling (LLU). A high level description of the situation in the UK illustrates the difference with the US. There are hundreds of ISPs offering retail packages to UK consumers. Cable provider, Virgin Media, cover about 50 per cent of UK homes, while PSTN incumbent, British Telecom (BT) covers the whole country. All ISPs rely on mandatory access imposed on BT. The latter is under an obligation to provide LLU at cost across the whole country. In addition in areas where LLU has not been successful BT also must provide Bitstream access at cost. In order to tailor the access obligation to the local competitive conditions the UK NRA, Ofcom has divided the territory into three areas. In Market 1 (about 16 per cent of UK premises) there is no real

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

7

access retail market. The situation may be different in the US where there is effectively a duopoly of cable and telephony providers. Second, the ISP should either be vertically integrated22 into the provision of content and applications or planning to do so in the near future. Currently, in Europe ISPs may provide some, but limited, content and application services in addition to Internet connectivity.23 Hence, it is possible that in some cases they may have some incentives to engage in exclusionary behaviour if they could profit from putting their content and platform rivals at a disadvantage. While this is a possible occurrence, it is currently unclear how large this risk may be. Third, even if the first two conditions are met, ISPs would not have an incentive to exclude competitors under all circumstances. It may not be profit maximising for the vertically integrated ISP to disrupt the quality of the services of its competitors to further its own products, even if it was a monopolist. This is because even a monopolist ISP may benefit from valuable complements and it may be better off charging a higher price for Internet access instead of trying to force customers onto its own services. It may be instructive to compare the risk of exclusionary concerns under a traditional voice telephony service and Internet access. An entrant providing voice telephony – a mature service – and gaining one subscriber would take away one sale from the vertically integrated incumbent. In essence, rivals’ services are perfect substitute for the incumbent’s services.24 Unlike voice telephony where there is effectively only one downstream service, an Internet connection provides access to a huge range of downstream services. An ISP normally only provides a handful of downstream services which are in direct competition with other downstream providers. For many services, such as those provided by Ebay, Amazon, Google, to mention a few, it seem unlikely that any ISPs could compete by vertically integrating into service provision. This means that, unlike for telephony, direct competition between ISPs and most of the downstream CAPs are restricted to a few services. All other downstream services are better thought as complements to the ISPs’ connectivity and other services, rather than substitutes. This has important implications for the choice of remedy, should a concern be substantial to warrant intervention. There seem to be a strong case for ex post rather than ex ante intervention for Internet as, unlike in the case of voice telephony, exclusionary concerns may only arise in a few circumstances. Ford and Stern (2010) stress that if NN is interpreted as a zero price cap rule this may increase the exclusionary incentives of a vertically integrated ISP into downstream services. As the ISP would not raise any revenues by allowing access to downstream rivals, it may have an incentive to hamper its downstream rivals. The only way to do that, given that it cannot act on its access price, is to degrade the quality of access supplied to downstream competitors. This concern is known as ‘‘sabotage’’ which arises when any market power is curbed by tight – i.e. cost-based – price upstream regulation.25 A zero price cap obligation would be an even tighter form of regulation potentially exacerbating this type of concern which, however, may only emerge if strict NN obligations were imposed. Given that currently there do not seem to be significant risks and evidence that exclusionary behaviour could be an endemic feature of competition in retail Internet access markets In Europe, it does not seem appropriate to consider per se ex ante rules to prevent exclusionary behaviour by ISPs. Ex post rules seem likely to be a better choice (Cave & Crocioni, 2007). 8. Wider regulatory concerns? In addition to the potentially exclusionary concerns a number of ‘‘wider’’ regulatory concerns may potentially arise in the absence of a NN obligation. The next sections focus on two. First, there is a theoretical and static welfare argument that if ISPs were free to price this would lead to a ‘‘competitive bottleneck’’ outcome where CAPs are charged ‘‘too much’’ even in the absence of any exclusionary strategy. Second, the debate has also focused on dynamic concerns about the investment incentives of CAPs on one side and of ISPs/networks on the other. 8.1. Is there a risk of ‘‘competitive bottleneck’’ absent NN? Internet access can be thought of as a two-sided market. ISPs act as the platforms allowing consumers to access CAPs’ services on the Internet and, vice versa, allowing the latter to provide their services and applications to consumers. As in any two-sided market the key factor is the presence of cross-group externalities – i.e. the demand of each side depends in part on the participation or usage of the other. The benefit that a consumer derives from accessing the Internet depends on the (footnote continued) prospect of wholesale competition and BT, being the only supplier, must provide Bitstream access at cost. In Market 2 (about 14 per cent of UK premises) there are between 2 and 3 wholesale suppliers for each home. BT has an overall wholesale share of almost 70 per cent and must provide access at cost but with more pricing freedom than in Market 1. Market 3 (about 70 per cent of UK premises) is the competitive part. There are more than 4 suppliers for each home and BT’s wholesale share is less than 30 per cent. As a result there are no obligations on BT. 22 Or provide services that are strict or strong complements. 23 The situation may be different in the US. Typically in the US (and to a much lesser extent the UK and some European countries), cable operators are vertically integrated and have rights to a significant amount of valuable (video) content as the network is used to deliver television services as well as broadband. 24 While this provides incentives for entry deterrence it is not always the case under all circumstances as the post Chicago literature has clarified (Crocioni, 2008). 25 For a review of the literature and an application see Cave, Correa, and Crocioni (2006).

8

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

“Competitive bottleneck” price (“Too high”). Some factors could counter the “competitive bottleneck” outcome:

• • Removing NN obligation



Google, Amazon etc. may have bargaining power. So may also niche content providers; CAPs may reverse outcome with exclusive contracts; Consumers may not single-home (home, office, mobile etc).

Price level range without NN

Efficient price – takes into account cross – group externalities, but does not allow for market power exploitation.

Price = zero with a NN obligation (“Too low”). Fig. 2. ‘‘Competitive bottleneck’’ or efficient price?

amount (and quality) of content and applications available – e.g. the number of websites and related content he or she can access. At the same time, the benefit for a firm to have a website or provide content or applications through the Internet depends on how many consumers can access the Internet – e.g. the penetration (and usage) of Internet. The main implication is that when a platform sets the prices on each side it will take into account the cross-group externalities between the two sides in order to get the right balance between participation on both sides. This means that in some cases the prices charged to the two sides may be skewed. A commonly given example is that of Free To Air (FTA) broadcasting. It could be argued that advertisers value significantly additional viewer/listeners while the latter do not particularly, or perhaps negatively, value additional advertising. This is reflected in the relative prices as advertisers contribute to most, if not all, the cost of programmes while viewers or listeners need to be ‘‘bribed’’ to watch or listen. There is general agreement that this is an efficient way to set prices in two-sided market (Rochet & Tirole, 2006). There is, however, a particular type of two-sided market (known as the ‘‘competitive bottleneck’’) for which the economic literature has identified a potential market failure (Armstrong, 2006). The ‘‘competitive bottleneck’’ refers to a situation when consumers on one side multi-home – i.e. subscribe to more than one platform – and on the other side they single-home – i.e. subscribe to one platform. In this situation, each platform has market power in providing access to its own single-homing customers.26 This arises because each platform controls access to each of its own members or subscribers and consumers on the other side has no choice than dealing with that provider. Market failure manifests itself in a distorted price structure (even if and when all profits made on one side are dissipated on the other),27 which cannot be corrected by increasing the degree of competition (e.g. via an increase in the number of competitors).28 In other words, one side – the multi-homing one – is charged ‘‘too much’’. Internet access may be described as a ‘‘competitive bottleneck’’ market. Consumers could be thought of subscribing to only one ISP (i.e. they single-home) while CAPs are available to subscribers to all ISPs (i.e. they multi-home). Therefore, the basic theory, if corroborated by evidence, would predict that each ISP is likely to have market power in providing CAPs with access to their single-homing customers. As a result they may charge CAPs ‘‘too much’’. This discussion suggests that the efficient price access price for CAPs may be between two extreme. Fig. 2 illustrates this. Start from the world envisaged by the extreme view of NN that of a zero price cap. Is that likely to be efficient? In order for a zero price to CAPs to be efficient one would need to be satisfied that certain conditions were met. One possible justification of such price structure is that consumers valued additional CAPs significantly more than the latter value consumers. In other words, CAPs should be ‘‘bribed’’ (via a zero price) in order to participate because this is significantly valued by consumers. This is exactly the opposite of what is believed to be the case for FTA broadcasting. However, Internet access like broadcasting is largely supported by revenues from advertisers. Therefore, zero seems unlikely to be an efficient price to CAPs under most circumstances.

26 This market power would arise in a narrow wholesale market which could be termed that for termination of Internet traffic. This could, but may not, be different from the market that would be relevant to assess leveraging concerns. 27 This is often referred to as the waterbed effect. It describes the situation where any extra profits made on the multi-homing side are competed away to attract single-homing consumers on the other side. The waterbed effect could be either complete – i.e. any profit generated from content providers will be competed away in the consumers’ side – or incomplete—i.e. only some of the profits will be competed away. 28 ‘‘Competitive’’ in the ‘‘competitive bottleneck’’ term refers to the fact that this market power – i.e. bottleneck – exists irrespective of the number of competing platforms.

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

9

Suppose now that ISPs had the freedom to charge CAPs. The risk is that a ‘‘competitive bottleneck’’ may emerge and ISPs may charge ‘‘too much’’. This too would be unlikely to be efficient. An efficient level of charges for CAPs is likely to sit in between these two extremes. That price should factor in the relative sizes of cross-group externalities but not allow for the exploitation of the ‘‘bottleneck monopoly’’ market power. The key question is where the market would be heading absent regulatory intervention. The ‘‘competitive bottleneck’’ case rests on specific assumptions. First, CAPs particularly valued by consumers (e.g. Google, Yahoo, Amazon, YouTube, etc.)29 could exert bargaining power vis-a-vis ISPs. This is particularly likely if the degree of competition between ISPs is high. If consumers are loyal to CAPs and able to easily switch among ISPs, there will be an incentive for ISPs to price competitively on the CAPs’ side of the market so as to retain the content required to attract and retain subscribers. Second, in an unrestrained market CAPs may reverse any ‘‘competitive bottleneck’’ outcome by committing to exclusive contracts. Providers of content highly valued by consumers may agree to provide it exclusively to one ISP in return for some ‘‘compensation’’, rather than having to pay ‘‘too much’’ to the ISPs for access. There is no sign of that yet, but there are examples in other markets. For example, in the UK mobile operator O2 paid to sell exclusively Apple’s Iphone. Yet, this could be thought of a ‘‘competitive bottleneck’’ example: consumers largely subscribe to one mobile operator and handsets are generally available to customers of each mobile network. All these factors30 may limit the ability of each ISP to benefit from their potential ‘‘competitive monopoly’’ position in relation to access to their own single-homing customers by charging ‘‘too much’’ to CAPs. Therefore, the ‘‘competitive bottleneck’’ outcome is likely to be an extreme outcome and relies on CAPs not having valuable content and/or bargaining power. If this is not the case, it is likely that CAPs will face lower prices than predicted under the simple ‘‘competitive bottleneck’’ story. While currently there are no explicit restrictions, other than perhaps due historical custom, there is no sign of ISPs charging, let alone overcharging, CAPs. This runs counter the prediction of the ‘‘competitive bottleneck’’ theory. To sum up, neither a zero price nor a price that would emerge from a ‘‘competitive bottleneck’’ situation is likely to be optimal. If a zero price was inefficient and absent NN restrictions, ISPs started charging CAPs this could lead to a rebalance in the relative prices charged to both sides (Rochet & Tirole, 2006). Under a zero price, access and content providers may pay ‘‘too little’’ and consumers ‘‘too much’’. If rebalancing was efficient this could lead to lower prices for consumers and increase Internet take-up. Free Internet packages could emerge where consumers can accept to be targeted by more adverts as in the FTA broadcasting model. These considerations are very relevant for any European regulator in order to justify imposition of ex ante remedies on ISPs.31 8.2. Investment incentives So far, the discussion has focused on static efficiency. However, both sides of the debate have claimed that there could be important effects on dynamic efficiency. CAPs have claimed that if ISPs were allowed to charge them, they would extract all the rents and, hence, negatively affect their investment and innovation incentives. As sometimes put, this would harm the emergence of the ‘‘next Google’’. Implicit in this debate is the argument that it is CAPs who generate most value to consumers and not ISPs. Conversely, ISPs and/or network operators have claimed that if they were not allowed to charge, CAPs would ‘‘free ride’’ on their investment. The ISPs dynamic efficiency argument against ISPs’ charging is assessed first. Although the intuition seems straightforward, this has been the focus of also some academic work (Choi & Kim, 2008; Musacchio, Schwartz, & Walrand, 2009) arguing that the investment incentives of the CAPs are inversely related to the ISPs’ ability to extract rent (through higher prices) from prioritisation. The apparent intuition is simple. As the ISPs’ access charges to deliver content and applications increases, the profits of the CAPs decline and with them their incentives to invest and innovate. Therefore, they argue that the absence a NN constraint (i.e. a zero price cap), ISPs’ charging could reduce innovation in content or at the ‘‘edge of the network’’. However, absent market power of ISPs – i.e. unless the ‘‘competitive bottleneck’’ argument was considered valid – this argument does not appear very convincing for a number of reasons. First, if a firm has a valuable service proposition, one would expect it to still be able to obtain financing (absent imperfection in capital markets). In the case of CAPs this includes obtaining funds to pay for access and perhaps prioritisation. A start-up firm could launch and establish itself in the ‘‘besteffort’’, before moving on to prioritised access. Second, the concern about reducing the CAPs’ investment may only be reasonable if one assumed that the greater the amount of content the better. However, as argued above it appears unlikely that a zero price cap rule may be efficient. This raises the possibility that a zero price cap rule may lead to ‘‘too much’’ investment in content and applications. Third, as argued above, absent a NN obligation it is likely that ISPs will offer prioritised access for a fee and interruptible or ‘‘best-effort’’ access for free. To the extent that consumers demand a particular service and they have a preference for prioritised access, CAPs will be able to offer a better quality service for which 29

Even small or niche content providers may have some bargaining power if some users particularly value their content. Furthermore, while the assumption that consumers single-home appears to describe well the subscription decision – i.e. a consumer need to subscribe to only one ISP in order to obtain access to the Internet – consumers often have access to the Internet from home and separately from work. They also may have access to the Internet via a fixed and a mobile line. Hence, if, at least, a proportion of consumers in fact multi-homed it could be argued that the market power that ISPs may have under the ‘‘competitive bottleneck’’ situation would be reduced. 31 One way to do so may be to pass the three criteria test mentioned in footnote 7, define a market such as ‘‘data termination’’ on each ISPs and find evidence of SMP. 30

10

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

consumers or advertisers (depending on the business model) would be willing to pay for. Fourth, and this is critical, delaysensitive services such as online games and telemedicine will never emerge if priority access cannot be offered. Their incentives to invest are not well-served by a ‘‘best effort’’ only regime. Overall, therefore, the concerns about investment incentives of CAPs do not appear particularly convincing.32 In addition, and separately from this there are the concerns about the need to manage congestion which would require rationing demand. This leads the discussion to the ‘‘free riding’’ argument by ISPs and/or network providers. The description is in itself a simplification given that ISPs exchange traffic on a B&K basis or payments if traffic is unbalanced. The argument is more whether the balance of payments should shift in favour of ‘‘retail’’ ISPs. A static argument has already been discussed in the previous section. In a two-sided market framework it is likely to be efficient to recover costs from both sides, although there may be a risk that ‘‘too much’’ is recovered from the CAPs’ side. In terms of dynamic incentives it is perhaps useful to assume first that there is no excess demand and consider the incentives of the ISPs and/or networks, if demand increased to the point that new investment was required. To continue the road traffic analogy, suppose that a bridge operator is not allowed to charge cars and lorries that go through it (it may get revenues from other sources – e.g. state contributions). Without a pricing mechanism to ration demand congestion could arise. In order to satisfy demand, the bridge operator would need to increase capacity, but the only way to do so is by a lumpy investment – i.e. to build a new bridge. The latter will be initially underutilised until traffic increased again to the point that a third new bridge may have to be built. Suppose now that the firm operating the bridge was allowed to charge cars and lorries that go over the bridge. By rationing demand to those who valued passing over the bridge the most, it would be able to delay the investment in the second bridge until a substantial demand arose. Increased investment in capacity which is then under-utilised is unlikely to be efficient. Similarly for the Internet it may be more efficient to delay the capacity investment, but with a strict NN obligation (i.e. a zero price cap) a delay would not be possible. Thus, the ability to charge CAPs may prevent over-investment in capacity, as imposing a NN obligation would remove possibly the most efficient mechanism for the ISPs to ration demand. Put it simply (Cave et al., 2009), it would be inefficient to respond to congestion by investing to double capacity when by prioritising traffic could improve consumers’ welfare (allowing delay-sensitive services to emerge) with a fraction of investment. A perhaps more succinct and clearer description of the issue is that of a spokesman of UK mobile operator O2 commenting on their recent decision to introduce data usage caps: ‘‘we can’t be building a six-lane highway every time the traffic increases – which it is, doubling every four months’’ (Meyer, 2010). 9. Conclusions This article has examined the NN debate, mainly from a European viewpoint. It has attempted to provide an economic framework under which to examine the arguments put forward by both sides. This has been, so far, a confused debate, starting from a remedy to consider later on what it may be for. Bringing it back a more traditional approach framework clarifies its scope and boundaries. From an economic regulation perspective, this debate is better cast in terms of assessing the benefits of traffic management and the risks that ISPs may engage in behaviour that is harmful to consumers. There are clear benefits from traffic management; hence, the question is whether the risks of harmful behaviour can outweigh the former. In essence, if one assumed strong competition between ISPs and informed consumers, it would seem difficult to conclude that currently there is a need for the type of ex ante remedies proposed by NN advocates. This is not to say that potential concerns may never emerge in the provision of Internet access services by ISPs. Indeed, exclusionary behaviour by ISPs with market power could be possible. However, in the light of the evidence available and the main facts it seems currently unlikely that ex ante regulation could be justifiable. Equally, concerns about the emergence of a ‘‘competitive bottleneck’’ seems remote today as the predictions of high access prices for CAPs is not borne by the absence of ISPs’ charging positive prices. Therefore, it seems currently inappropriate, at least in Europe, to argue in favour of ex ante restrictions on the ISPs’ ability to manage traffic, including pricing and their ability to differentiate their access services to CAPs. This leaves ex post intervention under competition law to deal with exclusionary concerns. References Armstrong, M. (2006). Competition in two-sided markets. RAND Journal of Economics, 37(3), 668–691. Cain Miller, C., & Stone, B. (2010, June 6). App makers worry as data plans are capped. The New York Times. Retrieved from /http://www.nytimes.comS. Cave, M., van Eijk, N., Prosperetti, L., Collins, R., de Streel, A., Larouche, P., & et al., (2009). Statement by European academics on the inappropriateness of imposing increased internet regulation in the EU. Retrieved from /http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1329926S. Cave, M., Correa, L., & Crocioni, P. (2006). Regulating for non-price discrimination: The case of UK telecoms. Competition and Regulation in Network Industries, 1(3), 391–415. Cave, M., & Crocioni, P. (2007). Does Europe need net neutrality rules? International Journal of Communications, 1, 669–679. Choi, J.P., & Kim, B.C. (2008). Net neutrality and investment incentives. NET Institute Working Paper, No 08-03 Retrieved from /http://papers.ssrn.com/sol3/ papers.cfm?abstract_id=1285639S.

32 Pricing could potentially have a positive impact on the CAPs choices of content and application formats and technologies. With pricing they would have stronger incentives to ensure that their services use an efficient amount of capacity. Under a ‘‘best effort’’ no fee regime CAPs would have weaker incentives to chose data or capacity efficient services and applications.

P. Crocioni / Telecommunications Policy 35 (2011) 1–11

11

Crocioni, P. (2008). Leveraging of market power in emerging markets—A review of cases, literature and a suggested framework. Journal of Competition Law & Economics, 4(2), 449–534. Economides, N., & Tag, J. (2009). Net neutrality on the Internet: A two-sided market analysis. Retrieved from /http://www.stern.nyu.edu/networks/ Economides_Tag_Net_Neutrality.pdfS. Faulhaber, G. R. (2010). Transparency and broadband Internet service providers. International Journal of Communication, 4, 738–757. Ford, G., & Stern, M. (2010). Sabotaging content competition: Do proposed net neutrality regulations promote exclusion. Retrieved from /http://ssrn.com/ abstract=1576565S. Hahn, R.W., & Singer H. (2008). A two-sided market analysis of net neutrality regulation: Estimating price effects for broadband subscribers. Retrieved from /http://www.imaginar.org/its2008/343.pdfS. Hermalin, B. E., & Katz, M. L. (2007). The economics of product-line restrictions with an application to the network neutrality debate. Information Economics and Policy, 19, 215–248. Meyer, D. (2010, June 10). O2 drops unlimited mobile data allowance. ZDNet. Retrieved from /http://www.zdnet.co.ukS. Musacchio, J., Schwartz, G., & Walrand, J. (2009). A two-sided market analysis of provider investment incentives with an application to the net-neutrality issue. Review of Network Economics, 8(1), 22–39 Retrieved from /http://www.bepress.com/cgi/viewcontent.cgi?article=1168&context=rne. Rochet, J. C., & Tirole, J. (2006). Two-sided markets: A progress report. RAND Journal of Economics, 37, 645–667. Schuett, F. (2010). Network neutrality: A survey of the economic literature. Review of Network Economics, 9(2))10.2202/1446-9022.1224. Yoo, C. S. (2010). Network neutrality or Internet innovation? Regulation, 33(1), 22–29. Weisman, D., & Kulick, R.B. (2010). Price discrimination, two-sided markets, and net neutrality regulation. Retrieved from /http://ssrn.com/abstract= 1582972S.