Battle for bandwidth as P2P goes mainstream

Battle for bandwidth as P2P goes mainstream

Can the ISPs bear the peer-to-peer explosion? CHRIS JACKSON/GETTY Technology P2P networks are devouring the internet’s bandwidth, slamming service ...

233KB Sizes 1 Downloads 64 Views

Can the ISPs bear the peer-to-peer explosion?

CHRIS JACKSON/GETTY

Technology

P2P networks are devouring the internet’s bandwidth, slamming service providers in the process ANIL ANANTHASWAMY

WHEN Microsoft released the eagerly awaited Xbox 360 game Halo 3 last month, fans waited through the night outside stores to get their hands on the first copies. How much more convenient it would have been if the game had simply arrived on their computers as soon as it was released. “If we had a delivery service, we could deliver the content electronically and maybe offer a discount,” says Jin Li of Microsoft Research in Redmond, Washington. Unfortunately that wasn’t possible. Microsoft’s connections to the internet would have been overwhelmed had they needed to send out more than a million copies of the game. That could soon change if the company decides to deliver games using a peer-to-peer (P2P) delivery system, which alleviates such bandwidth burdens. While the workings of P2P systems differ between applications, it could go something like this: instead of every customer downloading the game directly from Microsoft’s servers, the software would first be distributed to a small number of computers. These “seed” computers would transmit the game to purchasers, who would 28 | NewScientist | 13 October 2007

in turn pass the game to other purchasers, or peers, all in a legal and accountable manner. Microsoft itself would need far less bandwidth to deliver the software in this manner than if everyone connected directly to its servers. Microsoft is not alone. P2P networks were first popularised as the technology behind music file-sharing network Napster. They now look to be the future of high-quality content delivery. Warner Brothers in the US is using the BitTorrent P2P system to deliver video, the Canadian Broadcasting Corporation (CBC) is banking on P2P software to deliver live TV, and universities are building P2P systems to boost robustness at the core of the internet (see “Spread it around”, right). But while P2P applications remove the data bottleneck for

in use. Feeling the pinch, some are fighting back, and the way this plays out will determine whether P2P can realise its potential in delivering high-quality video and software directly to our PCs. For most of us, most of the time, the internet operates

“Once you actually start using P2P networks, you break the business model of the ISPs” the organisation that originates the content, the surge in data exchange between ordinary users’ computers is consuming huge swathes of internet bandwidth (see Graph). The business models of the internet service providers (ISPs) that supply that bandwidth have yet to account for this growth

according to a “client-server” model. Each time you want to download a web page, for example, an individual copy of that page is sent from the web server to your computer. This has worked well for reading news, accessing email, listening to radio and even viewing low-quality

video, since these applications require relatively small amounts of data. But as the internet gears up to deliver high-quality video and television, the client-server model is beginning to creak. Take the problem faced by the CBC. To upload content to users it has to buy bandwidth, which can cost about $150,000 per year for a 45 megabits/second “pipe”. Under a client-server model, this could stream high-quality video to up to 60 computers simultaneously, so servicing the CBC’s 6 million customers would be prohibitively expensive, not to mention technically challenging. Like the researchers at Microsoft, to get around these problems, Mohamed Hefeeda and colleagues at Simon Fraser www.newscientist.com

So what options do ISPs have? Metering bandwidth and charging users who exceed a certain limit is an option, but is unpopular with customers, who prefer to pay a flat rate. To conserve its bandwidth, an ISP can also choose to cut a customer off if they are generating levels of P2P-like traffic that exceed conditions of fair use for a home broadband connection. Or they can restrict the bandwidth available to that user – an action euphemistically called “traffic shaping”. The P2P software developers aren’t taking this sitting down, of course. “There is sort of an arms race going on,” says Wallach. It used to be that P2P applications were easy to detect, because they always used a particular “port” on the computer to communicate. Later, P2P software developers got smarter, –P2P could make waiting in line history– constantly shifting the ports they used. In response, Sandvine, University in Burnaby, British Ipoque and other companies that Columbia, Canada, is building a produce programs to detect P2P P2P network for the CBC. Great activity have resorted to news for the broadcaster, but something called “deep packet what about the ISPs that transport inspection”, a technique that the content between peers? allows them to examine the Home computer users with contents of internet data packets broadband connections typically and determine if they belong to a buy their bandwidth from an ISP P2P application. In retaliation, P2P at a monthly flat rate. That software developers are trying to connection tends to lie unused ensure that the data they send most of the time: “The internet does not show any characteristic service providers are counting patterns, a technique called on that,” says Dan Wallach, a P2P protocol obfuscation. expert at Rice University in To make P2P packets even Houston, Texas. In contrast, P2P harder to detect, several P2P networks are designed to squeeze software providers, including every last drop of the network BitTorrent and eDonkey, have bandwidth available to them. more recently moved to total “Once you actually start using P2P encryption of the packets, networks, you break the business according to Mochalski. “It’s a catmodel of the ISPs,” says Wallach. and-mouse game,” he says. www.newscientist.com

P2P technology won’t just deliver highquality video, says Dan Wallach at Rice University in Houston, Texas. “As P2P techniques mature, you are going to see them used to implement core internet services, or any of a variety of plumbing issues that make the internet go.” One core service is the domain name system (DNS) or internet “phonebook”. It allows computers to find websites by translating user-friendly addresses like www.newscientist.com to computer addresses such as 81.144.183.95. Currently these translations are

stored by internet service providers on relatively few servers. Now, researchers at Princeton University have developed CoDNS, which stores copies of the translations on PCs that are linked in a P2P network. If a server is slow, the network uses these copies to find sites. CoDNS might also be more robust than relying on servers. When hurricane Isabel hit North Carolina in 2003, DNS servers at Duke University went down, preventing the university’s computers from accessing the internet, except for a few that were running CoDNS.

GUZZLING THE INTERNET P2P traffic is eating up a growing proportion of internet bandwidth

80

email

FTP

web browsing

P2P

70 60 50 40 30 20 10 0 1993

While the battle rages, researchers are working on ways to minimise P2P networks’ impact on ISPs. Hefeeda’s team, for example, is looking to cut the financial burden on ISPs. Peerto-peer traffic mostly shuttles between a multitude of ISP networks, and such transfers cost the ISPs money. By developing “location aware” P2P applications that allow computers to recognise which network they are on – and instruct them to share data within their own network as much as possible – Hefeeda hopes to lessen the cost to ISPs. “We try to find the local sender closest to me, not just within the network, but also with the least number of links,” he says, to reduce the load on the internet to the minimum.

2000

2006

Li is researching another type of location-aware P2P network. Instead of keeping traffic within an ISP’s network, his P2P solutions allow computers to exchange data only within a home or office network. Once a computer within a network has downloaded a software update or game from Microsoft, say, it only passes it to computers on its network, bypassing the ISP entirely. There should be a way to keep everyone happy. Content providers want P2P because they can use it to deliver high-quality content and charge extra for it. With such a burgeoning demand for their services, ISPs will inevitably profit from the bandwidth explosion – the devil lies in figuring out how to charge for it, and how much. ● 13 October 2007 | NewScientist | 29

SOURCE: CACHELOGIC RESEARCH

Call to arms

SPREAD IT AROUND

Percentage of internet traffic

P2P data now accounts for 60 per cent of daytime internet traffic and 90 per cent at night, according to Klaus Mochalski of German internet-traffic management firm Ipoque, so it is a serious problem for ISPs. Mobile bandwidth providers are especially concerned, he says, because their networks are smaller.