The death of the (open) Internet

A leading US telecommunications company has once again predicted the death of the Internet. But could there be an ulterior motive at play here rather than the truth being told?

by Michael Smith

The Internet will, so we are being told by the VP of legislative affairs for US telecoms giant AT&T, reach its maximum data capacity by the year 2010 and will then grind, more or less, to a halt.

“The online content surge is at the centre of the most dramatic changes affecting the Internet today,” said Cicconi at the Westminster eForum on Web 2.0 in London in May 2008. He added, “Eight hours of video are loaded onto YouTube every minute. Very soon everything will become High Definition [HD], and HD is seven to 10 times more bandwidth hungry than a typical video. Video will be 80% of all traffic by 2010, up from 30% today.”

Experts agree that, as with any form of infrastructure, investment in the Internet must increase in line with demand. But some are now questioning the motives of telecommunications giants such as AT&T predicting the ‘death of the Internet’, and this questioning should, in my view, be not surprising.

AT&T and fellow US telecoms company Comcast are vocal opponents of the proposed legislation protecting ‘net neutrality’, which would bar them from charging content providers a premium for preferential traffic routing and improved service quality – a potentially lucrative business. Doing so contravenes guidelines from the Federal Communications Commission (FCC), but the pressure to back these up in law is mounting.

Hence one can but understand why they keep predicting the end of the Internet as we know it, so to speak. They are looking for a way to make even more money than they do already from the users of the World Wide Web. Corporate greed, yet again, and in such an effort they would rather make it a closed affair, to some extent at least, than to actually improve the service so that people could use and enjoy the Web more.

If AT&T’s predictions come true, they might provide US senators with an incentive to oppose such legislation: as the Internet approaches capacity, it makes economic sense to allow big business right of way on the network, especially if they are prepared to pay for it.

But with regards to the specific prediction that the Internet will reach capacity by 2010, the corroborating evidence comes solely from a study published in 2007 by Nemertes Research and backed by an organisation called the Internet Innovation Alliance (IIA). The IIA is a telecommunications lobbying group that warns of a coming ‘exaflood’ – a catastrophic explosion of data that kills the Internet – and whose members include none other than AT&T.

Jan Dawson of telecommunications analyst company Ovum believes AT&T’s comments relate to its position on net neutrality

The underlying point of these remarks is sound – massive investment in the network capacity of the Internet will be needed in coming years. But AT&T is in the minority in suggesting that a new business model is required to fund it all.

It’s worth noting that Cicconi is responsible at AT&T for regulatory issues, not network investment, making it more likely that this is further positioning in the net neutrality debate.

Michael Holloway of the Open Rights Group believes that technology developments will pre-empt the death of the Internet

Doomsayers periodically claim the net is reaching maximum capacity. This is a poor argument for networking filtering or preferential traffic routing because, as has historically been the case, new technologies will ensure capacity stays ahead of traffic growth. History also shows that network monopolists such as AT&T are uncomfortable with the disruption and financial cost of keeping the net up to speed.

At all costs we who are interested in the Internet as users, for whatever purpose we may use it, must ensure that it remains free, at least as free – and that is limited – as it is today. Freer still would still be better but. The only way to achieve that, I think, would be the use of more Open Source software (and hardware).

When I say above that the freedom on the Net and usage of it is, to a degree, limited we must recognize that the biggest switchers and routers are not owned by the telecoms companies but belong to the US military which, regardless of what some might like to claim, in reality owns or at least owned the Internet. It was the military communications nets via computers that laid the foundation to the Internet as we now know it and if we look at when the speed of the Internet traffic slows to a trickle and it behaves like thick molasses then we should check what is going on in the field of military or national (US) security operations. Nine out of ten times when the Web slows to a crawl about half of all capacity is taken over by the military and security agencies.

However, while that may happen the Internet presently is relatively free. If, however, the likes of AT&T and Comcast, and others of their ilk, get their way, and also with the hint-hint of improving security, e.g. protection of children and such, we will see a restricted Internet; one that is no longer free for all to use.

Already we are seeing that an attempt is being make to make Bloggers, though given the same protection in some countries as “ordinary” journalists, have protection removed and that the governments and other agencies can have Blogs removed that they do not like the contents of as regards to political views. I am not talking about China here or maybe Russia, but of the United States and other, so-called free and democratic countries. The powers that be are scared of the Blogger, of the citizen journalist.

The Internet must remain free and we must ensure that is stays thus.

© M Smith (Veshengro), September 2008