The upside to the 250GB Comcast cap – If we fight for it!

Listen to this post:

Comcast’s 256 GB Cap

Depending on how up to date you are on your tech news, you may or may not be aware that starting October 1st, Comcast is implementing a set 250 GB/month bandwidth cap. ArsTechnica has a great break down with background history which you can read here to get caught up.  While this announcement comes from Comcast it is representative of troubling behavior and policies which have secretly been in place for years across a number of major ISPs. If you’re a regular reader you’re probably aware of my issues with my own provider: Cox Communications.  Regardless of which ISP you’re on, if you’re in the US you will be affected eventually.

I’m a firm believer that unlimited, or at the very least, extremely high daily bandwidth caps are a hugely important factor in the future of the US as a competitive world player. For a more in-depth look at how broadband is handled abroad and my thoughts on the economic impact, please see my past Part I & 2 posts on the Technological (Digital) Revolution available here and here.  I would also like to take a moment to be very clear.  While I am pointing out a potential upside to the current bandwidth caps, I am in no way a supporter or endorsing them.

Right now ISPs: 1) Throttle what they see as P2P-like activity indiscriminately; and 2) Oversell their product offering. In large part due to a powerful lobbying arm and a lack of competition, ISPs have been able to get away with murder. They make up their own rules, regularly lie to their customers, and generally do as they want.  Recently when the FCC got involved in a case involving Comcast we saw the start of what may be some accountability in the industry. All of that said, currently there is no pressure whatsoever on ISPs to deliver the service they’ve sold you.  Most providers bill their services as unlimited, but have near-secret soft caps.   Because of the nebulous nature of the service description provided and the lack of understanding most users have of how their high speed cable bill works, there are seldom any challenges to this lack of service delivery.

The Upside

With set caps the ISP’s are defining a fair-use limit, which I assume will be legally binding. Previously, users were provided with the speed tier they were paying for, 7 mb/s for example, and the price. Now price, speed, and bandwidth have been laid out in concrete terms.  This is significant because it removes any valid claim on the ISP’s behalf that super users (typically P2P users) are clogging the pipes, slowing down other users and destroying ISP’s profitability by using excessive bandwidth. The ISPs have now defined for us – the consumer – what constitutes fair use and the service we are paying for.  So long as Comcast users use less than 249.99999 GB of bandwidth a month, they are within the allotted bandwidth they have paid for.

There is now NO grounds for ISPs to throttle P2P and other similar traffic as it can not be claimed that P2P/Gaming/Multimedia users (within their allotted bandwidth) are abusing the network. Especially since ISPs can NOT differentiate between legal P2P usage and illegal P2P usage.  There are hundreds of legal uses for P2P software and throttling it because of the possibility of illegal use is like banning e-mail because it is used by Nigerian bank scammers. Further, the current throttling activities being used by some ISPs may hinder streaming media, particularly online video games which rely on a constant, timely exchange of data.

This is a point being largely ignored/unrealized and one which the ISP’s will work to “overlook”.  They made their bed, now it’s time they lay in it. Get your hands off my data and out of my pockets.

Cox Quotes

The following are quotes from a recent exchange I had with Chris, a high level Cox technician responsible for monitoring customer complaints across newsgroups, blogs, and other similar web resources. It took a lot of back and forth but he confirms several extremely revealing things about Cox’s customer policies and network maintenance.

If my previous fails to answer you question on this issue, I simply do not know how to answer your question.  What I can tell you is that streaming media an online gaming work with our service and you will observe no impact on the performance of these functions. If you are asking if out network management practices affect these services I cannot tell you that because I simply do not know. If you are observing performance related problems with online gaming and streaming media there is a problem that we need to fix.

This comes from the end of our 14 e-mail exchange after extensive back and forth.  It is important to note that the discourse started because I posted audio of a Tier I tech telling me Cox did throttle P2P, and a Tier 2 Tech telling me immediately after being escalated that they did not hinder ANY traffic whatsoever.  What I find revealing is that when pressed repeatedly he still gives me a concrete answer saying that media and gaming won’t be effected, but in the very next sentence contradicts any credibility he has on the subject when stating that he has no idea how/what they do to manage “P2P” traffic.

I know of no technology that can differentiate between legal and illegal traffic. The primary goal of our network management policy is to ensure that all of our customers have adequate bandwidth and to prevent heavy users from denying service to others.

I’d like to reiterate that P2P technology is an amazing resource and used for countless legal tools/applications.  One such use is my P2P based library modernization project.  There is NO valid reason for it’s restriction.  Regardless of the hogwash RIAA and other lobbyist groups have tried to convince us of.

As far as a heavy user is concerned, we are basically trying to prevent a situation where a single customer or a group of customers is over utilizing their connection to the point that it prevents the remaining customers from using their connection and achieving speeds at or close to the advertised rates.  There is a fundamental flaw with the suggestion that you should be able to utilize your connection without any restrictions until reaching the stated monthly caps.  The problem is with the speeds we are offering our customers now you could easily reach those stated caps within a couple of days. Such a high utilization in such a short span of time would likely cause a denial of service to other customers such as I described above.  As far as enforcement of the advertised caps are concerned, due to the lack of an accurate customer facing tool to monitor usage we have been lenient with enforcement with regards to this abuse issue.

I understand this argument, but I don’t buy it. When you pay your taxes to maintain city and state roads it’s an all you can eat setup.  This is no different.  By that same bar when traffic in one area or another becomes particularly congested the government doesn’t turn your car off, or flatten your tires.  They improve the roads and increase their capacity and load to service demand.  The limitation is how much gas you buy, or in this case, monthly bandwidth.

If you’re a Cox subscriber you’re probably curious what your monthly caps are.  If you’re on their standard plan, you might be surprised to learn you only get 40gb of downstream and 10gb of upstream a month. To put that into perspective, that’s less bandwidth in a month up/down combined, than the Japanese are alloted in 2 days. The real kicker?  They’re also paying less and their available speeds are significantly higher. View Cox’s full cap breakdown here.

Thank you for reading. If you’ve enjoyed this post please recommend it via one of the social media bookmarks below.

[Audio] Cox Communications Lies

Listen to this post instead:

Cox Communications Lies Audio Post

On June 25th I put together a lengthy writeup looking at Cox Communication’s misleading advertising campaign and sharing general information that the average consumer might not have access to or be able to locate.

You can view that post here.

Today, after suffering through a number of slow loading videos across a number of sites I decided to call in in the hope that I’d learn something useful. Unfortunatly, I learned what I already knew.  That Cox and its agents have no problem what so ever lying to their customers. That, or the level of gross incompetence/lack of internal communication is astonishing.

The one piece of potentially interesting information comes from the Tier 2 Tech’s explanation of the way that Cox handles their bandwidth consumption caps. The catch is, that since his credibility is suspect (high chance he lied to me about a 100% unfiltered network) this information is suspect.

I’ve taken the most relevant 4:27 seconds of Audio out of my 30 minute call and compiled it.  The following audio sequence contains 3 separate clips combined in chronological order.  I apologize for not making the transitions more defined/smoother between the 3.


Cox Communications Lies

You’ll note that the Tier 1 Tech, who was very friendly readily admits that Cox does employ P2P filtering Technology. A statement supported by experience, that I and others have confirmed with other Tech Support people within the company, and shown to be accurate by research.  One example of which I noted in my previous article.  To revisit the information ArsTechnica reported:

Of the nine ISPs in the US found to block BitTorrent, Comcast and Cox were far and away the most aggressive. Both blocked more than half of all attempted BitTorrent tests on their networks (82 of 151 tests on Cox were blocked, while 491 of 788 tests on Comcast met the same fate).

Additionally, there is a significant amount of data from other major sources confirming these figures, which if you’re curious I encourage you to research independently.

Now contrast that to the statements made by the Tier 2 Tech. An individual with significantly more technical knowledge and access to Cox’s system.  Further, consider his statements that he has NEVER had any performance complaints similar to mine and that he’s unfamiliar with any information similar to that which I just posted. Keep in mind that Cox has the cable monopoly on a majority of the greater Phoenix Area. Home to a thriving high tech industry, companies like GoDaddy, Google, Flypaper and Ipowerweb, several technical/game design institutes and one of the largest Universities in the U.S. – Anyone want to take bets on how many of those people value a decent connection?

Thoughts, feedback, personal experiences? Post them in a comment.

Public Libraries in the Digital Age

Library fresco in Prague by Alex Berger

Feeling lazy? Listen to this post instead:

Listen to this post

If you are not overly tech savvy you probably think that Peer-to-Peer (P2P) Networks are a mostly negative thing…used solely to facilitate the illegal sharing of data, violating copyright, and depriving the rightful authors of their well-deserved due. While P2P networks do facilitate the illegal transfer of a lot of information, that’s not all P2P networks are used for (despite the perverted portrayal by the media ).  In fact, P2P networks are 100% legal and immensely useful. Internet Service Providers (ISPs) hate them because they take an a-symmetrical web and make it more symmetrical (users sending and receiving near equal amounts of data) but the technology is not only useful, it’s very sound.

The Problem

Public Libraries are an incredible resource.  They have quietly powered just about every major information revolution in the written age.  We have free public libraries to thank for the education received by some of our greatest thinkers. Fundamentally, they facilitate the free and easy dissemination of knowledge. The problem many libraries are facing in the modern environment is competition with the web and P2P networks.  Most of us would still prefer to read a 200+ page book over it’s digital alternative. With e-book readers like Amazon’s Kindle even that may not last much longer. So how do libraries stay competitive and useful in the modern environment?

The Current Situation

For the sake of illustration I’ll use the Phoenix Public Library System (PPLS).   PPLS and other city libraries spend an average of $75,000 a year on custom subscriptions.  That $75,000 is in addition to the $1.5 million spent by/across Maricopa county.  On December 3, 2007 the Arizona Republic reported:

In Phoenix and its suburbs, they’re free passes to growing numbers of costly subscription-only Internet databases with genealogy research and auto repair instructions, foreign languages courses and antique appraisals. Maricopa County and Valley cities are spending more than $1.5 million a year to make this information free to cardholders.

As you can imagine, with that type of coinage involved they’re not just purchasing offbeat service subscriptions.  Instead they’ve put together a comprehensive, engaging list of offerings.  View it here.

In addition to adding access to these online research tools and services, they have also moved towards providing a comprehensive e-book and audio book selection…all available for download.  The website lists 1800+ titles in the movie section available for download, 1,500+ audio files, 18,500+ ebooks and 9200+ audio books. An impressive assortment and one that has the potential to cost the Library thousands of dollars in bandwidth costs.

The Solution

PPLS’s offering is impressive, with 60+ database subscriptions and with 30,000+ digital offerings but it’s still minuscule compared to the potential.  Why not turn our library system into a networked P2P network operating custom software which not only allows the distribution of the content they already have, but also the submission and potential addition of hundreds of thousands of new files by authors, documentary producers, and musicians? I know a number of musicians and authors that are eager to distribute their work freely who would jump at the opportunity to tie into the library network. They would be more than willing to submit their works on a royalty-free basis.

It would also allow libraries to share digital catalogs easily between each other.  To ensure availability they probably would still need to offer central download servers, but the load on those could be readily offset by tapping peers with the files first, before defaulting back to the hard servers.  There are hundreds of developers on sites like Source Forge working on open source projects for file sharing and P2P networks so the cost of development would be minimal.  I believe that given the benevolent nature of the project, you could attract a number of skilled coders and developers relatively easily and quickly.

Custom-sort options for submission/approval/maintenance could be built in fairly easily ensuring that illegal files were not dumped onto the server. This would allow the Library complete control over what was made available. It would also differentiate a Library-based system from your standard open P2P network while protecting copyrights. This would also mean that all content on the network was safe, unlike a conventional P2P network where rogue users sometimes submit viruses or mis-labeled material.

If you consider the success of the SETI home program, and people’s helpful default nature, combined with the knowledge that what they are sharing is legal, I believe many individuals would be more than happy to leave the P2P component running.

As always, I’d love to hear your feedback and impressions.  Please post them in comment form below.