If crowdsourced network quality tools are so good, why don’t operators, universities, and regulators use them more?

A recent story appeared in the Guardian about a study that alleges American broadband providers are purposely slowing access to websites, amounting to widespread net neutrality violation. The study is reported to have been conducted by BattleForTheNet which uses the so-called Network Diagnostic Tool (NDT) of the MLab project of the New America Foundation.  BattleForTheNet created the website http://internethealthtest.org/ where it crowdsources users to generate information, purportedly about the quality of the broadband networks they use.

Before judging the veracity of the study or its measurements, consider the following:  if network performance can be measured with free crowdsourced tools, why are operators spending millions if not billions of dollars on network testing equipment from companies such as Cisco, Alcatel-Lucent, Ericsson and so on? Moreover why are engineers at Aalborg University spending so much energy making detailed measurements of wireless networks with highly sophisticated equipment and staffers with PhDs?  If measurements by MLab are so much better, why don’t operators and universities switch to crowdsourced tools?

The answer is that crowdsourced network tools do not measure network quality. The NDT purports to measure network performance from a web browser. This is akin to measuring the weather by putting a thermometer in the ground. There may be relevant information in the dirt, but this does not say whether it’s cloudy, sunny, or whether the wind is blowing. Some colleagues have attempted to get the actual “study” that is referenced in the article, but it turns out there is no study after all, and the FCC even rejected an earlier attempt to offer crowdsourced data from these entities as part of its investigation to the issue of the interconnection of networks.

It’s unfortunate that the Guardian would print a story on a study that does not exist. Moreover the article does not make a critical assessment of MLab being a project of New American Foundation, which along with Free Press, has vigorously supported the imposition of Title II, giving the FCC the authority to regulate the Internet like the telephone network.  Nor did the article, in good journalistic standard, solicit any other perspectives on the issue.

There is no doubt that crowdsourced tools are cool. Moreover they can empower users—provided that they are reliable and they measure what they purport.  But this is not the case for the NDT.

The article hints at the possible motivation behind the story with a quotation from Tim Karr of Free Press, a member of the BattleForTheNet coalition: “The irony is that this trove of evidence is becoming public just as many in Congress are trying to strip away the open internet protections that would prevent such bad behavior.”

The  FCC’s imposition of Title II may very well be unlawful.  Eight lawsuits are pending against it.  Rather than wait years for lawsuits to play out, a number of Republicans and Democrats are trying to carve out legal and lasting rules.  Free Press doesn’t like Congress getting involved because they will appropriately limit the power of the FCC.

BattleForTheNet uses the bogus tests to prop up the FCC’s decision. My research on the so-called net neutrality issue in 20 countries shows that there are two ways to make it last, soft law or hard law. The path that the FCC has taken, unilaterally carving out new rules, will not work. At the end of the day, Congress is the boss of the FCC, and it’s their job to point out when the FCC oversteps its bounds.