Click to Skip Ad
Closing in...

Researchers have finally found a way to keep an eye on the state of net neutrality

Published Aug 27th, 2018 6:40PM EDT
FCC net neutrality monitoring
Image: Cultura/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

“Day 77 of the post-Title II era: The Internet is free and open, and it’s National Dog Day,” tweeted Matthew Berry, the Chief of Staff at the Federal Communications Commission, early this morning. Berry has been tweeting a wry daily observation about how the internet is still “free and open” every day since a repeal of 2015 net neutrality rules went into effect earlier this year, not even taking a day off to achknowledge how Verizon throttled California firefighters during a recent wildfire.

Berry’s commentary is funny in a troll-baiting kind of way, but it’s also a daily reminder that we have no way of monitoring if the internet is still “free and open.” Many of the things that net neutrality is designed to protect or prevent are hidden out of sight, inaccessible — either by complexity or inconvenience — to the average consumer who is supposed to be making all these informed rational decisions.

A prime example was the 2014 war of words between Netflix and Verizon. When one Netflix user in Texas reported that his Verizon Fios connection to Netflix — and only Netflix! — was being throttled, the Electronic Frontier Foundation (EFF) had to deploy an army of volunteer testers to work out what was going on. Netflix and Verizon blamed each other, and we were left with the realization that consumers and watch-dogs don’t have a good way to monitor what’s going on in the opaque world of the internet once it leaves our devices.

In 2014, the EFF called for peering arrangements between internet providers to be made public. With a soft-touch FCC currently in control, that’s unlikely to happen, but a team of researchers have outlined an approach in a new paper that could force transparency.

The paper is titled “Inferring Persistent Interdomain Congestion,” and according to its authors, it outlines a system for measuring backend congestion in the internet, which would open-source data about how the internet is functioning for the first time. Crucially, it would give data to net neutrality advocates and technology companies to cry foul if they suspected artificial throttling.

“We use our method to study interdomain links of eight large U.S. broadband access providers from March 2016 to December 2017, and validate our inferences against ground-truth traffic statistics from two of the providers,” the researchers explain in the abstract.

There’s some good short-term news for net neutrality in the paper as well. “For the period of time over which we gathered measurements, we did not find evidence of widespread endemic congestion on interdomain links between access ISPs and directly connected transit and content providers, although some such links exhibited recurring congestion patterns,” the authors continue.

Rather than focusing on the data presented in the paper, it’s more interesting to consider the kind of network monitoring they’re proposing. By keeping an eye on one statistic, the Time Series Latency, the researchers were able to monitor overall network congestion without interfering in the day-to-day operation of the network. If only some regulator could enforce implementation of such a monitoring system and make it public, we might one day be able to argue with Mr Berry/s glib assertations that the internet is still free and open.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.