The FCC Wants to Know Your Broadband Habits
Are you getting the speed you’re paying for? The FCC hopes to find out.
At a time when online privacy concerns have forced Facebook and Google to back down, it might seem audacious to ask for 10,000 volunteers to allow the government to monitor every bit and byte of their home Web use. But that is exactly what the U.S. Federal Communications Commission did last week.
Anyone can volunteer for the program at its dedicated website. Selected participants will receive a box made by U.K. firm SamKnows that will monitor their Internet data consumption and connection uptime. The box will also perform hourly tests of connection performance, using dedicated servers to conduct speed tests and loading pages from common Web destinations to track latency, delay, failure rates, and the performance of the ISP’s DNS servers, which convert each Web address into the IP address that locates a server. Users will be able to access detailed results from a box profiling their connection.
“We hope that by providing consumers more information on the nature of the service efforts like this new project might push the marketplace towards better performance,” says FCC analyst John Horrigan..
The results will appear in a “State of Broadband” report later this year and inform the FCC’s efforts to deliver on the ambitious National Broadband Plan that the American Recovery and Reinvestment Act required the FCC to draw up. Unveiled in March 2010, the plan involves providing 100 million households with access to 100-megabit-per-second broadband, around 20 times faster than those typically available today.
The most valuable data the new FCC trial will yield is the extent to which broadband subscribers get the speed they pay for. Telecom companies typically promise “up to” a certain speed in their advertising, but anecdotal evidence suggests that few customers actually receive this headline number. “When we have real information on what Americans pay for and what we get is where I think we’ll see profound sticker shock,” says Sascha Meinrath of the New America Foundation, a think tank. Meinrath is a cofounder of Measurement Lab, a consortium of academic labs and companies, including Google, that provides open-source connection testing tools online, some of which will be used by the FCC’s black boxes.
Just days before the FCC’s announcement, two companies–Measurement Lab and Ookla, whose market-leading speedtest and pingtest sites also allow Web users to test their connection speed–made publicly available the results of the millions of connection tests they have performed, releasing more than a billion broadband speed-test records to shed a brighter spotlight on the big U.S. telecom companies.
To further hold ISPs to account, Ookla will soon prompt users to volunteer details of their broadband package, its cost, and their postal code when they do a test. “We will create a promise index to show how well service providers are doing compared to what they offer consumers,” says Mike Apgar, Ookla founder.
Although Meinrath welcomes the FCC’s new trial, he points out that it excludes anyone who downloads more than 30 gigabytes a month, which means its results will not represent the heavy users, who disproportionately influence how ISPs operate. Heavy users are sometimes singled out by ISPs for subtle adjustments that downgrade service quality in an attempt to deter their custom, says Meinrath. “We know from off-the-record chats with network engineers that providers are doing things like increasing the likelihood that heavy users’ packets will be dropped,” says Meinrath.
Ookla, which in the last year processed 65 million unique speed or connection quality tests in the U.S. alone, launched a site last week called NetIndex that provides a way to explore global data culled from the last month of the roughly 1.5 million tests performed every day. Rankings compare nations and cities for upload and download speeds, while maps allow comparison of speeds within an individual country or U.S. state. Only tests performed closer than 300 miles to a test server are included to reduce the effect of network lag, and all results are based on the last 30 days of data. At the time of this writing, based on the more than 5 million Ookla tests done in the past month, the U.S. ranks 28th in the world, between Norway and Russia in the world listing. “For the place where the Internet got started, that is not too good,” says Apgar.
All the data behind the NetIndex site is available to download, and the full dataset of all 1.5 billion tests Ookla has ever served is also now available to all, on request. Measurement Labs also last week made available the more than 60 terabytes of data from tests using its platform. It is hosted on BigQuery, a part of Google’s new cloud storage service, allowing anyone to query the database without powerful data-storage hardware.
Become an MIT Technology Review Insider for in-depth analysis and unparalleled perspective.Subscribe today