The Case for Using Algorithms to Validate Broadband Data

Source: 
Coverage Type: 

It’s time for the Federal Communications Commission to step into the future by using artificial intelligence tools to address the continuing lack of affordable broadband to many communities—an increasingly entrenched problem of “internet inequality,” which impacts our economy and democracy and threatens the future global competitiveness of our country. By adopting more sophisticated data validation algorithms, the FCC could avoid repeating past mistakes. Such algorithms can not only automate the data validation process but also can ensure consistency and learn from previous provider submissions to improve error detection. However, a validation model is only as reliable as the historical data set used to train the model. The FCC needs to change its data collection practices, but we also need new information to check the broadband provider data we’re already collecting. For example, a model could compare reported data about certain locations with provider advertisements for those same locations. Better yet, the FCC should allow Americans to comment directly on potential map errors through an improved challenge process, injecting transparency and real-world scrutiny into an otherwise opaque process.

To fix our maps we must not only improve the granularity of our data but also ensure that data submissions are vetted and validated by the commission, without having to rely on providers or consumers to make sure we get this right. Data algorithms can help achieve this goal. With better data, we will be able to make better policy and faithfully execute the core functions of our agency—including addressing internet inequality. The need for accurate data is clear, and the technology to validate that information already exists.  We must put it to work for the American people.


The Case for Using Algorithms to Validate Broadband Data