To Reduce Disability Bias in Technology, Start With Disability Data

When people with disabilities interact with technologies, there is a risk that they will face discriminatory impacts in several important and high-stakes contexts. Disability rights and disability justice activists have a long history of fighting against discrimination that impacts disabled people. While technology-facilitated disability discrimination may be newer forms of older injustices, it is not going anywhere. While it is tempting to write off this bias as the result of the so-called algorithmic “black box,” disparate and discriminatory algorithmic outcomes can often be linked back to problems with the data on which models are trained—and better data is likely to produce better results. In this paper, we highlight several policy recommendations, including:

  1. Disability data should be collected in all contexts where other demographic data is collected.
  2. Data should be collected and stored in ways that are respectful of personal and data privacy.  
  3. New and more inclusive methods of both defining disability and collecting disability data must be developed. 

To Reduce Disability Bias in Technology, Start With Disability Data