Most developers I know assume that software can’t be racist unless they actively make it racist. After all, On the Internet, No One Knows You’re a dog.
To paraphrase Charles Babbage: Racism In, Racism Out. When your inputs are racist, your outputs will be racist. Even if you didn’t do anything racist.
Years ago, I worked for one of the largest residential mortgage brokers in the US and often tried to impress upon my fellow developers the need to be aware of the past so that we could try to reduce the racism flowing through our code.
The conversations were usually flat. They would thank me for the interesting history lesson and walk away assured that since they weren’t racist, and weren’t coding anything racist, there was no racism in the software.
Here are two quick examples of how race and racism leaks into mortgage software. The first is pretty blunt, the second is subtle.
Colorblind Code Not Allowed
For residential mortgages, the US Government requires asking borrowers race and gender. The government uses the data to find racism (and sexism) in lending. This data is how we know that Black borrowers pay higher rates and get rejected for loans more often. The data paints a depressing picture, racism is prevalent in the mortgage industry. You have to add race to your code, and it is a good bet that some of your users will exploit that data to discriminate.
Pricing Is Based Off Racist History
As a part of the appraisal process, the appraiser will find “comparable” houses nearby to validate the price. In areas where the value of homes has been depressed by racist history like redlining, comparables act as a racist anchor. Using comparables is like saying “houses in this neighborhood are worth less than other neighborhoods because 60 years ago racists decided that predominantly Black neighborhoods are worth less, and we have decided to continue the process.”
Many states ban asking about salary history because it reinforces past discrimination. There are companies out there pushing back against the use of comps. As a developer you won’t be able to choose the company’s risk models, but with a little work, you can code up alternative models and make better data available.
Don’t be Passive
You have an obligation to understand your inputs. You may not be able to sanitize them, but understanding is a vital first step. Google the history of your industry to find where racism has come from in the past, and think about how your code makes it easier or harder for history to repeat itself.
Black Lives Matter.