Singapore markets closed
  • Straits Times Index

    -26.98 (-0.81%)
  • Nikkei

    +94.09 (+0.24%)
  • Hang Seng

    -170.85 (-0.94%)
  • FTSE 100

    -16.81 (-0.21%)
  • Bitcoin USD

    +156.83 (+0.24%)
  • CMC Crypto 200

    -40.82 (-2.88%)
  • S&P 500

    -2.14 (-0.04%)
  • Dow

    -57.94 (-0.15%)
  • Nasdaq

    +21.32 (+0.12%)
  • Gold

    +30.40 (+1.31%)
  • Crude Oil

    -0.13 (-0.17%)
  • 10-Yr Bond

    -0.0250 (-0.59%)
  • FTSE Bursa Malaysia

    -2.85 (-0.18%)
  • Jakarta Composite Index

    -96.73 (-1.42%)
  • PSE Index

    -7.13 (-0.11%)

Housing: How AI is perpetuating racial bias in home lending

University of New Mexico Associate Professor of Law Sonia Rankin speaks with Yahoo Finance Live about how artificial intelligence and technology in the home lending process is contributing to racial inequities.

Video transcript


JULIE HYMAN: Mortgage and insurance companies have started to employ artificial intelligence as a way to streamline the underwriting process, but our next guest claims that historical discriminatory lending practices are still an issue even with those AI systems. For more on this, let's bring in University of New Mexico Associate Professor of Law, Sonia Rankin, and "Yahoo Finance's" Ronda Lee, who covers real estate for us.


Thank you both for being here. Sonia, I wanna start with you. I guess the problem is if the data that you're putting into the AI, that you're feeding the AI reflects discrimination, then I guess you can't get rid of that bias. Is that the idea?

SONIA RANKIN: You're absolutely right. I mean, there's this word we use in tech-- garbage in, garbage out. If you're not careful and carefully monitoring the data that you have coming in, well, any kind of manipulation or things you do to the data are going to always be based on this foundation that it was bad information to begin with. And we consider racial bias and historic racial discrimination in the United States to be part of that garbage in.

RONDA LEE: Hi, Sonia. It's Ronda. So one of the things we wanna talk about is redlining. We've always talked about redlining in housing, and even though redlining was outlawed, there's still other ways where we wanna use the term race proxy. So like zip code data that is typically used to determine desirability of homes, flood zones. But also due to historic redlining in housing, how that data-- you talked about codifying racial bias. How does that work?

SONIA RANKIN: Absolutely. So if you were to just pick any neighborhood, pick any zip code, it is not by happenstance that the people who live there are there. In fact, it was legally required for certain people to be allowed into certain neighborhoods or barred and banned from certain neighborhoods whether because of race, because of religious practices. Some of these would be actually codified into law. If you even look into some housing properties, you'll still find that they're historically built in.

This property must not be owned by certain people of certain racial backgrounds. Well, when you understand that those are all of the data points that come in, that's what's happening with the AI. The AI, to us-- maybe not to you and I. We're looking at it and we'll say, oh my goodness, this is racially discriminatory practices. But to the AI, it's just zeros and ones. It doesn't understand that when there are certain codes or covenants or certain people that live in certain zip codes or when there's a person who applies for an application, they're not looking at saying, oh, Sonia, a Black woman, is interested in this option.

What they're looking at is, it's 023011. What do we do about it and where do we put it? And it doesn't understand that everything that we have going on in the United States, particularly as it relates to race and to housing, to historical things like redlining, are all based on certain practices. And it's just going to continue to exacerbate and codify what we already to be historically racially motivated data points.

- Another layer to this as well as what goes into somebody's credit score, even before they're starting to apply for some of the larger purchases that they would like to make that are gonna give them equity over an extended period of time. How are we seeing changes in credit scores and how those are actually calculated? Also kind of transitioning through to where even the lending practices are also being changed to really reflect the modernization of the society that we live in.

SONIA RANKIN: So that's a great point. When we start to think about credit scores, what is being used to determine my credit worthiness, well, there's things like credit based insurance scores. There are numbers that are used in based off of geographic location, home ownership, motor vehicle records, and we understand that all those things are historically going to be based on places like where someone goes shopping, right?

How many times has my credit card been swiped in area codes or zip codes that have lower socioeconomic conditions? The idea that I can be penalized somehow for looking for discounts on my groceries or on products somehow playing its way into my ability to get a mortgage is really some of the troubling practices that we see.

So even though there are some great things that AI can do, automating just these tasks that can slow down our actuarial persons that have to make these determinations. The problem comes is that when we are in such a quest to speed it up, that it's starting to skim over and gloss over the things that people would look to and spot and know that we have had to set aside a number of legal decisions to ensure that there are not these discriminatory practices going forward.

- Sonia, of the things that I wanna bring us back to is that we have had traditionally where recently during the housing boom, a huge increase in the denials within Black and Hispanic homeowners trying to refinance. One of the things that the Consumer Financial Protection Bureau said that companies are using-- because they're required to tell a homeowner why they've been denied, they're now saying, oh, we can't give you a reason because the AI algorithm told us to deny this person or the algorithm's proprietary. This is kind of what we're looking at as, yes, AI is good, but when companies are using AI as a reason to justify discriminatory practices, what do we do to fix this?

SONIA RANKIN: So there tends to be this practice. You'll see there's an advertisement of kind of saying, hey, we sprinkled AI on it. It's all better. It's a good thing there's AI app applied. It has removed the discrimination. But in reality, all it is doing is masking the fact that we don't know. We don't know where the AI use in that one mortgage lending company's app software went to go get its data.

Was it searching Reddit? Did it start to make these false connections between Black Muslim women with negative connotations and start to add those into determinations? Now, we already know that there are the basic words that won't be included in an application, right? That are-- will have equal protection, constitutional protections added to it. But we do know that there are lots of proxies for race. A person's zip code can be a proxy for race, where they work, the kind of lending places.

All kinds of details, even looking at the internet of things. What times I enter and exit my home that my Nest has been calculating. Where has Fitbit been marking I've been for the past 5, 10 days or weeks? All of these kind of products are collecting a story about us and much of that story will be connected to our race, our culture, our gender, and our lifestyle practices.

JULIE HYMAN: And a lot of time we're not even fully cognizant of all of that being collected, certainly. Thank you so much for being here. Sonia Rankin, University of New Mexico, Associate Professor of Law and "Yahoo Finance's" Ronda Lee. Thanks to you both.