I’ve written here more than a few times that ours is an industry which needs to more readily embrace the principles of DEI. Yes, we’ve made some strides. We are moving forward. But it’s also clear that we have a way to go, as well. The numbers tell us, for example, that as of 2020, the gap in homeownership rates between black and white families remained near a 120 year high point. We know that, in 2020, 74% of white families owned their own homes, as contrasted to under 45% of black families. And we know, sadly, that redlining is still a thing. Maybe fewer do it. Maybe not. But it still exists.
We still have work to do when it comes to DEI.
I recently read an article discussing a recent Federal Reserve Bank of Philadelphia study which found that, while advances in AI technology have been applied to default prediction and credit scoring in the mortgage lending world, they haven’t really made access to credit any more equitable than it already was.
Imagine that. AI can write a professional caliber thesis in minutes. It can build a library of effective content. But it isn’t helping to make the underwriting and credit processes more equitable?!
AI’s latest capabilities are certainly nothing less than amazing. You can’t scroll through social media or your favorite news sources these days without hearing about Chat GPT or the Bing AI application that declared its love for its New York Times interviewer. These are strange times indeed. You’d think we’re a few programming tweaks away from the Age of Skynet.
Well, maybe not. But it’s clear that the latest leap in AI technology is a big one. AI can do some amazing things.
So why can’t it make the mortgage lending process more equitable?
Perhaps, at least today, it’s because, as advanced as it is, AI is still a man-made technology. For all of the bells and whistles, we still get out of it what we put into it. And like any level of technology, even though it’s capable of “learning,” it’s bound by the conventions and formulas programmed into it. One machine learning economist from the Philly Fed made the point: “We believe it’s a great time to be thinking not just about if they’re [mortgage lenders] going to use these models–because they are–but also how we can guide the use of these models.”
It’s fairly simple. The decisions we make when designing and implementing these incredible and advanced technologies are going to reflect our own biases, principles, beliefs, and viewpoints. And they’re going to have very real consequences for years to come. It reminds me a little of the ongoing discussion as to credit scoring itself. Are there inherent, systemic biases baked into the various models used by lenders to presumably weigh the risks associated with each applicant? Or do the biases that appear come at an individual level?
In the discussion about DEI and its adoption on a wide scale across the mortgage industry, there’s a lot more that needs to be done. The programs, websites, departments and conferences are a pretty good start. But the true embrace of equity in lending starts with the tools and programming at the heart of the decision-making process. AI gives us a great opportunity to move toward that ideal in leaps and bounds. But it also gives us the opportunity to reinforce deep-seated biases that plague every industry in our country. Like so many other opportunities, this one sits at an individual level. It’s up to us to decide what we want the future of the mortgage industry to look like. And I, for one, vote for a future built upon fairness and equality.
We at LodeStar are grateful to all of our clients, friends and colleagues who take the time to view Deeper Thoughts. Please consider having a look as well at some of our other great content, including our podcast, “LodeStar’s Lending Leaders,” and “A Tale of Two Mortgages: an original webcomic for the mortgage industry, presented by LodeStar.” As always, your feedback is welcomed and appreciated!
[grwebform url=”https://app.getresponse.com/view_webform_v2.js?u=tyvjN&webforms_id=63463901″ css=”on” center=”off” center_margin=”200″/]