Thursday, August 11 2022

While your credit score is generated from data about you, the way the algorithms behind the score are developed is often based on broader financial trends.

“A lot of times credit scores are based on history from all sorts of other aggregated data, from people who look like you,” he said Safiya nobleProfessor of Gender Studies and African American Studies at the University of California, Los Angeles.

Noble who wrote the book “Algorithms of Suppression”, explores how algorithms can perpetuate racism and gender bias.

“And this is where we start to run into trouble,” Noble said. “If you’re part of a group that has traditionally been denied credit or offered pirated products, then your profile may look more like those people and you’ll be offended.”

As a result, consumers may not have access to a loan, mortgage, or better insurance rates.

David Silberman, Senior Fellow at the Responsible Lending Centersaid it was part of a larger problem.

“Credit scores are very reflective of the history of discrimination in the country,” he said.

Silberman, who spent a decade at the Consumer Financial Protection Bureau and many years in the financial services industry, has pondered how algorithms can reflect privileges or lack thereof.

“When you start with no assets, with limited income prospects, it affects the type of credit you can get,” he said.

For example, payday lenders focus on African American and Latino neighborhoods and tend to offer loans on less favorable termsso borrowers using these lenders could be more likely to default.

“Your ability to repay that loan will be impacted, and that will then work its way into creditworthiness on its own,” Silberman said.

According to payment processor ShiftWhite Americans have an average FICO score of 734 – a relatively good score for most financial products. But for black Americans, it’s 677. A lower score can equate to higher interest rates or result in a loan being denied.

Because accurate historical data can still lead to biased algorithms, many researchers and companies are looking for new options to determine creditworthiness, but even that can be risky.

Nicholas Schmidt, CEO of SolasAI, checks algorithms for different effects. He said bias can “creep in” anywhere.

“Most people talk about bias in the data. And that’s kind of an obvious thing,” he said.

One example he shared was a lender algorithm used to assess the credit risk associated with people who have defaulted on credit card debt. He said the best indicator is how often consumers shop at convenience stores.

Patron convenience store in Southeast Washington, DC (Kimberly Adams/Marketplace)

Gas stations or malls, even freestanding stores like the Patron Convenience Store in southeast DC, can get crowded on a Wednesday morning as people shop for lottery tickets and snacks.

“And I’ve been thinking about it. What do you get in a supermarket – cheap beer, cigarettes, bad candy and lottery tickets?” said Schmidt. “All of these probably correlate fairly well with risky behavior, which probably correlates well with bad credit card scores.”

But then Schmidt and his team thought again and realized that there was a gap in that analysis: food deserts. These are areas where residents have low incomes and do not have easy access to supermarkets or large grocery stores. according to the US Department of Agriculture.

In 2021, about 13.5 million people lived in America’s food deserts — and many of them shopped at convenience stores.

Ekram Aman is a cashier at Penn Way Market, a high street grocery store in a food desert in southeast Washington, DC

Ekram Aman poses next to a shelf "Little Debbies" Desserts at Penn Way Market in Washington, DC
Ekram Aman works as a cashier at Penn Way Market in southeast Washington, DC (Kimberly Adams/Marketplace)

She said most of her patrons use electronic benefit transfer, a tool for accessing government food aid programs, to buy groceries.

“They say because it is convenient for them. And it’s very convenient, especially for non-drivers,” said Aman.

Most customers are from the neighborhood and go to Penn Way, she said. Sometimes they send their children off to pick up food for dinner or some of the household goods that are packed on the narrow store’s shelves.

SolasAI’s Schmidt said using the data generated in this way is a form of discrimination that could creep in if an algorithm lumps all these people together.

“What you’re going to do is capture the risky behavior of white people in the suburbs, going to convenience stores and buying lottery tickets, bad candy and bad beer,” he said.

But, says Schmidt, you will also attract creditworthy people in cities, people with low incomes and people of color, but also wealthier people in densely populated cities who shop in bodegas.

Schmidt doesn’t know if that particular variable ended up in a lender’s final model, because financial services firms often adjust their models to account for built-in biases.

But, said David Silberman of the Center for Responsible Lending, these algorithms can only do so much.

“There may be tweaks that marginally bring more people into the system or give a more complete picture of their creditworthiness by looking at a more comprehensive data set,” he said. “But I think that’s marginal. It will not address the fundamental issues of inequality that we need to address.”

Previous

Flourish Crypto Partners with Ric Edelman's DACFP to Offer Crypto Education Course for Advisors

Next

Following layoffs and withdrawn job offers, Coinbase announces European expansion plan

Check Also