Clicky

Join the pushback against online censorship, cancel culture, and surveillance.

Your credit score will soon be partially determined by AI monitoring your social media accounts

Credit score companies are some of the most privacy-invasive companies in the world. The amount of data they hold on people is shocking. And it's about to get worse.

If youโ€™re tired of censorship and surveillance, subscribe to Reclaim The Net.

The way in which credit scores are calculated is changing rapidly, from human assessment to computers, and more recently to artificial intelligence.

The change comes out of the desire to make credit scores more equitable. However, credit companies failed to remove bias.

Credit companies started using machine learning to offer โ€œalternative creditโ€, as a way to make credit scores less biased. To do this, they use data that wouldnโ€™t be normally included in a credit score to get a sense of a personโ€™s trustworthiness. According to these companies, all data is credit data. Everything, including sexual orientation, political beliefs, and even what schools you went to.

Financial tech companies such as ZestFinance, Lenddo, SAS, Equifax, and Kreditech feed โ€œalternative dataโ€ into their algorithms to generate credit scores, then sell their AI-powered systems to banks and other companies that use it for creditworthiness assessment.

Lenddo, for example, offers a Lenddo Score that โ€œcomplements traditional underwriting tools like credit scores because it relies exclusively on non-traditional data derived from a customerโ€™s social data and online behavior.โ€ It even offers the option to creditors to install an app onto their phones to analyze their online searches.

In return, customers experience the illusion of agency. Users might think that if they search for โ€œgoodโ€ things on Google and connect with the right people and groups on social media, they could become eligible for a loan.

โ€œIt suggests in some ways, that a person could control their behavior and make themselves more lendable,โ€ said Tamara K. Nopper, an alternative data and credit researcher.

Lenders argue that credit scores based on alternative data can benefit those who have been discriminated against and excluded from banking. Not having a credit, or having a bad one doesnโ€™t mean a person is not trustworthy. โ€œLet us mine your alternative data and we will think about it,โ€ they say.

But scanning peopleโ€™s alternative data looks a lot like Orwellian-style surveillance. Letting a company browse through all your digital footprints equals to giving up all your privacy in the name of credit.

Furthermore, using alternative information to generate credit scores could result in them being even more biased.

AI is fed with huge amounts of data in order to detect patterns and produce an output, such as a credit score. Good AI training results depend on a quality dataset, as the programs could assimilate the prejudices of their creators. For example, Amazon had to get rid of an AI-powered hiring tool trained on resumes after it showed signs of being biased against women.

Today, there are laws aimed at preventing discrimination in credit scores on the basis of color, religion, national origin, gender, race, age, and marital status. But for decades, the banks used predatory mortgages and loans on specific communities.

Experts say that the high rate of foreclosure on the mortgages โ€œwiped out nearly $400 billion in communities of color between 2009 and 2012.โ€ The companies that buy the debts and take people to court target colored people more than any other.

Researchers warned that AI programs could effectively reintroduce redlining and other types of bias for certain communities by feeding on deeply biased or incomplete data.

In 2017, researchers at Stanford University found that even an objectively โ€œfairโ€ algorithm could be injected with bias in favor of a specific group depending on the quality of the training dataset.

A 2018 study found that โ€œface-to-face and FinTech lenders charge Latinx/African-American borrowers 6-9 basis points higher interest rates.โ€

Some experts worry that the use of alternative data might once again lead to a situation similar to the subprime mortgage crisis if predatory loans will be offered to marginalized communities once again.

Even FICO, the data analytics company that in 1989 introduced the FICO score โ€“ a three digit number associated with most credit scores today โ€“ recognized the dangers of over-relying on AI and alternative data. โ€œ[It] can actually obscure risks and shortchange consumers by picking up harmful biases and behaving counterintuitively.โ€