FF News Logo
Wednesday, October 08, 2025
FF Awards Leaderboard Banner

The fight against data discrimination with machine learning technology

The Fintech Forecast with ACI Worldwide

The Fintech Forecast is a series of guest articles published each month from thought leaders at ACI Worldwide

Has the rise of the ‘side hustle’ changed the way we should lend?

Millennials have been the generation of the side hustlers – no longer relying on their 9-5 salary – they are creating multiple income streams. The explosion of apps for finding services has facilitated a ‘gig economy’ in which flexible and temporary jobs are customary. With tax benefits to employers, many millennials are self-employed freelancers, leaving them with little financial data when applying for loans. But how does the economy shift effect this generation’s ability to take out a credit card, get a mortgage or insurance? Side hustlers could be increasing their salaries and therefore, becoming reliable borrowers, but if they are young or if their business is small, they are at a disadvantage when it comes to credit. I spoke with Patricia Rojas from our Data Science team to understand how to tackle the issue.

Digging into the data

The problem lies in the data: traditional credit models rely on static long-term data to assess the reliability of someone’s financial situation. This creates barriers that make it difficult for young people to build their livelihoods by climbing the property ladder or taking a loan out to start a business. We have the technology to solve data discrimination – financial institutions (FIs) just have to use it in the right way. Machine learning and big data have the potential to improve financial inclusion across the globe.

When applying for a loan, Banks and FIs rely on customer history to make an informed decision on whether the customer will be a risk. This is the same across other industries, including insurance, which ascertains how much to charge the customer based on historical data. This is where the problem begins – for first time buyers or people signing up to their first credit card with little to no credit history, the traditional way of using machine learning to determine how risky a customer they will be puts them at a disadvantage. If a consumer has been making regular rent payments for ten years and wants to buy a house and take out a mortgage, the rent payment data isn’t always considered.

How do we combat bias with machine learning?

You always start with a sample of your data, not all the data, so it is important to have an accurate sampling process. The sample must be an accurate representation of the population to avoid bias. If the sample is not accurate, results cannot be valid for the entire population.

Many different algorithms can help you understand the bias in your data, but it depends on the use case. For instance, if a financial institution is interested in understanding what type of clients will repay a loan or a mortgage, it may want to analyse the average disposable income for clients. If your sample includes clients that mostly make between 30,000-50,000 USD per year, but one group in the sample makes an unusually large amount of money in comparison, the results will be invalid. Conversely, if the use case is anti-money laundering (AML), you’re looking for the outliers, so removing large anomalies from the sample would make no sense.

When a side-hustler or self-employed young person wants to take out a mortgage, the data may tell us that, as they don’t have a regular salary, they are a less reliable borrower. To fight this discrimination, FIs can instead look at data from anonymous clients with similar financial situations to supplement information on the individual. By analysing similar customers, FIs can feed machine learning models by understanding how they behave. If through this analysis, you see that most of them make regular mortgage repayments, the likelihood is that our side-hustler will be reliable too. What makes the difference between good and bad borrowers aside from a regular salary needs to be understood too, and machine learning has the potential to assess this.

The importance of model explainability

This is where model explainability comes in – it is an important element in avoiding discrimination in fraud detection models. Let’s look at this in the context of young card holders. If there is a human bias based on age stereotype when a fraud analyst is reviewing suspicious transactions, they could decide to block the transaction based on the age group of the initiator. Then comes model explainer. It helps identify the real reason behind the high score of the suspicious transaction. If you have a fraudulent transaction from a young card holder worth $800 and comes alongside a strange email with a high fraud score, and then another $800 transaction without a strange email with a lower fraud score, it means the email is the difference between a transaction with a high or low fraud score, not the fact that the cardholder is young. Explainability helps us to understand why it got a certain score and make judgements based on data, not bias.

Fighting data discrimination is in the interest of everyone. It opens access to financial services for reliable customers that would have otherwise been denied. Not only does this result in a better customer experience, but it also helps banks and FIs serve more reliable and legitimate customers and make more money. Data discrimination always needs to be a consideration when offering financial services, fighting fraud or money laundering. Identifying bias is one thing, but understanding you need to remove it from your data is another.

Companies In This Post

  1. EXCLUSIVE: “Taking the Pain Out of Compliance” – Camillo Werdich, Sinpex in ‘The Fintech Magazine’ Read more
  2. Global Tech Leaders Unite to Propel Emerging Future-Critical Sectors at GITEX GLOBAL 2025 Read more
  3. U.S. Bank Leverages Gen AI for Banking Services Read more
  4. Two Thirds of Millennial Travellers Say Flexible Payments Are Their Deciding Factor When Choosing Airlines Read more
  5. Big Issue Partners With fumopay To Roll Out New Way for Vendors To Take Cashless Payments via Open Banking Read more
ITC Vegas