Breaking News
Apple card biased algorithms – expert explains how to reduce bias in AI
It has been reported that Apple’s credit card offered different credit limits for men and women In fact, even Apple’s co-founder Steve Wozniak has expressed concerns that the algorithms used to set limits might be inherently biased against women.
Commenting on the news, and explaining how financial organisations can make sure that bias in artificial intelligence is kept to a minimum, Peter van der Putten, assistant professor at Leiden University and global director at Pegasystems said the following:
“AI is as biased as the data used to create it. Even if its designers have the best intentions, errors may creep in through the selection of biased data for machine learning models as well as prejudice and assumptions in built in logic. Therefore, financial organisations need to make sure that the data being used to create their algorithms is absent of prejudice as much as possible. In addition, one should realize that human decisions can also be subjective and flawed, so we should approach these with scrutiny as well.”
“Whilst not impossible, and the only evidence are published tweets, it is highly unlikely that the Apple card policies were explicitly built to take gender into account. Credit policies typically are subject to internal model approval. That said, it is not simply enough to remove gender from your prediction models and rules, as other more innocent looking pieces of data might be correlated with these protected variables like gender. In this scenario, the claimants statement that ‘all other data was equal’ between husband and wife may or may not hold when looking at the data and decision in detail. Under European law (GDPR), customers have the right to an explanation for these kind of decisions, so in this situation I would certainly make that request.”
“There is also a need for businesses not to rush into AI as simply a cost-saving. The starting point should be how AI and machine learning can be elements in improving end-to-end processes to serve consumers, citizens and the human co-workers of an AI workforce. This requires businesses adopting a responsible mindset to using AI: defining what the company’s values are, building in transparency and removing bias – I call it a human approach to AI.”
“A recent Pega survey into consumer attitudes to AI found that 28% aren’t comfortable with its use by businesses. Stories, such as this one about the Apple card, will only help to perpetuate this opinion. Financial organisations must be absolutely transparent with their use of algorithms and AI. The key is for banks to balance transparency with accuracy. The more ‘material’ the AI’s outcome, for example these credit limit decisions, then the greater need for transparency and control. Several industries have seen AI amplifying inherent prejudice and this has to be avoided and minimized at all cost.”
- Wells Fargo Expands Its Commercial Banking Transformation with Q2 to Drive Collaboration Across Commercial Banking Teams Read more
- ADIB Becomes the First Islamic Bank to Adopt Swift’s Alliance Cloud Platform Read more
- CSI Launches Advanced Check Fraud Detection Tool in Partnership With Mitek Systems Read more
- LuLu International Exchange Partners With Fintech Galaxy for Open Banking Remittance Solutions in Bahrain Read more
- Wealthon Secures PLN 531M to Accelerate the Expansion of Unique Financial Services Ecosystem for SMEs Read more