Breaking News
Morality, Money and Intense Mathematics: The Rabbit Hole of Effective Altruism
It costs around $40,000 to train a guide dog. But that same money could also prevent blindness in around 400 people suffering with trachoma. So, which is better? An Effective Altruist would tell you the second option. … Even if it was the most adorable puppy helping a loved one.
Effective Altruism (EA) ruthlessly applies logic over emotion. Sites like Giving What We Can analyse data to calculate the efficiency of charities, alongside how your income compares to the rest of the world. It’s a thought-provoking (and slightly guilt-inducing) experience, filled with accountability and excel spreadsheet energy.
TedTalk Speaker Founder and Freelancer, Siobhan Ballan uses sites like these to inform her decisions. She pledges 10% of her income to the most impactful causes. “The thing that really convinced me to get more involved was a TED talk by Peter Singer entitled ‘The Why and How of Effective Altruism’”, she explains.
This 17-minute video only garnered 760K views over 12 years, but seems to have changed a disproportionate number of lives. It came up again and again across interviews. Singer addresses today’s moral failings, and provides a solution – give everything you can to the most impactful causes. Mosquito nets over new trainers, all the way.
It’s easy to see the appeal of EA. Yet, no leaders would speak to me on the record. In six years and hundreds of articles, I have never been ghosted so much. Interviews were left hanging, mysteriously moved to “background off-the-record”, or abruptly cancelled. In the aftermath of the Sam Bankman-Fried scandal, there’s been media nervousness.
PR problems and the dubious “earn to give” model
Now serving a 25-year sentence for fraud, Bankman-Fried was once the EA poster child. Giving up a career in an NGO to become a billionaire, Bankman-Fried embodied EA’s “earn to give” philosophy. The logic is that members maximize donations by earning the highest salaries in “morally neutral” roles. Students were even recruited directly from universities.
EA websites suggest that high-paying, “morally-neutral” industries could be “tech entrepreneurship, finance, partner-track in professional services”… For me, at least, this is where the logic unravels. Finance and technology… morally neutral?!?
The genocide in Gaza is partially funded with war bonds underwritten by financial firms. Global investment giants profit from the brutal humanitarian crisis in Sudan, while weapons are financed directly by banks. Russia’s unprovoked war against Ukraine is largely paid for with the tax of global banks who did not exit the country, including Raiffeisen Bank, Unicredit and CitiGroup.
Tech conglomerates like Google are now using AI in weaponry. What’s more, devastatingly cruel armaments like the cluster bombs unleashed on civilians in Afghanistan are financed by the likes of JPMorgan, Goldman Sachs, Deutsche, HBSC, Barclays, CitiGroup, Credit Agricole… It’s probably quicker to list the banks that are not enabling them.
The technology sector emits up to 4% of all carbon, creating far more devastation than any single EA donation can ever fix. While the world’s largest banks handed fossil fuel companies $9.7 trillion since signing the 2015 Paris Agreement. In what world are they morally neutral?
Effective Altruists hit back that it’s better to have high-powered roles held by people who care for the world. But we have yet to see much evidence.
Elon Musk declared himself an Effective Altruist (which MacAskill somewhat refutes), yet his dismantling of USAID has so far claimed 14 million lives. Facebook cofounder, Dustin Moskovitz now takes Bankman-Fried’s old spot as the most generous Effective Altruist. But Facebook contributes to systemic disinformation, privacy violations, and the rise of far-right sentiment. Bill Gates is often cited, yet Microsoft has been at the epicentre of numerous tax and corruption scandals, with sky-high carbon emissions (to match his four private jets). The world would surely be better off if they had never become billionaires.
In the wrong hands, the “earn to give” logic can also be a manipulation tool. One EA ex-member alleges some high-earners demanded graduates do their laundry, as their “marginal hour is better spent working”.
It gets weirder. EA events also have a history of men asking women for sexual favours and even threesomes… presumably to also make them more effective at work? Two of the three prohibited behaviours for the EA Global event are around unwanted sexual advances. Alongside the sympathetic note, “we understand that human interaction is complex”. Umm… ok.
Source: https://www.effectivealtruism.org/ea-global/events/ea-global-new-york-city-2025
Mars, locusts and lobbying…?
The hyper-focus on logic can lead EA studies into ever-stranger places,Dr Joshua Hobbs runs through some of them, such as whether wild predictors should be culled, or how many locusts are worth one human life. Perhaps it’s little surprise that some theories have become “widely mocked”.
The hottest EA trend is ‘longtermism’, which considers how actions today will affect humanity in the future. This is how we get to the “we need to colonize Mars” logic.
Over the past years, EA has even taken its philosophy into Washington. Lobbyists like Center for AI Policy, and Center for AI Safety paid US politicians $281,964 and $270,000 respectively last year, leading to questions around whether it is an efficient use of funds.
Professor Brian Berkey suggests that longtermists have started to hijack some of the traditional space, and “displaced the focus from things like global health and animal welfare”. Fractions are forming in the movement.
Is EA tech’s answer to religion?
EA adds so much to the world of philanthropy, it instills a much-needed sense of accountability and transparency. The (mostly anonymous) members I spoke to are truly some of the kindest people I’ve met. Frankly, I sat there cringing at how little I am doing for the world. I don’t even give 1% of my wealth away, let alone 10%.
But you can have too much of a good thing. The unbroken logic of EA could also be its downfall, descending into far-fetched extremist arguments that sometimes feel more like complicated sci-fi plots than philanthropy. Like religion, the overarching message is plagued with institutional contradictions, scandals … and a few cults. What’s more, the movement has attracted unhinged billionaire narcissists looking to clean-up their own tragic images. Not great.
EA continues to evolve into new avenues. Armed with longtermist logic, we now have a concerning surge of pale, stale and rich technocrats marching into Washington, convinced that they know best what will fix the planet… Elon energy, much? But do they really know? And at what point should you look up from the spreadsheet, space theories or salary-maxing, towards the person on your street who just needs a guide dog?
In the end, it was Siobhan who came up with the best solution. Just give whatever you can, whenever you can, and don’t call it Effective Altruism if you don’t want to.
People In This Post
- Tastytrade and zerohash Bring Instant Stablecoin Account Funding to Traders Globally Read more
- PayPal Introduces AI-Powered Scam Alerts for Friends and Family Payments Read more
- Global FX company Currency Stream Launches Global Expansion Backed by Paycorp International Investment Read more
- Xelix Secures $160 Million Series B to Advance Agentic AI Innovation in Accounts Payable Read more
- Accenture Deepens Banking Capabilities in Malaysia with Acquisition of Aristal Read more